Appendix S2 NASS Review Response

App_S2.NASS review response.docx

WIC Infant and Toddler Feeding Practices Study-2 (WIC ITFPS-2)

Appendix S2 NASS Review Response

OMB: 0584-0580

Document [docx]
Download: docx | pdf

APPENDIX S2

NASS Review Response



We appreciate the comments and questions from NASS. In response to the reviewer’s questions, we have the following responses:



Part A:

  1. How did you arrive at your expectations for study participants to return measurement cards or printout measurements from provider’s offices?



Response: Our expectations were based on the preliminary distribution of the mode of reporting of measurement data when respondents’ children were measured at 48 months and collection of the measurement data was still in progress. With the 48-month measurement data collection now largely complete (and more measurements received from provider records), we have revised our expectations for the distribution of the mode of responses anticipated at 72 months. Of the measurements received at 48 months, 39.2% were completed at WIC, 32.4% were measurement cards completed at providers’ offices, and 28.3% were from printouts from providers’ offices. At 72 months, we expect a total of 1,331 participants to provide measurements and we anticipate a decrease in expected measurement at WIC (decreased to 36% or 480 participants) to account for the fact that respondents’ will be less likely to return to WIC as their WIC experience becomes more distant. We expect that 451 (34%) of respondents will bring their card to their doctor’s office for measurement, and that 400 (30%) will provide a printout from a recent provider visit. These percentages are similar to what was observed for the 48-month measurement data collection.



  1. In A.3, the content seems to not make sense. How can you anticipate 100 percent of interview respondents to submit responses electronically? Further, later it states only 48 percent of responses will be collected electronically. The same is true later for non-electronic data collection.



Response: We were trying to indicate that 100% of responses that were received from respondents who completed the interview would be submitted electronically in the computer-assisted telephone interview. However, we were not including the non-respondents. We have revised this section to state that we anticipate that approximately 2,091 respondents across the 72-month interview, and the replicate interview with a subset of participants, will submit responses electronically. Thus considering respondents and non-respondents, 48 percent of the interview responses will be collected electronically. All other responses will be non-electronic.

  1. I don’t see Appendices R1 and R2 as referenced in A.6. Further, in Part B, response rates are listed and all are below 100%.



Response: Appendices R1 (NASS review) and R2 (NASS review response) only had placeholders as we had not received the NASS review at the time the OMB package was submitted to PRAO for review. The NASS review and NASS review response are now included as Appendices S1 and S2.




Part B:

With the appendix V, everything seems to be justified appropriately and well-documented. The only items that were unclear were as follows:


  1. In Table B1.2, actual rates were higher than projected. Will new sample rates also be higher for subsequent studies?


Response: The projected response rates for the 42-month through 60-month interviews do not fall under the burden for the current study expansion but were calculated for the study extension to age 5 (ICR Reference and 201601-0584-008; Expiration date: 07/31/2019). Given that there will be a 1-year gap in data collection between the 60-month and 72-month interviews with no interview at 66 months, we believe that the response rate will be lower for the 72-month interview than for earlier interviews.


  1. Before the first 72-month interview is fielded, 15 percent of participants will be randomly selected for the replicate AMPM interview. What methodology will be used for these 15% to be randomly selected? Certain methodologies may allow for targeting specific groups to ensure you do get completed reports for 10 percent of the sample. For B2, the same applies for the 10 percent subsample of caregivers who complete the first interview. What methodology will be used to subsample the caregivers to ensure they are randomly selected?


Response: Prior to the fielding of the first 72-month interview, we will use the SAS procedure SurveySelect to select a probability sample of 15 percent of all participants who are still enrolled in the study to receive the replicate AMPM interview. We specify the method to be simple random sampling which is a selection with equal probability and without replacement. Some of the 15 percent of respondents selected for the replicate AMPM interview will not complete the initial AMPM interview and thus will not be eligible for the replicate interview and some respondents who complete the initial AMPM interview will choose not to do the replicate interview. Our experience with prior rounds of the AMPM data collection for this study has demonstrated that selecting an initial random sample of 15 percent of the eligible population ensures that we will have completed replicate AMPM interviews for at least 10 percent of the respondents who complete the initial AMPM.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKilburg, Douglas - NASS
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy