NASS comments and responses Year 3

NASS comments and responses Year 3.docx

Special Nutrition Program Operations Study (SNOPS)

NASS comments and responses Year 3

OMB: 0584-0562

Document [docx]
Download: docx | pdf

Responses to Comments from NASS

Special Nutrition Program Operations Study (SN-OPS)



We have carefully reviewed the comments and suggestions. Editorial comments not specific to the content of the study were taken into consideration and incorporated where necessary. Below we provide a responses to each comment and note changes to the surveys.

Part A

  1. Why do the pretests have such a long non-response time?

We adjusted these to 30 minutes. This does not impact the burden because we expected zero nonresponses.

  1. Fill in Type of Respondent for Follow-up Email – Week 6 and Reminder Email – Week 7.

Done in this version.

  1. Please use the same terminology for SD and SFA surveys. Pick one for each web-based/online, follow-up/reminder, hard copy pretest/hard copy survey for pre-test.

Done in this version.

  1. If you are expecting 50 CN director respondents, why are you planning to send 56 thank you letters? Also this does not match the stated expected response rate of 95% for this group.

We fixed this. The number is 53.

  1. Please use the same terminology for SD and SFA surveys. Pick one for each web-based/online, follow-up/reminder, hard copy pretest/hard copy survey for pre-test.

Done in this version.

  1. Why do these rows not match the rows in Table A1? Some survey instruments are omitted (ex. Follow-up Email – Week3), and some new ones are added (ex. Telephone Script – Week 3).

The rows now match between A1 and A2.

  1. Table A1 shows a total of 2565.63 burden hours.

We use the exact burden from Table A1. (Values are different now, reflecting pre-test information.)

  1. Table A3 says data collection starts April 10.

This has been revised. Data collection will begin following OMB approval.

  1. Table A3 says data analysis will not start until August 29.

This has been revised.

  1. Please check all SY references in this paragraph to make sure they are referring to the correct SY.

This has been revised

  1. Was any non-response weighting done?

No. The nonresponses were from territorial CNs and were deemed to be small enough so as to not affect the estimates.

Part B

  1. American Samoa and the Marshall Islands are not included in any region. Are any SFAs from these territories included in the study?

No

Appendix A

  1. C8 is listed twice.

We corrected this.

Appendix B

  1. Why are some numbers bolded?

They should not have been in bold font. We corrected this.

Appendix C

  1. Regarding the research question on requesting budget reallocation, “Do you mean 2013-14?”

No, the desired response is for 2012-13.

  1. Regarding the research question on the number of breakfast and lunch serving days in 2013-14, “Will they be able to accurately answer this since the school year will not be complete at the time of response?”

Yes, by the middle of April, directors should know the number of days for the current school year.

Appendix D2

  1. Regarding the inclusion of the CN Director Survey, “If this is different from the survey in Appendix E, please include it in this appendix.”

This refers to the survey in Appendix E.

  1. Regarding, the CN Director Survey FAQ, “Do you mean State?”

Yes, we changed the FAQ.

  1. Regarding the CN Director Survey Information Sheet, “Would the CN directors just need to provide their state?” if they lose their login?

Yes, we changed the Information Sheet.

Appendix D3

  1. Regarding the burden statement, “I understand the response time is specifically for the email, but I am concerned it might be misleading to the respondent who could assume the response time was related to the survey as a whole

We changed the burden statement to specifically say the time is to read the letter and login to the survey.

Appendix D4

  1. Regarding comments on the FAQ, “Do you mean state?, “This answer refers only to SFAs. Please reformulate the answer for CN directors” and “Fill in (phone) number”.

We have now included the correct FAQ with appropriate phone number.

Appendix D7

  1. Regarding inclusion of survey instrument, “If this is different from the survey in Appendix F, please include it in this appendix.”

This is the survey included in Appendix F.

  1. Regarding the burden statement, I understand the response time is specifically for the email, but I am concerned it might be misleading to the respondent who could assume the response time was related to the survey as a whole.”

We changed the burden statement to specifically say the time is to read the letter and login to the survey.

Appendix D8

  1. Regarding the burden statement, “I understand the response time is specifically for the email, but I am concerned it might be misleading to the respondent who could assume the response time was related to the survey as a whole.”

We changed the burden statement to specifically say the time is to read the email and login to the survey.



Appendix E: State Child Nutrition Director Survey

The table below summarizes the changes made to the State Child Nutrition Director Survey.

NASS Comment

Response

A10a: Are you asking if their original findings resulted in FA, if the results after the appeal resulted in FA, or both?

The question is to determine the total number of appeals and the number of appeals resulting in FA. The second part of the question asks “of the SFAs that appealed findings from the new Administrative Review process. (This is question A12a in the revised draft.)

A11: Unclear as to how findings increase or decrease. Do you mean a change to the number of reviews done? The number of problems found? The number of schools in or out of compliance with the program standards?

We revised the question stem to refer to the number of findings as a result of reviews, not the number of reviews performed. (This is question A13 in the revised draft.)

B1: Please verify that all references to the 2012-13 SY are correct. Is it possible any of these references should be changed to 2013-14?

The reference to SY 2012-2013 is correct. The Year 1 survey (2011-2012) referenced SY 2010-2011 and the Year 2 survey (2012-2013) referenced SY 2011-2012.

C3: Do you mean C4?

We corrected the skip instruction.

C10: Please renumber your answer categories.

We corrected the numbering.

C10: Why is there not an arrow to C10a like in previous questions?

We added an arrow.

D2: Add dotted line to make formatting match other answers.

We corrected the formatting. (This is question D3 in the revised draft.)

D3: Verification and validation of what?

We deleted this question from the revised draft.

D3: Add dotted line.

We deleted this question from the revised draft.

D10: Is it possible that different SFAs/schools in a state would use different systems?

We added “State” to the question stem to make it clearer that the question pertains to State Agencies. While SFAs within a State may use different software systems, we do not anticipate that a State Agency would use more than one for its own activities. (This is question D12 in the revised draft.)





Appendix F: SFA Director Survey

The table below summarizes the changes made to the SFA Director Survey.

NASS Comment

Response

1.2c: Is this the correct school year?

The Year 2 survey used “if 40% or more of the lunches served by the school are served free or at a reduced price in the second preceding year.” Since Year 3 is fielded in SY 2013-2014, we simplified the instructions by specifying the actual school year.

2.1: Why don’t these answer boxes have notches between each number like the other questions requesting numeric answers?

The other questions can more readily accommodate an appropriate number of digits. This question asks about the number of students. The largest district in the U.S., Los Angeles Unified, has a total enrollment of over 900,000. It is not possible to fit an appropriate number of notches across all five columns in this hard-copy version of the instrument. However, the web survey will be programmed to accommodate the correct number of digits.

2.2: Will this answer be known if the respondent is completing the survey before the end of the 2013-14 SY?

FNS has already collected information on SY 2012-2013 and prior years but needs SY 2013-2014 for the longitudinal analyses. Pretest participants did not indicate a problem with the question when we asked whether it would be difficult to answer any of the questions about the current school year. The survey will be fielded in late April at the earliest and so most serving days will be known (weather-related school closers are unlikely in late spring).

3.3: Are you asking how likely it is that schools not currently operating under the Community Eligibility Provision would switch to it? If an SFA already has school(s) operating under the CEP, wouldn’t they automatically be Very Likely to continue doing so?

We revised the question to specify Provision 1, 2, or 3.

3.4: Why is there not an arrow to 3.5 like in previous questions?

There was no arrow because the skip broke across two pages. The revised draft different pagination so an arrow was included. (This is question 3.5 in the revised draft.)

3.8a: Is this instruction necessary? 4.1 is the next question on the survey.

We deleted the instruction.

5.4: Should this be 5.6?

The skip instruction is correct in the revised draft.

5.4: In another part of the docket, reference was made to sodium content. Should another line for sodium be included here?

We did not add sodium to this question because it is addressed elsewhere in the survey.

5.8, 5.9, 5.10, 5.11, 5.12a, 5.14: Should these categories refer to school type instead? As it is, these grade levels are not mutually exclusive. A respondent thinking of a K-8 school might mark an answer for K-5, 6-8, and other because they are unsure of the correct category.

We revised the wording. (Questions 5.8 and 5.9 were deleted from the revised draft. The remainder are now questions 5.13, 5.14, 5.15a, and 5.18 in the revised draft.)

6.1, 6.2: Are the prices necessarily the same for all schools of the same type in an SFA? Should you be asking for the average price instead?

We did not change these questions in order to maintain comparability to the Year 2 survey.

6.5: Please review all references to October 2012 and verify they are correct and should not be changed to October 2013.

We updated the year.

8.5: Other questions in this section ask about the 2012-13 SY. This question sounds like it is asking about current 2013-14 SY practices. Please clarify.

We added a reference year.

9.4: Why is there not an arrow to 9.5 like in previous questions?

There is no arrow because the skip breaks across pages.

10.3: Add dotted line to make formatting match other answers.

We corrected the formatting.



5

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKim Standing
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy