Responses to Comments from NASS
Special Nutrition Program Operations Study (SN-OPS)
We have carefully reviewed the comments and suggestions. Editorial comments not specific to the content of the study were taken into consideration and incorporated where necessary. Below we provide a responses to each comment and note changes to the surveys.
We adjusted these to 30 minutes. This does not impact the burden because we expected zero nonresponses.
Done in this version.
Done in this version.
We fixed this. The number is 53.
Done in this version.
The rows now match between A1 and A2.
We use the exact burden from Table A1. (Values are different now, reflecting pre-test information.)
This has been revised. Data collection will begin following OMB approval.
This has been revised.
This has been revised
No. The nonresponses were from territorial CNs and were deemed to be small enough so as to not affect the estimates.
No
We corrected this.
They should not have been in bold font. We corrected this.
No, the desired response is for 2012-13.
Yes, by the middle of April, directors should know the number of days for the current school year.
This refers to the survey in Appendix E.
Yes, we changed the FAQ.
Yes, we changed the Information Sheet.
We changed the burden statement to specifically say the time is to read the letter and login to the survey.
We have now included the correct FAQ with appropriate phone number.
This is the survey included in Appendix F.
We changed the burden statement to specifically say the time is to read the letter and login to the survey.
We changed the burden statement to specifically say the time is to read the email and login to the survey.
The table below summarizes the changes made to the State Child Nutrition Director Survey.
NASS Comment |
Response |
A10a: Are you asking if their original findings resulted in FA, if the results after the appeal resulted in FA, or both? |
The question is to determine the total number of appeals and the number of appeals resulting in FA. The second part of the question asks “of the SFAs that appealed findings from the new Administrative Review process. (This is question A12a in the revised draft.) |
A11: Unclear as to how findings increase or decrease. Do you mean a change to the number of reviews done? The number of problems found? The number of schools in or out of compliance with the program standards? |
We revised the question stem to refer to the number of findings as a result of reviews, not the number of reviews performed. (This is question A13 in the revised draft.) |
B1: Please verify that all references to the 2012-13 SY are correct. Is it possible any of these references should be changed to 2013-14? |
The reference to SY 2012-2013 is correct. The Year 1 survey (2011-2012) referenced SY 2010-2011 and the Year 2 survey (2012-2013) referenced SY 2011-2012. |
C3: Do you mean C4? |
We corrected the skip instruction. |
C10: Please renumber your answer categories. |
We corrected the numbering. |
C10: Why is there not an arrow to C10a like in previous questions? |
We added an arrow. |
D2: Add dotted line to make formatting match other answers. |
We corrected the formatting. (This is question D3 in the revised draft.) |
D3: Verification and validation of what? |
We deleted this question from the revised draft. |
D3: Add dotted line. |
We deleted this question from the revised draft. |
D10: Is it possible that different SFAs/schools in a state would use different systems? |
We added “State” to the question stem to make it clearer that the question pertains to State Agencies. While SFAs within a State may use different software systems, we do not anticipate that a State Agency would use more than one for its own activities. (This is question D12 in the revised draft.) |
The table below summarizes the changes made to the SFA Director Survey.
NASS Comment |
Response |
1.2c: Is this the correct school year? |
The Year 2 survey used “if 40% or more of the lunches served by the school are served free or at a reduced price in the second preceding year.” Since Year 3 is fielded in SY 2013-2014, we simplified the instructions by specifying the actual school year. |
2.1: Why don’t these answer boxes have notches between each number like the other questions requesting numeric answers? |
The other questions can more readily accommodate an appropriate number of digits. This question asks about the number of students. The largest district in the U.S., Los Angeles Unified, has a total enrollment of over 900,000. It is not possible to fit an appropriate number of notches across all five columns in this hard-copy version of the instrument. However, the web survey will be programmed to accommodate the correct number of digits. |
2.2: Will this answer be known if the respondent is completing the survey before the end of the 2013-14 SY? |
FNS has already collected information on SY 2012-2013 and prior years but needs SY 2013-2014 for the longitudinal analyses. Pretest participants did not indicate a problem with the question when we asked whether it would be difficult to answer any of the questions about the current school year. The survey will be fielded in late April at the earliest and so most serving days will be known (weather-related school closers are unlikely in late spring). |
3.3: Are you asking how likely it is that schools not currently operating under the Community Eligibility Provision would switch to it? If an SFA already has school(s) operating under the CEP, wouldn’t they automatically be Very Likely to continue doing so? |
We revised the question to specify Provision 1, 2, or 3. |
3.4: Why is there not an arrow to 3.5 like in previous questions? |
There was no arrow because the skip broke across two pages. The revised draft different pagination so an arrow was included. (This is question 3.5 in the revised draft.) |
3.8a: Is this instruction necessary? 4.1 is the next question on the survey. |
We deleted the instruction. |
5.4: Should this be 5.6? |
The skip instruction is correct in the revised draft. |
5.4: In another part of the docket, reference was made to sodium content. Should another line for sodium be included here? |
We did not add sodium to this question because it is addressed elsewhere in the survey. |
5.8, 5.9, 5.10, 5.11, 5.12a, 5.14: Should these categories refer to school type instead? As it is, these grade levels are not mutually exclusive. A respondent thinking of a K-8 school might mark an answer for K-5, 6-8, and other because they are unsure of the correct category. |
We revised the wording. (Questions 5.8 and 5.9 were deleted from the revised draft. The remainder are now questions 5.13, 5.14, 5.15a, and 5.18 in the revised draft.) |
6.1, 6.2: Are the prices necessarily the same for all schools of the same type in an SFA? Should you be asking for the average price instead? |
We did not change these questions in order to maintain comparability to the Year 2 survey. |
6.5: Please review all references to October 2012 and verify they are correct and should not be changed to October 2013. |
We updated the year. |
8.5: Other questions in this section ask about the 2012-13 SY. This question sounds like it is asking about current 2013-14 SY practices. Please clarify. |
We added a reference year. |
9.4: Why is there not an arrow to 9.5 like in previous questions? |
There is no arrow because the skip breaks across pages. |
10.3: Add dotted line to make formatting match other answers. |
We corrected the formatting. |
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Kim Standing |
File Modified | 0000-00-00 |
File Created | 2021-01-28 |