Download:
pdf |
pdfAppendix I
CN-OPS II Year 4 Pretest Report
Child Nutrition Program Operations Study II
(CN-OPS-II) Year 4 Pre-Test Report
August 31, 2018
CN-OPS-II YEAR 4 PRE-TEST REPORT
CONTENTS
INTRODUCTION........................................................................................................................................... 1
1. State Agency Director Pre-test Survey Findings ....................................................................... 2
2. SFA Director Pre-test Survey Findings ..................................................................................... 4
TABLES
1.
State CN Director Survey Pre-test Feedback .................................................................................. 5
2.
SFA Director Pre-test Survey Section Completion Times ............................................................... 5
3.
SFA Director Survey pre-test feedback ........................................................................................... 9
i
CN-OPS-II YEAR 4 PRE-TEST REPORT
INTRODUCTION
In August 2018, Mathematica Policy Research (under subcontract to 2M Research) fielded
pre-test versions of draft surveys for the State Agency and School Food Authority (SFA)
Director Year 4 Child Nutrition Program Operations Study II (CN-OPS-II). The pre-test survey
instruments included newly developed questions and questions that were heavily edited from the
previous Special Nutrition Program Operations Study (SN-OPS) surveys and CN-OPS-II Years
1, 2, and 3 surveys. Two State Child Nutrition (CN) directors and five SFA directors from three
States participated in the pre-test.
Mathematica conducted pre-tests with two State CN directors, in North Carolina and
Pennsylvania, and five SFA directors, in North Carolina, Pennsylvania and Wisconsin. These
respondents were selected to participate on the basis of their availability to participate within the
pre-test timeframe. Mathematica emailed hard-copy versions of the instruments to confirmed
respondents and scheduled debriefing appointments. Participants printed the surveys, filled
them out by hand, and returned scanned copies of the completed surveys by email. Mathematica
survey staff conducted debriefing interviews with each pre-test respondent to solicit feedback.
Debriefing interviews were scheduled for 30 minutes but lasted 50 minutes and 1 hour and 20
minutes for State CN Directors and between 18 minutes and 45 minutes for SFA Directors. The
interviews focused on asking respondents to identify any questions and sections that were
unclear and recommend changes to the wording of questions.
1
CN-OPS-II YEAR 4 PRE-TEST REPORT
This report summarizes key findings from the pre-tests and the revisions that Mathematica
made to the CN-OPS-II Year 4 State Agency and SFA Director Surveys as a result of these
findings.
1. State Agency Director Pre-test Survey Findings
a. Discussion of burden
The two pre-test respondents provided feedback on new survey items in Sections 2
(Subsidies, Resources, and Funding) and 3 (Buy American), and the full set of questions in
Sections 1 (School Nutrition Service Administration, formerly Food Service Administration),
and 5 (Professional Standards), which both included a large portion of newly developed or
heavily revised questions. The focus of the pre-test of the State Agency Director Survey was to
obtain feedback on the quality of the survey items, rather than precise timing estimates. Although
neither respondent was able to time herself completing the survey, neither reported concerns
about the survey being overly time-consuming or burdensome. Based on timings of items from
previous years and feedback from the respondents, we estimate the total State Agency Director
Survey will take about two hours (40 minutes to gather the necessary data to answer the
questions and 80 minutes to respond to the questions). This estimated burden is consistent with
what is planned for the State Agency Director Survey.
b. Discussion of item improvements
Table 1 summarizes the pre-test respondents’ feedback and the changes we made to the
survey to address the respondents’ concerns. Using this feedback, we inserted additional
instructions and revised questions to clarify wording, primarily in Sections 1 and 5. Additionally,
we made several minor wording changes throughout the survey in response to suggestions from
the pre-test respondents. For instance, we replaced “food services” with “school nutrition
2
CN-OPS-II YEAR 4 PRE-TEST REPORT
services” throughout the survey at the suggestion of one pre-test respondent. Below we discuss
the most substantive changes made to the survey.
We added several response options throughout the survey based on feedback from the pretest respondents. In 1.12 and 1.15, which ask about the top challenges that charter schools and
Residential Childcare Institutions (RCCIs), respectively, experience with Child Nutrition
program administration, we added the response options, “lack of vendors that can comply with
school nutrition requirements,” “meeting different nutrition standards for separate Child
Nutrition programs (for example, NSLP and CACFP),” “challenging to comply with
procurement regulations,” and modified the response option, “lack of staff to manage the meal
service” to read “lack of qualified staff or dedicated staff positions to manage the meal service.”
In 1.12, which asks about charter schools, we added the response option, “low student
participation that discourages school program participation.” In 1.15, which asks specifically
about RCCIs, we added the response option “Operating Child Nutrition programs in a nonschool setting.” These additional response options will better capture common challenges.
We reorganized Section 2 (Subsidies, Resources, and Funding) after pre-test respondents
expressed some confusion about question 2.11 (formerly 2.1), which asks about State budget cuts
for Child Nutrition operations. Both respondents reported that their States do not have a budget
for Child Nutrition operations, separate from the Federal funding they receive. We moved the
questions about State provided subsidies and State Administrative Expense (SAE) funds to the
beginning of the section, before questions about State budgets. We added a question (2.10) that
asks if the State has a budget for Child Nutrition that is in addition to Federal funding and
included the clarification, “This question is about your State’s CN budget that is in addition to
Federal funding, such as additional per meal reimbursements, State grants, and in-kind
3
CN-OPS-II YEAR 4 PRE-TEST REPORT
contributions like office space and computer access.” The survey then asks whether the State
enacted budget cuts in the last few years that have affected Child Nutrition operations. By
organizing the section this way, respondents are able to report on the most common aspects of
funding first, and then, if applicable, report on the impact of State budget cuts on Child Nutrition
operations.
We compared two procedures to determine how to best administer questions 5.1 and 5.2,
which ask about training and technical assistance provided to SFAs by the State agency. In the
original draft of the questions, from the Year 1 survey, the items are set up as grids in which
respondents report whether training or technical assistance was provided for each topic in one
question (5.1) and then indicate who provided the training or technical assistance for each topic
in another grid (5.2). In an effort to determine whether respondents prefer answering one
question for all training and technical assistance topics before moving on to the next question or
responding to all questions about a specific topic before moving on to the next topic, we drafted
separate questions in the pre-test version of the instrument. In this version, respondents were
asked questions 5.1 and 5.2 about each training and technical assistance topic separately before
moving on to the next topic.
Respondents were asked for their thoughts on the question format during the debrief calls.
One respondent reported that completing the survey this way caused fatigue and indicated that
she would prefer that all the topics be listed for one question at once, similar to the way the
question was originally formatted. The other respondent indicated that the new format with
separate questions helped her to think about each topic more carefully, but also indicated that she
thought the survey would take longer to complete this way. Increasing the time needed to
complete the survey may produce more item nonresponse or only partially completed surveys if
4
CN-OPS-II YEAR 4 PRE-TEST REPORT
fatigued or frustrated respondents opt to exit and not return to answer remaining questions.
Based on this feedback, we plan to keep questions 5.1 and 5.2 in a grid format to reduce burden
on respondents and to maintain consistency with the Year 1 survey.
Table 1. State CN Director Survey Pre-test Feedback
Final Survey
Question #
Respondent comments
Survey revisions
One respondent recommended using
the term “school nutrition services”
instead of “food services.”
Both respondents were unsure whether
to include charter schools in their counts
of schools.
We changed the term “food services” to “school
nutrition services” throughout the survey.
1.4-1.10
One respondent pointed out that public
charter schools were only specified in
1.11.
We added “public” to the stems in 1.4-1.10 to
specify that respondents should be thinking
about the public charter schools in their States.
1.11
One respondent commented that the
main way the State agency encourages
charter school participation is by
coordinating with another State agency
that oversees charter schools.
One respondent added that charter
schools experience a lack of vendors
that can comply with meal pattern and
food safety requirements.
One respondent suggested adding other
response options for low student
participation and procurement.
We added the response option “Working with
State agency that oversees charter schools to
conduct outreach.”
Both respondents clarified that charter
schools experience challenges with a
lack of staff to manage the meal service
because there is a lack of qualified or
dedicated staff.
One respondent suggested that charter
schools may lack understanding of the
scope and complexity of the school
nutrition programs.
We modified the response option from “lack of
staff to manage the meal service” to “Lack of
qualified staff or dedicated staff positions to
manage the meal service.”
1.15, 1.12
(formerly 1.18)
One respondent noted that meeting
different nutrition standards for
breakfast/lunch and dinner/snacks is a
challenge that residential child care
institutions (RCCIs) experience.
We added the response option “Meeting
different nutrition standards for separate Child
Nutrition programs (for example, NSLP and
CACFP)” to this survey item and a related
survey item about challenges in charter schools.
1.15 (formerly
1.18)
One respondent commented that one of
the biggest challenges with RCCIs is
that they do not operate in schools.
One respondent expressed confusion
about this question, noting that they do
not collect meal count data, except as
part of the claims for reimbursement.
We added the response option, “Operating Child
Nutrition programs in a non-school setting.”
General
1.1-1.3
1.12, 1.15
(formerly 1.14)
1.12, 1.15
(formerly 1.14)
1.12, 1.15
(formerly 1.14)
1.12, 1.15
(formerly 1.14)
1.17 (formerly
1.23)
5
We added “including charter schools” to the
question stems to clarify that respondents
should include charter schools in their counts.
We added the response option, “Lack of
vendors that can comply with school nutrition
requirements.”
We added the response option “Challenging to
comply with procurement regulations” to both
questions and “Low student participation that
discourages school program participation” to
1.12.
We added the response option “Lack of
understanding of scope and complexity of Child
Nutrition program” to this question and the
related item about RCCIs.
We replaced “meal count data” with “meal
reimbursement claims data” in questions 1.231.26.
CN-OPS-II YEAR 4 PRE-TEST REPORT
Final Survey
Question #
Respondent comments
Survey revisions
Both respondents expressed confusion
with this question, which asks whether
the State Agency collects data on
various school meal program operations
at the school or SFA level, and they
were unsure why it was being asked.
Both respondents indicated that these
open-ended questions are difficult to
answer because they are vague and
States oftentimes try to limit the data
they collect from SFAs.
Both respondents indicated that their
States do not have a budget for Child
Nutrition operations.
We added an introduction to frame the topic and
separated the single grid into three separate
questions so respondents can think about the
three topics separately.
One respondent said they were thinking
of the last two to three years when
probed for how they defined “recent
years.”
One respondent indicated that it was
unclear whether this question was about
strategies applied at the State-level or at
both the State-level and SFA-level.
We changed “recent years” to “the last two
years” to give respondents a specific time frame.
2.13 (formerly
2.3)
One respondent suggested adding a
“Not applicable” response option.
We added a “Not applicable” response option.
3.1
Both respondents expressed some
confusion with this question because
their State policies are the Federal Buy
American policy.
We modified the question stem to clarify that
“State policies may be identical to the Federal
policy or may include Federal and/or Statespecific policy components.” As a result, the
next survey question (3.2) will capture the
components described in the State’s policy
regardless of whether the State policy is the
same as the Federal policy.
5.1
Both respondents expressed some
confusion about the professional
standards topics listed and requested
examples or further clarification.
One respondent noted that it is rare for
a State to support Child Nutrition
administration because there are SAE
funds for this purpose.
We added an introduction to explain the topics
and will link to the Professional Standards
Learning Objectives in the survey.
1.20-1.22
(formerly 1.26)
1.27, 1.28
2.10 (formerly
2.1)
2.11 (formerly
2.1)
2.12 (formerly
2.2)
5.1, 5.2, 5.3
2
We removed these open-ended questions
because previous questions already address the
research question.
We added a filter question, 2.10, to ask if the
State has a budget for Child Nutrition that is in
addition to Federal funding. We also added the
note, “This question is about your State’s CN
budget that is in addition to the Federal funding,
such as additional per meal reimbursements,
State grants, and in-kind contributions like office
space and computer access.”
We modified response options to clarify that
they were State-level strategies. We changed
“Reduce operating days or hours” to “Direct
SFAs to reduce program operating days or
hours” and changed “Reduce non-Federal meal
subsidies: to “Reduce State-provided meal
subsidies.”
This item was modified from Year 1 to include
“or fund” in the question stem and read “For
each of the following topic areas, did your State
agency provide or fund (or does it plan to
provide or fund) any training or technical
assistance to SFAs in the 2018-19 school year?”
The new wording was meant to help States
report on trainings even if not provided by the
State Agency. We removed “or fund” from the
question stem and added the instruction,
“Please include training or technical assistance
offered by your State agency and non-State
agency providers.”
CN-OPS-II YEAR 4 PRE-TEST REPORT
Final Survey
Question #
Respondent comments
Survey revisions
One respondent indicated that “office
staff” in response option 2 could be
interpreted as clerical staff.
Both respondents recommended adding
a response option to account for
education contractors and subject
matter experts who might provide the
training.
One respondent suggested clarifying
how “In-person” is different from “Local
meetings” or “Conferences.”
We revised response option 2 to read “State
Child Nutrition Agency staff.”
One respondent suggested that these
questions about challenges charter
schools and RCCIs face in meeting
professional standards would be more
appropriate in Section 5 because they
ask about professional standards.
Both respondents noted several other
challenges, including inadequate
resources to compensate staff for CE
time, a lack of dedicated school nutrition
staff, and for RCCIs and charter
schools, frequent use of community
volunteers. One respondent noted that
staff frequently have multiple duties and
are unable to be away from school to
attend professional training.
We moved these questions to Section 5.
One respondent clarified that it is a
challenge to both recruit and
compensate qualified applicants that
meet the professional standards hiring
requirements.
One respondent suggested that charter
schools may lack understanding of the
professional standards and the scope
and complexity of the school nutrition
programs.
One respondent reported that charter
schools often have trouble recruiting,
hiring, and retaining staff who meet
standards.
We changed “find” in items a, b, and c to “recruit
or hire.”
5.11 (formerly
5.5)
Both respondents suggested including
the USDA definition of rural SFAs.
We added the USDA definition of rural SFAs to
the stem.
5.19 (formerly
5.13)
One respondent expressed confusion
about whether this question was asking
about State CN directors or local CN
directors as well.
We revised the question stem to clarify that the
question is about requirements for State CN
directors.
5.2
5.2
5.3
5.5-5.10 (formerly
1.14-1.16 and
1.20-1.22)
5.5, 5.8, 5.11
(formerly 1.15,
1.21, 5.5)
5.6, 5.9, 5.12
(formerly 1.16,
1.22, 5.6)
5.7, 5.10
(formerly 1.14,
1.20)
5.7, 5.10
(formerly 1.16,
1.22)
3
We added the response option, “Education
contractor or subject matter expert.”
We reordered the response options, revised
existing response option “Online course” to say
“Online course or E-module,” and changed “Inperson” to “Other in-person format.”
We added item c, “Non-SFA personnel and/or
volunteers do not attend training.”
We revised item b from “Low attendance at
trainings (for example SFA personnel lack time
or buy-in)” to “SFA personnel lack time to attend
training.”
We modified item h to include “trainings, travel,
or hourly wage may be cost-prohibitive.”
We added the response option “Schools do not
understand or prioritize professional standards”
and “RCCIs do not understand or prioritize
professional standards.”
We replaced the original response options
“Cannot afford to hire individuals who meet
standards” and “Applicants do not meet
professional standards” with “Unable to retain
personnel who meet standards.” We removed
the response options “Training to meet
standards is not accessible or too expensive”
and “School nutrition service staff have other
responsibilities” because these are now
captured in 5.5 and 5.8.
CN-OPS-II YEAR 4 PRE-TEST REPORT
2. SFA Director Pre-test Survey Findings
a. Discussion of burden
Average completion times for individual sections and the total pre-test survey are presented
in Table 2. The pre-test survey included 6 sections (compared to 8 sections in the full SFA
Director Survey). Each pre-test survey section included only newly developed or previously
developed items that had been heavily revised; therefore, the pre-test survey did not test or time
the SFA Director Survey in its entirety.
Respondents were asked to record completion times for three of the 6 sections pre-tested,
including Sections 5 (Meal Pattern Requirements), 7 (Training and Professional Standards), and
8 (Financial Management). They recorded completion times for these three sections because a
large portion of the items in these sections was new or revised. Pre-test survey completion times
for these three sections were compared to expected completion times based on the assumption
that simple survey items take 15 seconds to complete and complex items take one minute to
complete. To limit the burden of the pre-test, respondents were not asked to record individual
section completion times for pre-tested items in Sections 1 (School Participation), 3 (Meal Prices
and Meal Counting), and 4 (Eligibility Determination and Verification). Therefore, Table 3
includes only calculated expected completion times for pre-tested survey Sections 1, 3, and 4.
Completion times varied among SFA respondents, which may be attributed to several
factors. First, the SFAs’ characteristics and operations affect the number and types of responses
completed. Further, SFA directors varied with respect to the amount of time they spent
documenting issues they experienced with the survey, such as challenging items to answer, and
with their efficiency at following survey instructions and skip logic on a hard copy instrument.
As noted in Table 2, the total pre-test survey time is overestimated for SFA respondent 2 (SFA2)
because the respondent did not print out the survey to complete it. Instead, the participant
4
CN-OPS-II YEAR 4 PRE-TEST REPORT
completed the survey on the computer in a Word document not formatted for responses.
Additionally, SFA4 noted that she was frequently interrupted while taking the survey and so
SFA4 survey section completion times are excluded from the table.
Table 2. SFA Director Pre-test Survey Section Completion Times
Section
SFA1
Section 1:
School
Participation
Section 3: Meal
Prices and
Meal Counting
Section 4:
Eligibility
Determination
and Verification
Section 5: Meal
6 minutes
Pattern
Requirements
Section 7:
Training and 4 minutes
Professional
Standards
Section 8:
2 minutes
Financial
Management
Total pre-test 23 minutes
survey timed
Expected
pre-test
completion c
SFA2a
SFA3
SFA4b
SFA5
Average
time
-
-
-
-
-
2 minutes
-
-
-
-
-
7 minutes
-
-
-
-
-
6 minutes
15 minutes
8 minutes
-
7 minutes
9 minutes
12 minutes
5 minutes
3 minutes
-
4 minutes
4 minutes
7 minutes
3 minutes
2 minutes
-
3 minutes
3 minutes
3 minutes
38 minutes
23 minutes
29 minutes
37 minutes
Estimated 3024 minutes
40 minutes
a
SFA respondent 2 completed the survey on the computer in a Word document not formatted for responses.
SFA respondent 4 was frequently interrupted while taking the survey.
c Expected completion times are calculated based on assuming single-response survey items take 15 seconds to
complete, and multiple-response survey items take one minute to complete.
d Total pre-test survey time refers to the time it took respondents to complete the pre-test survey in its entirety, which
included noting feedback for the interviewers. SFA respondent 4 estimated that the survey would take 30 to 40
minutes without interruptions. To calculate the average total pre-test survey time, we used 35 minutes for SFA4.
b
New content that was pre-tested and timed took respondents 29 minutes, on average. Using
our assumption that simple survey items take 15 seconds to complete and complex items take
one minute to complete, we estimated the total pre-test survey would take respondents 37
minutes, on average. We expect that the difference between our estimated completion time and
the average reported completion time is primarily because respondents did not answer all
questions due to the skip logic (the calculated expected completion time does not account for
5
CN-OPS-II YEAR 4 PRE-TEST REPORT
skipped items). This is particularly evident in Sections 5 and 7 (Meal Pattern Requirements and
Training and Professional Standards, respectively), where the expected completion times were
higher than the reported averages. Most respondents reported they do not provide
accommodations to students with dietary preferences (5.41) and were not currently using or
planning to use the USDA Child Nutrition Program’s Professional Standards Training Tracker
Tool (PSTTT) (7.9) and so they skipped a number of related questions. Pre-test respondents also
reported that they provided estimated percentages for items such as 1% flavored milk purchases
and special orders for students with disabilities and conditions or dietary preferences, rather than
look up the information, which would have reduced the time they spent completing the survey.
With this consideration, we averaged the expected pre-test completion time and the average
reported time to estimate that the pre-tested items will take respondents 32 minutes to complete,
on average. Remaining SFA Survey content is expected to take 71 minutes to complete, on
average, for a full instrument estimated completion time of 103 minutes. Factoring in an
additional 15-20 minutes (or 17 minutes, on average) for SFA directors to review their responses,
the total estimated time burden associated with the SFA Director Survey is 2 hours per
respondent. If FNS would like to reduce the burden of the SFA Director Survey, the following
options would result in substantial decreases:
Removing items 5.1 to 5.6d on USDA Foods would reduce length by an estimated 7.5
minutes
Removing items 5.24 to 5.30 about student demand for certain types of food and student
acceptance strategies would reduce length by an estimated 7 minutes
Removing items 7.1 to 7.6 about training and technical assistance would reduce length by
12 minutes
b. Discussion of item improvements
The details of respondent feedback and resulting instrument revisions are summarized in
Table 3. Overall, pre-test respondents reported that the questions in most sections of the SFA
6
CN-OPS-II YEAR 4 PRE-TEST REPORT
Director Pre-test Survey (Appendix B) were clearly worded and easy to answer. Pre-test
respondents offered several recommendations for minor wording changes to question stems and
response options as well as opportunities to filter respondents out of answering questions. Below,
we discuss the most substantive changes made to the survey.
Because several respondents commented that the items in questions 3.11 and 3.12 were too
numerous and were listed in a confusing order, we reordered the items based on descending
order of reported frequency in Year 2. This will help respondents more easily identify which
point of service methods schools in their SFAs use.1 We also made some changes to the item
wording. We modified c,“Portable Scanners or PIN Entry Pad” to include the clarification, “that
are staff operated,” because respondents reported that they did not understand how this item was
different from option b, “Personal Identification Numbers (PINs) that are student-entered.” This
will help clarify that item c is asking about staff operated PINs or scanners, rather than student
entered PINs. We also combined former items a and h to be the new item g, “Rosters, cashier
lists, cash register tapes, or manual entry” because respondents reported that they consider
manual entry to be the same as these other methods.
Some respondents reported confusion when responding to questions 3.11 and 3.12 about
methods used to track the number of free, reduced price, and paid breakfasts and lunches served
to students in non-cafeteria point of service methods. Respondents were unsure what constitutes
a non-cafeteria point of service or said that they do not use them. We modified questions 3.11
and 3.12 to ask about methods for each cafeteria point of service. Then we created new
questions, 3.11a and 3.12a, which ask respondents to select all the methods that schools use in
1
These questions have a yes/no format to ensure respondents must still read through the entire list of methods
instead of marking the first applicable method and moving to the next question without reading the remaining
response options.
7
CN-OPS-II YEAR 4 PRE-TEST REPORT
non-cafeteria points of service. The “select all that apply” format will streamline the
respondent’s experience and provide a list of methods used among all non-cafeteria points of
service. Furthermore, we added examples of non-cafeteria point of service methods (classroom,
bus, or outdoors) to the question and included a “Not applicable- schools do not offer noncafeteria points of service” response option. The new questions will help respondents think
separately about non-cafeteria points of service methods, and the “Not applicable” response
option will allow respondents to indicate that the schools in their SFA do not offer non-cafeteria
points of service. We removed question 3.13 because respondents expressed confusion with the
term “alternatives to the traditional cashier model” and reported that they did not understand how
3.13 differed from the previous set of questions. With questions 3.9 through 3.12, we are still
able to answer the research question, “What alternatives to the traditional cashier model are
used?” which question 3.13 was meant to address.
Some respondents reported difficulty responding to questions 5.31 and 5.32, noting that the
wording and the request to report amounts as a percentage of dollars was confusing. We changed
the questions to first ask about how much total milk in dollars was purchased and then ask how
much 1% flavored milk in dollars was purchased in the 2017-18 and 2018-19 school years.
Although this increases the number of questions in the survey, framing the questions in this way
will reduce the burden for SFA respondents because they will no longer have to calculate the
percentages of 1% flavored milk purchased. One respondent indicated that they would not be
able to report on total milk purchases in SY 2018-19 until July 2019. We changed these
questions to ask for milk purchases “to date in SY 2018-19.” Note, dollar amounts reported midschool year may lead to analysis challenges because the full 2017-18 school year of milk
purchases will be compared to a partial 2018-19 school year of milk purchases. That is, the
8
CN-OPS-II YEAR 4 PRE-TEST REPORT
calculated percentage of 1% milk purchases in the second school year may be overestimated or
underestimated if SFAs stagger or vary the size of 1% milk purchases throughout the year.
Several respondents reported that they do not offer accommodations for students with
dietary preferences and expressed some confusion about having to answer questions about these
accommodations in 5.33 through 5.42 (now 5.36-5.45). In response, we reorganized these
questions to first ask whether the SFA has any schools that accommodate students with
disabilities or conditions, including food allergies, or other dietary preferences (5.36). If SFAs
respond “no”, they now skip all the questions that ask about accommodations. If they do have
schools that make accommodations, they first go through all the questions about
accommodations for students with disabilities or conditions (5.37-5.40). They then receive a
question about which dietary preferences they accommodate (5.41) and, if any are selected, are
asked about accommodations for students with dietary preferences (5.42-5.45). This organization
will reduce burden and potential confusion by filtering respondents out of questions that are not
applicable to schools in their SFA. Moreover, respondents will be prompted to think first about
accommodations for students with disabilities and conditions and then about accommodations
for students with dietary preferences.
Table 3. SFA Director Survey pre-test feedback
Final Survey
Question #
General
1.3
Respondent comments
Survey revisions
Participants suggested including a list of the
resources they may need, like their latest milk
invoice and paid meal pricing, at the beginning
of the survey so they do not have to frequently
pause to look up information while completing
the survey.
SFA1 suggested adding a filter question
because some SFAs may not operate any
Child Nutrition programs other than SBP and
NSLP.
As in previous years, we will include details on
the resources respondents may want to gather
before beginning the survey in the Frequently
Asked Questions document provided to
respondents.
9
We added question 1.3 as a filter question to
gauge if the SFA participates in any Child
Nutrition programs in addition to NSLP and
SBP.
CN-OPS-II YEAR 4 PRE-TEST REPORT
Final Survey
Question #
Respondent comments
Survey revisions
1.6 (formerly
1.3), 1.4
SFA1 was confused about how to answer this
question because their schools have a catering
contract for CACFP, but they are not the
sponsor.
1.14 (formerly
1.13)
SFA1 suggested adding a filter question
because some SFAs may not operate any
Child Nutrition programs other than SBP and
NSLP.
3.4
SFA4 noted that they do not offer adult meals,
only a la carte items.
3.5, 3.5a, 3.6
SFA2 noted in the debrief call that they misread
the question about Paid Lunch Equity
exemption and incorrectly marked no.
3.9, 3.10
SFA1 and SFA2 said that point of service
methods are how they count students but the
question asks about how they are serving
meals and suggested removing “point of” from
the term. SFA2 reported that they did not
understand the difference between Grab ‘N’ Go
and kiosk or cart.
We added “regardless of whether your SFA is
the sponsor” to the beginning of question 1.4
to clarify that respondents should be thinking
specifically about school participation in
CACFP.
We removed “if your SFA participates in one
or more Child Nutrition programs in addition to
NSLP and SBP” from the question stem in
1.14 because respondents will only receive
this question if they participate in other Child
Nutrition programs. The question now reads
“What are the major administrative challenges
that your SFA encounters in participating in
any child nutrition programs other than the
NSLP and SBP?” We also deleted the “Not
Applicable” response option due to new skip
logic from 1.3.
We added a "not applicable” option and
included an instructions: “If reimbursable
breakfasts or lunches are not served to adults
or adults purchase reimbursable meal
components a la carte, please check the
appropriate box.”
We added the new question 3.5, “Does your
SFA’s paid meal pricing comply with the Paid
Lunch Equity provision?” and simplified the
response options in new question 3.5a
(formerly 3.5) so the response options are
simply “Yes”, “No”, and “Don’t know.” We also
revised response options 1 and 2 in 3.6 for
clarity and removed the response option,
“Paid lunch pricing already complied with
provision,” because respondents will skip this
item if they respond “yes” to 3.5.
We removed “point of” so that the question
asks about service methods. We moved the
“meal delivery to the classroom” item from
item b to c and edited the new item b to read
“Kiosk or cart (not for Grab ‘N’ Go)”.
10
CN-OPS-II YEAR 4 PRE-TEST REPORT
Final Survey
Question #
Respondent comments
Survey revisions
3.11, 3.11a,
3.12, 3.12a
SFA1 checked methods that they used when
normal methods were down, in addition to
those they regularly use. Several respondents
thought the response items were too numerous
and were listed in a confusing order. SFA2
reported that former c and d (PINs and PIN
entry pads) were the same thing. SFA3 and
SFA5 were unsure what non-cafeteria POS
were and suggested adding examples.
We clarified the question by asking which
methods schools “regularly” use for tracking.
We changed the order of the response options
based on respondent comments and the
prevalence of responses from Year 2, listing
the most prevalent methods first. We added
the clarification “that are staff operated” to
item c, Portable Scanners or Pin Entry Pads.
We combined former items a and h to be item
g, “Rosters, cashier lists, cash register tapes,
or manual entry.” We moved the question
about non-cafeteria POS methods to be a
separate question, which asks respondents to
select only the methods used, rather than
responding “Yes” or “No” to whether schools
use each item. We added examples of noncafeteria POS methods to the question and
included a “Not applicable- schools do not
offer non-cafeteria points of service” response
option.
Formerly 3.13
Respondents reported that they did not
understand the term “alternatives to the
traditional cashier model” and did not
understand how 3.13 differed from the previous
set of questions.
SFA1 was not familiar with Provisions 2 or 3.
We recommend omitting this question
because we capture alternatives to the
traditional cashier model in questions 3.9 –
3.12.
SFA3 suggested that question 4.6 was
unnecessary because all SFAs should answer
yes to this question. SFA1 suggested adding
examples such as household letter and district
webpage. SFA4 suggested adding “all call” to
response option 3.
SFA1 reported that the question was confusing
because they thought SFAs were not allowed
to start verification before October 1st.
We recommend omitting question 4.6. In the
new question 4.6, we added examples to
response options 1, 3, and 4, removed
response option 5, and included a “not
applicable” category.
4.15 (formerly
4.16)
SFA1 was confused by this question because
their direct certification process is automated
by the State.
We added “or State” to the question so that
respondents select “Yes” if their SFA or State
uses the direct certification process.
4.16 (formerly
4.17)
SFA1 listed Medicaid in the other response
option.
We added instructions that “we will ask about
Medicaid in the next question.”
4.20 (formerly
4.21)
SFA4 was confused about how to answer this
question because they conduct direct
certification and offer online applications at the
same time.
We added the instructions, “If these processes
are conducted at the same time, please select
No” to clarify which option to choose if direct
certification is conducted at the same time
household applications are available.
5.28j
SFA4 checked the FFVP response option but
noted that her SFA used this strategy in past
years, not the school year in question.
We added “in SY 2018-19” to the FFVP
response option to reiterate that respondents
should only be thinking about activities in the
current school year.
4.1
4.6 (formerly
4.6 and 4.7)
4.11 (formerly
4.12)
11
We added definitions for Provision 2,
Provision 3, and the Community Eligibility
Provision (CEP) from SNMCS-II.
We added “no longer” to the beginning of the
question stem to clarify the verification
process requirement change.
CN-OPS-II YEAR 4 PRE-TEST REPORT
Final Survey
Question #
Respondent comments
Survey revisions
5.31-5.34
(formerly 5.315.32)
SFA1 noted that they will not have information
on the amount of 1% flavored milk purchased
in SY 2018-19 until July 2019. Respondents
differed in their answers on which units would
be easiest to report and expressed some
confusion about giving percentages of specific
units. When probed, multiple respondents
indicated it would be easier to report on the
total amount of milk purchased and total
amount of 1% flavored milk purchased in a
specific unit. SFA5 suggested emphasizing the
years because they initially thought these
questions were the same.
We changed the questions to first ask about
how much total milk was purchased, in dollars,
and then ask how much 1% flavored milk was
purchased, in dollars, in the 2017-18 and
2018-19 school years. New question 5.31
asks respondents about total milk purchased
in SY 2017-18. New question 5.32 asks
respondents about total 1% flavored milk
purchased in SY 2017-18. New question 5.33
asks respondents about total milk purchased
to date in SY 2018-19. New question 5.34
asks respondents about total 1% flavored milk
purchased to date in SY 2018-19.
5.35-5.44
(formerly 5.335.42)
Most respondents reported that they do not
offer accommodations for students with dietary
preferences and expressed some confusion
about having to answer questions about them.
We reorganized these questions to first ask
whether the SFA has any schools that
accommodate students with disabilities or
conditions or other dietary preferences,
including food allergies, followed by questions
about dietary preferences. We added skip
logic to 5.35 so respondents will skip all
questions about accommodations if the SFA
does not provide them to students.
5.38, 5.43
(formerly 5.39,
5.40)
SFA2 was not sure if the question is referring to
activities conducted by only food service staff
or the SFA as a whole. SFA3 said that getting
the required documentation from the physician
through the school nurse or parent requires the
most resources.
We eliminated “or food service staff” which is
consistent with other items in this section. We
added “work with parents or school personnel
to obtain required documentation from the
physician” as response option 1 for 5.38. We
removed the “not applicable” response option
since respondents that do not offer
accommodations will skip the question.
SFA2 suggested adding “individual” before
students because they initially misinterpreted
the question to include vegetarian planning.
SFA1 said that the gluten-free category can be
interpreted as preferential or allergy-related.
We added instructions that the next set of
questions ask about how SFAs accommodate
students with dietary preferences in SY 201819. We clarified the meaning of dietary
preferences by specifying “individual” students
and adding instructions to exclude options
offered to all students on the regular menu.
We added “excluding allergies and medical
conditions” to 5.33c.
7.1, 7.2, 7.4
We modified these questions based on
feedback to questions 5.1-5.3 in the State
Agency Director pre-test.
We added an introduction to explain the topics
and will link to the Professional Standards
Learning Objectives in the survey. We
reordered the items to more closely align with
those in the State Agency Director Survey.
7.5, 7.6
(formerly 7.7
and 7.8)
SFA1 said that these questions may be hard to
responsibly answer because they ask about
staff opinions.
We added “don’t know” response options.
5.40 (formerly
5.34)
12
CN-OPS-II YEAR 4 PRE-TEST REPORT
Final Survey
Question #
Respondent comments
Survey revisions
7.9 (formerly
7.11)
SFA2 and SFA5 were not familiar with the FNS
tracking tool in original 7.11a; both referred to
using a USDA tracking tool. SFA4 said that a
lot of SFAs use Excel spreadsheets to track
training.
We replaced “The FNS Professional
Standards Training Tracking Tool 2.0“ with
“The USDA Child Nutrition Program’s
Professional Standards Training Tracker Tool
(PSTTT)” in 7.11a. We replaced FNS to USDA
in 7.11-7.13. In 7.9, we also added a new
option for “SNA Developed Professional
Standards Online Training Tracking Tool”. We
edited 7.11c to read “Computer- or Excelbased tracking tool other than USDA PSTTT
and SNA online training tool”.
7.20, 7.18,
7.19 (formerly
7.5, 7.6)
Respondents expressed some confusion at
these questions and were unsure what was
meant by “State-recognized certificate.”
We added “This could include a School
Nutrition Association (SNA) Certificate or other
State-recognized certificate” to the question.
We moved former questions 7.5 and 7.6 to
now 7.18 and 7.19 to be in the same set as
other questions that ask about SFA director
credentials and experience to better frame the
questions for respondents.
SFA1 listed “nightly notifications: email and call
parents” in the other response category.
Respondents were confused by 8.11e.
Respondents noted that 8.11f varies by grade
and suggested examples to include.
We reordered response options. We added
response options “notify households of
negative balances” and “encourage
households to apply for free or reduced price
meals”. We added “events” and “prom” as
examples of using administrative actions. We
added the example of retroactively approving
the student. We removed the response option
“no effort made” because SFAs that make no
effort to recover money for unpaid meal
charges will select “no” to all items.
SFA1 suggested adding a “none” category
because they did not consider her SFA
successful in recovering funds.
We changed directions to “below” since
response options are below, not above, the
question. We added response option “none of
these steps were successful”.
8.11
8.12
13
File Type | application/pdf |
File Title | Deliverable 4.1 |
Subject | AG-3198-C-15-0008 |
Author | Mathematica and 2M Research |
File Modified | 2018-11-20 |
File Created | 2018-09-10 |