Appendix K Summaries of Instrument Pretesting

CEO K Summaries of Instrument Pretesting.docx

Community Eligibility Option Evaluation

Appendix K Summaries of Instrument Pretesting

OMB: 0584-0570

Document [docx]
Download: docx | pdf



Appendix K: Summaries of Instrument Pretesting

Introduction

This document presents the results of the pretesting of the various data collection instruments and procedures planned for use in the evaluation.

Pretest Scope. In February and March 2012, Abt Associates conducted pretests of 14 CE Option Evaluation survey instruments for which OMB clearance is being requested. Telephone interviews were conducted to test the State Education Agency (SEA) Survey, Pre-Visit LEA Foodservice Director Questionnaire, and Pre-Visit School Information Questionnaire. Field procedures and testing of on-site school instruments were conducted at six schools in Massachusetts in March. The following instruments and field procedures were tested in three Massachusetts schools: Menu Survey, Meal and Cashier Observation Form, LEA Meal Counting and Claiming Review Form, and Meal Count Verification Form. The Certification Record Abstraction Form, Application Data Form, and Administrative Cost Interview Guide and Preparation Forms were tested in the remaining three Massachusetts school districts. Abt Associates conducted paper-version pretests of the web-based surveys of Participating, Eligible Non-Participating, and Near-Eligible LEAs in the CE Option participating States of Illinois, Kentucky and Michigan.

The primary objectives of the pretest were to evaluate the:

  • Ability of respondents to understand and respond to questions;

  • Appropriateness of response categories;

  • Assumptions regarding availability of certain data items;

  • Methods of administering the survey instruments; and

  • Length of time required to administer the survey instruments.

We have grouped the pretest procedures and findings into three main sections by the type of data collection they represent. These sections, Telephone Instruments, In-School Instruments, and Web Survey Instruments, are presented below.

Telephone Instruments

State Education Agency (SEA) Survey

Field Procedures

Abt Associates enlisted the help of FNS in contacting the West, Southwest, Mid-Atlantic and Mountain Plains Regions about recruiting States into the CE Option pretest. FNS emailed each Region to briefly describe the pretest, and Abt Associates followed up with the Regions, asking each to help recruit two States into the pretest. Based on the information provided by the Regions, selected States in the West, Southwest and Mid-Atlantic Regions were then asked to complete the State Education Agency (SEA) Survey, as well as to provide contact information for LEAs who would consider participating in other components of the pretest (see “Field Procedures” details for the Pre-Visit LEA Foodservice Director Questionnaire and Local Education Agency (LEA) Survey on Participation, Enrollment, Attendance, and Revenues (PEAR Survey) below). The States in the Mountain Plains Region were only asked to complete the SEA Survey and were not asked to recruit any LEAs for the pretest.

General Findings

The State contact provided by the Region was often the director of the State’s Child Nutrition (CN) Director at the Department of Education. After these respondents were asked to help recruit LEAs within their State for the Pre-Visit LEA Questionnaire and the PEAR Survey, they were asked to identify a person at the State level who could answer questions about fund allocation for the SEA Survey. The questions were either reviewed over the telephone, or sent by email to the contact to help them identify the correct respondent for the SEA Survey. This process revealed that there is no one person at the State level who can speak to all the items in the SEA Survey. Among the eight completed SEA Surveys, our final respondents by State included one Director of the Department of Education, three Directors of the Child Nutrition Program, one Operations Director at the Department of Education, and three directors associated with the financial department. The identified respondent for the pretest was often unable to answer the SEA Survey questions about the use of Free and Reduced Price Lunch (FRPL) data for a wide variety of programs with any certainty. Interview debriefings revealed that the final respondents for the pretest would, and sometimes did, poll a variety of State offices to gather more definitive information on the use of FRPL data. The reported burden for accomplishing this task was low, and it was often conducted via email between officials at their respective State agencies.

Specific Findings

Based on the findings from the pretest, a pre-interview form was created for the State respondents to allow them to poll other State offices and gather more accurate information about the use of FRPL data. The decision was made to keep the responsibility for completing the questionnaire to one State contact to reduce non-response through perceived lack of responsibility inherent in scenarios where instruments are passed from one person to another. The pre-interview form should not only increase response rate to the SEA Survey, but should also increase the accuracy of answers.

The SEA Survey asks about the use of FRPL data by States and by LEAs to allocate funds to various programs. State respondents had difficulty describing the LEAs’ use of FRPL data to allocate State funds, noting that this decision was left to the LEA and differed among LEAs within a State. In response to this issue, the SEA Survey questions were modified to ask if the State “requires” LEAs to use FRPL data to allocate funds for certain programs.

Finally, questions about possible data substitutes for FRPL data were rephrased to elicit the respondent’s opinion on data substitutes, as many States in the pretest had not yet discussed, or thought about, the need for alternate data options.

Pre-Visit Local Education Agency (LEA) Foodservice Director Questionnaire

Field Procedures

This questionnaire was pretested at the LEA level in the six Massachusetts school districts selected to pretest the in-school instruments (see next section for details of in-school instruments). These LEAs were also asked for permission to visit one school within the district to field a set of in-school instruments. Additionally, one of the two States selected for the SEA Survey in the West, Southwest and Mid-Atlantic Regions provided an LEA who volunteered to pretest this instrument. In total, nine Pre-Visit LEA Questionnaires were completed.

General Findings

A few confusing questions were identified during the pretest of the Pre-Visit LEA Questionnaire and clarified by adding instructions for the interviewer. For example, Question 2 asked whether there was another person responsible for foodservice accounting when the person responsible for this is usually the respondent (LEA foodservice director). The wording was changed to verify that the LEA foodservice director is the person responsible for foodservice accounting, and if not, to then identify who that person is. In Question 3, the answer choices were reorganized in order of most frequently answered.

In addition, the method of drawing a sample of applications to be reviewed on site was revised. Previous instructions asked the school to provide a range of student ID numbers and then the sampling unit was selected based on this range. However, student ID numbers were not assigned sequentially in each school, so it was impossible to select the correct student ID numbers of students attending the specific school we selected. In the revised protocol, we will ask questions about the LEA foodservice director’s ability to sort and generate lists of approved and identified, denied, and directly certified students so that the sampling can be done on-site in an efficient manner.

Pre-Visit School Information Questionnaire

Field Procedures

The Pre-Visit School Information Questionnaire was completed by telephone with the school cafeteria manager in each of three Massachusetts schools selected to participate in an on-site pretest of the Menu Survey, Meal and Cashier Observation and Meal Count Verification Form, prior to this visit. The three pre-visit interviews were conducted with cafeteria managers at one elementary school, one middle school, and one high school.

General Findings

Based on the responses and feedback received during and after the interviews, a few revisions were made in order to improve the flow of the instrument. Some questions were eliminated because the level of detail was excessive based on responses. The questionnaire previously began by stating that an on-site cafeteria manager would be the only eligible respondent; however, the pretest revealed that there may be other appropriate staff, such as liaisons between the district and the school, familiar enough with the day-to-day procedures in the cafeteria to be suitable respondents. Changes were made to the language used in some questions to reflect changes that were made to the Menu Survey, which the respondent would also be responsible for completing.

In-School Instruments

An experienced Abt Associates field manager with established relationships with many Massachusetts school districts was able to recruit six Massachusetts LEAs into the CE Option pretest. Each LEA was then asked to identify one school to participate in the pretest. Each school was then asked to pretest one of two sets of in-school instruments:

  • Menu and Observation: Three schools, one in each of three separate school districts, completed the Pre-Visit School Information Questionnaire by telephone (see above for details), and the Menu Survey in paper booklet form. Abt Associates staff visited these same three schools to conduct the Meal and Cashier Observations, and gather data for the Meal Count Verification Form. The Local Education Agency (LEA) Meal Counting and Claiming Review Form was completed at the corresponding LEA for each of these three schools to compare against the data in the Meal Count Verification Form.

  • Record Abstraction: The remaining three LEAs were selected to complete the Pre-Visit LEA Questionnaire by telephone (see above for details), and schedule an appointment for Abt Associates staff to visit the LEA and complete the Certification Record Abstraction Form, Application Data Form, and the Administrative Cost Interview.

Menu Survey

Field Procedures

The Menu Survey was fielded at three Massachusetts schools, each in a different LEA. Surveys were completed in one elementary school, one middle school, and one high school. During the calls to the school’s cafeteria manager to complete the Pre-Visit School Information Questionnaire (see above for details), two appointments were scheduled for Abt Associates staff to visit the school. The Menu Survey and its instructions were reviewed in detail with the cafeteria manager on the first visit, and the Menu Survey was left for the cafeteria manger to complete in time for the second visit two days later. While the final CE Option Evaluation will ask managers to complete the Menu Survey for five days (a target week), the pretest version only captured details about lunch and breakfast for one day. Completed Menu Surveys were retrieved during Abt Associates’ second visit to the school, the day the Meal and Cashier Observations (see below for details) were completed. Field staff reviewed each completed survey while on site for accuracy, and this review informed the debriefing conducted with each cafeteria manager.

General Findings

Based on feedback received from cafeteria managers, many small adjustments were made to the forms in order to make them more user-friendly for respondents. These changes included: reordering some of the pre-listed foods; providing more detailed explanations at the top of each new section; including all of these new details in the Menu Survey instruction booklet; and clarifying when to use Recipe Forms and Made-to-Order Bar forms. Other modifications to the forms were made to reflect regulatory changes beginning in the 2012/13 school year, such as removing 2% milk as a pre-filled option, as this will no longer be offered as part of a reimbursable meal. Some respondents were unsure of where to find a product code on a case or product label, so we have included an example of a typical school food label and how to identify the product code in the instruction booklet. A Daily Reminder Card was created after the pretest to help respondents stay organized during the target week, as there will be numerous documents and forms to keep track of throughout the target week when they are completing the Menu Survey.

Meal and Cashier Observation Form

Field Procedures

The Meal and Cashier Observation Form was pretested by two Abt Associates researchers in each of three Massachusetts school cafeterias—one high school, one middle school and one elementary school. At each location, researchers were able to observe and record paired observations of specific foods taken and cashier transactions for 40 students at breakfast and 60 students at lunch. Observations were divided among various cashier lines offering reimbursable meals, and were further divided between multiple lunch periods as available. If the cafeteria (or meal time) only included one cashier, researchers took turns alternating student observations within the line. In addition to capturing the food taken by students, researchers also observed how payment was tallied by the cashier and whether meals were counted as a reimbursable meal at the point of transaction. Registers with electronic screen displays were used in two of the three schools, and reimbursable meals were displayed as such on the screen. The third school recorded reimbursable meals with a keypad machine. Researchers watched the cashier to see if she pressed a dedicated button on the keypad to indicate a meal as reimbursable.

General Findings

Prior to pretesting the instrument we revised the form slightly by revising the instructions for coding the meal component in the second column that corresponds to each reimbursable food offering and its contribution to the meal pattern. In addition to the meal component abbreviations (“M” for meat/meat alternate, “G” for grains, “V” for vegetable, “F” for fruit, and “Mlk” for milk), we have included additional abbreviations to allow for schools that are planning their breakfast meals using nutrient-standard menu planning during SY2012. These abbreviations are “E” for entrée and “SD” for side dish. These are expected to be used for breakfast observations only where applicable. These were the only changes made to the instrument.

The pretest itself did not reveal any problems or issues with the Meal and Cashier Observation Form, but did aid in developing a list of items to incorporate in observer training. This included meeting with the cafeteria manager and/or head cashier before the observation to determine with them the best place for observers to stand and collect data, so that they can see both the students and the cashiers without getting in the way of either. Observers should also ask the cafeteria manager or head cashier to show how a reimbursable meal will display when recorded at some point prior to the meal service. Finally, observers should confirm information on the number of students expected at each of the lunch periods so calculations on how to divide observations between different periods can be made before the start of lunch.

Local Education Agency (LEA) Meal Counting and Claiming Review Form and Meal Count Verification Form

Field Procedures

Both the Local Education Agency (LEA) Meal Counting and Claiming Review Form and the Meal Count Verification Form were pretested at the three Massachusetts schools and districts who participated in the Menu Survey and the Meal and Cashier Observation pretests. The school in the largest district was asked for the information on the Meal Count Verification Form during the school visit when researchers dropped off the Menu Survey. This school was then able to provide all the information needed during the observation visit, two days later. The information was printed directly from the computer, as counts are computerized with no intermediate forms or tracking. The largest district contained approximately 120 schools, and had not completed the Pre-Visit LEA Survey over the phone, per the protocol, as of the day we visited the target school for observations. The LEA was visited on the same day as the target school’s Meal and Cashier Observation, and the Pre-Visit LEA Survey was completed in person. At the same time, the LEA was told what information was needed for the LEA Meal Counting and Claiming Review Form. The LEA was able to send the data for all schools, via email, the following week. Since all information is computerized, we received one sheet of meal counts per school in this email. These counts were summarized to calculate what should have been claimed, and compared against the information in the district’s claim form. Follow-up calls and emails were made to capture information on the reason for any discrepancy.

Of the remaining two districts, one provided both computer printouts and handwritten counts, and its target school provided meal count information directly from their computer system. The final district provided meal counts by day as a printout from their computer system. Its target school also provided computerized counts, and while the cafeteria manager was unable to provide school enrollment and eligibility data, this information was collected at the school’s main office.

General Findings

Prior to the pretest, the forms were reviewed by the staff who would perform the data collection. Instructions were rewritten, and questions reordered for improved flow and clarity. To facilitate data collection, the forms were split into a CE Option version and a non-CE Option version.

The data items for enrollment, average attendance, and number of students approved for free or reduced price meals were moved from the Meal Count Verification Form to the LEA Counting and Claiming Form as the district was able to provide this information with greater ease, and more consistently than the schools. The protocol for data collection and training instructions will be updated to inform districts containing more than 15 schools as to what data is needed for these meal count forms prior to the school visits, in order to give additional time to collect the needed data.

Certification Record Abstraction Form

Field Procedures

The Certification Record Abstraction Form was fielded at three LEAs in Massachusetts. They were completed in the foodservice offices in each LEA where the application records were kept. Data was recorded for only one school within in the district, chosen for the pretest as the most convenient for the LEA foodservice director to access. In the evaluation, the Certification Record Abstraction Form will be conducted in three different pre-selected schools within an LEA. The physical location of the applications (at the school or foodservice office) was determined during the Pre-Visit LEA Telephone Interview. On site, 50 Free/Reduced Price Lunch (FRPL) applications from the selected school were reviewed in detail and compared to an LEA foodservice director generated list of approved, denied, and directly certified students using this form. Field staff collected this data independently in the LEA foodservice director office but asked questions if any issues arose. The Certification Record Abstraction Form, Application Data Form, and Administrative Cost Interview Guide were completed during the same site visit.

General Findings

Based on the experience of using the Certification Record Abstraction Form during the pretest, a column was added for “Application Number,” which was useful for data collectors to keep track of the source documents. The “Opted Out” column was removed because it was rare that a student would apply and be approved for FRPL, and then opt out of the benefit. Additionally, the flow of information from the different source documents made it difficult to complete the Certification Record Abstraction Form quickly. Therefore, columns I, J, and K were reorganized so they will be completed for all applications. In order to improve data collection clarity and accuracy, check boxes replaced text fields; this should also make the form clearer and easier to fill out. In general, the changes were made to improve the efficiency and accuracy of record abstractions.

Application Data Form

Field Procedures

The Application Data Form was also fielded at three Massachusetts LEAs and completed in the LEA foodservice director’s office. The Application Data Form was used on all approved/identified and denied applications and completed in conjunction with the Certification Record Abstraction Form. Roughly 50 Application Data Forms were completed at each site.

General Findings

The Application Data Form took about 3 minutes per form to complete. Data collectors drew information from applications, “Approved/Identified and Denied” lists, and electronic queries. The multiple sources of information led to some disorganization in the beginning, and it was determined that using pre-printed labels in Section A in the full evaluation will help data collectors complete the form more quickly. In addition, more instructions were needed on the form itself so that data collectors know exactly where to get certain information for specific sections. This will also be addressed in training. Pre-populated answers were added to make data collection clearer and easier to complete. In addition, redundant questions on the Certification Record Abstraction Form were removed.

Administrative Cost Forms: Self-Administered Questionnaire

Field Procedures

During the pretest, the Administrative Cost Interview Self-Administered Questionnaire was sent out by e-mail to LEA foodservice directors between 3 and 4 weeks before the site visits. The questionnaire used for the pretest included the Activity Summary Grid and the Staff Rosters. The LEA foodservice directors were asked to complete these forms before the site visit and return the Activity Summary Grid by email or fax before the site visit. One LEA returned the Activity Summary Grid before the site visit as requested and two LEAs completed this grid by telephone before the site visit. One LEA completed and returned the Staff Rosters two weeks before the site visit, one LEA completed the Staff Rosters during the site visit (as planned), and one LEA completed the Central Foodservice and Cafeteria Staff Rosters, but not the School Administrator Staff Roster, after the site visit.

General Findings

Based on the pretest experience, we plan to send out the Administrative Cost Interview Self-Administered Questionnaire 4 to 6 weeks before the site visits. We will request that the LEA return the Activity Summary Grid and the questions on indirect costs and fringe benefits within two weeks. This will allow sufficient time to complete these forms and to follow up and obtain the information by telephone if they are not returned in time to plan and schedule the site visits.

During the pretest, we found that the Summary Grid had to be reviewed over the phone during the Pre-Visit LEA Questionnaire to clarify responses at the sites. It was also necessary to give sites extra time after the site visit to fill in the staff rosters. With the Summary Grid, we realized that asking about the activities was not a detailed enough prompt for respondents, and we needed to ask about the tasks within each activity to more accurately determine which staff conduct these activities. The Summary Grid was updated accordingly. The questions on fringe rates and indirect costs were part of the Administrative Cost Interview Field Questionnaire during the pretest. We determined that the LEA Foodservice Director needed more time to obtain this information from other LEA officials, so we moved these questions to the Administrative Cost Interview Self-Administered Questionnaire, to be requested in advance. Data collectors will follow up if needed to get this information during the on-site data collection. The interview questions were modified to better guide the respondent through the form with clear skip patterns. The Staff Rosters were changed to make them easier to complete and clearer to understand by adding check boxes and lines to record data.

Administrative Cost Interview: Field Questionnaire

Field Procedures

The Administrative Cost Interview was also pretested with LEA foodservice directors at three Massachusetts LEAs. Interviews with the cafeteria manager and school principal were conducted if they were identified by the LEA foodservice director as partaking in any activities listed on the Activity Summary Grid (see Administrative Cost Forms above for details on this grid). Each interview was recorded for note taking purposes.

General Findings

During the pretest, we found that capturing time spent on activities was challenging because tasks were performed episodically, and respondents preferred to answer with varying time units. To solve these problems, we made capturing time more streamlined by adding check boxes for unit of time, and then adding a column for period of time. We then sub-divided each task into four rows to allow for multiple entries so that different rates of time could be reported as well as different types of employees. To account for a group of tasks in which time could not be teased apart by the respondent, we added extra rows at the bottom of each grid so that the interviewer can write down the task numbers to be grouped and the time spent on these tasks. We also added a section at the end of each grid to provide a space for respondents to record activity we may have missed. As a result of these changes, we developed scripts for the LEA foodservice director and all other respondents separately since we have different instructions for them. In addition, we changed some questions for the LEA foodservice director so that they are less leading and sensitive (for example, asking if food quality has increased).

Web Survey Instruments

Local Education Agency (LEA) Survey on Participation, Enrollment, Attendance, and Revenues (PEAR Survey)

Field Procedures

Local Education Agencies were recruited from each of the three CE Option participating States (Illinois, Kentucky, and Michigan) during an exploratory interview with the State’s Child Nutrition Director for participation in either the PEAR Survey or one of the LEA Foodservice Director Web Surveys (for Participating LEAs, for Eligible Non-Participating LEAs, or for Near-Eligible LEAs—see the sections below for details on each of these surveys). An additional five LEAs were recruited for the PEAR Survey, two from States in the Mid-Atlantic Region, two from a Western Region State, and one from a Southwest Region state. One LEA recruited from the Western Region State was ineligible to complete the survey since her LEA operated Provision 2 schools, and one LEA from the Mid-Atlantic Region State was dropped from the pretest because her LEA participated in the Summer Foods Study (a USDA program evaluation) so as to not overburden the LEA. A total of six LEAs participated in the pretest for the PEAR Survey.

LEAs that were identified by their respective States were initially contacted by email. A follow-up email or phone call was made if necessary. Once LEAs agreed to take part, a paper version of the web survey was either sent to the LEA by FedEx or emailed to the respondent as an attachment. Completed surveys were returned either by pre-paid FedEx or fax. The mode of delivery depended on respondent preference and time constraints of the pretest deadlines. Once the survey was completed and returned, an Abt research team member contacted the respondent for a 15 to 30-minute debriefing interview.

General Findings

The first three respondents (Michigan, Illinois, and Kentucky) completed the attached version of the PEAR survey for the pretest. It took two of these States 3 hours and 55 minutes each to complete the survey. The remaining State completed it in 2 hours and 8 minutes, but left all revenue categories blank except for NSLP and SBP payments. This respondent reported that completing the remainder of the survey would have been very time consuming and he wanted to get the survey returned. In addition he was a new food service director who worked for a management company and did not have access to records for SY 2008/09. This raised some concern that a foodservice director employed by a foodservice management company may not be forthcoming with revenue data. Therefore, in these cases an introductory letter from USDA may be helpful in facilitating the collection of this data in the field.

The majority of respondents expressed some confusion over what we wanted them to report in the “Year End Total for SY” row in the meal counts table. Some totaled the rows above, for the months of October through December. We also discovered that revenues across the months of October, November, and December were quite variable and often depended on the number of school days within the month, which differs due to holidays, snow days, etc.

Based on findings from the first three debriefing interviews, a revised version of the PEAR Survey was developed prior to the recruitment of the remaining LEAs in the pretest. The revisions included addition of a “Number of Operating Days” column in the lunch meal count tables, as well as a separate table to collect year-end total meal counts for breakfast and lunch. The revised survey was administered to two LEAs (Western and Mid-Atlantic Region States). The changes facilitated the LEAs’ understanding and completing the form, and clarified the collection of the meal counts data. These two respondents completed the survey in 3.5 hours, and 4 hours. After completing the pretest with five survey respondents, we concluded that we were unable to recruit additional LEAs before the pretest deadline.

The main barrier to recruitment, and the overall concern, was the length of the survey, which took the respondents twice as long to complete as anticipated. In an attempt to drastically reduce the survey length, it was decided to eliminate the collection of the following data: meal counts, number of operating days, enrollment, average daily attendance, revenue from federal payments, revenue as value of commodities received, and revenue from State payments. Instead, these data will be obtained at the State level as part of the CE Option Evaluation data request made to each participating State. Another concern was obtaining complete revenue data for the first half of each school year, since the months of October through November did not appear to be a good proxy and this data was necessary as a comparison for the data collected for SY 2012/13, which will not include year-end totals. Changes were made to allow LEAs to report revenue data as one figure for the first half of the year or by month (August through December) since not all LEAs report revenue data for the same time period. Other changes include the addition of a question about changes in the average pay per hour of cafeteria staff (to better interpret any changes reported in per meal labor cost) and minor rewording of questions and text to improve flow.

After this considerably shortened version of the PEAR survey was developed we received a reply from an LEA in a Southwest Region State that we had originally attempted, and failed, to recruit who was now willing to participate in the pretest. Abt Associates emailed her the new shortened version of the PEAR Survey and conducted a debriefing interview. It took her 14 minutes to complete the survey; however, she was unable to complete the revenue table, the most time consuming section, since her computer was unavailable. She estimated that she would be able to complete the revenue section over the course of two days accounting for the workload of her job and the typical interruptions, but she could not estimate the actual hours it took her to complete the section.

Overall, respondents reported that the revenue section took the longest amount of time and accounted for about half of the response time. Aside from what has been detailed above, all other survey questions, terms, definitions, and instructions were reported to be clear. Based on the pretest results, we estimate the average completion time to the shortened survey will be 75 minutes.

Although unlikely, in case a participating State is unable to provide data on meal counts, number of operating days, enrollment, average daily attendance, and select revenue categories for sampled LEAs, we will retain a longer version of the PEAR Survey that collects these data items. The main revisions to that instrument are consistent with what has been described.

Local Education Agency (LEA) Foodservice Director Web Surveys

Field Procedures

Pretest activities were conducted with the three LEA implementation web surveys in the three States participating in the Community Eligibility Option in the first year—Illinois, Kentucky, and Michigan. These three surveys were developed for and pretested with three distinct LEAS: 1) eligible and participating (EP) LEAs; 2) eligible and not participating (EN) LEAs; and, 3) nearly eligible (NE) LEAs. As some questions are repetitive across the three surveys, and to assure that questions were not asked of more than 9 respondents, some LEAs received the full survey while others received partial surveys that contained questions specific only to that instrument. The goal was 3 completed full surveys from each of the 3 respondent groups and 6 completed partial surveys from the EP and EN respondent groups. As the NE survey did not have any unique substantive questions, a partial version of this survey was not pretested. Respondents were contacted and surveys mailed over the course of 3 weeks. Of those LEAs who responded to the pretest request, a total of 19 pretests (90% of the goal) were conducted with these LEAs. Two pretests could not be completed, with one missing full survey from the EP LEA and one missing partial survey from the EN LEA. The breakdown of survey completion and time to complete is as follows:

  • Eligible, participating (EP) LEAs: 2 completed a full survey and 6 completed a partial survey. The full surveys took 29 minutes and 35 minutes to complete; the partial surveys took between 15 and 20 minutes to complete.

  • Eligible, non-participating (EN) LEAs: 3 completed a full survey and 5 completed a partial survey. The full surveys took between 27 and 40 minutes to complete; the partial surveys took less than 10 minutes to complete.

  • Nearly eligible (NE) LEAs: 3 completed a full survey. The full surveys took between 20 and 30 minutes to complete.

General Findings

Some consistent themes emerged across the three surveys, with each survey receiving specific feedback. Across the three surveys, respondents reported that the surveys were easy to understand, most questions were clear, response categories were appropriate, and most topics were covered. There were some general issues, however. First, there was confusion around the Identified Student Percentage (ISP). This confusion emerged around the definition of the ISP, the utility of the ISP, how it was calculated, and which students could be counted in the calculation. Once the study team explained the ISP, most respondents recognized the term, but few could report the ISP for their LEA. Respondents were more familiar with the term “reimbursement rate” and often responded with that number (i.e., their free meals claiming percentage) when asked for their ISP. The surveys have been clarified and examples added to address this confusion. Additionally, questions have been added to assess who calculated the ISP for the LEA. A second issue was that most respondents were not familiar with the terms Provision 2 and Provision 3, especially in States or LEAs where these Provisions are not used. Respondents reported having to look up the terms on the Internet. In response, definitions of these terms have been added to the survey and will be available to respondents to reference as they complete the survey. Finally, the question that collects data on students approved for free and reduced price meals by various categories was confusing for some (explained more in the EP section) and data was obtained from different sources to complete the table. Respondents across all three surveys reported that they would like an introduction notifying them they will need to access data prior to taking the survey. Accordingly, the question has been revised from its original format, a column added to indicate the data source, and an introduction will be included in the cover letter for this survey. Respondents will also have the option of accessing a worksheet in PDF format at the beginning of the web survey, which they can print and use to gather data prior to completing the web survey. Other general revisions to all surveys included improving clarity and flow by re-wording questions that were not clear, breaking down questions into two or more parts, re-ordering questions, and adding or deleting response options. The issues specific to each survey are presented below.

Local Education Agency (LEA) Foodservice Director Web Survey for Participating LEAs

Specific Findings

Respondents who were eligible and participating found the question asking for free and reduced price meals eligibility data confusing, especially those LEAs that were CE Option only since they no longer categorize their students this way for the purpose of counting school meals. Some LEAs reported last year’s numbers, and others reported these categories as derived from an alternative household income form (which they used in place of the household application for funding purposes). All eligible participating LEAs did collect household income using an alternate form. Based on this finding the collection of free and reduced-price data was eliminated from the surveys, and will be obtained at the State level. Questions about the use of an alternative household income form were added.

This respondent group also reported difficulties with the questions inquiring about incentives, barriers, benefits, and problems with implementation of the CE Option. These questions required the respondent to read through a list of response options, choose the ones they feel apply, and then rate the importance of the ones they have chosen. They also have the option to write in a response if desired. However, very few wrote in responses and reported that additional response options were not needed. Many respondents checked only a few responses, or checked all responses. In the follow-up debriefing, some respondents reported that just skimmed the list and picked a few for ease of responding. Several respondents felt the options were too specific or there were too many on the list to read through. Since these questions are a key piece of the implementation survey, these response options have been consolidated to focus on the most relevant issues, and reduced to alleviate respondent fatigue. These response options were also reviewed for consistency across the three versions of the implementation surveys, and revised accordingly.

Local Education Agency (LEA) Foodservice Director Web Survey for Eligible Non-Participating LEAs

Specific Findings

Respondents who were eligible and not participating reported some barriers to participating that were not previously included in the response categories. These barriers included: stigma imposed on the schools that were eligible to participate versus those that were not in the same school district; not having appropriate staff to meet the potential increased participation in school lunch; concerns about the longevity of the CE Option; and using resources to provide free meals to students who otherwise would not need financial help. Several respondents reported the biggest barrier was concern about the impact of not collecting household application data needed for other funding streams and felt the survey did not allow them the opportunity to rate this as a substantial barrier. They also indicated that the Option was not financially viable for them at the 1.6 multiplier. Respondents also reported that there wasn’t sufficient communication about the program or enough time to address concerns related to perceived barriers prior to making the decision to implement the Community Eligibility Option. Accordingly, questions have been added to this survey to address these themes.

Local Education Agency (LEA) Foodservice Director Web Survey for Near-Eligible LEAs

Specific Findings

Respondents who were nearly eligible reported concern with how they would learn or be notified if their eligibility status in subsequent years for the CE Option changes. This concern led a couple of respondents to conclude that the communication around the Option wasn’t sufficient for their LEA. Accordingly, a question was added to the survey to obtain this information in more detail.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Single-Sided Body Template
AuthorJan Nicholson
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy