Attachment D.2: Cognitive Interview Testing Report

Attachment D.2 - Cognitive Interview Testing Report Final.docx

Review of Child Nutrition Data & Analysis for Program Management

Attachment D.2: Cognitive Interview Testing Report

OMB: 0584-0620

Document [docx]
Download: docx | pdf

Attachment D.2 – Cognitive Interview Testing Report



Table of Contents







  1. Participant Recruitment

The survey development researchers recruited a cognitive interview sample of nine participants, consisting of three State agencies and six School Food Authorities (SFA) representatives, utilizing data from a preliminary examination of the variety of data systems used by State agencies and SFA. The selected States and SFAs represented a variety of geographic areas, population size, socioeconomic make-up, and MIS sophistication.

Table 1 shows the states and SFAs that agreed to participate, and the date of each cognitive interview.

TABLE 1: Cognitive Interview Subjects for State and SFA Survey Instruments

Name

State/SFA

Interview Date

Interview Time EST

Vonda Cooke

Pennsylvania

1/11/2016

1:30-4:30 PM

Marla Moss,

Peter Cyril Jones and Diane Golzynski

Michigan

1/13/2016

2:00-4:00 PM

Karen Wooton and Dana  Doerhoff

Missouri

1/14/2016

1:00-4:00 PM

Christopher Melonas

Wirt County Schools, WV

1/12/2016

11:00 AM-2:00 PM

Marcie Christiansen

Lake Oswego School District, OR

1/12/2016

1:00-4:00 PM

Peggy Terry

Wilcox County Schools, AL

1/12/2016

3:00-5:00 PM

Sidney Vinson

Detroit Public Schools, MI

1/14/2016

9:00 AM – 12:00 PM

Lilly Bouie and Kathy Brown

Little Rock School District, AR

1/14/2016

11:00 AM - 1:00 PM

Marlene Pfeiffer and Susan Barks

Parkway School District, MO

1/15/2016

12:00-3:00 PM



All interviewees received instructions, a copy of the consent form, and the draft survey instrument in advance of the interview. Interviewees had the option to invite additional staff to participate in the interview based on their brief review of the survey in advance of the interview. Appointments were made for a two to three hour time block.

  1. Cognitive Interview Guides

The survey development team developed cognitive interview guides designed to test the adequacy and clarity of survey content. Before creating the cognitive guides, the survey instruments were modified in response to FNS comments. The guides contained interview questions designed to detect problems such as poorly understood questions, terms that were not well-defined, inadequate response categories, difficult transitions between topics, and unclear instructions.

All five IMPAQ staff who conducted the interviews attended a three-hour training on the cognitive interview process and guides. During this training, all interviewers received instruction in standardized interview techniques, how to engage respondents, and how to conduct the consent process.

  1. Conducting Interviews

Teams of two survey development staff conducted each interview by phone during the week of January 11- 15, 2016. Each interview covered the entire survey instrument and took an average of 2½ hours to conduct. After obtaining consent, one member team took extensive notes and ensured that the interview was recorded. The lead interviewer followed the cognitive interview guide for the applicable survey, reading aloud survey instructions and questions. Interviewees confirmed their survey answers and responded to the interviewer’s probe questions about how they understood the survey question and response options, and the process they used to arrive at their response. Then the interviewer engaged the respondent in a conversation that explored any areas of confusion in the question wording or response options. FNS staff were invited to observe all cognitive interviews.


3. Cognitive Test Results

Based on feedback from interview subjects, we revised the draft surveys around the following themes:

  1. Improving the clarity of questions by changing wording, or adding descriptions and/or examples;

  2. Improving the clarity of response options by adding descriptions and examples, or deleting unnecessary language;

  3. Changing or adding response options to reduce respondent burden by making the question easier for respondents to answer;

  4. Consolidating or moving questions, or structurally changing them to ensure that items are in logical order and flow well for the respondent; and

  5. Deleting unnecessary questions or questions that were difficult for respondents to answer reliably.

This section provides a brief overview of the revisions made to each section of the draft surveys based on the themes listed above. One global change made, based on interviewee feedback, was to change the term “child nutrition” to “school nutrition” to reflect more accurately the survey’s focus on the NSLP/SBP only, and no other federal child nutrition programs. This is especially important for state agency respondents, as they frequently administer multiple programs.

  1. Changes to Section 1: Description of School Nutrition MIS

In this section, changes were implemented based on all five of the themes described above. For example, based on interviewee feedback, in question 1 (In general, how do you manage your State agency’s school nutrition program data?), we expanded the descriptions in the response options to include the term commercial off-the-shelf (COTS) software (theme 2 above).


Similarly, because respondents had difficulty calculating the number of full time equivalent staff, initially asked in question 3, we deleted that item (theme 5 above). The sub-item asking respondents to provide the number of SFAs accessing the system was also deleted, as this information is available elsewhere (theme 5). Additionally, interviewees had difficulty understanding what the survey meant by “users” of the system. To address that confusion, we added a definition of users to the question (theme 1).


Other changes to this section included deleting the former question 6 asking for the length of time the organization has had their MIS. Instead, we included that information for each MIS module in question 4, as different modules may be acquired at different times (theme 4). Changes were also made to the list of modules based on interviewee feedback (themes 1 and 3). Similarly, questions on the costs of initial development of the MIS and upgrades were deleted and restructured to ask for each module within the MIS, as users could be using multiple systems each having its own cost structure. Because organizations could have multiple systems or vendors, response options were also changed, such as in question 11, which asks about having a maintenance contract with vendors. These response options were altered to allow respondents to say they had such contracts with “all” versus “some” or “none” of their vendors (theme 3).


  1. Changes to Section 2: Data Elements and Reporting

The survey development team changed this section following the themes listed above. For example, questions asking how frequently certain data elements are “collected” were rephrased to how often data are “entered”, as interviewees indicated confusion about whether to respond with how often the data were actually entered into the system vs. the frequency that data is reported (theme 1). Response options were further clarified. For example, for question 26 in the state survey, we changed the first response option from “upload” to “upload and enter”, as data could be inputted into the system by both means (theme 2). Moreover, some response options were changed. For example, “on an as needed basis” was added to the questions asking about the frequency of data entry (theme 3), based on interviewee feedback.


In addition, the order of some multi-part questions was changed to group the sub-questions thematically instead of alphabetically to make responses flow more easily for respondents. For an example, see question 35 in the state survey which asks about administrative review tools and forms. In other cases, we structurally changed questions for greater ease of response, such as consolidating the items within state survey questions 30 and 31 on data availability to a simple Yes/No question, instead of having multiple columns to obtain the same response (theme 4).


Finally, one question was deleted in this section (theme 5): the open-ended question at the end of the section asking about the biggest challenge with the MIS. Based on responses received, we deemed the data resulting from this question to be of minimal value, as interviewees already have sufficient opportunities to indicate the challenges they face in prior questions about data collection, aggregation, and reporting. In addition, we found that respondent views on the greatest challenge could vary among respondents from the same organization.


  1. Changes to Section 3: Technical Features of the School Nutrition MIS

Changes in this section were minimal, with just four of the themes being reflected in the changes. Question wording was clarified with additional definition, for example, in questions 50 and 51 in the SFA survey, where “share data” was explained to be “send and receive data” as some interviewees were unsure if receiving data should be included in sharing data (theme 1). Some response options were also enhanced, such as adding an option to indicate the organization does not have a custom built system when asked about copyright of the system in question (theme 3). Other response options were edited for clarity, such as deleting the words “from external sources” from the response option “Manually key in data” in question 51 of the SFA survey, which is about sharing data with the State agency, as the deleted phrase was deemed to be redundant (theme 2). Finally, one item was deleted (question on how the MIS is deployed) as the next item (question 52 about where the data are stored) adequately addresses the same concept (theme 5).


  1. Estimated time to complete the survey

At the end of the cognitive interview, each interview subject was asked to estimate the amount of time necessary to complete the survey. All but one of the six SFA subjects thought it would take 60 minutes or less, with responses ranging from 30 to 120 minutes, and the average of their responses being 57.5 minutes. The three States had a wide range of time estimates of 30, 120 and 180 minutes. We are confident that after streamlining the surveys and reducing the number of questions, both States and SFAS will be able to complete the survey in the 60 minute time frame estimate in the 60 day notice. We do not think that it is necessary to change the time estimates in the OMB package.  


  1. Other Changes Made to the Instrument

  1. FNS-suggested changes

IMPAQ carefully reviewed FNS comments on the initial draft surveys received on December 28, 2015. Most FNS comments were incorporated into the revised surveys used for cognitive testing. Any remaining FNS comments were considered along with results of the cognitive test in developing the revised survey instruments provided with this deliverable. Changes affecting both surveys broadly include the change from “child nutrition” to “school nutrition”, and the change of the response category “not applicable” for many of the data element charts to “do not have.” In addition, the list of software vendors in both surveys were also reviewed and updated.

Additional changes were made in response to the final round of FNS comments on the revised surveys, provided to IMPAQ on February 19 and 22, 2016. A question was added to the SFA survey on importing direct certification data into POS systems, and an additional question on database structure was added to both the SFA and State surveys.

5. Final Surveys

See Attachments B.9 and B.10 for final versions of the state agency surveys and SFA surveys.



Page 5

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMadeleine Levin
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy