Appendix Z. WIC pretest memo_Mar2025_2025

Appendix Z. WIC pretest memo_Mar2025_2025.docx

WIC & FMNP Outreach, Innovation, and Modernization Evaluation

Appendix Z. WIC pretest memo_Mar2025_2025

OMB:

Document [docx]
Download: docx | pdf

Appendix Z

WIC pretest memo



Memo

To: Carol Dreibelbis

From: Caroline Lauver and the WIC & FMNP Outreach, Innovation and Modernization Evaluation Instrument Development Team

Date: 8/5/2024

Subject: Pre-Test Memorandum (Deliverable 4.3)



This memorandum describes the pre-test procedures for the WIC & FMNP Outreach, Innovation and Modernization Evaluation (WIC Modernization Evaluation); summarizes the pre-test findings; and lists the instrument changes we implemented based on the findings. We pre-tested the following instruments: (1) the State agency interview protocol; (2) the waiver pulse survey; (3) the local agency interview protocol; (4) the vendor1 staff interview protocol; (5) the program staff experience survey; (6) the vendor staff experience survey; and (7) the WIC participant experience survey.

There are three components to the WIC Modernization Evaluation, each focused on one research objective.

  1. An implementation study will examine and document the implementation of the modernization projects through a review of project materials, interviews with State agencies, and case studies that include interviews with local agency staff, clinic staff, and vendor staff and focus groups with participants.

  2. A waiver study will examine and document the implementation of waivers to support modernization projects using administrative information on waivers, pulse surveys, and interviews with State agencies.

  3. An impact study will assess outcomes and measure causal impacts using administrative data and surveys of participants, vendor staff, and WIC staff at the State and local levels.

Pre-Test Recruitment and Pre-Test Procedures

Recruitment

  • Pre-test recruitment timeline. Starting in April 2024, the study team worked with the Food and Nutrition Service (FNS) to recruit State agencies to participate in the pre-test. We then planned to work with the State agencies to identify and recruit local agencies, vendor staff, and WIC participants on a rolling basis for the pre-test. FNS and Mathematica conducted recruitment outreach activities on a rolling basis through July 2024 when the final WIC vendor staff were successfully recruited.

  • Pre-test recruitment universe. FNS conducted pre-test recruitment outreach via email to all seven of the WIC regional offices, which include the Mid-Atlantic Regional Office (MARO), the Midwest Regional Office (MWRO), the Mountain Plains Regional Office (MPRO), the Northeast Regional Office (NERO), the Southeast Regional Office (SERO), the Southwest Regional Office (SWRO), and the Western Regional Office (WRO). The study team then worked with State agencies that volunteered for the pre-test to recruit the local agencies, vendor staff, and WIC participants associated with them. This recruitment effort yielded pre-test volunteers from four State agencies: Michigan (MWRO), Minnesota (MWRO), Mississippi (SERO), and Vermont (NERO).

  • Pre-test recruitment activities, including outreach to different respondents. In late April 2024, FNS sent an email to all the Regional Offices to solicit State agency nominees. FNS also disseminated a call for State agency pre-test volunteers through the National WIC Association’s June 10 Monday Update. This outreach correspondence included information about the purpose of the pre-test, the pre-test activities and what FNS would ask of each type of pre-test volunteer, the timeline for the pre-test activities, and compensation that FNS would provide to WIC vendor staff and WIC participants. As State agency staff volunteered to participate in the pre-test, the study team worked with those State agencies to obtain the names and contact information of local agency staff, WIC vendor staff, and WIC participants who might be interested and willing to participate in the pre-test. The study team reached out to recruit pre-test volunteers primarily via email; however, the study team did hold separate Webex conference calls with three State agencies so they could efficiently ask questions and obtain answers and better understand the pre-test task.

Pre-testing overview

  • The pre-test task lead conducted a training on June 17, 2024, for members of the study team who would be involved in the pre-test. This training provided an overview of the study’s background and objectives; it introduced the team to the study’s design and data collection instruments; it covered the pre-test plan, including the pre-test schedule and steps for conducting the pre-test for each instrument. The training also introduced the study team to the tools and resources the study team would use to collect and organize the pre-test data, which the team would then use to review the instruments.

  • The study team conducted the first pre-test interview and debriefing on June 26, 2024, and conducted the final pre-test debriefing on July 24, 2024. The study team used customized pre-test debriefing spreadsheets to gather feedback from pre-test volunteers and compile that feedback in a uniform manner to inform instrument revisions.

Pre-test procedures for the implementation and waiver studies

  • Purpose of the pre-test. The study team pre-tested the State agency interview protocol, the waiver pulse survey, and the case study interview protocols, which included the vendor staff interview protocol and the local agency staff interview protocol. The purpose of this pre-test was to (1) test how long it took to complete the interviews and the waiver pulse survey, (2) make sure respondents could easily understand the terminology, (3) make sure the questions were logical and could gather the type of data needed to conduct this evaluation, and (4) identify any questions to add. In addition, this pre-test enabled the study team to test its procedures for preparing for the implementation study interviews, including reviewing documents about the modernization activities and customizing the interview protocols based on that information.

  • Pre-test activities. As part of the pre-test, the study team asked volunteers to review the recruitment materials related to their respective interviews so the study team could obtain their feedback on the clarity and usefulness of these materials during the pre-test debriefing. The study team asked pre-test volunteers to participate in a mock interview (and for the State agency staff, we also asked they complete a five-minute waiver survey before their mock interview). Pre-test volunteers then participated in a 30-minute debriefing immediately following the pre-test interview so the study team could get their feedback on the respective data collection instrument(s) and the associated recruitment materials. The study team sought feedback on how easy or difficult it was to understand and answer the questions; the flow and organization of the instrument; how appropriate the terminology was; and how easy or difficult it was for the respondents to differentiate between grants and the associated activities when answering questions. The study team also asked for pre-test volunteers’ feedback on whether the team should remove any questions due to redundancy and whether to add any questions to gather important information.

Pre-test procedures for the impact study

  • Purpose of the pre-test. The study team pre-tested the program staff experience survey, the vendor staff experience survey, and the WIC participant experience survey. The purpose of this pre-test was to (1) test how long it took respondents to complete the experience surveys, (2) make sure respondents could easily understand the terminology used in the surveys, (3) make sure the questions were logical and could gather the type of data needed to conduct this evaluation, and (4) identify response options to add or remove from the response option lists.

  • Pre-test activities. As part of the pre-test, the study team asked volunteers to review the recruitment materials related to their respective surveys so the study team could obtain their feedback on the clarity and usefulness of these materials during the pre-test debriefing. The study team also asked the pre-test volunteers to complete their respective experience surveys and send their completed surveys back before their scheduled debriefing. Each pre-test volunteer participated in a 30-minute debriefing and provided their feedback to the study team on their respective survey and associated recruitment materials. The study team sought their feedback on how easy or difficult it was to understand and answer the different types of survey questions, how clear and appropriate the terminology was, and what they thought of the flow and organization of the survey. The study team asked for pre-test volunteers’ feedback on whether to remove any questions due to redundancy and whether to add any questions to gather important information. Finally, the study team asked whether the response options for various questions made sense and whether we should add or remove any response options.

Pre-Test Findings

We provide a high-level overview of the pre-test task using Table 1 before discussing in more detail the findings and resulting changes to each instrument. We will also summarize the pre-test findings in the Office of Management and Budget package for the study. The final versions of the instruments will incorporate the changes based on the pre-test.

Table 1. Overview of pre-tests

Instrument

Number of pre-tests completed

Average time per pre-test

Target length

State agency interview protocol

4

Average length of the interview was 60 minutes (55 minutes, 69 minutes, 64 minutes, and 50 minutes). However, for three of the pre-tests, the study team wrapped up the interview close to the 60-minute mark to be respectful of the pre-test volunteers’ time, even though they did not get through all of the protocol questions for every set of activities.

60 minutes

Waiver pulse survey

4

One respondent reported it took 5 minutes to complete it.

One respondent reported it took about 10 minutes because they had to verify information before responding to some of the waiver questions.

One respondent could not provide an exact time estimate, as they completed it in more than one sitting, in between meetings and competing priorities; they also had to consult with colleagues to answer multiple questions.

One respondent noted that if they were more confident in their knowledge of the waivers, it would have taken only about 5 minutes to complete the survey, but the survey took a bit longer because they had to look up information to respond to some of the questions.

5 minutes

Local agency staff interview protocol

4

Average length of the interview was 57 minutes (53 minutes, 57 minutes, 60 minutes, and 56 minutes).

One interviewee skipped some questions to stay within the 60-minute time limit; but the remaining interviews covered the content within the allotted time.

60 minutes

Vendor staff interview protocol

4

Average length of the interview was 53.5 minutes (49 minutes, 45 minutes, 60 minutes, and 60 minutes).

60 minutes

Program staff experience survey

4

Average length of time to complete the survey was 14 minutes (7 minutes, 20 minutes, 14 minutes, and 15 minutes).

10 minutes

Vendor staff experience survey

4

Average length of time to complete the survey was 21 minutes (10 minutes, 30 minutes, 19 minutes, 25 minutes).

The respondent who took 19 minutes noted that it took them longer to complete it because they struggled doing the survey in Word and they dealt with some interruptions.

The respondent who took 30 minutes noted that they struggled to navigate the survey skip logic in Word and accidentally answered questions from a section that didn’t pertain to them and that’s why it took them a longer time to complete the survey.

10 minutes

WIC participants’ experience survey

4

Average length of time to complete the survey was 21 minutes (13 minutes, 12 minutes, 34 minutes, and 24 minutes).

The respondent who took 34 minutes noted they spent time rereading the questions and jotting down comments while completing the survey.

The respondent who took 24 minutes noted they were distracted by their baby and, without that distraction, could’ve completed the survey faster.

10 minutes



Implementation and waiver studies

  • Summary of the State agency interview protocol and recruitment materials pre-test findings. The pre-test found that the interview protocol was too lengthy to fit into a 60-minute time limit, particularly when there were multiple grants and multiple associated activities to discuss. The pre-test also found that many of the probes were unnecessary, as the pre-test volunteers could answer the questions without needing prompts or probes. The study team removed questions that were redundant with information already being gathered through the document review. The study team streamlined the waiver-related interview questions, keeping only those directly tied to the waiver study’s research questions. Revised the recruitment materials by adding a link to a webpage that explains the WIC modernization effort. Streamlined the presentation of the data collection activities so it was clear what was expected of respondents and added information about respondent incentives.

  • Summary of the waiver survey and recruitment materials pre-test findings. The study team revised the waiver pulse survey based on feedback gathered through the pre-test. Most of the revisions centered on the survey’s introduction, adding language that encouraged respondents to consult with colleagues if needed, and recognizing that some of these waivers might have transitioned to the American Rescue Plan Act (ARPA)-related waivers after originally being granted under the Families First Coronavirus Response Act. The study team removed a lengthy question about which waiver requests were withdrawn and why; if a respondent notes this occurs, the study team will gather contextual information during the State agency interview.

  • Summary of case study interview protocols and recruitment materials pre-test findings.

    • Local agency staff interview. The study team revised the local agency staff interview protocol based on feedback gathered through the pre-test. The study team removed some extraneous probes and redundant questions to streamline the interview. They also added some introductory language before specific sets of questions so the respondent would understand what would be discussed next. The study team also made wording revisions to streamline some of the questions and make them clearer to the respondent. We also added a link to the study description one-pager that takes the respondent to an FNS website that provides more information about the WIC modernization efforts. Revised the recruitment materials by adding a link to a webpage that explains the WIC modernization effort. Streamlined the presentation of the data collection activities so it was clear what was expected of respondents and added information about respondent incentives.

    • Vendor staff interview. The study team revised the vendor staff interview protocol based on feedback gathered through the pre-test. The team removed questions from across the sections asking vendor staff what they believe WIC participants were saying about their shopping experience. They also reframed overarching questions about the shopping experience to better understand what that experience looked like and how the modernization activities had changed it, rather than asking the vendor staff to summarize what a typical WIC shopping experience looked like. They added definitions to some of the terms to make sure respondents had the same understanding. They added a question to understand how the eSolution and WIC cash value benefit (CVB) had changed the reimbursement process for FMNP-authorized outlets. We also added a link to the study description one-pager that takes the respondent to an FNS website that provides more information about the WIC modernization efforts. Revised the recruitment materials by adding a link to a webpage that explains the WIC modernization effort.

  • Adjustments to instrument implementation based on pre-test findings.

    • The pre-test offered the study team an opportunity to test the processes and procedures the team would follow to prepare to conduct the implementation study interviews and waiver survey. In response to questions about interview topics from pre-test volunteers, the study team subsequently created and shared a summary of the grants and their associated activities they would discuss during the mock interviews. The study team also shared an overview of the types of questions the interviews would focus on. During the pre-test debriefings, the study team asked pre-test volunteers if this was a helpful resource, and the feedback was positive. The study team plans to use such summaries during data collection to help interviewees prepare for their interviews.

    • The study team remains concerned about the length of the State agency interview, especially for State agencies with multiple grants and/or multiple projects. One option is to have a longer interview for the first annual interview and reduce the time for subsequent interviews (for most State agencies) as later interviews can focus on updates of the information from the first interview. Given the need to streamline the State agency interview and because we could gather some of the information about grant funding, timelines and waivers outside the interview, the study team suggests removing some of those questions from the State agency interview protocol and gathering that information through the document review and via email with the respondent. The study team will also do detailed follow-up by email and phone with State agency interview respondents who are unable to answer the “reach” and “intensity” questions during the interview. These are important topics, and the study team recognizes the respondent may need to consult with colleagues and documentation to provide a sufficient response.

Detailed descriptions of revisions to each instrument and their respective recruitment materials, as well as the rationale behind those revisions, are available in Table 2.

Table 2. Revisions to implementation and waiver study instruments

Instrument

Revisions

Rationale

State agency interview protocol and recruitment materials

State agency interview


Introduction. Streamlined some of the wording so it reads, “We would like to learn about [STATE AGENCY NAME] and your experience and progress implementing the WIC and/or FMNP modernization grants. I will review the list of grants we have on file for [STATE AGENCY] once we begin the interview.”

The original wording of this sentence was lengthy, clunky, and hard to follow.


Section C. Grant background

Summarize the activities for each grant, not just the grant name, and ask the respondent how they typically refer to this grant.

Summarizing the activities gets the interviewer and respondent on the same page. Using the preferred name for the grant reduces the respondent’s cognitive burden.

Remove the grant funding and timeline questions from the protocol and gather and confirm this information through the document review and/or email when sharing the pre-interview summary with the respondent.

The interview was too long to fit within the allotted hour. Suggest removing questions about information that can be gathered elsewhere.

Revisions across project activity sections (outreach, technology, in-person shopping, online shopping, and farmers’ market shopping:


Moved a question to the start of each section and revised it to provide the respondent an opportunity to provide a high-level overview of the activities and their implementation

This type of question at the beginning was helpful in orienting the rest of the discussion

Removed questions about how much funding was committed to the activities

This is information we can obtain through the document review or email before the interview.

Removed extraneous probes throughout these sections.

Pre-test volunteers could answer questions without the probes. Removing them enables interviewers to focus on the main questions.

Removed extraneous interviewer instructions throughout these sections

This makes the protocol easier to follow. Will cover interviewer instructions in-depth in training.

Removed rollout timing questions as they were redundant with another question.

Reduced redundancy and streamlined the interview protocol.

Made the successes and challenges questions more open-ended rather than listing the successes and challenges for the respondent

This streamlined the interview and the respondents spoke at length about both. We will train interviewers to have a list of strengths and challenges on hand if needed but give the respondent a chance to speak openly to start.

Replaced “challenges” with “barriers.”

Pre-test volunteers felt more open talking about challenges framed as barriers. Challenges can be a sensitive topic.

Removed questions about waivers from these sections.

The interview was lengthy and we will discuss waivers at the end.

Made the questions about equitable WIC access more open-ended.

Moved away from a yes/no format and framed it for a more nuanced response.

Removed probes about challenges working with technology vendors.

Respondents mentioned such challenges without prompts. Effort to streamline the interview.

Section D. Outreach activities

Added wording to the first question to guide the respondent to discuss only non-Community Innovation and Outreach (CIAO) activities.

This study is not evaluating outreach activities related to CIAO grants.

Section H. Farmers’ market shopping experience Added a question about how the eSolution/WIC CVB changed the reimbursement process.

This is an important aspect of the eSolution/WIC CVB.

Section I. Grant effectiveness and sustainability

Consolidated the questions in this section and removed probes.

During the pretest, this section took too long to complete with the separate questions and probes.

Section J. Waivers

Removed questions about why a waiver was requested.

Many waivers were originally granted under the Families First Coronavirus Response Act and transitioned to ARPA. Given this history, it was difficult to recall why a waiver was requested

Consolidated questions about implementing the waivers; removed extraneous probes

Streamline and simplify the protocol; make it easier to answer.

Added a question about barriers to waiver use and a question about long-term waiver use.

These new questions are directly tied to the waiver study’s research questions.

Simplified question wording about equitable program access.

Make the question easier for respondents to answer.


Removed questions about how waivers affected program participation/benefit redemption; removed set of questions about waivers that were approved by not implemented.

These questions aren’t tied to the waiver study research questions.

Added one question to gather information about which waiver requests were withdrawn and why.

This information was originally included in the waiver pulse survey; moved it to the interview protocol as it works better as an interview question.

Recruitment materials


USDA endorsement letter for WIC State agencies Revised the sentence to read “We will also be leveraging the WIC Participant and Program Characteristics (PC) data collection for the WIC Modernization Evaluation to minimize the burden on State agencies.

A State agency noted that the PC Plus data pool is a large project and is an extra burden in and of itself. Suggest removing the last part of the sentence.

Study description for WIC State agencies

Added a link to the WIC modernization efforts.

Added this link as a resource for respondents if they wanted more information.

Added information about the WIC participant and vendor staff incentives for data collection.

Make State agencies aware of the incentives so they can communicate that information.

Explained how we would intentionally select 32 local agencies nationally for the case studies.

Clarify how these agencies would be selected.

Streamlined the information about the different data collection activities.

Reduce confusion about the different data collection activities.

WIC State agency study recruitment email from the Mathematica study team

No suggested revisions


Waiver pulse survey and recruitment materials

Waiver pulse survey


Introduction

Added language recognizing that some waivers might have transitioned from originally being granted under the Families First Coronavirus Response Act to being ARPA-related waivers.

Pre-test respondents noted this type of clarification would help.

Added instructions for respondents to consult with colleagues if they need help answering any survey questions.

Pre-test respondents often needed to speak to colleagues to complete the survey.

Revised the wording to make it clear the survey focused on which waivers were issued and implemented—not how they were used.

Pre-test respondents noted the survey was about waiver issuance, not how the waivers were used.


Waiver issuance and use

Removed B8, which was a lengthy question that asked detailed questions about waivers requests that were withdrawn, when, and why.

This question increases the respondent burden. If respondent notes they withdrew any waiver requests, we will gather information about it during the interview, not the survey.

Recruitment materials

WIC State agency pulse survey invitation email

  • No suggested revisions


Local agency staff interview protocol and recruitment materials

Local agency staff interview protocol



Introduction

Revised wording on how we will combine the responses when we present the information in the reports.

This makes the introductory language clearer.

Section B. Respondent and local agency or clinic background

Removed the question “Please describe the location or setting of the WIC clinics” based on pre-test feedback and an effort to remove redundancy.

There was overlap in asking the pre-test respondents to describe the setting or location and then describe the community. Removed the redundancy.

Section C. Outreach, services, and support to WIC participants

Added introductory language before sets of questions about the online application, virtual appointments, and online administrative services.

This introductory language helps the respondent understand the type of services we will talk about and how we define them.

Removed questions about “How easy or difficult will it be to continue to use [SERVICE]” from the online application, virtual appointments, online administrative services, and participant communication sets of questions.

This sustainability question was redundant with existing questions; removed to streamline.

Revised the introductory language for outreach to potential participants to recognize that outreach can happen at both the State and local levels and request the respondent describe it at a local level.

Pre-test volunteers felt the original wording, stating the State led the activities, minimized the work the local agencies do

Removed the question about how the respondent feels about virtual appointments.

This question is redundant with existing questions. Removed to streamline the interview.

Removed probes about language barriers during participant communication.

Removed probes to streamline the interview and focus on the main questions.

Added a question at the end of this section on whether they would make any changes to the communication tools and methods to make them more useful or easier to understand.

During the pre-test, this kind of question yielded rich information.

Section D. Improving the in-person shopping experience

Added “shopping” to the term “WIC Shopping App.”

Some pre-test respondents initially confused it with the breastfeeding app; clarified what we meant by adding the term “shopping.”

Supporting the workforce

Added language to the Interviewer Instructions to ask this section only of local agencies involved with the workforce grants.

Clarifies who should be asked this set of questions.

Added language to this section to remind the respondent that any answers they provide during the interview will not be shared with others, including their colleagues.

Added this language to help the respondent feel more comfortable answering questions candidly.

Removed some extraneous probes.

Removed probes to streamline the interview.

Simplified the question about staff knowledge of participant needs so it now reads “How does your agency/clinic show it supports and values staff’s knowledge of the needs of participants of different backgrounds and the lived experience of its staff? Please describe.”

This streamlined the question and improved clarity and readability for the respondent.

Recruitment materials

Study description for WIC local agencies

Added a link to the WIC modernization efforts

Added this link as a resource for respondents if they wanted more information about the modernization efforts.

Added information about WIC participant and vendor staff data collection incentives.

Added this information so the local agency staff are aware and can communicate this, if needed.

Streamlined the information in the document.

Streamlined the information so the data collection activities are clear.

WIC local agency case study recruitment email from the Mathematica study team

  • No suggested revisions.


Vendor staff interview protocol and recruitment materials

Vendor staff interview

Revisions across project activity sections (online shopping, in-person shopping, and farmers’ market shopping

Removed question about what vendor staff believe WIC participants are saying about their shopping experience.

Pre-test respondents noted they would not be able to speak to a WIC participants’ shopping experience.

Reframed the overarching question from “What is the typical WIC participant shopping experience” to make it broader to understand what the shopping experience looks like and how the modernization activities have changed it.

Reframed the question so it asks the vendor staff to give their perspective, rather than asking them to put themselves in the WIC participants’ shoes.

Section D. In-person shopping activities

Add a definition for “in-person shopping.”

Defined the term so all respondents have the same understanding.

Section E. Farmers’ market shopping experience

Added a question about how the eSolution and accepting WIC CVB has changed the reimbursement process and timeline for FMNP authorized outlets.

This topic came up during the pre-test interview and the pre-test respondent spoke at length about how this process has changed.

Section G. Closing questions

Revised the first question in this section to understand the respondent’s experience being a WIC vendor and why it was a positive or negative one.

The study is interested in the experience being a WIC vendor, rather than just their overall satisfaction or dissatisfaction.

Recruitment materials

Study description for staff from WIC vendors, farmers’ markets, and roadside stands

Added this link as a resource for respondents if they wanted more information about the modernization efforts.

WIC vendor staff and farmers’ market and roadside stand sellers and staff interview recruitment email from the Mathematica team

No suggested revisions.




Impact study

  • Summary of the program staff experience survey pre-test findings. The study team revised the survey instrument based on feedback gathered through the pre-test. The study team spelled out each acronym at first mention, clarified that the respondent should only respond with ARPA-funded modernization activities in mind, and added language to remind the respondent that their responses would not be shared with their workplace. The study team added examples to illustrate some of the response options, so they were clearer for respondents. The study team added additional response options that pre-test respondents highlighted as relevant and important. Finally, the study team revised a complex question by placing it in grid format, enabling respondents to respond separately to each component.

  • Summary of the vendor staff experience survey pre-test findings. The study team revised the survey instrument based on feedback gathered through the pre-test. The study team added wording to the survey introduction encouraging respondents to consult with colleagues if they had difficulty answering any questions. We added examples to illustrate some of the response options, while clarifying that the list of examples is not exhaustive. We added a definition of an A50 vendor to improve respondents’ understanding of the term. We replaced the term, WIC/FMNP authorized outlet with WIC vendor or farmer/market outlet as the original wording was confusing for respondents. We expanded the list of response options for some of the questions based on respondents’ feedback.

  • Summary of the WIC participant experience survey pre-test findings. The study team revised the survey instrument based on feedback gathered through the pre-test. The study team added wording to the introduction for some of the survey sections to give the respondent a better understanding of the type of information the survey sought to gather in that section. We expanded some of the response options to make them relevant to a wider group of respondents, without making the list of responses lengthier. We added response options to some of the questions based on pre-test feedback and the experiences of our pre-test respondents. We made minor wording changes to some of the survey language to improve readability for respondents.

  • Adjustments to instrument implementation based on pre-test findings. The pre-test offered the study team an opportunity to test the processes and procedures the team would follow to prepare to field the program staff, vendor staff, and WIC participant experience surveys. The study team identified some adjustments to the data collection process to ensure the collection of high-quality data.

    • The pre-test results suggest the experience surveys may be at least 10 minutes long, possibly a little longer. While we expect the actual survey completion times online will be shorter than the paper completions during the pretest, as the skip logic will be programmed, respondents will only receive questions intended for them, and they won’t be writing down pretest feedback, we may want to remove questions that are not critical to answering the study’s research questions.

    • For the vendor staff experience survey, it will be important to send the survey to the vendor staff who are knowledgeable about working with WIC and implementing the modernization efforts. To accomplish this, it will be important to provide guidelines during outreach to the vendors to make sure the survey goes to the staff member most knowledgeable about WIC.

    • For the program staff experience survey, it will be important to understand if the local agency staff are also involved with the local clinics, as this will dictate the sets of questions the respondent will receive. We have added a question to the start of the survey to gather this information, as it might not be information known during the sampling process.

Detailed descriptions of revisions to each instrument and their respective recruitment materials, as well as the rationale behind those revisions, are available in Table 3.

Table 3. Revisions to the impact study instruments

Instrument

Revisions

Rationale

Program staff experience survey and recruitment materials

Program staff experience survey

Overarching revisions

Spell out all acronyms at first mention, even those that seem like common knowledge.

Improve respondents’ understanding of the terminology in the survey.

Introduction

Clarify that this survey is interested only in the ARPA-funded modernization efforts

Some State and local agencies might have other modernization efforts happening.

Section A. Background characteristics

Added language to A3 and A4 to remind the respondent that their survey responses would not be shared with their workplace.

Added this language to help the respondent feel comfortable answering the questions candidly about how satisfied they were at work and likelihood of leaving.

Sections B–C. Experience with the WIC modernization activities

INTRODUCTION: Added language back in that we are interested in the ARPA-funded modernization efforts.

Clarified for the respondents the modernization efforts they should be thinking of when responding.

B1 and B2: Added “Since January 2022” to these questions.

Adding this specific time helped the respondent frame their thinking and made it easier to respond.

Reworded B1 to read “Since January 2022, which activities have you been focused on with the WIC modernization efforts, through applying for funding for new efforts, planning for new efforts, or conducting new efforts?”

Expanded the question to include applying for funding for the various types of modernization efforts, as this is an important part of implementation.

B1, B2, and B3: Added examples in the list of response options to illustrate the response options (B1–B3 are the same list).

These examples help the respondent understand what we are referring to in each response

B4: Added the response options “Feedback and data on modernization effort progress and impacts” and “Having more knowledge/training/education about existing technologies.”

Pre-test volunteers felt these would be relevant response options to include.

B11 and B18: Added the response option “Upgrade the program’s software so it works reliably and efficiently.”

A pre-test volunteer suggested this response option because it does not technically fall under “hardware.

B19: Revised the first response option by adding “mid-certification/eligibility appointment” to “Certification.”

A pre-test volunteer suggested this revision to make it more inclusive.

C2: Revised the wording to read, “The Lab at OPM has provided trainings in human-centered design (HCD) techniques for some WIC staff. If you have taken one of these trainings, have you used any HCD techniques in your work?

Streamlined the wording and tried to explain the types of training we were interested in. Removed “FNS-sponsored trainings” because that was confusing for respondents.

C6: Revised the question, putting it in grid form, allowing the respondent to provide separate answers about how staff provide care that is tailored to meet WIC participants’ needs.

The original question asked the respondent to provide only one response while considering multiple elements, which was difficult. Also clarified that we were interested in whether this was currently happening.

Recruitment materials

Study description for WIC State agencies

WIC staff survey invitation email from the Mathematica study team

No suggested revisions.


Vendor staff experience survey and recruitment materials

Vendor staff experience survey

Introduction

Added the wording, “If you don’t know the response to a question, please feel free to consult with a colleague.”

Some respondents noted that it was hard to answer some questions on their own.

Section A. Vendor/outlet background

A1: Added examples of conventional grocery stores. Throughout the list of response options, clarified that the list of examples is not exhaustive.

A1a: Added a definition of an A50 vendor.

Made these revisions to improve the respondents’ understanding of the response options.

A1b. Replaced the term “WIC/FMNP authorized outlet” with “WIC vendor or farmer/market outlet.” Made this change here and throughout.

Some of the pre-test respondents were not familiar with the term “A50 vendor.

A1c: Added instructions that ask respondents to select the role they spend the most time in if they fill multiple roles. Also, added the response category for “Information technology (IT) staff.”

Respondents found the original term confusing.

Section B. WIC vendor staff satisfaction and experiences

B18: Spelled out “EBT” at first use (electronic benefit transfer). Also, added the response option “Improve the checkout processing software to reduce the number of technical difficulties.”

Pre-test respondents noted staff could fill multiple roles. A pre-test respondent also noted that IT staff could play an important role at a vendor and suggested adding it to the response list.

Vendor staff found acronyms to be confusing. Defined them at first use.


Section D. WIC vendor and farmer/market outlet staff satisfaction with State agency communications and interactions

D4: Added response options regarding frequency of in-person visits by WIC staff to offer support; and in-person auditing visits.

Added the response option based on pre-test input.

Per pre-test feedback, separated supportive visits by WIC staff from auditing visits from WIC staff.

Recruitment materials

Study description for staff from WIC vendors, farmers’ markets, and roadside stands

WIC vendor staff, farmers’ market, and roadside stand sellers and staff survey invitation email from Mathematica

No suggested revisions.


WIC participant experience survey and recruitment materials

WIC participant experience survey


Section A. Your Family’s Participation in WIC

A2: Added the wording “if applicable” to the question, “This question asks about WIC participation for people in your family. Please include any foster children, if applicable …

A pre-test respondent suggested adding this wording to make this question clearer to respondents.

Section B. Experience with the WIC Program

In the section’s introduction, added the wording “We want to learn about how satisfied you’ve been with different parts of the WIC program and if your satisfaction has changed over time.”

This change gives the respondent a better understanding of the type of information this section is trying to gather.

B3: Replaced “How did you hear about the WIC program?” with “How did you learn about the WIC program?” Also added a response option, “Visiting your State’s WIC website.

A pre-test respondent suggested rephrasing, as many people might have heard about WIC. The new response option is more specific than just “doing an internet search.”

B9: Added “or eat” to the response option “My family and I don’t like or eat some WIC foods.” Also expanded a response option by adding “almost always” to “I always/almost always redeem all my benefits.”

A pre-test respondent noted they do not eat some of the WIC-eligible foods.

Added “almost always” to a response option because a pre-test respondent noted she almost always redeems her benefits.

Introduced language before B11: Broke the introduction into two sentences and simplified the language.

Revisions sought to improve readability

B13: Added the response option, “I email to schedule my appointments”

One pre-test respondent noted they use email to schedule their WIC appointments.

B17: Added a response option, “Technical difficulties made it hard to use [WIC STATE APP NAME].” Also added a response option, “I do not find the [STATE WIC APP NAME] helpful.”

A pre-test respondent experienced technical difficulties that prevented her use of the app. Another respondent did not find the app to be helpful and that was why they did not use it.

Introduced language before B20: Expanded the introduction by adding the line, “We want to learn about how satisfied you’ve been with these experiences and if your satisfaction has changed over time.

Expanded the introduction so respondents have a better understanding of the type of information we hope to gather in the coming questions.

Introduced language before B25: Expanded the introduction by adding the line, “We want to learn about how satisfied you’ve been with this experience and if your satisfaction has changed over time.”

Expanded the introduction so respondents have a better understanding of the type of information we hope to gather in the coming questions.

B27: Made minor wording changes to some of the response options.

The minor wording changes sought to make the response options clearer for the respondents.

Recruitment materials



Study Description for WIC Participants

Added a link to the WIC modernization efforts.


Added this link as a resource for respondents if they wanted more information about the modernization efforts.

WIC participant survey invitation email from WIC State agency

Added “the United States Department of Agriculture (USDA)” to the invitation letter to accompany “FNS.”

A pre-test respondent noted that we neglected to include “USDA” in the survey invitation email. Added it to be consistent with the terms used in other materials.



1 For ease of exposition, we use the term vendor to mean WIC vendors and farmers or markets that are WIC-authorized outlets.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMathematica Memo
Subjectmemo
AuthorJason Pamintuan
File Modified0000-00-00
File Created2025-06-13

© 2025 OMB.report | Privacy Policy