Appendix F1. Pre-Test Memorandum

App F1 Pretest Memorandum.docx

Third National Survey of WIC Participants (NSWP-III)

Appendix F1. Pre-Test Memorandum

OMB: 0584-0641

Document [docx]
Download: docx | pdf

APPENDIX F1



PRE-TEST MEMORANDUM



Third National Survey of WIC Participants (NSWP-III)


Capital Consulting Corporation

2M Research Services

Abt Associates, Inc.

Order # AG-3198-K-15-0077

Tony Panzera, COR


March 21, 2017

Deliverable 3.4 Pretest Memorandum

Table of Contents

Introduction 1

I. Overview 1

II. Challenges in Conducting the Pretest 4

A. Obtaining OMB Clearance to Conduct Pretest 4

B. Obtaining Participant Data from State and Local Agencies 4

C. Recruiting Respondents 6

State Agency Survey 8

I. State Agency Survey Pretest Methodology 8

A. Instrument Description 8

B. Respondent Selection 8

C. Pretesting and Debriefing Procedures 8

D. Analysis 9

II. Summary of Results of State Agency Survey Pretest 9

III. Pretest Burden Estimates of the State Agency Survey 10

IV. State Agency Survey Recruitment and Communication Materials Pretested in the Lab 14

Local Agency Survey 16

I. Local Agency Survey Pretest Methodology 16

A. Instrument Description 16

B. Respondent Selection 16

C. Pretesting and Debriefing Procedures 16

D. Analysis 17

II. Pretest Burden Estimates of the Local Agency Survey 17

III. Local Agency Survey Pretest Findings and Recommendations 18

IV. Local Agency Survey Recruitment and Communication Materials Pretested in the Lab 22

Certification Survey 24

I. Certification Survey Pretest Methodology 24

A. Instrument Description 24

B. Respondent Selection 26

C. Pretesting and Debriefing Procedures 27

D. Analysis 28

II. Pretest Burden Estimates of the Certification Survey 28

III. Certification Survey Pretest Findings and Recommendations 29

A. Proof of Identity 29

B. Proof of Residency 29

C. Participant Category (Infant/Child Version) 29

D. Adjunctive Eligibility 29

E. Income Source Documentation 30

F. Debriefing 30

G. Recommendations 30

Denied Applicant Survey 35

I. Denied Applicant Survey Pretest Methodology 35

A. Instrument Description 35

B. Respondent Selection 36

C. Pretesting and Debriefing Procedures 38

D. Analysis 39

II. Pretest Burden Estimates of the Denied Applicant Survey 39

III. Denied Applicant Survey Pretest Findings and Recommendations 40

A. Proof of Identity 40

B. Proof of Residency 40

C. Participant Category 40

D. Adjunctive Eligibility 41

E. Income Source Documentation 41

F. Debriefing 41

G. Recommendations 41

Program Experiences Survey 46

I. Program Experiences Survey Pretest Methodology 46

A. Instrument Description 46

B. Respondent Selection 46

C. Pretesting and Debriefing Procedures 46

D. Analysis 47

II. Pretest Burden Estimates of the Program Experiences Survey 47

III. Program Experiences Survey Pretest Findings and Recommendations 48

Former Participant Case Study 51

I. Former Participant Case Study Pretest Methodology 51

A. Instrument Description 51

B. Respondent Selection 51

C. Pretesting and Debriefing Procedures 51

D. Analysis 52

II. Pretest Burden Estimates of the Former Participant Case Study 52

III. Former W Participant Case Study Pretest Findings and Recommendations 53

Discussion and Conclusion 57

I. Lessons Learned for Full Data Collection 57

A. Obtaining Data from State and Local Agencies 57

B. Recruiting Respondents 57

C. Questionnaire Development 58

II. Next Steps 58




List of Tables

Table 1. List of Data Collection Instruments Pretested in the Field and in the Lab 2

Table 2. Number of Respondents Who Participated in the Pretest by Data Collection Activity 3

Table 3. State Agency Survey Pretest Timeline 9

Table 4. Burden on State Agency Survey Respondents 10

Table 5. State Agency Survey Pretest Feedback 10

Table 6. State Agency Survey Invitation Email Pretest Feedback 14

Table 7. State Agency Survey Recruitment and Communication Materials Pretest Feedback 14

Table 8. Local Agency Survey Pretest Timeline 17

Table 9. Burden on Local Agency Survey Respondents 17

Table 10. Local Agency Survey Pretest Feedback 18

Table 11. Local Agency Survey Invitation Email Pretest Feedback 22

Table 12. Local Agency Survey Recruitment and Communication Materials Pretest Feedback 22

Table 13. Certification Survey Topic Areas 26

Table 14. Certification Survey Pretest Respondents 26

Table 15. Number of Days Between Certification Date and Date of Certification Survey 26

Table 16. Administration Time (mins) for Certification Survey Pretest, Overall and by Language and Version 28

Table 17. Income Sources and Documentation for Adjunctively and Non-Adjunctively Income-Eligible Respondents to the Certification Survey Pretest 34

Table 18. Denied Applicant Survey Topic Areas 36

Table 19. Denied Applicant Survey Pretest Respondents for In-Person and Abbreviated Versions 36

Table 20. Denied Applicant Survey In-Person Pretest Respondents 37

Table 21. Number of Days Between Application Date and Survey Administration Date for the Denied Applicant Pretest 37

Table 22. Abbreviated Adult Version of Denied Applicant Survey Administered by Telephone 38

Table 23. Administration Time (mins) for Denied Applicant Survey Pretest, Overall and by Language and Version 40

Table 24. Income Sources and Documentation for Respondents to the Denied Applicant Survey Pretest 45

Table 25. Program Experiences Survey Pretest Timeline 47

Table 26. Administration Time (mins) for Program Experiences Survey Pretest, Overall and by Language and Version 47

Table 27. Program Experiences Survey Pretest Feedback 48

Table 28. Program Experiences Survey Invitation Telephone Script Pretest Feedback 50

Table 29. Former Participant Case Study Pretest Timeline 52

Table 30. Administration Time (mins) for Former Participant Case Study Pretest, Overall and by Language and Version 52

Table 31. Former Participant Case Study Pretest Feedback 53

Table 32. Former Participant Case Study Interview Invitation Telephone Script Pretest Feedback 56



Appendices

Appendix A1. NSWP-III Research Purpose and Objectives

Appendix B1. State Agency Survey

Appendix B2. Local Agency Survey

Appendix B3.a Certification Survey: Version A (Adult)-English

Appendix B3.b Certification Survey: Version B (Infant/Child)-English

Appendix B4.a Denied Applicant Survey: Version A (Adult)-English

Appendix B4.b Denied Applicant Survey: Version B (Infant/Child)-English

Appendix B5.a Program Experiences Survey: Version A (Adult)-English

Appendix B5.b Program Experiences Survey: Version B (Infant/Child)-English

Appendix B6.a Former Participant Case Study: Interview Guide-English

Appendix C1. Notification Email to Regional and State Offices

Appendix C2. Letter to State Agencies from Regional Offices

Appendix C3. State Agency Survey Invitation Email

Appendix C4. State Agency Survey Invitation Letter with Instrument

Appendix C5. State Agency Survey Reminder Email

Appendix C6. State Agency Survey Reminder Telephone Script

Appendix C7. Local Agency Survey Invitation Email

Appendix C8. Local Agency Survey Invitation Letter with Instrument

Appendix C9. Local Agency Survey Reminder Email

Appendix C10. Local Agency Survey Reminder Telephone Script

Appendix C11.a Certification Survey Recruitment Telephone Script-English

Appendix C12.a Certification Survey Recruitment In-Person Script-English

Appendix C13.a Text Message Reminder for Scheduled Certification Survey-English

Appendix C14.a Telephone Reminder for Scheduled Certification Survey-English

Appendix C15.a Denied WIC Applicant Survey Recruitment Telephone Script-English

Appendix C17.a Text Message Reminder for Scheduled Denied Applicant Survey-English

Appendix C18.a Telephone Reminder for Scheduled Denied Applicant Survey-English

Appendix C19.a Program Experiences Survey Invitation Telephone Script-English

Appendix C23.a Former Participant Case Study Interview Invitation Telephone Script-English

Appendix D1. Study Description for State and Local Agencies

Appendix D2. State Agency Survey Thank You Letter

Appendix D3. Certification End Date Verification Email

Appendix D4. Certification End Date Verification Reminder Telephone Script

Appendix D5. Local Agency Survey Thank You Letter

Appendix D6.a Certification Survey Information Letter from State Agencies-English

Appendix D7.a Participant Consent Form-Certification Survey-English

Appendix D9.a Participant Consent Form-Denied Applicant Survey-English

Appendix F1. State Agency Survey Debriefing Interview Guide

Appendix F2. Local Agency Survey Debriefing Interview Guide

Appendix F3.a Certification Survey Debriefing Interview Guide-English

Appendix F4.a Denied Applicant Survey Debriefing Interview Guide-English

Appendix F5.a Program Experiences Survey Debriefing Interview Guide-English

Appendix F6.a Former Participant Case Study Interview Debriefing Interview Guide-English

Introduction

I. Overview

This memorandum describes pretesting of data collection procedures and provides suggested revisions to data collection instruments that will be used in full data collection for the “Third National Survey of WIC Participants” (NSWP-III), in 2018. The objectives of the pretest were to:

1. Understand respondent burden and determine the best ways to keep respondent burden within target range;

2. Assess and improve the survey flow;

3. Gain understanding of parts of the survey that need clarification; and

4. Ensure response categories and questions are adequate and cover all applicable topics.


A primary objective of NSWP-III is to provide the U.S. Department of Agriculture (USDA) Food and Nutrition Service (FNS) with nationally representative estimates of improper payments from the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) program arising from errors in the certification or denial of WIC applicants, in order to fulfill the requirements of the Improper Payment Elimination and Recovery Act (IPERIA) of 2012.1 In addition, the study will investigate potential State agency (SA) and local agency (LA) characteristics that may correlate with these errors and will assess WIC participants’ reasons for satisfaction or dissatisfaction with the program. NSWP-III builds on three previous studies and reports that span several decades. The Research Purpose and Objectives for NSWP-III can be found in Appendix A1.

To accomplish study objectives, Capital Consulting Corporation, Abt Associates, and 2M Research Services (henceforth referred to as “the research team”) planned the following full data collection activities for 2018: (1) a State Agency Survey with 90 agencies, including 50 States and the District of Columbia, 34 Indian Tribal Organizations (ITOs), and 5 U.S. Territories; (2) a Local Agency Survey with 1,500 local agency directors; (3) a Certification Survey with up to 2,000 recently certified WIC participants; (4) a Denied Applicant Survey with up to 240 WIC applicants who did not qualify for the program; (5) a Program Experiences Survey with up to 2,500 current WIC program participants; and (6) a Former Participant Case Study with 125 inactive WIC program participants who have stopped redeeming WIC benefits.

Data collection materials were tested one of two ways: in the field with recruited respondents or in the research team’s lab by trained research assistants. A complete list of the data collection materials pretested in the field and in the lab for the full data collection in 2018 is shown in Table 1. Pretest methods listed as “n/a” are materials that will only be used in the complete study, but were not part of the pretest methodology.

Table 1. List of Data Collection Instruments Pretested in the Field and in the Lab

Appendix ID

Document Name

Type

Pretest Method

English

Spanish

B1

 

State Agency Survey

instrument

Field

C1

 

Notification Email to Regional and State Offices

recruitment materials

Lab

C2

 

Letter to State Agencies from Regional Offices

recruitment materials

Lab

C3

 

State Agency Survey Invitation Email

recruitment materials

Field

C4

 

State Agency Survey Invitation Letter with Instrument

recruitment materials

Lab

C5

 

State Agency Survey Reminder Email

recruitment materials

Lab

C6

 

State Agency Survey Reminder Telephone Script

recruitment materials

Lab

D1

 

Study Description for State and Local Agencies

communication materials

Lab

D2

 

State Agency Survey Thank You Letter

communication materials

Lab

B2

 

Local Agency Survey

instrument

Field

B7

 

Denied Applicant Log

instrument

Lab

C7

 

Local Agency Survey Invitation Email

recruitment materials

Field

C8

 

Local Agency Survey Invitation Letter with Instrument

recruitment materials

Lab

C9

 

Local Agency Survey Reminder Email

recruitment materials

Lab

C10

 

Local Agency Survey Reminder Telephone Script

recruitment materials

Lab

D1

 

Study Description for State and Local Agencies

communication materials

Lab

D3

 

Certification End Date Verification Email

communication materials

Lab

D4

 

Certification End Date Verification Reminder Telephone Script

communication materials

Lab

D5

 

Local Agency Survey Thank You Letter

communication materials

Lab

B3.a

B3.c

Certification Survey: Version A (Adult)

instrument

Field

B3.b

B3.d

Certification Survey: Version B (Infant/Child)

instrument

Field

C11.a

C11.b

Certification Survey Recruitment Telephone Script

recruitment materials

Field

C12.a

C12.b

Certification Survey Recruitment In-Person Script

recruitment materials

n/a

C13.a

C13.b

Text Message Reminder for Scheduled Certification Survey

recruitment materials

Field

C14.a

C14.b

Telephone Reminder Scheduled Certification Survey

recruitment materials

Field

D6.a

D6.b

Certification Survey Information Letter from State Agencies

communication materials

n/a

D7.a

D7.b

Participant Consent Form-Certification Survey

communication materials

Field

B4.a

B4.c

Denied Applicant Survey: Version A (Adult)

instrument

Field

B4.b

B4.d

Denied Applicant Survey: Version B (Infant/Child)

instrument

Field

C15.a

C15.b

Denied WIC Applicant Survey Recruitment Telephone Script

recruitment materials

Field

C16.a

C16.b

Denied WIC Applicant Survey Recruitment In-Person Script

recruitment materials

n/a

C17.a

C17.b

Text Message Reminder for Scheduled Denied Applicant Survey

recruitment materials

Field

C18.a

C18.b

Telephone Reminder of Scheduled Denied Applicant Survey

recruitment materials

Field

D8.a

D8.b

Denied WIC Applicant Information Letter from State Agencies

communication materials

n/a

D9.a

D9.b

Participant Consent Form-Denied Applicant Survey

communication materials

Field

B5.a

B5.c

Program Experiences Survey: Version A (Adult)

instrument

Field

B5.b

B5.d

Program Experiences Survey: Version B (Infant/Child)

instrument

Field

C19.a

C19.b

Program Experiences Survey Invitation Telephone Script

recruitment materials

Field

C20.a

C20.b

Program Experiences Survey Invitation Letter

recruitment materials

n/a

C21.a

C21.b

Program Experiences Survey Invitation Email

recruitment materials

n/a

C22.a

C22.b

Program Experiences Survey In-Person Script

recruitment materials

n/a

D10.a

D10.b

Program Experiences Survey Invitation Postcard

communication materials

n/a

D11.a

D11.b

Participant Information Brochure

communication materials

n/a

D12.a

D12.b

Program Experiences Survey Thank You Letter and Gift Card

communication materials

n/a

B6.a

B6.b

Former Participant Case Study Interview Guide

instrument

Field

C23.a

C23.b

Former Participant Case Study Interview Invitation Telephone Script

recruitment materials

Field

D13.a

D13.b

Former Participant Case Study Interview Thank You Letter and Gift Card

communication materials

n/a


Data collection materials pretested in the field were tested with respondents recruited using SA and LA contact information supplied by FNS and participant/applicant data provided by one SA. Table 2 below provides a summary of the number of respondents who pretested each data collection activity. Detailed results and findings from the pretest of each data collection activity can be found in their respective chapters.

Table 2. Number of Respondents Who Participated in the Pretest by Data Collection Activity

Respondent Type

Research Activity

# of Respondents

State WIC Agency Director

State Agency Survey

9

Local WIC Agency Director

Local Agency Survey

9

WIC Participants

WIC Participant Certification Survey

9

Program Experiences Survey

9

Former Participants Case Study

9

Denied WIC Applicants

Denied Applicant Survey

9

Total

54



II. Challenges in Conducting the Pretest



The research team experienced several challenges in conducting the pretest. These challenges were described in monthly reports delivered to FNS between April 2016 and March 2017, and described in further detail in three memoranda submitted to FNS on January 6, 2017 (Update on Pretest Schedule), January 24, 2017 (Update on Pretest Progress and Sampling for NSWP-III), and March 3, 2017 (Status of Denied Applicant Pretest).The challenges fall into three larger categories: (a) obtaining OMB clearance, (b) obtaining participant data from SAs and LAs, and (c) recruiting respondents. These challenges are discussed in this section, and lessons learned with recommendations are presented in the final section of this memorandum.



  1. Obtaining OMB Clearance to Conduct Pretest



On April 15, 2016, the research team joined FNS in a session with FNS’s Planning & Regulatory Affairs Office (PRAO) to learn more about the process of obtaining OMB clearance. During this session, FNS informed the research team that instead of each instrument representing a data collection limited to nine respondents without OMB approval, the entire study would be considered one data collection and could only include nine respondents overall. Since each of the six instruments required nine respondents for pretesting (54 respondents in all), OMB clearance was required to pretest the data collection instruments. This added task of obtaining OMB clearance required significant changes in the study timeline which resulted in a contract modification. The revised schedule of deliverables (dated June 3, 2016, known as “Version 4”) and revised budget were approved by FNS with a contract modification on June 3, 2016.

FNS notified the research team on July 19, 2016 that the “Supporting Statement” for generic OMB clearance for the pretest of recruitment materials and data collection instruments was submitted to the Office of the Chief Information Officer (OCIO). FNS submitted the “Supporting Statement” for generic OMB clearance for the pretest of recruitment materials and data collection instruments to OMB on July 21, 2016. Version 4 of the schedule of deliverables stated that the pretest data collection would take place from August 1 to September 9, 2016. The research team was notified on September 22, 2016 that OMB clearance was obtained.

  1. Obtaining Participant Data from State and Local Agencies



The research team initiated the process of obtaining participant data needed for respondent recruitment for the pretest of the Certification Survey, Denied Applicant Survey, Program Experiences Survey, and the Former Participant Case Study in August 2016. Two SAs, Massachusetts and Illinois, were selected to provide participant data for the pretest. The research team encountered several challenges and delays while obtaining participant data. FNS sent the “Pretest Notification Emails” (approved by FNS on July 28, 2016) to Regional Offices (ROs) on August 3, 2016, followed by hard copies. The ROs were instructed to notify the SA directors in both States that the research team was interested in obtaining participant data from one or more of three preferred LAs in their jurisdiction.

The SAs were requested to identify the most suitable candidate from the listed LAs and notify them of the study. The research team contacted the selected LA and requested the participant and denied applicant information needed for the pretest of the Certification Survey, Denied Applicant Survey, Program Experiences Survey, and Former Participant Case Study. This approach met with several barriers, detailed in the sections for each State below.

Illinois

The research team approached both the Illinois SA (August 9, 2016) and LA (August 10, 2016) to obtain participant data needed. After several discussions with the Illinois LA, the research team learned that the only data output it could provide was in the form of predefined, printed reports from the State’s Management Information System (MIS). Furthermore, when the research team provided a data use agreement (DUA), the LA was unclear whether it had the authority to sign as the “provider” of the data since the data were retained in the State’s database.

The research team and LA contacted the Illinois SA on August 24, 2016 for clarification. The SA contact referred the question to its legal team. On September 1, 2016, the SA notified all parties that it would be the provider of the data and that it would be the signatory of the DUA and would pull the data for the research team. The Illinois SA provided the research team with its own version of a Data Sharing Agreement (DSA) and a 19-page IT security questionnaire to be completed by each company that would have access to the pretest data. The research team returned the questionnaires on September 19, 2016. On September 30, 2016, the Illinois SA provided a second, 22-page version of the IT security questionnaires for each research team contractor to complete. Additional clarification questions were also asked regarding the responses in the first questionnaire.

The research team sent revised IT security questionnaires and other materials to the Illinois SA on October 31, 2016. The research team received a revised DSA and received feedback on the research team’s IT security questionnaires from the Illinois SA on November 21, 2016. The research team responded to questions regarding the security questionnaires and submitted revised versions to the SA on December 9, 2016 and again on December 12, 2016. The revised IT security questionnaires and DSA were sent by Illinois Department of Human Services (IDHS) to the Illinois SA’s legal department on December 12, 2016 and on December 19, 2016.

The research team submitted a memorandum on January 6, 2017 to update FNS on pretesting that had already begun in Massachusetts and to present contingency plans if Illinois further delayed access to its data. The memorandum proposed that if the research team was unable to obtain participant data from the Illinois SA by January 19, 2017, the pretest should proceed using only Massachusetts participant data. On January 20, the Illinois SA stated it was not comfortable releasing the data to the research team and would seek more guidance from FNS. On January 25, 2017, FNS gave the research team permission to conduct all pretest activities in Massachusetts only.

If necessary, the research team will continue to work with FNS to obtain participant data needed from the Illinois SA for the full data collection of the Certification Survey, Denied Applicant Survey, Program Experiences Survey, and the Former Participant Case Study. The current sample does not include Illinois.

Massachusetts

The research team made initial contact with the Massachusetts SA on August 11, 2016. On August 23, 2016, the Massachusetts SA refused to provide information, stating that our request was “not in line with USDA’s regulations around participant confidentiality.” The SA stated we would need to get consent from each participant before their contact data was released, and that the study would likely need to go through the State’s Institutional Review Board (IRB) for review. FNS prepared a letter to explain to the Massachusetts SA that they can release contact data and that the study’s current IRB approval should suffice. This letter was finalized and sent by FNS in October 2016.

Following the receipt of the letter from FNS, the Massachusetts SA contacted the research team on October 27, 2017 and stated they would participate but would need to submit the study for IRB clearance. During the IRB review process, the Massachusetts SA provided the research team with its Confidentiality Agreement to prepare for a data pull when approvals were obtained. The research team received a revised DSA from Massachusetts on November 21, 2016. The DSA was reviewed and signed by all contracted parties on December 14, 2017. Notification of the Massachusetts IRB approval was received on December 6, 2016. The research team obtained participant data from the Massachusetts SA on December 15, 2016.

Lessons Learned from Illinois and Massachusetts

The pretest revealed that obtaining the data from States will take some time. The language provided by FNS should be very helpful going forward but the research team believes that we should be prepared to submit protocols to State IRBs, negotiate different types of data sharing arrangements, and establish communication channels with the various personnel who will be providing the data.



  1. Recruiting Respondents

Several challenges were encountered during data collection. Some of these challenges were anticipated by the research team and emphasized by members of the Expert Panel.2

This section provides a brief overview of the general challenges experienced in recruiting for agency surveys and participant surveys. Challenges that were particular to individual surveys will be discussed in the related survey sections that follow.

State and Local Agency Surveys

Once all approvals were received, FNS sent letters on December 9, 2016 to the ROs to forward to SAs that had been selected to participate in the State Agency Survey. Timing of the initial contact near the holidays caused the initial response to be delayed for several agencies. Asking the SA to recommend an LA for inclusion caused some delay in identifying the LAs that would participate in the study. This experience should be lessened in the national survey since LAs to be notified by the SAs will already be identified.

Participant Surveys

Challenges in completing the participant surveys were similar to those expressed by members of the Expert Panel and experienced in prior surveys of this population. The greatest barrier was getting people to answer the phone. Since the objective of the pretest was obtaining a convenience sample to test the instruments, not all of the protocols that will be employed in the full study were used for pretest recruitment, notably:

  • Introduction letter

  • Expanded evening and weekend calling

  • Voicemails to all non-respondents

  • Reminder postcard

  • A voicemail on call seven that if they do not respond to the next call or call us back, we may visit them for an in-person interview

  • A door knock for selected non-respondents (Program Experiences Survey, Certification Survey, and Denied Applicant Survey)

Other challenges were encountered from the delay in obtaining data from Illinois and the resultant pausing and restarting of data collection activities for in-person surveys in Massachusetts. In particular, the predetermined mix of Spanish, Adult, and Child versions of the surveys needed to fulfill the pretest was not reflective of the limited sample the LA was able to provide. This circumstance required using additional LAs to fulfill our pretest quotas, especially after Illinois did not provide data.

While the challenge related to setting quotas that were not reflective of a small Denied Applicant (DA) population may not be as acute in the national survey, other challenges are likely to occur. In particular, LAs may not keep routine records of DAs. For such cases, the research team created a simple logging system to help the LA track DAs and then provide contacts to the research team. Additionally, the small numbers encountered in the pretest are a function of the short timeframe allowed for the pretest data collection window. In the full study, the longer data collection window will result in more cases of DAs.

State Agency Survey

I. State Agency Survey Pretest Methodology

  1. Instrument Description

Federal guidelines grant SAs substantial authority to determine the SA’s Program operations and procedures, including providing guidance to LAs on implementing procedures to determine an applicant’s eligibility, negotiating and determining food options, establishing application and payment procedures, and establishing program data management systems and procedures. The State Agency Survey is designed to identify certification-related policies and practices that each SA has established under these discretionary powers and to enable comparisons of their potential effects. The data collection from this survey will be used to determine the association between SA policies and the national certification error rate. The State Agency Survey was created by incorporating and modifying questions from NSWP-II. Some questions are new to the NSWP-III survey; some NSWP-II questions have been dropped due to changes in the program since the prior study was conducted, and other questions have been dropped because they were outside the scope of research questions detailed in the NSWP-III Performance Work Statement (PWS).

  1. Respondent Selection

The research team selected three FNS Regional Offices and nine SAs under the jurisdiction of those offices as the primary SA sample. The research team submitted “Memo for Recommended Pretest ROs and States” to FNS on September 6, 2016. FNS approved the recommended SAs for pretest on September 22, 2016. The research team sought a mix of urbanized and rural States from each of three regions. The research team also included an ITO and a U.S. Territory among the nine responding SAs.

FNS aided the research team in contacting the ROs of the three selected regions. The ROs were asked in a December 9, 2016 letter from FNS to notify the nine selected SAs to expect an invitation to participate in the pretest and to confirm contact information of the person who will be responsible for completing the survey.

  1. Pretesting and Debriefing Procedures

The research team mailed a hard copy of the invitation letter and two hard copies of the State Agency Survey (Appendix B1) (one to keep and one to return) to nine of the selected SA directors. The package was sent using FedEx priority mail. A paper-and-pencil, self-administered State Agency Survey was administered to nine SA directors.

The research team followed up with an email and a telephone call within 2 days of expected delivery of the package to the SA director. The purpose of the email and call was to answer any questions, identify the point of contact, and emphasize the timing of the pretest. The SA director was also requested to provide for the research team contact information for primary and backup LA directors. These appointed LAs became the respondent sample for the pretest of the Local Agency Survey. The ITO and U.S. Territory included in the sample also acted as the LA for their jurisdictions. Therefore, two LAs were selected from the backup LAs provided by other SAs. Reminder emails were sent to the SA directors each week of the data collection period. No more than four reminder emails were sent. Follow-up phone calls continued periodically until a contact and completed questionnaire were obtained.

Once the research team received a completed State Agency Survey, a trained interviewer contacted the SA director to schedule a 30-minute debriefing telephone interview. During the debriefing telephone interview, the interviewer asked the SA director to estimate how long it took them to complete the survey, and to identify any survey questions that were confusing or difficult to answer (Appendix F1). Once the pretest data collection period ended, the research team sent a thank-you email to all SA directors who participated in the pretest. Feedback provided during the pretest was used to revise the State Agency Survey. The survey will be programmed into the web-based survey software program, Qualtrics, for use with the full study sample.

Table 3. State Agency Survey Pretest Timeline

State Agency

Survey Sent

Completed Survey

Completed Debriefing

Provided LA Contact Info

SA-1

December 13, 2016

January 20, 2017

February 3, 2017

December 27, 2017

SA-2

December 13, 2016

February 17, 2017

February 23, 2017

February 24, 2017

SA-3

December 13, 2016

December 21, 2016

December 28, 2016

December 21, 2016

SA-4

December 13, 2016

December 30, 2017

January 13, 2017

January 13, 2017

SA-5

December 13, 2016

January 10, 2017

January 17, 2017

N/A

SA-6

December 13, 2016

December 16, 2016

December 21, 2016

December 21, 2016

SA-7

December 14, 2016

January 4, 2017

January 19, 2017

N/A

SA-8

December 14, 2016

December 22, 2016

December 22, 2016

December 23, 2016

SA-9

December 13, 2016

February 15, 2017

February 22, 2017

February 15, 2017


  1. Analysis

The data collected during instrument pretesting, as well as feedback received during debriefing interviews, were evaluated and discussed by the research team. The research team has provided recommendations, supported by the analysis of respondent feedback, for each instrument, following the reported results in each section. Findings are individually presented for each of the pretested instruments in the sections that follow.

II. Summary of Results of State Agency Survey Pretest

The State Agency Survey was estimated to take 66 minutes to complete. Respondents were asked to indicate approximately how long it took them to complete the State Agency Survey (see Table 4). Response times ranged from 30 minutes to 3 hours, with the average (median) response being 84 (65) minutes. Respondents who completed the State Agency Survey in less than 1 hour answered most questions themselves (and sought little support from others), and the skip pattern in the survey allowed them to skip certain questions. Respondents who completed the State Agency Survey in 2 or more hours relied heavily on others to answer questions. The most time-consuming questions were ones in the “Denied Applicants,” “Retention,” “Manufacturer Rebates,” and “Record Keeping and Systems” sections. Specifically, Q21 and Q21A took the longest. The difficulties encountered in answering these questions are described in more detail in Table 5. Several of the respondents expressed during their debriefing interview that much of the time spent answering the survey was consumed by “looking up information” and “combing through policies to make sure the right information was provided.” In response to these comments the research team created a list of information that the SA will need to complete the survey. Moreover, many questions were streamlined, while others dropped. As a result of these post-pretest revisions we believe tha the best estimate for burden remains 66 minutes.

Table 4. Burden on State Agency Survey Respondents

State Agency

Self-Reported

Time to Complete

Minutes Used for Burden Calculation

SA-1

Less than 1 hour

50

SA-2

About 1.5 hours

90

SA-3

Little over 1 hour

70

SA-4

65 minutes

65

SA-5

30 minutes

30

SA-6

1 hour

60

SA-7

Less than 1 hour

50

SA-8

3 hours

180

SA-9

2.5 hours

150

Average


83

Median


65

Minimum


30

Maximum


180


Generally, feedback on the State Agency Survey was positive. Aside from suggestions to revise specific questions (described below), comments included, “it was pretty easy to respond to,” “in the past we have done studies that take hours and are confusing but this one is pretty clear,” and “these questions [. . .] are pretty well-designed.” None of the respondents thought that the questions felt repetitive.

III. Pretest Burden Estimates of the State Agency Survey

Based on the feedback on wording, survey flow, and response choices received on the State Agency Survey from the nine SAs with which the survey was pretested, the research team suggests several general changes to this instrument. Specific changes are also suggested in Table 5.

Table 5. State Agency Survey Pretest Feedback

Question #

Respondent Comments

Recommended Survey Revisions

Q1

Most respondents (seven of nine) wrote additional types of documentation in response to this question. These included: green card; self-declaration form for migrants, homeless, and victims of disaster; Medicaid referrals; baptismal record or confirmation record with church seal; adoption papers; tribal identification card; Temporary Assistance for Needy Families (TANF) card; recent paystub or leave and earnings statement (LES); loan papers from a bank/financial company; healthcare ID card; WIC or eWIC folder. Some SAs also provided suggestions for revisions: separate “crib card” from “hospital discharge papers or hospital ID card;” “government issued” should be added to “driver’s license, State ID;” add "official" to “immunization record;” and change “foster placement letter” to “foster care placement letter.” Several SAs added “with printed name” to the response options.

  1. Add additional response options.

  2. Revise response options.

  3. Reword question to include “(assuming the documentation includes a printed name).”

Q2

Many respondents wrote in additional types of documentation in response to this question. These included: property tax bill;

SSI statement; recent paystub or LES; signed letter from shelter/hotel/motel where residing; credit card bill; school records; shelter verification letter; foster care placement letters; voter registration card; and Medicaid card. Several SAs added “with printed address” to the response options.

  1. Add additional response options.

  2. Revise response options.

  3. Reword question to include “(assuming the documentation includes a printed address).”

Q2A

Multiple respondents incorrectly marked the first response choice instead of the second response choice. One of these respondents suggested moving the second response choice to the first position because it will be the most common response from States. This is supported by the responses received from other States; seven of the nine respondents selected the second response choice.

  1. Move the second response choice to the first response choice.

Q4

Eight respondents selected the second response option: “The infant becomes categorically ineligible and needs to be certified again based on criteria used for children.” Three respondents were confused by the wording of this question since their States had differing actions for children who had been certified between the ages of 0-6 months or 7-11 months.

  1. Revise question to allow for coding of differing actions by infant age for States that have such policies.

Q7

Some respondents (three of nine) were confused by the wording of this question. Feedback received from respondents included comments such as, “Proxies and authorized representatives are different. The authorized representative is permitted to do all things in the list; however, the proxy can only redeem benefits.” Recommendations included dividing each column into two so that one column asks about the authorized representative and the other asks about the proxy.

  1. Restructure question. Add columns to each category in table so that respondent can differentiate an authorized representative from a proxy.

Q8

Most respondents (six of nine) were confused by the wording and intent of this question. Some were confused by the word “application” in this question. Some SAs document the screening/intake portion of the application online and the other nutrition assessment portion on paper. Recommendations included defining what is included in an application. Other SAs were not sure which response to choose because they only do paper applications in emergency or disaster situations.

  1. Rephrase question to understand the role of paper as part of the application process; under what circumstances paper applications are used (e.g., nutrition assessment, only during emergency or disaster situations).

Q9

A few respondents (three of nine) were unsure how to answer this question. Some SAs stated that the application is programmed in their MIS system but the probing questions can be tailored to each applicant/participant depending on their situation.

  1. Delete question.

Q11A

Two respondents were confused by the response options provided for this question. Feedback included, “’Public Assistance’ is not specific enough” and “how would ‘Public Assistance’ be different from ‘Welfare,’ ‘Energy assistance,’ and ‘Rental assistance’?” One SA shared their State policy which states that their SA does accept forms of “Public assistance” and the SA defines public assistance/welfare as not including food stamps (Supplemental Nutrition Assistance Program - SNAP), Temporary Assistance for Needy Families - TANF, or Medicaid.

  1. Remove “Pubic Assistance” from the list; other forms of public assistance unique to a State may be listed in the “Other” response categories.

Q18

During the debriefing interview, the research team discovered that each of the SAs defined “retention rate” differently. Examples of retention rate among respondents included:

-Participants who finish their certification period and are certified for the next participant category

-Certified and eligible participants who pick up their food instruments as scheduled

-Overall caseload counts compared to an earlier time period

  1. Provide definition of “retention rates” after additional discussion with FNS.

Q19A

Seven of nine respondents did not answer this question because their SA does not calculate retention rates. The two SAs that do stated “computer generated report” and “(monthly USDA participation- monthly USDA participation same month prior year) / (monthly USDA participation - same month prior year).”

  1. Renumber. Incorrectly numbered Q19A, Q19B, and Q19C; all should be 18.


Q19B

Six of nine respondents did not answer this question because the SA does not calculate retention rates. One of the three respondents who did answer the question selected two response choices.

  1. Revise format of question from “select one” to “select all that apply.”

  2. Revise the focus of the question from how often the retention rate is calculated to how often it is reported internally or shared with LAs.

Q19C

Three respondents answered this question. Answers ranged from percentages to whole numbers. One SA was only able to provide retention rates for infants and children. SA was not able to provide the FY2016 data because this will not be ready until April 1; 2017 instead, SA provided us with FY2011. One SA was confused if the question was asking for the monthly average or cumulative.

  1. Add “not available” response choice to each year data is requested.

  2. Specify that we are asking for infant/child retention rates.

  3. Provide instructions on how to calculate an annual retention rate depending on the answer to 19B.

Q20 and

Q21

Five respondents had difficulty answering these questions. All nine SAs were unsure of how to record their numbers of LAs and clinics or sites. One SA has “county health department clinics and contracted clinics” and was not sure if they should be counted. Another SA “does not have any LAs but has many clinics.” One SA suggested revising questions so that the first question asks if any WIC clinics or sites see participants; the second question asks if it is a permanent site versus a satellite; and the third question asks how many LAs are administrative and see participants and how many LAs just see participants.

  1. Add sub-questions that will give us a better understanding of the structure and tailor the follow-up question based on the response. This could include a matrix of LA and clinic types that add up to a total.

Q22

Six SAs had difficulty answering this question. Multiple SAs were confused about which type to list. One SA explained that “formula comes in powder, concentrate, and ready to feed forms. It’s also available in soy and non-soy.” Several SAs explained that this information is not readily available and this question took a long time to answer. Four SAs did not answer the question about the top three infant cereals because they do not have contracts for infant cereal. One SA mentioned that this question was missing the element of time but assumed the question was about FY2016.

  1. Add timeframe “FY2016” to question.

  2. Add a column to capture the type of formulas.

  3. Provide a list of data needed in the invitation email.

  4. Add “N/A” response option.

  5. Add the FY 2016-2017 to the question.

Q22A

All nine SAs reported having difficulty answering this question because the information is not readily available. Three SAs were confused by what was meant by “Current unit price” and “Define unit.” For “Current unit price,” one SA listed the “Net price per can based on manufacturer's lowest national wholesale price less rebate per can." Another SA listed "Wholesale Price per Unit." In both of these instances, the SA was confused about whether they should be providing the actual price or the rebate price of the infant formula. Another SA stated that this question was “hard to calculate and time consuming.” This SA also couldn’t figure out which of the top three formulas they listed in the previous question Q21 should be carried to Q21A.

  1. Inform SAs that they may need to contact others in order to answer some of the questions.

  2. Define “Current unit price” to be the unit price after rebate.

  3. Revise response options under “Define the ’Unit.’”

  4. Reword the question to be more specific and bold “the most frequently purchased product.”

Q23

All nine SAs reported that they keep all response options “Over 1 Year.” During the debriefing interview SAs explained that records are “easily accessible” for 3-7 years. After that, records are archived for up to 10 years and then some agencies move the records to a server where they are stored perpetually. Some SAs were confused whether the question was asking for electronic or paper copies; which category (adult records and child records are maintained for different periods of time); and active versus inactive participant records. Four SAs stated that response options that ask about “Proofs” needs to be revised. SAs only visually verify proofs. Two SAs recommended changing the wording to “Document proofs.”

  1. Reword question to be more specific about the type and accessibility of records (easily accessible versus archived; paper versus electronic; adult versus child; active versus inactive).

  2. Refine the year scale to include additional years.

  3. Revise wording of the three response options asking about proofs.

Q25

Two SAs were confused by the wording of “denials” and which type of records the question was asking about (electronic versus paper).

  1. Reword “denials” to “denied applications.”

  2. Specify that the question is about electronic records.

Q26

Six SAs reported having difficulty answering this question because the wording was confusing and many SAs who answered the question mistakenly selected the incorrect response. During the debriefing interviews, some SAs shared that their MIS provides interface access to view certain variables for a select program; however, other SAs reported that their MIS system does not provide them with access to other programs’ records.

  1. Reword question to “Is your WIC MIS system connected to other programs’ systems, such as TANF or Medicaid, to facilitate record keeping or certification?"

Q27

Four SAs reported having difficulty answering this question because they were not familiar with their system. Each SA had to contact their MIS/IT person to obtain this information.

  1. Inform SAs that they may need to contact others in order to answer some of the questions.


Q27A

Four SAs reported having difficulty answering this question because they were not familiar with their system. Each SA had to contact their MIS/IT person to obtain this information.

  1. Inform SAs that they may need to contact others in order to answer some of the questions.



The research team also pretested the State Agency Survey Invitation Email recruitment material in the field with respondents. Respondents were not specifically asked about the invitation email; however, Telephone Interviewers (TIs) made notes based on feedback received during the debriefing interviews. The research team suggests several general and specific changes to this instrument. These are listed in Table 6. The average amount of time needed to read the State Agency Survey Invitation Email was estimated using feedback gathered in the lab. It is estimated that, on average, the State Agency Survey Invitation Email will take respondents about 1 minute to read (0.02 hours), which is lower than the estimated 3 minutes (0.05) hours approximated before pretest.

Table 6. State Agency Survey Invitation Email Pretest Feedback

Appendix ID

Document Name

Research Team Comments

Recommended Revisions

C3

State Agency Survey Invitation Email

The research team received feedback during the debriefing interviews indicating that additional information would be helpful. SA directors requested more detail about the types of information they would need and the types of staff they may need to collaborate with to complete the survey.

  1. Add a brief list of types of data the SA director may need to access or request.

  2. Provide a list of types of staff the SA director may need to contact.



IV. State Agency Survey Recruitment and Communication Materials Pretested in the Lab

Research assistants (RAs) reviewed several recruitment and communication materials in the research team’s lab. The wording, understandability, and flow were analyzed. All recruitment and communication materials were pretested in English. The research team suggests some changes to these materials. These suggestions, along with burden estimates, are provided in Table 7. The average amount of time needed to read each of these materials was estimated using feedback gathered in the lab.

Table 7. State Agency Survey Recruitment and Communication Materials Pretest Feedback

Appendix ID

Document Name

Average Burden

Research Team Comments

Recommended Revisions

C1

Notification Email to Regional and State Offices

1 minute

None

No changes at this time.

C2

Letter to State Agencies from Regional Offices

1 minute

None

No changes at this time.

C4

State Agency Survey Invitation Letter with Instrument

3 minutes

The phrase in the third paragraph, “Let me encourage you to complete…,” would be better phrased as “I encourage you to complete….”

RAs recommended moving the sentence, “Know that the information you provide will be kept private to the extent allowed by law,” in the third paragraph to the end of the fourth paragraph so as not to separate the idea of the physical survey and where it should be mailed.

  1. Revise “Let me encourage you to complete…” to “I encourage you to complete….”

  2. Move “Know that the information you provide will be kept private to the extent allowed by law.” from the third paragraph to the fourth paragraph.

  3. Revise estimated average respondent burden from 4.2 minutes (0.07 hours) to 3 minutes (0.05 hours).

C5

State Agency Survey Reminder Email

1 minute

The phrase in the fourth paragraph, “Let me encourage you to complete…,” would be better phrased as “I would like to encourage you to complete….”

  1. Revise “Let me encourage you to complete…” to “I would like to encourage you to complete....”

  2. Revise estimated average respondent burden from 3 minutes (0.05 hours) to 1 minute (0.02 hours).


C6

State Agency Survey Reminder Telephone Script

3 minutes

None

  1. Revise estimated average respondent burden from 5 minutes (0.08 hours) to 3 minutes (0.05 hours).

D1

Study Description for State and Local Agencies

3 minutes

None

No changes at this time.

D2

State Agency Survey Thank You Letter

1 minute

None

No changes at this time.



Local Agency Survey

I. Local Agency Survey Pretest Methodology

  1. Instrument Description

The Local Agency Survey focuses on the services the LA provides to WIC participants, as well as the infrastructure of the WIC agency itself, including structure of the agency, clinics and sites under the LA, income eligibility procedures, certification procedures, and food instrument or food distribution procedures. LAs will also be asked about the policies and practices at their sites. These questions will characterize heterogeneity in site-level policies and practices across the nation. The Local Agency Survey was created by incorporating and modifying questions from NSWP-II. Some questions are new to the NSWP-III survey, some NSWP-II questions have been dropped due to changes in the program since the prior study was conducted, and other questions have been dropped because they were outside the scope of research questions detailed in the PWS.

  1. Respondent Selection

SAs that agreed to participate in the pretest of the State Agency Survey were asked to help the research team identify one or two LAs that could participate in this pretest. SAs were encouraged to select LAs that they believed would be most cooperative in the short timeframe of the pretest. Two SAs (an ITO and a U.S. Territory) were not able to provide LA director contact information because the SA does not have an LA (only clinics). Two States that participated in the State Agency Survey provided contact information for a second LA director in their respective State. A total of nine LA directors pretested the Local Agency Survey.

  1. Pretesting and Debriefing Procedures

The research team emailed an invitation to selected LAs and mailed a hard copy of the invitation letter with two hard copies of the Local Agency Survey (Appendix B2) (one to keep and one to return). A paper-and-pencil, self-administered Local Agency Survey was sent to the nine selected LAs using FedEx priority mail.

The research team followed up with an email and a telephone call 2 days after the package was expected to be delivered. The purpose of this call was to answer any questions, identify the point of contact, and emphasize the timing of the pretest. Reminder emails were sent each week of the data collection period. No more than four reminder emails were sent.

Once the research team received a completed Local Agency Survey, a TI contacted the LA director to schedule a 30-minute debriefing telephone interview. During the debriefing telephone interview, the TI asked the LA director to estimate how long it took them to complete the survey, and to identify any survey questions that were confusing or difficult to answer (Appendix F2). After the pretest data collection period ended, the research team sent a thank-you email to all LA directors who participated in the pretest. Feedback provided during the pretest was used to revise the Local Agency Survey. The survey will be programmed into the web-based survey software program, Qualtrics, for use with the full study sample.

Table 8. Local Agency Survey Pretest Timeline

Local Agency

Survey Sent

Completed Survey

Completed Debriefing

LA-1

December 27, 2017

January 10, 2017

January 13, 2017

LA-2

February 24, 2017

March 3, 2017

March 7, 2017

LA-3

December 27, 2017

January 13, 2017

January 19, 2017

LA-4

January 9, 2017

January 17, 2017

January 19, 2017

LA-5

January 13, 2017

January 23, 2017

January 26, 2017

LA-6

December 27, 2017

January 4, 2017

January 12, 2017

LA-7

December 27, 2017

January 5, 2017

January 11, 2017

LA-8

December 27, 2017

January 9, 2017

January 12, 2017

LA-9

February 17, 2017

February 23, 2017

March 3, 2017


  1. Analysis

The data collected during instrument pretesting, as well as feedback received during debriefing interviews, were evaluated and discussed by the research team. The research team has provided recommendations, supported by the analysis of respondent feedback, for each instrument, following the reported results in each section. Findings are presented for each of the pretested instruments individually in the sections that follow.

II. Pretest Burden Estimates of the Local Agency Survey

The Local Agency Survey was estimated to take 40 minutes to complete. Respondents were asked to indicate approximately how long it took them to complete the Local Agency Survey (see Table 9). Response times ranged from 15-20 minutes to 2 hours, with the average response being 67 minutes. The questions that took respondents the longest included ones about the staffing across all clinics under the LA (Q27) and questions in the Retention section (Q33-Q35).

With post-pretest revisions, it is suggested that the estimated amount of time for completion be increased to 60 minutes.

Table 9. Burden on Local Agency Survey Respondents

Local Agency

Self-Reported

Time to Complete

Minutes Used for Burden Calculation

LA-1

Less than 1 hour

50

LA-2

About 1.5 hours

90

LA-3

Little over 1 hour

70

LA-4

1-2 hours

90

LA-5

15-20 minutes*

18

LA-6

45 minutes-1 hour

53

LA-7

1 hour

60

LA-8

2 hours

120

LA-9

Less than 1 hour

50

Average


68

Median


60

Minimum


18

Maximum


120

Overall, feedback on the Local Agency Survey was positive. Aside from suggestions to revise specific questions (described below), comments included, “pretty thorough,” “fairly straightforward,” and “easy to follow.”

III. Local Agency Survey Pretest Findings and Recommendations

Based on the feedback on wording, survey flow, and response choices received on the Local Agency Survey from the nine LAs with which the survey was pretested, the research team suggests several general and specific changes to this instrument, detailed in Table 10.

Table 10. Local Agency Survey Pretest Feedback

Question #

Respondent Comments

Recommended Survey Revisions

S1

One respondent indicated that the first response option (i.e., “Agency to which this survey was addressed does certifications”) could be confusing if the person responding is housed in a purely administrative office. This LA suggested removing the second response option (i.e., “Agency serves as a purely administrative office”) to reduce confusion.

  1. Instruct LAs that are purely administrative offices are to contact the project manager for a discussion. In most cases, it would be appropriate for the LA to respond.

Q1

Eight of the nine respondents added response options in the “Other: PLEASE SPECIFY” section of the question. Specific additional response options included: Medicaid card with name; recent paystub (30 days or less) with name; court-issued document (guardianship papers); WIC EBT card; child support/alimony award letter; employment paystubs; income return/W2; medical marijuana card; vehicle registration; WIC checks (if former participants); Mexican Matricula Consular ID; and eWIC card holder.

  1. Add the frequently written-in responses: employment paystubs and Medicaid card.

  2. Add “(either electronically and/or physically)” to the question wording.


Q2

All nine respondents added response options in the “Other: PLEASE SPECIFY” section of the question. Specific additional response options included: rent/mortgage receipts; letter on shelter letterhead; vehicle registration card with name/address; voter registration card; employment check stubs with address; hospital/clinic record; income tax return/W2; letter from landlord; foster placement letter; property tax bill; Social Security statement; Medicaid document showing current eligibility; TANF document showing current eligibility; recent leave and earnings statement; mailed bank statement; and phone bill.

  1. Add the frequently written-in responses (i.e., rent/mortgage receipts, shelter documentation, vehicle registration, recent paystub, property tax bill).

  2. Add “(either electronically and/or physically)” to the question wording.


Q3

Five of the nine respondents added response options in the “Other documents” section of the question. Specific additional response options included: letter from employer; foster care placement letter; scholarship/financial aid letter; cash receipts; college grants/loans; signed statement by employer; SNAP letter showing current eligibility; 1040 farm or non-farm/self-employment; Medicaid award letter; and State Healthcare Authority Eligibility/Verification system.

  1. Add the frequently written-in responses (i.e., rent, foster care placement letter, scholarship/financial aid letter, SNAP letter showing current eligibility).

Q4

Two of the nine respondents indicated that they were unclear about what was meant by “c. Active program voucher.” None of the nine pretest respondents selected this option. It was selected by less than 6 percent in the NSWP-II survey.

  1. Delete the response option; it can be picked up in the “Other” field and is less likely to occur than in NSWP-II.


Q5

Six of the nine respondents indicated that this question was either difficult to calculate, or difficult to determine what was being asked of them. Specifically, two people indicated that it was unclear whether the question was about all infants or only breastfeeding infants. One person suggested revising the question to be: “In your estimation, at what ages are breastfeeding infants being certified to receive ‘fully’ (rather than ‘partially’) breastfeeding food packages?” Four of the six respondents who reported difficulty did not actually calculate these numbers, but indicated that they provided estimates.

  1. Revise the question to be clearer. Provide the definition of the denominator (all infants who were certified to receive fully breastfeeding food packages).

  2. Emphasize that an estimate or “best guess” is okay if the information is not readily available.


Q6

One respondent indicated that the information on denied applications is kept in their State MIS system, but is not called “denied applications” anymore.

  1. Reword the title of the section from “Denied Applications” to “Denied Applicants.”

Q7

One respondent indicated that there was confusion about what was meant by an option in column C, “Where Retained. . .Your Local Agency.” This respondent asked if the purpose of this question was to determine if records were kept at a centralized location.

  1. Reword to emphasize if the record is readily accessible to the SA, LA, and/or clinic.

Q8

All respondents chose the option, “<10%.”

  1. Revise answer categories to include smaller percentages as all answered less than 10 percent.

Q9

One respondent understood this question to mean certification for participants seeking to be certified for the next participant category and that it included any reason for denial. They indicated that they were unable to calculate this number but were able to estimate it. This was the only respondent that chose an option greater than 10 percent (11-20 percent).

  1. Clarify that this question applies to current participants.

  2. Revise answer categories to include smaller percentages as eight of nine answered less than 10 percent.

Q10

Five of the nine respondents indicated some degree of difficulty in determining the percentage of denials attributed to various eligibility problems. One respondent indicated that “no nutritional risk” is not applicable for their LA, as every WIC participant is assigned this code. Only 3 percent selected this category in NSWP-II. Two respondents estimated these numbers based on their experience and reports provided by their State, respectively. One respondent did not answer this question, as they did not have the requested information. Another had to estimate based on a report provided by their SA.

  1. Remove the “no nutritional risk” option as none selected it. Occurrence of “no nutritional risk” will be picked up in the “Other” category, if needed.

Q11

Two of the nine respondents indicated that their LA does not send a letter but that it is handed to them.

  1. Delete the word “send” and replace with the word “provide” to be clearer.

Q13

Two of the nine respondents indicated that they were confused about the row asking about denials determined through screening phone calls. Only three of nine responded to the question and the percentages screened by phone were 10 percent, 1 percent, and 0 percent. One of these LAs indicated that Federal regulations require potential WIC participants to be physically present during the time of certification. The second LA stated that very rarely are applications started over the phone and then the LA later finds out that they are not eligible for WIC. A third LA noted that data are not entered during screening calls.

  1. Delete the Certification column. Consider dropping this question entirely. Only three answered it and the percentages screened by phone were very small.

Q14

No respondents indicated difficulty with this question; however, during pretesting the research team identified areas for improvement.

  1. Include “or” between “offer” and “provide” so that the question reads: “Does your agency offer or provide....”

  2. Add a skip pattern, such that if “No” is selected, go to Q16. Further, suggest adding a response option, “Yes, only during disaster/emergency situations.”

  3. Include a definition of “alternative site” that clarifies distinctiveness from a standing, regular clinic.

Q15

Two of the nine respondents noted confusion on how to answer this question because they were unclear on the definition of “alternative site.”

  1. Include a definition of “alternative site” that will be included in Q14.

Q16 and Q16A

Two of the nine respondents referred to revised Federal regulations that require a separation of duties between income eligibility and nutritional assessment tasks. Some did clarify that staff may be cross-trained in all tasks, but are not able to perform all for a single applicant.

  1. Revise the questions to focus on cross-training so as not to conflict with Federal regulations that require separation of duties.

Q17

Since certifications are a joint effort among several employees, four reported that they do not have individual caseloads. Those who did answer the question divided total enrollment by the number of staff.

  1. Drop question and calculate number of participants per clinical staff members.

Q19

Three of the nine respondents added response options in the “Other: PLEASE SPECIFY” section of the question. Specific additional response options included: standalone/business community complex; Headstart; Indian Health Service facility; store front; and modular office. One respondent indicated that they were unsure whether this question was asking about geographic location. A second respondent indicated that their LA is a standalone nonprofit and that on other surveys, they select “nonprofit” to describe the structure.

  1. Add “standalone WIC clinic” and “nonprofit” as response choices.

Q20

One respondent noted that they were unclear about whether the question was asking about their LA or across their clinics. This respondent was also unclear if they should be responding to the question in general, or about the clinics specifically. Finally, this respondent was also unsure of whether to answer the question thinking about an “ideal world” or “reality.” A second respondent noted that their LA offers separate breastfeeding rooms, and were unclear whether that should be included in the “Other: PLEASE SPECIFY” response option or in one of the prespecified response options.

  1. Reword question to “Of the spaces available at WIC clinics in your jurisdiction, how adequate….”

Q22

One respondent was confused by the word “other” in the question and suggested deleting the word “other” to be clearer.

  1. Delete “other.”

Q25

One respondent indicated that this question was difficult to estimate because their offices are open a different number of days of the week and for different hours.

  1. Revised instructions will be provided for this question to address such contingencies.

Q26

Four of the nine respondents seemed to have difficulty answering this question due to variations in their clinics’ hours and days.

  1. Reword question to be clearer that the question is asking about the hours open to the public.

  2. Add screening question to branch to alternative question if hours vary by clinic.

Q27

Five of the nine respondents noted that this question was difficult and took the longest time to answer. All respondents indicated that reporting the raw numbers would be easier. Two respondents added response options in the “Other: PLEASE SPECIFY” section of the question. Specific additional response options included: breastfeeding peer counselors; IBCLCs (International Board Certified Lactation Consultants); lab staff; and clerks.

  1. Change the last three columns from percentages to raw number text fields (research team will calculate percentages).

  2. Add reminder for the respondent to include themselves in the calculation.

  3. Use floating definitions to clarify types of employees that should be considered administrative support (e.g., clerks) or full-time/part-time (e.g., include contract workers).

  4. Add “breastfeeding peer counselors” and “IBCLC” as response options or add into a floating definition for “Other Professional.”

Q33, Q33A, Q33B, and Q33C

Five of the nine respondents had some difficulty understanding what was meant by the word “retention,” and/or had difficulty finding retention rates for 2012 and 2013. There were also different definitions of retention between respondents. For example, one respondent defined “retention” as “average daily participation rates,” and focused on what percentage of their goal they were meeting for enrollment. A second respondent defined “retention” as “how long did someone stay on the [WIC] program.”

  1. Add response options: “No, calculated by State agency,” “No, not calculated at all,” and “Not sure.”

  2. Align questions to match revised SA questions regarding retention.

Q34, Q34A, an Q34B

One respondent did not answer these questions and the rest answered “No.” One respondent thought that this question would be more relevant for the State agency to answer.

  1. Remove these questions and focus on the desired definition of “retention.”

Q35

Two respondents were confused about what the question was asking. Four respondents indicated that they were unsure how to get the information to answer the question. One respondent suggests that this question have a skip pattern or be moved to the State Agency Survey.

  1. Add a response option so respondents may indicate that this information is calculated at the State level.

Q35

Two of the nine respondents added response options in the “Other: PLEASE SPECIFY” section of the question. Specific additional response options included: automated phone call for the appointment reminder and for missed appointments, receive call to reschedule, mailed reminders for appointments and recertifications, and one respondent added “Instagram” to the social media response option.

  1. Add “Instagram” to social media response option.

  2. Remove reference to “decreasing stigma;” no reference is made to it in the NSWP-II report and none of the pretest respondents chose it. “Other” category should be sufficient to capture any such efforts.


The research team also pretested the Local Agency Survey Invitation Email recruitment material in the field with respondents. Respondents were not specifically asked about the narrative and screening questions, but TIs made notes based on the types of reactions and questions they received during pretest recruitment. The research team suggests several general and specific changes to this instrument as presented in Table 11. The average amount of time needed to read the Local Agency Survey Invitation Email was estimated using feedback gathered in the lab. It is estimated that, on average, the Local Agency Survey Invitation Email will take respondents about 1 minute (0.02 hours), which is lower than the estimated 3 minutes (0.05) hours approximated before pretest.

Table 11. Local Agency Survey Invitation Email Pretest Feedback

Appendix ID

Document Name

Research Team Comments

Recommended Revisions

C7

Local Agency Survey Invitation Email

The research team received feedback during the debriefing interviews indicating that additional information would be helpful. LA directors requested more detail about the types of information they would need to complete the survey.

  1. Add a brief list of types of data the LA director may need to access or request.



IV. Local Agency Survey Recruitment and Communication Materials Pretested in the Lab

RAs reviewed several recruitment and communication materials in the research team’s lab. The wording, understandability, and flow were analyzed. All recruitment and communication materials were pretested in English. The research team suggests some changes to these materials. These are listed in Table 12. Estimates of burden on respondents are also included in the table. The average amount of time needed to read each of these materials was estimated using feedback gathered in the lab.

Table 12. Local Agency Survey Recruitment and Communication Materials Pretest Feedback

Appendix ID

Document Name

Average Burden

Research Team Comments

Recommended Revisions

B7

Denied Applicant Log3

---

---

---

C8

Local Agency Survey Invitation Letter with Instrument

3 minutes

The phrase in the third paragraph, “Let me encourage you to complete…,” would be better phrased as “I would like to encourage you to complete....”

  1. Revise “Let me encourage you to complete…” to “I would like to encourage you to complete….”

  2. Revise burden estimate from 4.2 minutes (0.07 hours) to 3 minutes (0.05 hours).


C9

Local Agency Survey Reminder Email

1 minute

The phrase in the fourth paragraph, “Let me encourage you to complete…,” would be better phrased as “I would like to encourage you to complete….”

  1. Revise “Let me encourage you to complete…” to “I would like to encourage you to complete....”

  2. Revise estimated average burden estimate from 3 minutes (0.05 hours) to 1 minute (0.02 hours).


C10

Local Agency Survey Reminder Telephone Script

3 minutes

None

No changes at this time.

D1

Study Description for State and Local Agencies

3 minutes

None

No changes at this time.

D3

Certification End Date Verification Email4

---

---

---

D4

Certification End Date Verification Reminder Telephone Script5

---

---

---

D5

Local Agency Survey Thank-You Letter

1 minute

None

No changes at this time.





Certification Survey

  1. Certification Survey Pretest Methodology

  1. Instrument Description

The Certification Survey’s purpose is to meet the objective of calculating improper payment rates in the WIC program. Data from the survey, combined with State administrative data on WIC participants and redemptions, will allow the research team to estimate case error and improper payments in a nationally representative sample both overall and for each of the five certification categories (pregnant women, breastfeeding women, non-breastfeeding postpartum women, infants, and children) by asking respondents questions related to their eligibility for WIC. Survey respondents for sampled infants and children will be the parent or legal guardian, or the applicant who applied on the infant or child’s behalf.

The Certification Survey (Appendices B3a, B3b, B3c, and B3d) is designed to assess whether or not a WIC participant met four of the five eligibility criteria for participation in the WIC program (the survey does not include documentation of nutritional risk). When a new applicant or an existing WIC participant reapplies for WIC, LAs apply the following criteria:

  • Identity—Applicant must show a valid form of identification.

  • Residency—Applicant must reside in the jurisdiction of a State in which an application is submitted and show proof of residency, or meet ITO residency requirements.6,7

  • Participant category—Applicant must fall within one of the following five WIC participant categories:

    • Pregnant (includes up to 6 weeks after birth of an infant or end of a pregnancy)

    • Breastfeeding (a postpartum woman who is breastfeeding up to 1 year after birth of an infant)

    • Non-breastfeeding women (a postpartum or previously pregnant woman who is not breastfeeding, up to 6 months after the end of a pregnancy)

    • Infant (includes birth up to the last day of the month in which the first birthday falls)

    • Child (more than 12 months of age up to the last day of the month in which the fifth birthday falls)

  • Income—The applicant must meet one of the following income eligibility criteria:

    • The income of the applicant’s family economic unit must be no greater than 185 percent of the Federal Poverty Level (FPL) (or lower, at State discretion, but no lower than 100 percent of FPL); or

    • The applicant must be adjunctively or automatically income-eligible. Adjunctive or automatic income eligibility is met if

      • the applicant participates in SNAP;

      • the applicant participates the State’s TANF program;

      • the applicant participates in a qualifying State Medicaid program (in thepretest State, the SA specified the types of Medicaid coverage that did, and did not, confer adjunctive eligibility);

      • the applicant has a family member who is a pregnant or infant participant in the State’s Medicaid program; or

      • the applicant is a participant in an FNS-approved, State-administered program that documents income eligibility and has income limits at or below WIC income limits.

The NSWP-III Certification Survey will provide data required to estimate the case error rate: the proportion of participants that are ineligible for WIC due to errors in identity documentation, residency documentation, certification category, and/or income eligibility. Once case errors have been identified from Certification Survey data, these data will be used in conjunction with State redemption data to calculate dollar error amounts and rates.8

The NSWP-III Certification Survey has two versions. The “Adult” version is used when the sampled participant is a pregnant, breastfeeding, or postpartum (non-breastfeeding) woman (Appendices B3a and B3c). The “Infant/Child” version is used when the participant is a child or infant (Appendices B3b and B3d). The survey respondent for the Infant/Child version is the adult who applied for WIC for the child or infant.

The NSWP-III version of the Certification Survey is adapted from the version used in NSWP-II. This is motivated by an effort to minimize differences in data collection and permit direct comparisons of the estimates of improper payment errors between the two studies. The survey is organized into the topics shown in Table 13:



Table 13. Certification Survey Topic Areas

Survey Topic

Description

Participant Identity

Valid documentation of identity

Participant Residency

Valid documentation of residency within State’s jurisdiction (or documentation of meeting ITO residency requirements)

Information Relevant to Participant’s Certification Category

Birth dates of infant and child WIC participantsa

Household Composition

Numbers of adults, children, and infants in the household; determination of which household members were part of the family economic unit (i.e., members who share income and expenses) at the time of certification

Income of Each Member of Family/Economic Unit

Documentation of income sources and amounts for each member of the family economic unit at time of certification, including documenting income for adjunctively or automatically eligible WIC participants

Documentation of Adjunctive/Automatic Income Eligibility

Documentation of current participation in program conferring adjunctive income eligibility (SNAP, TANF, or Medicaid), or State program conferring automatic income eligibility at time of certificationb

a To be consistent with NSWP–II, the research team will look for possible participant category errors only for infant and child WIC participants and not for potential category errors for women WIC participants.

b For the pretest, the SA’s FY16 program manually identified which programs conferred (or did not confer) adjunctive income eligibility. No program conferred automatic income eligibility.

  1. Respondent Selection

The initial pretest sample included recently certified WIC participants from four LAs in the Commonwealth of Massachusetts. Table 14 summarizes characteristics of the nine pretest respondents.

Table 14. Certification Survey Pretest Respondents

Language:

Adult Version

Infant/Child Version

Total

English

2

2

4

Spanish

4

1

5

Total

6

3

9



As shown in Table 15, all respondents had been certified less than 6 weeks prior to the survey administration date. The average time elapsed between certification and survey administration was 23 days.

Table 15. Number of Days Between Certification Date and Date of Certification Survey

N

9 Respondents

Average

23 days

Median

23 days

Minimum

16 days

Maximum

35 days

  1. Pretesting and Debriefing Procedures

Prior to pretest data collection, the research team conducted two 4-hour training sessions with a bilingual (English/Spanish) Field Interviewer (FI) in the participating State. The training included an overview of the purpose of the study, the pretest, and definitions of key terms relevant to the WIC program and certification procedures; an introduction to the main topics of the survey; a question-by-question review of each instrument (both the adult and infant/child versions of the Certification Survey and Denied Applicant Survey) and each debriefing guide (Appendices F3a and F3b); a review of the informed consent form (Appendices D7a and D7b); and the field procedures for survey administration and tracking of materials.

Recruitment of WIC participants focused on those who had been recently certified, to maximize the likelihood that survey participants would have income documentation—required for certification unless the applicant is adjunctively income eligible—on hand at the time of the Certification Survey. The goal was to administer the Certification Survey no later than 6 weeks after the participant’s certification date. To further increase the likelihood that respondents could show income documentation, both the telephone recruiter and the FI asked each respondent who agreed to participate to collect identification and income documents in preparation for the survey. The recruitment script included the following: “In one part of the interview, the interviewer will ask you to show her certain documents. It will help speed up the interview if you can gather together a few things before she arrives. These documents include some form of identification [IF INFANT/CHILD: for you and your child], a piece of mail or something that shows your current address, and any pay stubs or documents showing your family’s income in the month before [CERT_DATE] when you had your recent WIC appointment.” The reminder script included the following: “Remember that it will help speed up the survey if you have some form of ID [IF INFANT/CHILD: for you and your child], something with your current address [IF NECESSARY: like a recent piece of mail you received at your address], and recent pay stubs and other income documents.”

For each Certification Survey respondent, the research team prepared a customized paper version of the survey using data provided by the SA. These data included

  • the participant’s name, address and contact information (and for infant/child participants, the name of the parent);

  • the participant’s most recent date of certification;

  • the participant category (pregnant, breastfeeding, postpartum, infant or child); and

  • whether or not the participant was determined to be adjunctively income eligible, and if so, the name of the program (SNAP, TANF, or Medicaid) that conferred adjunctive income eligibility).

Once customized with this information, the prepared survey was mailed to an FI prior to the scheduled appointment. The FI confirmed scheduled appointments with each respondent at least 1 day in advance. After arriving at the respondent’s home, the FI obtained the respondent’s consent to participate and administered the survey. The FI read questions aloud from the survey, marked the respondent’s answers on the paper copy, and followed any relevant skip patterns based on these answers. The FI also noted directly on the survey any difficulty the respondent had understanding a question and kept track of the start and end times for key sections of the survey. Immediately following the survey, the FI administered a debriefing questionnaire, thanked the respondent, and gave her a $25 Walmart gift card. After leaving the respondent’s residence, the FI reviewed the survey for completion, completed tracking paperwork, and mailed the forms to the research team for analysis.

  1. Analysis

For each completed survey and debriefing guide, the research team reviewed the data collected along with FI notations to look for any items where respondents had difficulty understanding the question, providing a valid response, or showing documentation. This review focused on the following:

  • The average burden of the survey across respondents

  • Respondents’ provision of valid identification and proof of residence

  • Respondents’ self-report of the infant/child participant’s date of birth

  • Respondents’ ability to show proof of participation in a program conferring adjunctive (or automatic) income eligibility (i.e., for cases where the SA-provided data indicated that the participant was adjunctively eligible)

  • Respondents’ ability to show documentation of income sources for relevant family members

  • Respondents’ difficulty understanding terms or questions used in the survey

Section II summarizes data on the burden of the survey for respondents, followed by a qualitative summary of the other analyses in Section III.

II. Pretest Burden Estimates of the Certification Survey

The average time to administer the phone recruitment was 11 minutes and the average time to review the consent form was 5 minutes. On average, the Certification Survey required 44 minutes to administer (Table 16). Administration time differed by language: The English version required 32 minutes on average, whereas the Spanish version require 50 minutes on average. Although the administration times for the English infant and adult versions were approximately equivalent, the administration time for the adult version of the Spanish language version took 7 minutes longer than the Spanish infant/child version. However, these data reflect just one infant/child for the Spanish version and two for the English version.

Table 16. Administration Time (mins) for Certification Survey Pretest, Overall and by Language and Version


Overall (mins)

English (n=4)

Spanish (n=5)

Adult
Version

(n=2)

Infant/Child Version

(n=2)

Overall (English)

Adult
Version

(n=4)

Infant/Child Version

(n=1)

Overall (Spanish)

Average

42

32

33

32

51

44

50

Median

40

32

33

33

52

44

48

Minimum

24

24

26

24

35

44

35

Maximum

65

39

40

40

65

44

65



The average burden per respondent was nearly 40 percent longer than estimated before the pretest: that estimate was an average of 30 minutes per respondent. It also appears that the Spanish language version increases the total burden relative to the English version. At least part of this additional burden likely resulted from the fact that the pretest (both English and Spanish versions) were administered using a paper form that required the FI to manually follow written skip instructions. In contrast, for the main study, FIs will use a computer-assisted personal interview (CAPI) version that automates these skip patterns. Administration time also tends to decrease with practice.

III. Certification Survey Pretest Findings and Recommendations

Below, we summarize findings from each of the main sections of the Certification Survey that will affect an analytic determination of the respondent’s eligibility for WIC.

  1. Proof of Identity

All but one respondent showed valid identification (in all cases, either a driver’s license or passport). One respondent presented an expired driver’s license, but also produced a lease with her name and current address as proof of residency (the lease showing her name constitutes valid proof of identity and residency). For the three infant/child WIC participants, one respondent showed a passport, and two others showed the child’s birth certificate.

  1. Proof of Residency

For proof of residency, all but one respondent showed valid documentation of their current home address. The latter respondent had moved (within the State) after her WIC certification appointment but prior to the administration of the Certification Survey. She was able to show proof of her prior address, but did not have documentation of her current address. Nevertheless, she lived within the same State as the LA where she was certified (lack of residency proof due to a move that occurred after the certification date will not count as an “error” with respect to proof of residency).

  1. Participant Category (Infant/Child Version)

For all three infant/child WIC participants, the respondents were able to provide the participant’s date of birth, and in all instances, the age of the participant at the time of certification showed that the certification category assigned by the LA was correct.

  1. Adjunctive Eligibility

For eight of nine respondents, the data from the SA indicated that the WIC participant was determined to be adjunctively eligible based on participation by the participant or family member in a qualifying Medicaid program (i.e., a Medicaid plan identified by the SA in its certification procedures as conferring adjunctive eligibility).9 Of these eight adjunctively income-eligible participants, three showed an award letter to document participation in Medicaid; the others were unable to show valid documentation (three showed a Medicaid card, and one showed a card from the State Department of Transitional Assistance, but these cards are not valid proof of participation because they lack participation dates).

It is important to note that staff in LAs in the pilot State use an Eligibility Verification System (EVS) to confirm participation in a qualifying Medicaid plan. As described in the NSWP-III study plan, the research team will ask SAs to confirm that the LA checked a WIC participant’s adjunctive eligibility electronically or via phone call to a relevant case worker in instances where the Certification Survey respondent cannot produce documentation of qualifying participation in SNAP, TANF, or Medicaid.

  1. Income Source Documentation

Although the recruitment script asked respondents to have identification and income documentation available at the time of the scheduled appointment, five of nine respondents were unable to show any documentation of income; two others were able to show partial documentation, and two were able to show complete documentation for all income sources reported (Table 17). Each of the two respondents able to show complete income documentation received weekly wages and showed a recent pay stub covering 1 week of wages; no other member of the family received another type of income. Another adult WIC participant showed proof of wages she received but was unable to show proof of wages received by other family members; she also received occasional alimony and child support payments but could not provide documentation. Other respondents were also unable to provide proof of wages received by other family members. Respondents likewise reported income for which they lacked documentation from the following sources: TANF, Social Security, Federal and State supplemental security income (SSI), and income from self-employment.

For four respondents, the FI noted the amount that a respondent reported receiving as SNAP benefits, but this is not countable income for purposes of determining income eligibility for WIC. We will address this issue in training for the full study to ensure that FIs do not include SNAP benefits as “public assistance/TANF” or any other countable source of income.

For one respondent, the FI reported that a household member “shared food,” but not other expenses or income with the participant, and marked “separate finances” on the survey. We will address this error in training for FIs: sharing food with a member of the household means that this person should be included as “sharing finances” and thus be included as a member of the family economic unit.

  1. Debriefing

One respondent did not understand the Spanish phrase used to refer to income from “child support” (manutención o pensión alimenticia); when the FI clarified in English that she was asking about “child support,” the respondent understood. We will identify a meaningful translation into Spanish. One other respondent to the Adult version reported difficulty understanding some items or terms on the survey but could not identify the particular items; however, the FI noted that this respondent had difficulty reporting her own age and the ages of other members of her family. No other respondents reported difficulty understanding items or terms used in the survey.

  1. Recommendations

Results of the pretest suggest that respondents to the Certification Survey may lack sufficient income documentation. We discuss these results and potential solutions, before summarizing a small number of other modifications suggested by the pretest.

Recommendation: Discuss with FNS the lack of documentation to determine income eligibility (adjunctive or otherwise). The fact that a majority of respondents were unable to show documentation for some or all sources of the family economic unit’s income may present a major risk to the study. Although eight of the nine pretest respondents were adjunctively income-eligible, and thus not required to show their local agency proof of income, one of the NSWP-III objectives is to estimate the average family income for both adjunctively and non-adjunctively eligible participants. The one respondent in the pretest final sample who was not adjunctively eligible was unable to show proof of one of two sources of income.

For each study participant who is not adjunctively income-eligible, and who cannot produce documentation of income, it will not be possible to determine with sufficient confidence whether the participant met income eligibility requirements. Lack of documentation means that data on the gross or net amount of income at the time of certification may be inaccurate (i.e., if self-reported) or missing. Inaccurate or missing data will jeopardize the determination of which cases may have been certified erroneously.

It is important to recall the following:

  1. The median number of days elapsed between the pretest respondent’s certification date and interview was 23 days (the maximum was 35 days).

  2. The recruitment and reminder scripts used in the pretest asked the respondent to gather income documents in preparation for the in-person survey.

For each respondent, the research team also checked the length of the certification period. WIC agencies have discretion to issue a temporary, 30-day certification if an applicant lacks proof of income but otherwise appears likely to qualify for the program; therefore, we examined the possibility that one or more pretest respondents had received a temporary certification. However, none of the pretest respondents had a 30-day certification period, so all would have been required to provide proof of income at the time of certification if they were not adjunctively income-eligible.

Possible solutions. To address this lack of income documentation, we would like to discuss the following possible remedies with FNS. First, we propose to strengthen the language in the recruitment script to emphasize the importance of income documentation for all family members. For example, we will modify the recruitment script as follows:

  • Ask whether or not the respondent showed proof of income to their LA, and if so, ask the respondent to collect the same documents; and/or emphasize the importance of collecting income documents for the month in which the respondent was certified.

  • Expand the list of documents that the field interviewer may need to see. This list should include any award letter or notice of benefits showing enrollment and expiration dates for participation in SNAP, TANF, and/or Medicaid; pay stubs for all family members earning wages or a salary, and documentation of income (for all family members) from alimony, child support agreements, Social Security, Federal or State SSI, etc.

  • Ask respondents to have copies of bank statements and tax returns available, in case any income documentation (i.e., showing the gross income) is unavailable. If the preferred documents are not available (e.g., missing pay stubs for wages), a bank statement showing the amount of income deposited into a participant’s checking account is preferred to no documentation.

Although it is uncertain how effectively a revised recruitment script might improve the availability of income documentation, FNS agreed that:

  1. Some documentation of income sources is preferred to no documentation, even if the available document(s) do not meet all of the criteria that a local WIC agency might require. For example, a bank statement showing an amount of wages deposited into a checking account after withholdings (e.g., for FICA or income taxes) would not provide the gross income amount required by the WIC program; but in the absence of documentation of the gross income, documentation of the net income is preferred to self-reported income (i.e., no documentation);

  2. If no documentation is available for a particular income source, the research team will use self-reported income in study analyses and will note the prevalence of missing income documentation and the limitations of self-reported income.



Recommendation: Drop the “Alternate Income Reference Period” procedure. The revised draft of the Certification Survey (3.2.2) included procedures to review a respondent’s income sources for a period of 3 to 6 months prior to the certification date if preliminary results of the survey suggest that the WIC participant should have been deemed ineligible due to income, based on current income sources (defined as the 30 days prior to certification date). Having completed a pretest of the Certification Survey without this procedure, we recommend dropping the Alternate Income Reference Period procedure for the following reasons:

  1. Results of the pretest show that the Certification Survey already imposes an administration time that is 40 percent longer than expected, on average; including the Alternate Income Reference Period would increase the average burden further.

  2. Results of the pretest indicate that respondents had difficulty providing documentation of income dated within 30 days of the certification date; requesting additional income documentation for earlier periods of time would likely yield little additional data while imposing added administration time.

  3. This alternate income procedure was not implemented in the prior study (NSWP-II), so including it in NSWP-III would likely impair valid comparisons of results of the two studies.

  4. As explained in the draft survey “note for reviewers,” Section 246.7(d)(2)(i)) of the Federal WIC regulations give SAs and LAs flexibility to make independent and non-replicable decisions about what timeframe for assessing family income is most accurate.10 The research team is concerned that any implementation of an alternate income reference period would not accurately reflect LA procedures for determining income eligibility.

Other recommendations. In addition to addressing the need for income documentation, we note the following additional recommendations arising from the pretest:

  • For the Spanish version of the survey, we will identify a more meaningful way of referring to child support payments in the Income Sources section of the Income Eligibility module.

  • In States like the pilot State, where LA staff can use an electronic EVS to determine whether an applicant participates in SNAP, TANF, or a qualifying Medicaid program, the research team has already described plans for confirming adjunctive eligibility when the respondent cannot document participation in one of these programs during the in-person survey. In such cases, the research team will ask the State to confirm that the WIC participant was a current enrollee in one of these programs.

Table 17. Income Sources and Documentation for Adjunctively and Non-Adjunctively Income-Eligible Respondents to the Certification Survey Pretest


State WIC Agency Data on Respondent’s Adjunctive Income Eligibility

Survey Respondent’s Ability to Show Proof of Adjunctive Income Eligibility

Income Sources Reported

Version, ENG=English SP=Spanish

Adjunctively Income -Eligible

Applicable Program

Able to Show Proof of Adjunctive Eligibility?

Type of Proof

With Valid

Documentation

Without Valid

Documentation

Adult, SP

Y

Medicaid

Yes

Award letter

Wages, salary, or fees (participant)


Adult, SP

Y

Medicaid

None

NA

Wages, salary, or fees (participant)

Wages, salary, or fees (family member)

Wages, salary, or fees (family member)

Alimony

Child support

Adult, SP

Y

Medicaid

Yes

Award letter


Income from self-employment (family member), income is cash-only

Adult, ENG

Y

Medicaid

Partial (Invalid)

Medicaid card (no dates)


Social Security benefits

Federal SSI,

State SSI, or State disability insurance

Disability

TANF

Adult, ENG

Y

Medicaid

Partial (Invalid)

Card from Department of Transitional Assistance (DTA), no dates


Wages, salary, or fees (family member)


Adult, SP

Y

Medicaid

Partial (Invalid)

Medicaid card (no dates)


Cash payments for childcare

TANF

Infant/Child, ENG

N

NA

NA

NA

Wages, salary, or fees (participant)

Wages, salary, or fees (family member)

Infant/Child, SP

Y

Medicaid

Yes

Award letter


Wages, salary, or fees (family member)

Infant/Child, ENG

Y

Medicaid

Partial (Invalid)

Medicaid card (no dates)

Wages, salary, or fees (participant)




Denied Applicant Survey

I. Denied Applicant Survey Pretest Methodology

  1. Instrument Description

The predominant purpose of the Denied Applicant Survey is to determine the rate of denials, the proportion of denials made erroneously, and the dollar cost and rate of associated underpayments.11 An additional purpose is to describe the reasons why applicants were correctly or erroneously denied WIC benefits. To answer the Study Objectives, the Denied Applicant Survey will differ from the instrument used in NSWP-II. The prior study’s Denied Applicant Survey did not ask requisite questions needed to determine whether the denial was made correctly or erroneously. That is, in order to know whether an applicant was correctly or erroneously denied, the study must confirm the applicant’s residence, certification category, and income at the time of application. To achieve these objectives, the Denied Applicant Survey will, in large part, mirror the Certification Survey, with appropriate modifications to introductory language and question stems—for example, referring to the applicant’s “date of application” rather than “date of certification.” Like the Certification Survey, the NSWP-III Denied Applicant Survey has two versions: an “adult” version for use when the denied applicant is a woman (Appendices B4a and B4c), and a “infant/child” version for use when the denied applicant is an infant or child (Appendices B4b and B4d). The survey respondent for the infant/child version is the adult who applied for WIC for the child or infant.

The main topics for the survey are the same as for the Certification Survey. Some individual items differ, however, because the survey’s respondents will not be receiving WIC benefits. In addition, the survey includes some items to determine what an LA asked the applicant to do (e.g., whether the WIC agency asked an applicant to show proof of identity). In contrast to the adult version of the Certification Survey and the infant/child version of the Denied Applicant Survey, the adult version of the Denied Applicant Survey included five to six items intended to determine the adult applicant’s likely participant category (pregnant, breastfeeding, or postpartum). See Table 18 for these items.

A subset of the adult items was included in an “abbreviated version” of the adult Denied Applicant Survey for use in a telephone survey. The abbreviated version included items from the Identity, Residency, and Participant Category sections.



Table 18. Denied Applicant Survey Topic Areas

Survey Topic

Description

Applicant Identity

Valid documentation of identity

Applicant Residency

Valid documentation of residency within State’s jurisdiction (or documentation of meeting ITO residency requirements)

Information Relevant to Applicant’s Certification Category

Birth dates of infant and child WIC applicants;

likely participant category for adult WIC applicants (pregnant, breastfeeding or postpartum, and not breastfeeding)

Household Composition

Numbers of adults, children, and infants in the household; determination of which household members were part of the family economic unit (i.e., members who share income and expenses) at the time of application

Income of Each Member of Family/Economic Unit

Documentation of income sources and amounts for each member of the family economic unit at time of application, including documenting income for adjunctively- or automatically-eligible WIC participants

Documentation of Adjunctive/Automatic Income Eligibility

Documentation of current participation in program conferring adjunctive income eligibility (SNAP, TANF, or Medicaid), or State program conferring automatic income eligibility at time of application



  1. Respondent Selection

The pretest sample for the Denied Applicant Survey had to be compiled from several sources. The initially selected LA was unable to provide the needed quantities of denied applicants; the denied applicants provided were not reflective of the set quotas needed for Spanish, English, Adult and Infant/Child respondents. The lack of appropriately-apportioned denied applicants required that the research team expand data collection into three additional LAs in the Boston area. The in-person interviews conducted with denied applicants from these LAs were then augmented with a statewide sample of denied applicants to make up for a lack of adult applicants provided by the LAs. A sample of recently certified Spanish-speaking participants had to be used as a substitute sample to test the abbreviated version of the Spanish Adult Denied Applicant Survey. The statewide denied applicant sample and recently certified Spanish-speaking adult samples were administered an abbreviated version of the Denied Applicant Survey.

The resultant pretest in-person and telephone interviews for the Denied Applicant Survey are shown in Table 19. The breakdown of the in-person/telephone interviews and instrument versions are detailed in the discussion that follows.

Table 19. Denied Applicant Survey Pretest Respondents for In-Person and Abbreviated Versions

Language:

Adult Version

Infant/child Version

Total

English

2

3

5

Spanish

3

1

4

Total

5

4

9



In-Person Interviews

A total of five denied applicants were interviewed in person. Table 20 presents the breakdown of Adult and Infant/child versions and English/Spanish versions of the instrument that were pretested in person.

Table 20. Denied Applicant Survey In-Person Pretest Respondents

Language:

Adult Version

Infant/child Version

Total

English

1

3

4

Spanish

0

1

1

Total

1

4

5



Because we anticipated very few denied applicants in the population, the research team included denied applicants whose scheduled survey administration was several months after their application date. Table 21 shows that the average elapsed time between date of application and survey administration for in-person pretest respondents was 161 days.

Table 21. Number of Days Between Application Date and Survey Administration Date for the Denied Applicant Pretest

N

5 respondents

Average

161 days (5.3 months)

Median

215 days (7.1 months)

Minimum

34 days (1.1 month)

Maximum

279 days (9.2 months)



Telephone Interviews

Although the pretest yielded just one full administration of the adult version of the Denied Applicant Survey, the main substantive difference between the adult version and the infant/child version is the inclusion of five to six items designed to determine a mother’s likely participant category at the time of her application. After exhausting the census of denied applicants from four local agencies, to pretest this subset of participant category items, the research team recruited

  • an additional 31 adult denied applicants from LAs across the State; and

  • an additional sample of recently certified, Spanish-speaking adult WIC participants from LAs across the State.

With these two samples, the research team administered an abbreviated version of the adult Denied Applicant Survey including items from the Identity, Residency, and Participant Category sections. The first sample (31 denied applicants from LAs statewide) yielded one English-speaking adult denied applicant (who completed the abbreviated adult Denied Applicant Survey, by telephone), but no Spanish-speaking denied applicants. The statewide sample of Spanish-speaking recently certified adult WIC participants yielded three respondents who completed the abbreviated version of the adult Denied Applicant Survey. (Because the survey was deliberately abbreviated, administration times are not included in estimates of burden for the Denied Applicant Survey).

Table 22. Abbreviated Adult Version of Denied Applicant Survey Administered by Telephone


Survey Administered by Telephone to:

Language:

Adult Denied Applicant

Adult Certified Participant

English

1

0

Spanish

0

3

Total

1

3



  1. Pretesting and Debriefing Procedures

Prior to pretest data collection, the research team conducted two 4-hour training sessions (covering both the Certification Survey and Denied Applicant Survey) with a bilingual FI in the participating State. The training included an overview of the purpose of the study, the pretest, definitions of key terms relevant to the WIC program, and certification procedures; an introduction to the main topics of the survey; a question-by-question review of each instrument (both the Adult and Infant/Child versions of the Certification Survey and Denied Applicant Survey) and each debriefing guide (Appendices F4a and F4b); a review of the informed consent form (Appendices D9a and D9b); and the field procedures for survey administration and tracking of materials.

As with the Certification Survey, both the telephone recruiter and the FI asked each respondent who agreed to participate to collect identification and income documents in preparation for the survey. The recruitment script included the following: “In one part of the interview, I’ll ask you to show me certain documents. It will help speed up the interview if you can gather together a few things before I arrive. These documents include some form of identification [IF INFANT/CHILD: for you and your child], a piece of mail or something that shows your current address, and any pay stubs or documents showing your family’s income in the month before [APPLICATION_DATE] when you had your WIC appointment.” The reminder script included the following: “Remember that it will help speed up the survey if you have some form of ID [IF INFANT/CHILD: for you and your child], something with your current address [IF NECESSARY: like a recent piece of mail you received at your address], and recent pay stubs or bank statements covering the past month or two.”

For each Denied Applicant Survey respondent, the research team prepared a customized paper version of the survey using data provided by the SA. These data included

  • the applicant’s name, address, and contact information (and for infant/child applicant, the name of the parent);

  • the applicant’s most recent date of application; and

  • the applicant category, if available (pregnant, breastfeeding, postpartum, infant or child).

The paper version of this customized survey was mailed to a Spanish-English bilingual FI prior to each scheduled appointment. The FI confirmed the scheduled appointment with each respondent at least 1 day in advance. The FI then administered the informed consent form, obtained the respondent’s consent to participate, and began to administer the survey. The FI read questions aloud from the survey, marked the respondent’s answers on the paper copy, and followed any relevant skip patterns based on these answers. The FI also noted directly on the survey any difficulty the respondent had understanding a question. Notations on the survey prompted the FI to record the start and end times for key sections of the survey. Immediately following the survey, the FI administered the debriefing questionnaire, thanked the respondent, and gave her a $25 Walmart gift card. After leaving the respondent’s residence, the FI reviewed the survey for completion, completed tracking paperwork, and shipped the forms to the research team for analysis.

For the telephone survey of the abbreviated version of the Certification Survey, TIs were trained on the brief survey that had omitted all of the detailed income questions contained on the in-person survey. Interviews were monitored by project management staff. Immediately following the survey, the TI administered the debriefing questionnaire, thanked the respondent, and told her a $25 Walmart gift card would be mailed to her. The TIs met with project managers to discuss observations of the survey and debriefing interview.

  1. Analysis

For each completed survey and debriefing guide, the research team reviewed the data collected along with FI notations to look for any items where respondents had difficulty understanding the question, providing a valid response, or showing documentation. This review focused on the following:

  • The average burden of the of the survey across respondents

  • Respondents’ provision of valid identification and proof of residence

  • Respondents’ self-report of the infant/child applicant’s date of birth or the adult applicants’ responses to questions about pregnant, breastfeeding, or postpartum (and not breastfeeding) status

  • Respondents’ ability to show proof of participation in a program conferring adjunctive (or automatic) income eligibility (i.e., whether or not the SA-provided data indicated that the applicant was adjunctively eligible)

  • Respondents’ ability to show documentation of income sources for relevant family members

  • Respondents’ difficulty understanding terms or questions used in the survey

Section II summarizes data on the burden of the survey for respondents, followed by a qualitative summary of the other analyses in Section III.

II. Pretest Burden Estimates of the Denied Applicant Survey

The average time to administer the phone recruitment was 8 minutes and the average time to review the consent form was 5 minutes. On average, the Denied Applicant Survey required 39 minutes to administer (Table 23). Administration time differed by language: The English version required 33 minutes on average, whereas the Spanish version (n=1) required 64 minutes. No Spanish-speaking adult denied applicants were available to determine the administration time for this version of the survey.

Compared to the estimated burden of 35 minutes, the average administration time for the Denied Applicant Survey was 39 minutes, or 11 percent longer than originally estimated.

Table 23. Administration Time (mins) for Denied Applicant Survey Pretest, Overall and by Language and Version



English (n=4)

Spanish (n=1)


Overall (mins)

Adult
Version (n=1)

Infant/child
Version (n=3)

Overall (English)

Adult
Version

Infant/child
Version (n=1)

Overall

(Spanish)

Average

39

34

32

33

--

64

64

Median

34

34

30

32

--

64

64

Minimum

28

34

28

28

--

64

64

Maximum

64

34

39

39

--

64

64



The average administration time for the abbreviated adult Denied Applicant Survey administered by telephone was 5 minutes (across four respondents).

III. Denied Applicant Survey Pretest Findings and Recommendations

Below, we summarize the results of each of the key sections of the Denied Applicant Survey used to determine potential eligibility for the WIC program.

  1. Proof of Identity

Each of the five in-person respondents showed valid proof of identity for their infant/child, including a birth certificate, Social Security or green card, or a letter from a doctor or government agency showing the infant/child’s name. Four of five in-person respondents also showed valid proof of their own identity (a driver’s license, passport, or State/tribal-issued identification card). One adult denied applicant presented an insurance bill showing her name and address, but the FI could not confirm the date of this bill or whether it had been postmarked recently (a requirement). (We will address this issue in training for data collectors prior to the start of data collection.)

For the abbreviated telephone survey, three of the four respondents stated that they had the same form of identification that they showed at the time of application. The forms shown include a driver’s license, a Medicaid card, and a “state ID” that was not a driver’s license. The fourth respondent stated that she did not have to show identification because she was previously enrolled in WIC.

  1. Proof of Residency

For proof of residency, four of five respondents also presented valid documentation: two respondents showed both a driver’s license with their name and address and either a utility bill, rent/mortgage receipt, or lease with name and address; one showed a medical bill addressed to her at the address on record; a fourth respondent showed a letter from a government agency with name and home address.

For the abbreviated telephone survey, two of the four respondents stated that they did not have to show proof of residency at the time of application. The two respondents who did show proof of residency provided utility bills at the time of application.

  1. Participant Category

All five denied applicants were able to answer the questions in this section. For all four infant/child denied applicants, the respondent was able to provide the infant/child’s date of birth. The adult denied applicant had recently given birth and recalled that she was breastfeeding at the time of application. Interestingly, the participant category supplied by the SA indicated that she was considered to be a pregnant applicant at the time of her application. Based on the self-reported date of birth of her infant, however, this respondent should have been considered either as a breastfeeding or postpartum applicant, rather than a pregnant applicant (she gave birth 1 month prior to her date of application), as the infant’s date of birth suggests that this applicant was either breastfeeding or postpartum, but not pregnant, at the time of her application. However, it is unclear whether or not the self-reported information that she was breastfeeding at the time of her application is a reliable indicator of the applicant’s participant category. In contrast to proof of identity, residency, and income eligibility, there is no available documentation with which to check the denied applicant’s self-report.

For the abbreviated telephone survey, all four respondents were able to answer the participant category questions. Three respondents reported that they were pregnant at the time of application, and one reported that she had recently given birth. All three of those pregnant at the time of application affirmed that they told the LA that they were pregnant at the time; the respondent who had recently given birth reported her infant’s birthdate and indicated that she was breastfeeding at the time of her application.

  1. Adjunctive Eligibility

Three of the five denied applicants indicated that they were not participating in a program that would confer adjunctive eligibility (see Table 24). Two of the five showed a Medicaid card that lacked dates of participation needed for valid proof of adjunctive income eligibility. Without a valid form of proof of participation in a qualifying Medicaid program, the research team cannot determine with any certainty whether or not this applicant was adjunctively income-eligible, and therefore, it is not possible to determine whether the LA erred in denying the application.

  1. Income Source Documentation

Four of five respondents were able to show income documentation, but for two of these four, the dates on the income documentation cast doubt on the validity of this proof of income for purposes of determining whether or not the applicant was correctly or erroneously denied (one respondent showed a pay stub that was dated 8 months before her application date and reported that her income had not changed by the time of her application to WIC; another respondent showed a pay stub that was dated 8 months after her application date and reported that the income included a $2/hour raise since the time of her application for WIC (see Table 24). As occurred in the pretest of the Certification Survey, some respondents were unable to show documentation for wages earned by a family member, for alimony or child support, or for regular cash contributions from others.

  1. Debriefing

None of the denied applicants who completed the full Denied Applicant Survey that corresponded to their participant category at the time of application reported difficulty answering items in any section, and none reported that any terms that were unfamiliar. The same result was observed for the denied applicant and three certified applicants who answered the abbreviated version of the Denied Applicant Survey by telephone.

  1. Recommendations

Results of the pretest suggest that respondents to the Denied Applicant Survey may lack sufficient documentation of income and/or of participation in a program that would have conferred adjunctive income eligibility at the time of application. We discuss these results and potential solutions, before summarizing other recommendations based on the pretest results.

Lack of documentation to determine income eligibility (adjunctive or otherwise). Lack of documentation for some or all sources of the family economic unit’s income (n=4 respondents) and the absence of valid (i.e., with dates of participation) documentation for participation in a program that may have conferred adjunctive income eligibility at the time of application, present a major risk to the study.

For each study participant who is not adjunctively income-eligible and who cannot produce documentation of income, it will not be possible to determine with sufficient confidence whether the applicant met income eligibility requirements. Lack of documentation means that data on the gross or net amount of income at the time of certification may be inaccurate (i.e., if self-reported) or missing. Inaccurate or missing data will jeopardize the determination of which cases may have been erroneously denied. Similarly, lack of documentation for participation in SNAP, TANF, or Medicaid at the time of application will make it difficult to determine whether an applicant was adjunctively income-eligible at the time of application. It is important to recall the following:

  1. The median number of days elapsed between the pretest respondent’s application date and interview was 215 days (the maximum was 279 days).

  2. The recruitment and reminder scripts used in the pretest asked the respondent to gather income documents in preparation for the in-person survey.

Possible Solutions. To address this lack of documentation, we would like to discuss the following possible remedies with FNS. First, we suggest limiting the dates of application for sampled denied applicants to three months prior to the target date of the survey administration; the longer the time that elapses between the survey administration and the application date, the higher the risk that the respondent will no longer have the necessary documentation available. Because denied applicants are rare, and because erroneous denials are therefore a small portion of the total dollar error resulting from the sum of over- and underpayments, limiting the initial sample of denied applicants will have little effect on the overall precision of estimates for dollar error.

Second, as with the Certification Survey, we propose to strengthen the language in the recruitment script to emphasize the importance of income documentation for all family members, dated within the same or an adjacent month of the date of application. For example, we propose to modify the recruitment script to:

  • Ask whether or not the respondent showed proof of income to their LA, and if so, ask the respondent to collect the same documents; and/or emphasize the importance of collecting income documents dated within the month that the respondent applied for WIC.

  • Expand the list of documents that the FI may need to see. This list should include any award letter or notice of benefits showing enrollment and expiration dates for participation in SNAP, TANF, and/or Medicaid at the time of application; pay stubs for all family members earning wages or a salary; and documentation of income (for all family members) from alimony, child support agreements, Social Security, Federal or State SSI, etc.

  • Ask respondents to have copies of bank statements and tax returns available, in case any income documentation (i.e., showing the gross income) is unavailable. If the preferred documents are not available (e.g., missing pay stubs for wages), a bank statement showing the amount of income deposited into a participant’s checking account is preferred to no documentation.

Although it is uncertain how effectively a revised recruitment script might improve the availability of income documentation, FNS has agreed that:

  1. Some documentation of income sources is preferred to no documentation, even if the available document(s) do not meet all of the criteria that a local WIC agency might require. For example, a bank statement showing an amount of wages deposited into a checking account after withholdings (e.g., for FICA or income taxes) would not provide the gross income amount required by the WIC program; but in the absence of documentation of the gross income, documentation of the net income is preferred to self-reported income (i.e., no documentation);

  2. If no documentation is available for a particular income source, the research team will use self-reported income in study analyses and will note the prevalence of missing income documentation and the limitations of self-reported income.



Recommendation: Drop the “Alternate Income Reference Period” procedure. As described above for the Certification Survey, the revised draft of the Denied Applicant Survey (3.2.2) included procedures to review a respondent’s income sources for a period of 3 to 6 months prior to the application date if preliminary results of the survey suggest that the WIC participant should have been deemed eligible due to income, based on current income sources (defined as the 30 days prior to application date). Having completed a pretest of the Denied Applicant Survey without this procedure, we recommend dropping the Alternate Income Reference Period procedure for the following reasons:

  1. Results of the pretest indicate that respondents had difficulty providing documentation of income dated within 30 days of the application date; requesting additional income documentation for earlier periods of time would likely yield little additional data while imposing added administration time;

  2. As explained in the draft survey “note for reviewers,” Section 246.7(d)(2)(i)) of the Federal WIC regulations give SAs and LAs flexibility to make independent and non-replicable decisions about what time frame for assessing family income is most accurate.12 The research team is concerned that any implementation of an alternate income reference period would not accurately reflect LA procedures for determining income eligibility of applicants.

Recommendation: Drop questions about adult applicant’s likely participant category. To reduce the administration time and burden of the Denied Applicant Survey, we propose eliminating questions about the applicant’s likely participant category from the adult Denied Applicant Survey. These items do not contribute all of the necessary data for purposes of determining whether or not an applicant was correctly or erroneously denied (self-reported data on an applicant’s pregnancy or breastfeeding status at the time of application can neither confirm, nor disconfirm, an LAs determination of an adult applicant’s participant category. For this reason, we opted not to include similar items in the Certification Survey, as previously discussed with and approved by FNS. Our recommendation to eliminate the questions about participant category from the adult Denied Applicant Survey is based on review of the logic for the previous decision about the Certification Survey and the potential concerns about burden due to the pretest findings on the administration time for this survey.



Table 24. Income Sources and Documentation for Respondents to the Denied Applicant Survey Pretest


Respondent’s Report of Potential Adjunctive Income Eligibility

Income Sources Reported

Version (ENG=English, SP=Spanish)

Able to Show Proof of Adjunctive Income Eligibility?

Type of Proof for Adjunctive Income Eligibility

With Valid

Documentation

Without Valid Documentation

Infant/child,

SP

Partial (Invalid)

Medicaid certification card (no date)

Wages, salary, or fees (applicant’s parent)

Wages, salary, or fees (family member)


Alimony or child support

Infant/Child, ENG

N

(none, self-reported participation in MassHealth)

Wages, salary, or fees (applicant’s parent)



Infant/Child, ENG

N

N/A


Wages, salary, or fees (applicant’s parent). Documentation was more than 8 months prior to date of application for WIC, self-reported that income had not changed since date of documentation.

Infant/Child, ENG

Partial (Invalid)

Medicaid certification card (no date)


Regular contributions from someone not in the household

Wages, salary, or fees (family member)

Adult, ENG

N/A

N/A


Wages, salary, or fees (applicant’s family member) Documentation was more than 8 months after date of application for WIC and reflected a $2/hour raise.





Program Experiences Survey

I. Program Experiences Survey Pretest Methodology

  1. Instrument Description

The Program Experiences Survey (Appendices B5a, B5b, B5c, and B5d) will collect data on WIC participants’ program experiences, participation in other programs, food security, and other characteristics not available from administrative data. The Program Experiences Survey was created by incorporating and modifying questions from NSWP-II. Some questions are new to the NSWP-III survey.

  1. Respondent Selection

Current participants who have been certified in the past 6 months were recruited to participate in the Program Experiences Survey. The research team requested samples of former WIC participants from the Massachusetts SA. The Massachusetts SA provided the research team with a sample of current participants who had been certified within a 6-month period, but no more recently than 6 weeks. Participant data were delivered to the research team on December 15, 2016 from the initially selected, Boston area LA. The research team recruited from sampled current participants between January 26, 2017 and March 11, 2017 using a programmed Program Experiences Survey Invitation Telephone Script. The research team made up to five telephone calls in an attempt to reach potential respondents and left voicemails by the fourth call attempt.

  1. Pretesting and Debriefing Procedures

Trained TIs administered the survey using paper copies that had the same questions, response options, and prompts as the questionnaires that will be programmed for use with the full study sample. After verbally obtaining informed consent by telephone, the TIs administered the survey, reading each question aloud, following interviewer instructions, and writing down the responses indicated by the WIC participant. The TIs recorded the total duration of the survey. Interviews with a total of nine current WIC participants were conducted by telephone. All versions of the Program Experiences Survey were pretested: two English Version A with adult WIC participants; three Spanish Version A with adult WIC participants; two English Version B with adult applicants of infant/child WIC participants; and two Spanish Version B with adult applicants of infant/child WIC participants. A total of four English Program Experiences Surveys and five Spanish Program Experiences Surveys were conducted. A total of five Version A: Adult Program Experiences Surveys and four Version B: Infant/Child Program Experiences Surveys were conducted. Table 25 provides a summary of the Program Experiences Survey pretest timeline.

The TIs conducted a debriefing interview with each respondent immediately following the completion of the Program Experiences Survey to identify any questions that were confusing or difficult to answer (Appendices F5a and F5b). After completing the survey and debriefing interview, each pretest participant was mailed a $25 Walmart gift card. Feedback from pretest respondents was used to improve the instrument, which will be programmed for computer assisted telephone interviewing (CATI) for use with the full study sample.

Table 25. Program Experiences Survey Pretest Timeline

Program Experiences Survey


Respondent

Completed Survey

Version

Language

PES-1

January 30, 2017

Version B: Infant/Child

English

PES-2

February 15, 2017

Version A: Adult

Spanish

PES-3

February 16, 2017

Version B: Infant/Child

Spanish

PES-4

February 21, 2017

Version A: Adult

Spanish

PES-5

February 21, 2017

Version A: Adult

Spanish

PES-6

March 2, 2017

Version A: Adult

English

PES-7

March 9, 2017

Version B: Infant/Child

Spanish

PES-8

March 10, 2017

Version B: Infant/Child

English

PES-9

March 11, 2017

Version A: Adult

English



  1. Analysis

The data collected during instrument pretesting, as well as feedback received during debriefing interviews, were evaluated and discussed by the research team. The research team has provided recommendations, supported by the analysis of respondent feedback, for each instrument, following the reported results in each section. Findings are presented for each of the pretested instruments individually in the sections that follow.

II. Pretest Burden Estimates of the Program Experiences Survey

The Program Experiences Survey was estimated to take 40 minutes to complete. The TIs tracked approximately how long it took the respondents to complete the Program Experiences Survey (see Table 26). Response times ranged from 28 minutes to 48 minutes, with the average response being 38 minutes. Based on these findings, we recommend revising the burden estimate from 40 to 38 minutes.

Table 26. Administration Time (mins) for Program Experiences Survey Pretest, Overall and by Language and Version


Minutes

Overall (n=9)

English (n=4)

Spanish (n=5)

Adult
Version

(n=2)

Infant/Child
Version (n=2)

Overall (English)

Adult
Version (n=3)

Infant/Child
Version (n=2)

Overall

(Spanish)

Average

38

32

42

37

39

38

38

Median

35

32

42

35

35

38

35

Minimum

28

31

38

31

33

28

28

Maximum

48

32

45

45

48

48

48


General feedback on the Program Experiences Survey was collected from respondents during the debriefing interviews. Despite having difficulties that were noted by the TIs or observed by the call monitor (described in detail below), all nine respondents reported that all survey questions were easy to answer, none of the questions were confusing or difficult, and none of the questions included unfamiliar words. None of the nine respondents offered suggestions for improving the Program Experiences Survey.

III. Program Experiences Survey Pretest Findings and Recommendations

Based on the feedback received on the Program Experiences Survey from the nine current WIC participants with which the survey was pretested, the research team suggests several general changes to this instrument, along with some specific changes. These suggested changes are located in Table 27. All recommendations described below are for the English version of the Program Experiences Survey unless otherwise specified. Revisions to the English version of the Program Experiences Survey will be made, when appropriate, in the corresponding Spanish version of the Program Experiences Survey.

Table 27. Program Experiences Survey Pretest Feedback

Question #

Respondent Comments

Recommended Instrument Revisions

Q1

Two respondents to Version B: Infant/Child Program Experiences Survey were confused on this question and answered “currently participating.”

  1. Reword Q1 (in Version B) to “Is this the first time you’ve received benefits for your child, or has your child participated before this time?”

Q7

Four respondents hesitated when answering this question and one of them asked, “What do you mean?” In the Spanish version of the survey, the TIs noted a lack of understanding for some of the Spanish words used.

  1. Reword question to be more specific and clear.

  2. Revise Spanish translation of this question.

Q8

Five respondents answered this question with a current date (possibly giving the date of their or their child’s last WIC appointment).

  1. Reword question to better define what is meant by “last time.”


Q12

Six respondents struggled to learn and use the scale provided in this question. Even with repetition of the scale by the TI, respondents found this question to be difficult. One respondent asked what “satisfied” meant.

  1. Explore revision of scale. However, changing the scale will hinder comparisons with NSWP-II on this question.

Q13

Five respondents had a difficult using and remembering the scale in this question. The question also assumes that the respondent knows about each of the services listed and has had some experience using the services. TIs often had to explain at least one of the services to the respondent.

  1. Add a question before Q13 to ask which services the respondent uses or has used before asking them to rate the program.

Q15

All respondents hesitated when answering this question. Several asked for examples and others would say “everything” or “the help.”

  1. Design more effective probes.

Q16

Two of the four Spanish-speaking respondents did not understand this question.

  1. Revise Spanish translation of this question.

Q17A

Only three respondents were able to provide an estimate in miles to the distance from their home to the WIC office. Providing an answer in minutes was more attainable for respondents.

  1. Measure distance using time combined with transportation mode, not using miles.

Q18

Eight respondents struggled to answer this question. TIs noted that the question, as worded, does not make sense and would be better if it was clarified. The scale for this question may also need to be revised to make the question clear. TIs also noted that the example in (d) is long and distracting; several respondents answered “yes” instead of the scale after hearing the example.

  1. Delete the example provided in (d).

  2. Reword question.

Q20A

Six respondents struggled with this question. Some respondents were unsure of the question and others had difficulty using the scale despite being reminded by the TI.

  1. Reword question.

  2. Simplify scale.

Q30

Only three respondents were able to provide a response to how far in miles the store they used was from their home. One respondent gave a range of miles, the second respondent provided a range of minutes, and the third respondent responded with “I don’t know. It’s not far.”

  1. Measure distance using time combined with transportation mode, not using miles.

Q33A

Only one respondent reported “sometimes not having enough food” in response to Q33; therefore, the food security questions were only pretested with one respondent. The TI noted that the respondent was very confused on the questions within the Q33A series.

  1. Revise items in Q33A to read as questions.

Q38A and Q38B

TIs noted that this question caused some confusion with respondents and was difficult to administer. The ordering of “currently” and “ever” was cumbersome.

  1. Reorder the columns. Place column Q38B “ever” before column Q38A “currently.” If respondents say “no” to “ever,” then skip the “currently” option.

Q47

Three respondents were unable to answer the question about their child’s height and one respondent refused. The need for this question was not substantiated by the guidance in the PWS.

  1. Delete question.

Q48

Three respondents were unable to answer the question about their child’s weight and one respondent refused. The need for this question was not substantiated by the guidance in the PWS.

  1. Delete question.


The research team also pretested the Program Experiences Survey Invitation Telephone Script recruitment in the field with respondents. Respondents were not specifically asked about the narrative and screening questions. TIs made notes based on the types of reactions and questions they received during pretest recruitment. The research team suggests several general changes to this instrument, along with some specific changes. These suggested changes are located in Table 28. All recommendations described below are for the English version of the Program Experiences Survey Invitation Telephone Script unless otherwise specified. Revisions to the English version of the Program Experiences Survey Invitation Telephone Script will be made, when appropriate, in the corresponding Spanish version of the Program Experiences Survey Invitation Telephone Script. On average, the Program Experiences Survey Invitation Telephone Script recruitment script took 7 minutes to deliver to the respondent. The research team recommends revising the estimated average respondent burden from 5 minutes (0.08 hours) to 7 minutes (0.12 hours).

Table 28. Program Experiences Survey Invitation Telephone Script Pretest Feedback

Appendix ID

Language

Document Name

Research Team Comments

Recommended Revisions

C19.a

English

Program Experiences Survey Invitation Telephone Script

Q1. TIs felt the introduction should be brief and identify if the respondent was a WIC participant right away because many respondents contacted would hang up during the introduction or interrupt the TI to ask about the purpose of the call.

Q2. TIs reported that respondents reacted to this question with hesitation or concern.

Q6. TIs reported that the mention of a gift card appeared late in the recruitment script and would be better placed at the beginning of the introduction to the survey.

  1. Revise the introduction narrative in Q1.

  2. Reword Q2 to “Is now a good time to talk and are you in a place where you feel comfortable talking?”

  3. Move the sentence about receiving a gift card towards the beginning of the script in Q6.

C19.b

Spanish



Former Participant Case Study

I. Former Participant Case Study Pretest Methodology

  1. Instrument Description

The Former Participant Case Study consists of a qualitative semi-structured interview with a purposive sample of inactive WIC participants (i.e., those who have stopped redeeming WIC benefits prior to the end of their certification period) and is designed to examine the barriers and facilitators to WIC program retention. The instrument also seeks to determine reasons why a participant stopped redeeming benefits within their period of eligibility. The Former Participant Case Study provides a novel perspective to the NSWP series of studies.

  1. Respondent Selection

The Former Participant Case Study interviews were conducted with former WIC participants who are not using WIC benefits or services, even though they have not yet reached the end of their certification period or otherwise remain eligible for WIC. The research team requested samples of former WIC participants from the Massachusetts SA. The Massachusetts SA provided the research team with a sample of “inactive” (identified by termination code “no FI pickup -2 months”). Participant data were delivered to the research team on December 15, 2016 from the initially selected, Boston-area LA. The research team recruited from this, and additionally sampled former participants, between February 15, 2017 and March 13, 2017 using a programmed Former Participant Case Study Interview Invitation Telephone Script. The research team made up to five telephone calls in an attempt to reach potential respondents and left voicemails on every other telephone attempt.

  1. Pretesting and Debriefing Procedures

The Former Participant Case Study interviews were conducted by telephone and in English (Appendix B6a) or Spanish (Appendix B6b), as appropriate. TIs pretested paper versions of the interview guide. After verbally obtaining informed consent by telephone, the TIs administered the survey, reading each question aloud, following interviewer instructions, and audio recording the interview, when permitted by the respondent. If permission was not given to audio-record the interview, the call monitor was available to take notes. However, none of the respondents refused to be recorded. The TIs recorded the full duration of the surveys.

Telephone-based interviews were conducted with a total of nine selected former WIC participants. The Former Participant Case Study was conducted with an instrument designed to be administered to adult and child participants. A total of five English Former Participant interviews and four Spanish Former Participant interviews were conducted. Respondents included four adults who participated in the WIC program and five adults whose children participated in the WIC program. Table 29 provides a summary of the Former Participant Case Study pretest timeline.

The TIs conducted a debriefing interview with each respondent immediately following the completion of the Former Participant Survey to identify any questions that were confusing or difficult to answer (Appendices F6a and F6b). After completing the survey and debriefing, a $25 Walmart gift card was mailed to the respondent. Feedback from pretest respondents was used to improve the instrument, which will be programmed for CATI interviewing for use with the full study sample.

Table 29. Former Participant Case Study Pretest Timeline

Former Participant Case Study Interviews


Respondent

Completed Survey

Participant Type

Language

FPS-1

February 15, 2017

Infant/Child

English

FPS-2

February 21, 2017

Adult

Spanish

FPS-3

February 23, 2017

Infant/Child

Spanish

FPS-4

February 27, 2017

Infant/Child

English

FPS-5

March 1, 2017

Adult

English

FPS-6

March 3, 2017

Infant/Child

English

FPS-7

March 10, 2017

Adult

English

FPS-8

March 11, 2017

Adult

Spanish

FPS-9

March 13, 2017

Infant/Child

Spanish


  1. Analysis

The data collected during instrument pretesting, as well as feedback received during debriefing interviews, were evaluated and discussed by the research team. The research team has provided recommendations, supported by the analysis of respondent feedback, for each instrument, following the reported results in each section. Findings are individually presented for each of the pretested instruments in the sections that follow.

II. Pretest Burden Estimates of the Former Participant Case Study

The Former Participant Case Study interview was estimated to take 20­-40 minutes to complete. The TIs tracked approximately how long it took respondents to complete the Former Participant Case Study (see Table 30). Response times ranged from 20 minutes to 36 minutes, with the average response being 27 minutes. Because it took most respondents more than the estimated 20 but far less than the estimated 40 minutes to complete the survey, the research team recommends the estimated amount of time for completion be 30 minutes.

Table 30. Administration Time (mins) for Former Participant Case Study Pretest, Overall and by Language and Version


Minutes

Overall (n=9)

English (n=5)

Spanish (n=4)

Adult
Version

(n=2)

Infant/child
Version (n=3)

Overall (English)

Adult
Version (n=2)

Infant/child
Version (n=2)

Overall

(Spanish)

Average

27

29

25

26

23

32

28

Median

25

29

20

25

23

32

27

Minimum

20

25

20

20

21

28

21

Maximum

36

33

34

34

25

36

36

General feedback on the Former Participant Case Study was collected from respondents during the debriefing interviews. Despite having difficulties that were noted by the TIs (and are described in detail below), all nine respondents reported that all survey questions were easy to answer, none of the questions were confusing or difficult, and none of the questions included unfamiliar words. None of the nine respondents offered suggestions for improving the Former Participant Case Study.

III. Former W Participant Case Study Pretest Findings and Recommendations

Based on the feedback received on the Former Participant Case Study interviews with the nine former WIC participants with which the interview guide was pretested, the research team suggests several general changes to this instrument, along with some specific changes. These suggested changes are located in Table 31. All recommendations described below are for the English version of the Former Participant Case Study unless otherwise specified. Revisions to the English version of the Former Participant Case Study will be made, when appropriate, in the corresponding Spanish version of the Former Participant Case Study.

Table 31. Former Participant Case Study Pretest Feedback

Question #

Respondent Comments

Recommended Instrument Revisions

Q1

The first respondent to the Former Participant Case Study seemed to have difficulty transitioning into the more in-depth questions. Demographic questions generally work better toward the end of a survey and were moved for the remaining pretest respondents.

  1. Move the former Q10 to replace Q1 to provide a smoother transition into the questions and build rapport with the respondent.

  2. Move demographic questions (previously Q1-Q6) to the end of the survey (now Q34-Q39).

All question number references from this point forward refer to the renumbered questions in the instrument provided with this memo.

Q3

Five respondents reported participating in the WIC program at least one time before the most recent time. Throughout the nine surveys, TIs noted that the respondent would often talk about previous experiences as well as recent experiences in the program.

  1. Provide the respondent with a short narrative before the survey begins that reminds them of their most recent certification. The narrative would include the most recent past certification date, the associated certification end date, and incorporate the termination date (for not picking up food instruments for the past 2 months). The TI may remind the respondent of this narrative when appropriate.

Q3

All nine respondents provided the date of the first time they had ever participated in the program. Three respondents were confused by the term “WIC clinic;” one mentioned that they had never heard that term used before. During the debriefing interviews, the respondents stated that they use the term “WIC office.”

  1. Reword to “When is the last time you [or your child] participated in WIC?”

  2. Reword question. Use “WIC office or clinic” instead of “WIC clinic.” Make this change throughout the instrument.

Q8

Eight respondents reported using “food benefits” while they were in the program; however, the TIs often had to first explain what was meant by “food benefits.”

  1. Program the survey so that the type of food benefit (food voucher check or EBT card) administered in the respondent’s State will appear as an example. Make this change globally. Apply change to “WIC benefits.”

Q9

Five respondents who completed the survey were adult applicants of infant/child participants and were selected for the survey on behalf of their child. These respondents often described their own experience in the program, especially in response to this question.

  1. Tailor questions, when appropriate, so that the wording of the questions remind the respondent of the perspective sought.

Q14Aa and Q14Ab

Six respondents mentioned the number of appointments available during their preferred time frame. The TIs often reworded the example “participant work schedule” to “your work schedule” to make it more understandable to the respondent.

  1. Reword question. Use “WIC office or clinic” instead of “WIC clinic.”

  2. Add “number of appointments available during your preferred time frame” to the examples.

  3. Reword example from “participant work schedule” to “your work schedule.”

Q14B

Six respondents stated that they brought their children to the WIC appointments.

  1. Reword question to “What did you do with your child(ren) while you were at your appointment? Probe: Did that make it easier or more difficult for you to follow through with your appointment?”

Q14C

Six respondents hesitated before answering this question, indicating to the TI that there was some confusion about the intent of the question. In fact, one respondent reported during their debriefing interview that the question was “worded weirdly.” This respondent recommended changing the wording. Some of the terms used in the Spanish translation will be clarified as well.

  1. Reword question to “How did you usually get your WIC office or clinic? Was that easy or difficult? Can you tell me more about that?”

  2. Revise Spanish translation of this question.

Q16

Seven respondents provided the TI with specific names of stores instead of describing the type of store. One respondent was confused by the question because it written as a statement, not a question.

  1. Reword question to “What types of stores did you shop at when using your WIC benefits (voucher or EBT card)? PROBE: What kinds of stores are those? [EXAMPLES: grocery stores, farmers’ markets, convenience stores, commissaries, etc.]”

Q17

Respondents reacted similarly to this question as they did to Q15C described above. TIs also recommended that the order of this question be revised and suggested moving Q20 to follow Q17 so that the respondent is presented with a logical flow of questions beginning with where did shop and how did they get there. The TI who administered the Spanish versions of the survey noted that the translation of this question is complex and broad. The TI recommended choosing simple words instead.

  1. Reword question to “How did you usually get the store where you used your WIC benefits? Was that easy or difficult? Can you tell me more about that?”

  2. Revise translation of this question.

  3. Move Q20 to follow Q17.

Q18

Four respondents provided the TI with similar or the same responses that were provided in Q18, often making statements such as “Like I said…” and “Again,…” During one debriefing interview, one respondent specifically stated that Q22 seemed repetitive.

  1. Replace the wording of Q18 with the wording of Q22.

Q21

Several respondents hesitated before answering this question, indicating to the TI that there was some confusion about the intent of the question. The TI often read the succeeding question Q23A as a prompt.

  1. Reword Q23 to “Next, I would like to ask you about your experience with the staff at the store(s) where you usually used your WIC. benefits. How would you describe the way the staff treated you?”

Q20

The TI often read the preceding question Q23A as a prompt for Q23.

  1. Combine Q23A into Q23.

Q22

Several respondents hesitated before answering this question, indicating to the TI that there was some confusion about the intent of the question. The TI often read the proceeding question Q24A as a prompt.

  1. Reword Q24 to “Now I would like to ask you about the other customers at the store(s) where you usually used your WIC benefits. How would you describe the way the other customers treated you?”

  2. Program the survey so that the type of food benefit (food voucher check or EBT card) that the respondent’s State administered will appear instead of “WIC benefit.” Make change globally.

Q24A

The TI often read the preceding question Q23A as a prompt for Q24.

  1. Combine Q24A into Q24 as described above.

Q23

In a previous question, eight respondents reported using “food benefits” while they were in the program; however, the TIs often had to first explain what was meant by “food benefits.” When administering the Spanish version of the survey, a need for a simplified translation was evident.

  1. Reword to “What could be done to improve the shopping experience and using WIC benefits at stores for other WIC participants?”

  2. Program the survey so that the type of food benefit (food voucher check or EBT card) that the respondent’s State administered will appear instead of “WIC benefit.” Make change globally.

  3. Revise Spanish translation.

Q24

Though eight respondents provided a response to the question, the opening statement, “For what reasons…” may come across harshly. Additionally, four respondents reported that they did not leave the program but that they or their child were no longer eligible to receive WIC benefits. One respondent reported that they no longer had custody of their child. While neither of these circumstances results in a revision to the survey, the research team will revise the protocol for the Former Participant Case Study sample.

  1. Add a probe when appropriate “Can you explain why you think that you are [eligible or not eligible]?”

  2. Revise some wording in the Spanish translation.

Q29

Though four respondents provided a response to the question, the opening statement “What would stop you…” sounds abrupt. The probe was also ineffective. The five respondents who did not provide a sufficient response to this question responded with “I don’t know.”

  1. Reword question to “Is there anything that would stop you from participating in the WIC program again?”

  2. Add additional probes.

Q30

Four respondents were confused by this question. Although the question asks if the respondent has ever, or is currently participating, in any of the following programs, many of the respondents would ask “Do you mean now or at all?” TIs recommended adding a timeframe to the question to help frame the question. Also, when the TI would read the response options “NSLP” and “SBP,” many respondents would then ask if that meant free and reduced-price meals.

  1. Reword question to "[Were you OR Was your child] enrolled in any of the following programs between [CERT DATE] and [CERT END DATE]?”

  2. Group NSLP and SBP response options together.

  3. Add “also known as free and reduced-price meals at school.”

Q30A

All nine respondents were confused by this question. The question is worded in a way that presumes that these programs influenced the respondent’s decision to leave the WIC program.

  1. Reword question to “Did your participation in these programs influence your decision to leave the WIC program? PROBE: Can you tell me more about that?”

Demo-graphics Section

Some respondents refused to answer these questions; however, the “refused to answer” response is missing.

  1. Add “refused” as a response option for questions in the “Demographics” section.

Q35

One respondent stated that they had not attended school and another respondent answered “8th grade.”

  1. Add “Less than high school” to the response options (to match the categories of educational attainment used by the Census13).

Q36

All nine respondents hesitated when answering this question. TIs recommended adding a prompt to encourage their response.

  1. Add “Would you say…” as a prompt to follow the statement.



The research team also pretested the Former Participant Case Study Interview Invitation Telephone Script recruitment materials, which were programmed in Qualtrics/WinCATI, in the field with respondents. Respondents were not specifically asked about the narrative and screening questions, TIs made notes based on the types of reactions and questions they received during pretest recruitment. The research team suggests several general changes to this instrument, along with some specific changes. These suggested changes are located in Table 32. Revisions in the English version of the Former Participant Case Study Invitation Telephone Script will be made, when appropriate, in the corresponding Spanish version of the Former Participant Case Study Invitation Telephone Script. On average, the Former Participant Case Study Invitation Telephone Script recruitment script took 9 minutes to deliver to the respondent. The research team recommends revising the estimated burden from 5 minutes (0.08 hours) to 9 minutes (0.15 hours).

Table 32. Former Participant Case Study Interview Invitation Telephone Script Pretest Feedback

Appendix ID

Language

Document Name

Research Team Comments

Recommended Revisions

C23.a

English

Former Participant Case Study Interview Invitation Telephone Script

Q1. TIs felt the introduction should be brief and identify if the respondent was a WIC former participant right away because many respondents contacted would hang up during the introduction or interrupt the TI to ask about the purpose of the call.

Q2. TIs reported that respondents reacted to this question with hesitation or concern.

Q6. TIs reported that the mention of a gift card appeared late in the recruitment script and would be better placed at the beginning of the introduction to the survey.

  1. Revise the introduction narrative in Q1.

  2. Reword Q2 to “Is now a good time to talk and are you in a place where you feel comfortable talking?”

  3. Move the sentence about receiving a gift card towards the beginning of the script in Q6.


C23.b

Spanish



Discussion and Conclusion

  1. Lessons Learned for Full Data Collection

Several lessons were learned during this pretest data collection that will help prepare for the larger data collection in 2018.

  1. Obtaining Data from State and Local Agencies

Our experience with obtaining data from SAs demonstrates the need to start this process early. Once notified by FNS, the SAs will need to be approached by the research team to identify the person who will work with us to provide participant data. This process will involve getting DUAs in place and communicating data needs. The research team may need the assistance of FNS to help convince SAs that their participation is necessary. FNS’ involvement was important to respond to the Massachusetts and Illinois SAs’ concerns.

The data from each SA is likely to vary in how it is categorized, organized, and defined. Prior to the 2018 data collection start, the research team recommends making two trial runs to pull data from each participating SA to ensure that the research team’s data specifications are communicated clearly and that the SA can provide the data as specified. . If permitted prior to OMB approval of the data collection for the study, the research team proposes to make these trial runs on a rolling basis as each SA executes a DUA with the research team. Once a DUA is in place, the research team will conduct the first trial run, process the data as if setting up for data collection, and examine the data to ensure that all specifications were met. The team will follow-up with each SA to clarify any questions, and then proceed with the second trial run.. The data collected will be analyzed and prepared as if setting up for data collection. The research team will examine the data and follow up with SAs to clarify any questions. The experience will be used to build a set of decision rules regarding the data collected from those participating SAs.

This process will be repeated again in October 2017 with a larger, if not complete, set of participating SAs’ data. This data will be examined as well. If our experience with Massachusetts is indicative of other SAs, once the data pulls have been defined by IT staff, obtaining additional data is a minor burden. The research team will assess its preparedness at the end of the second trial run. If a third trial run is needed, the research team will perform another trial run prior to the study’s launch date.

Massachusetts was able to provide data that were 1 or 2 days old. In our discussions with other SAs, most had MIS systems that were updated daily. For the Certified and Denied Applicant Surveys, in particular, we recommend pulling the most recent data available to increase the likelihood that we can observe with respondents the same types of proofs that they brought to their WIC clinic.

  1. Recruiting Respondents

Difficulty in recruiting respondents was anticipated by the research team and reinforced by the experiences of the Expert Panel. Several protocols will be utilized in the fullstudy that will were not utilized in this pretest. These include:

  • Introduction letter when feasible

  • Expanded evening and weekend calling

  • Voicemails for all non-respondents

  • Reminder post card

  • A voicemail on call seven that if they do not respond to the next call or call us back, we may visit them for an in-person interview

  • A door knock for selected non-respondents (Program Experiences Survey, Certification Survey, and Denied Applicant Survey)

In addition to the protocols described above, the research team also recommends the following changes to protocol:

  • Change our voicemail message so that we identify ourselves and state that we are calling on “behalf of a Federal food program that you may be participating in” or something similar. Maintaining the need for privacy is understood by the research team. However, the current voicemail messages do not distinguish our calls from that of a marketing research company or telemarketing firm.

  • The voicemail message should include mention of the incentive.

  • The research team is discussing changes in the way we identify ourselves. Mentioning Capital Consulting Corporation in the first line of the introduction does not let respondents know that we are a research company. Since 2M Research is doing the phone recruitment, we are considering introducing ourselves as “2M Research” and then introducing the remainder of the team later in the interview after gaining respondent buy-in.

  1. Questionnaire Development

For each of the surveys discussed in this memorandum, there are a number of changes recommended. With FNS approval, the research team will incorporate these changes in the final revision of the data collection instruments and recruitment materials.

Several challenges and proposed solutions were discussed in the Certification Survey and Denied Applicant Survey sections of this memorandum. During the upcoming weeks, the research team and FNS will need to discuss the best approach for each of these challenges.

  1. Next Steps

After FNS provides review of these findings, the research team will finalize the instruments (proposed date, May 1, 2017) and OMB package (proposed date, July 17, 2017). The research team anticipates delivering a sampling memorandum on April 28, 2017. After this memorandum is accepted, the research team will begin recruiting SAs.

1 Improper Payment Elimination and Recovery Act of 2012, Pub. L. No. 112-248. (https://www.gpo.gov/fdsys/pkg/PLAW-112publ248/html/PLAW-112publ248.htm)

2 Third National Survey of WIC Participants (NSWP-III) Expert Panel Meeting. (2016). Alexandria, Virginia. 19 January, 2016 [transcript].

3 As of March 20, 2017, the Appendix B7 “Denied Applicant Log” was not pretested. The research team plans to request assistance from the Massachusetts SA in the upcoming weeks to pretest these items using their agency staff as a proxy to estimate burden time.

4 As of March 20, 2017, the Appendix D3 “Certification End Date Verification Email” was not pretested. The research team plans to request assistance from the Massachusetts SA in the upcoming weeks to pretest these items using their agency staff as a proxy to estimate burden time.

5 As of March 20, 2017, the Appendix D4 “Certification End Sate Verification Reminder Phone Script” was not pretested. The research team plans to request assistance from the Massachusetts SA in the upcoming weeks to pretest these items using their agency staff as a proxy to estimate burden time.

6 SAs with reciprocity agreements may allow applicants to document residence in a neighboring State with which the agency has a reciprocity agreement.

7 Individuals served by an ITO who reside on a remote Indian reservation, village, or pueblo may establish proof of residency by providing a mailing address and the name of the Indian village in which they reside.

8 An additional type of error, expired certification error, occurs when a WIC participant redeems food instruments after the end of a certification period. Data to inform expired certification error estimates will come from State certification data and telephone follow up with LAs to confirm certification dates in apparent cases of expired certification error.

9 In the pilot State, not all Medicaid plans offered in the State conferred adjunctive eligibility; the qualifying plans were listed in the SA’s program and procedures manual for LAs.

10 The following is the note included in the Revised Certification Survey: “Note for reviewers: The next set of questions asks for income sources and amounts during the 30 days prior to the participant’s certification date. Federal WIC regulations (Section 246.7(d)(2)(i)) permit SAs to instruct LAs to determine whether the current rate of income or income over the prior 12 months most accurately reflects the family status (with two exceptions described below). Although policy guidance provides some recommendations, this regulation gives LAs some flexibility to make independent and non-replicable decisions about which time frame is more accurate. As a result, FIs will first assess family income based on the current rate of income (defined as the 30 days prior to certification date). If preliminary results suggest that the WIC participant should have been deemed ineligible due to income, the FI will re-assess the family’s income using a reference period of at least 30 days that falls sometime within the year prior to CERT_DATE. The FI will first attempt to obtain income documentation for a total of 30 days during the 3 months prior to CERT_DATE. Given that families may have sparse documentation for income from prior periods, the FI will accept any proof of income that spans a total of 30 days within the past 6 months (for income from self-employment, rental income, and royalties, FIs will have already requested proof of income over the past 12 months.)”

11 The rate of denials will be estimated from data collected to construct the sampling frame, not from the data collected in the survey itself.



12 The following is the note included in the Revised Denied Applicant Survey: “Note for reviewers: The next set of questions asks for income sources and amounts during the 30 days prior to the participant’s application date. Federal WIC regulations (Section 246.7(d)(2)(i)) permit SAs to instruct LAs to determine whether the current rate of income or income over the prior 12 months most accurately reflects the family status (with two exceptions described below). Although policy guidance provides some recommendations, this regulation gives LAs some flexibility to make independent and non-replicable decisions about which time frame is more accurate. As a result, FIs will first assess family income based on the current rate of income (defined as the 30 days prior to certification date). If preliminary results suggest that the WIC participant should have been deemed eligible due to income, the FI will re-assess the family’s income using a reference period of at least 30 days that falls sometime within the year prior to APP_DATE. The FI will first attempt to obtain income documentation for a total of 30 days during the 3 months prior to APP_DATE. Given that families may have sparse documentation for income from prior periods, the FI will accept any proof of income that spans a total of 30 days within the past 6 months. (For income from self-employment, rental income, and royalties, FIs will have already requested proof of income over the past 12 months.)”

13 U.S. Census Bureau, 2015 Current Population Survey.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
SubjectAG-3198-S-15-0040
AuthorAmy Wieczorek, MPH
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy