Supporting Statement Part B 4-01-13

Supporting Statement Part B 4-01-13.docx

Study of the Food Distribution Program on Indian Reservations (FDPIR)

OMB: 0584-0583

Document [docx]
Download: docx | pdf



SUPPORTING STATEMENT FOR


Study of the Food Distribution Program on Indian Reservations (FDPIR)”




Bob Dalrymple, Ph. D., Senior Analyst


Office of Research & Analysis

Food and Nutrition Service, USDA

3101 Park Center Drive, 2016

Alexandria, VA 22302

703-305-2122

[email protected]








































Study of the Food Distribution Program on Indian Reservations (FDPIR)


TABLE OF CONTENTS


SUPPORTING STATEMENT PART B FOR REQUEST FOR CLEARANCE



Part B


B1. Respondent Universe, Sample Selection and Expected Response Rates……..1

1a. Respondent Universe………………………………………………………………..1

1b. Sample Selection…………………………………………………………………….2

1c. Expected Response Rates…………………………………………………………..4

B2. Procedures for the Collection of Information………………………………….5

B3. Methods to Maximize Response Rates and to Deal with Issues of Non-response……………………………………………………………………………………11

B4. Pre-testing of Procedures and Methods…………………………………………...12

B5. Individuals or Contractors Responsible for Statistical Aspects of the Design…………………………………………………………………………………….…18


ATTACHMENTS REFERENCED IN PART B

ATTACHMENT B1c: Case Record Review Pretest Outreach Script

ATTACHMENT B2b: Household Survey Pretest Outreach Script

ATTACHMENT B2d: Pretest Debriefing Guide

ATTACHMENT B4e: Onsite Official Interview Pretest Outreach Email

ATTACHMENT B5f: Discussion Group Pretest Outreach Script

ATTACHMENT G: List of Sampled FDPIR Sites

ATTACHMENT H: Outreach Process

ATTACHMENT I1: FDPIR Letter #1– to Tribal Leadership from USDA

ATTACHMENT I2: FDPIR Letter #2 – to FDPIR Director from USDA(Outreach)

ATTACHMENT I3: FDPIR Letter #3 – to FDPIR Director from USDA (Invitation Accepted)

ATTACHMENT I4: FDPIR Letter #4 – to Household Survey Respondents from USDA

ATTACHMENT I5: FDPIR Letter # 5 – to Tribal Leaders from Urban Institute

ATTACHMENT I6: FDPIR Study Informational Brochure

ATTACHMENT J: Invitation to the Reservation Template

ATTACHMENT K: Procedures for Accessing Case Records and Abstracting Data

ATTACHMENT L: Responsibilities of Field Interviewers

ATTACHMENT M: Procedures for Conducting Site Visits




B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

Data collection for this study includes three samples: a nationally representative sample of households participating in FDPIR whose case records will be reviewed and who will be surveyed; a purposive sample of Tribal leaders and program staff who will be interviewed; and a purposive sample of FDPIR participants and eligible non-participants for discussion groups.

1a. Respondent universe

As of the end of Federal Fiscal Year 2011, there were a total of 112 FDPIR programs operated by Indian Tribal Organizations (ITOs) and State organizations, serving a total of 34,563 participating households. As described below in response 1b, only participants in the 104 programs with more than 33 participating households were eligible for the sample. Table B-1 summarizes the respondent universe, sample size, and sample selection method for each data collection component.



Table B-1: Respondent Universe and Sample Size

Data Collection Component

Description of Respondent Universe

Universe Size

Sample Size

Sampling Method

Expected Response Rate

Participant Survey and Case Record Reviews

Households participating in FDPIR programs that have more than 33 participating households.

34,394 households in 104 programs


1040 in 25 programs

Programs sampled with Probability Proportionate to Size (PPS) and equal numbers of households per program to obtain equal probability of sample of households

80% for household survey; 100% for case record review

Tribal Site Visits:

Program Interviews

Tribal leaders, program administrators, and program staff of FDPIR programs participating in the survey and case record reviews.


25 FDPIR programs, 3-20 staff to interview per program

170 (17 programs

average of 10 interviews per program)

Purposive--to achieve diversity of programs on size, FNS region. economy, participation change, program administration,

access to other food programs, and program features such as food delivery method(s).

100%

Tribal Site Visits:

Discussion Groups

FDPIR participants and low-income non- participants at the programs participating in the survey and case record reviews.

25 FDPIR programs, about 40,000 participants; unable to estimate number of low-income nonparticipants, but assumed to be about 20,000 (half the number of participants)

304 in 17 programs

Convenience sample--

Recruited by FDPIR and other Tribal program staff as determined through Tribal consultations and outreach process.

75%


1b. Sample selection

The selection of the sample uses a two-stage design.  The first stage is the selection of ITOs and State organizations operating FDPIR programs25 ITOs and State organizations were selected with a probability proportionate to the number of households participating in the FDPIR program, a method referred to as probability proportionate to size (PPS). Monthly household participation data for FY 2011 was used to draw the sample. The largest 6 ITOs were included in the sample with certainty. A second sample of non-certainty sites was drawn to serve as replacement sites if needed.  Both samples were chosen with PPS.1  The sites were drawn by implicitly stratifying the programs, sorting the programs (i.e., ITOs) first on region (FDPIR programs operate in 6 FNS regions) and then on whether participation between 2001 and 2011 fell by more than 25 percent, fell by less than 25 percent, or increased.  The latter dimension was added to ensure that the sample matches the distribution of all participants across growing and shrinking programs. Small programs are represented in proportion to their numbers of participating households, rather than oversampled to ensure a target number of small ITOs.

Only participants in the 104 programs with more than 33 participating households were eligible for the sample. This is large enough to have a high probability of achieving the target number of interviews. It will exclude only 8 small programs with FDPIR participants and provide coverage of 99.5 percent of the FDPIR participating households. The excluded programs represent less than 1 percent (0.54%) of the population. The cutoff for inclusion was chosen to assure that a sufficient number of interviews with participating households in each of the sampled areas can be obtained (See Attachment G for a list of sites in the sample).

The samples of participating households will be selected from administrative records in each of the 25 Tribal areas. FDPIR eligibility is by household unit, and case records are maintained by household. From the sampled households, the interview will be conducted with the person (participant) who applied for FDPIR assistance (noted as the Head of Household on some forms) or his/her proxy. Information about all members of the household will be abstracted from the case record to develop the profile of participants.  

Case records and surveys will be obtained for 30 households in all but the two largest Tribal areas. In each of the two largest programs (Cherokee and Navajo), which together represent 23 percent2 of the FDPIR participants, case records and surveys will be obtained for 68 households (combined yielding 16.3 percent of the population). The increased sample size in these two programs reduces the extent to which weighting will be needed to represent the entire population.


1c. Expected response rates

In total, a random sample of 1040 households will be selected. The anticipated response rate for the participant survey is 80 percent, which will yield 832 completed surveys (the data abstraction for the case record review will yield a dataset that is 100 percent complete). If the response rates for the survey are lower than expected, the researchers will describe the differences between respondents and non-respondents by comparing case record data for the two groups, and, if needed, adjusting using weights for differential non-response. This design allows for computation of national estimates of the characteristics of participating households as well as estimates for large subgroups, such as households with elderly participants. A100 percent response rate is anticipated for interviews with Tribal leaders, FDPIR program administrators, and other community leaders; and a 75 percent attendance rate is anticipated for the discussion groups.

To ensure a high response rate, informational webinars have been conducted with Tribal leaders and their designated representatives during formal consultations, and a presentation and discussion session was conducted at the annual meeting of the National Association of FDPIR (NAFPDIR). A series of formal in-person consultations with Tribal leaders of Tribes selected for the sample has recently been completed in order to provide information about the study and encourage participation. The invitation letter that USDA sent to Tribal leaders is included as Attachment I1. USDA will also make the initial contact with FDPIR directors at the start of in-depth outreach and recruitment for data collection (see Attachments I2 and I3) and with participants selected for the survey sample (Attachment I4). The study team plans a systematic outreach effort with each selected Tribe, including disseminating materials in advance to familiarize Tribal leaders, elders, and ITOs with the study and the study team; outreach telephone calls or webinar conferences; compliance with Tribal IRB procedures; Tribal capacity-building activities; and onsite visits where needed. These activities are discussed in more detail in the response to question 3 and in Attachment H. A sample letter from the research team to follow-up after USDA’s initial letters is included in Attachment I5. This will be tailored as needed based on responses to initial communications from USDA. For example, it may be sent to the Tribal Leader or to the FDPIR Director, or both, and it may be sent via email. A draft brochure announcing the study can be found in Attachment I6. This will be sent to potential survey respondents with the letter in Attachment I4. Some reservations require an invitation letter from the Tribal Leader to permit access to Tribal areas. Attachment J is a template of such a letter for consideration by such Tribes.


  1. Describe the procedures for the collection of information including:

    • Statistical methodology for stratification and sample selection,

    • Estimation procedure,

    • Degree of accuracy needed for the purpose described in the justification,

    • Unusual problems requiring specialized sampling procedures, and

    • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

This section describes procedures for selecting the household sample, locating and extracting data for case records, administering the household survey, and selecting the site visit sample.

Sample of participating households. To select the sample of participating households, it is necessary to define the reference month and construct the sampling frame prior to implementing systematic random sampling. The reference month will identify every household that received FDPIR commodities that month for each of the 25 programs in the sample. Identification of the reference month is pending. Considerations in determining the reference month are the amount of time needed for sample frame construction and anticipated timing of data collection. After outreach to participating programs has begun, the researchers will have a much better idea of the time needed for sample frame construction. It is also important to consider seasonal fluctuation in FDPIR participation in order to select a month that reflects typical program activity. Based on the information to-date, it is anticipated that the reference month will be in the spring/early summer (May or June).

The sampling frame for each ITO will consist of a list of all FDPIR participating households for the reference month. Each ITO in the sample will be asked to prepare the list of heads of household (i.e., the member of the household who applied for the program and is participating) in alphabetical order by last name and send the list by secure mail or via FTP within a specified time frame. FNS staff at Regional Offices indicated that all ITOs have such lists, which are based on the Household Issuance History Report generated by FNS’ Automated Inventory System used for program recordkeeping, so producing it should not be burdensome. It is possible that some ITOs may be reluctant to share the list of FDPIR households in advance of the researchers’ visit to conduct the case record review. Should this be the case, project staff will obtain the list of FDPIR program households from the ITO for the reference month during the onsite case record review. The necessary information can be obtained from the paper case records, issuance cards, or an electronic file. Research staff will then create the list of participating households.

Once the list is established, systematic random sampling will be implemented at each ITO. Using the site-specific list of participants for the designated reference month, a sampling statistician will select a systematic random sample of the required size. Systematic random sampling involves identification of a random start (r) and a sampling or skip interval (k). The interval is determined by dividing the number of participants on the list (N) and the second-stage sample size (n) (i.e., N/n=k). K is a constant interval between participants on the list. R is a random number between 1 and k. All participants have an equal chance of being selected as the starting point or the initial subject. Using the constant interval (k), every kth unit will be selected (i.e., r, r+k, r+2k, etc.). The advantages of using systematic sampling over simple random sampling are that it is simple to understand, easy to implement, assures that the population is evenly sampled, and eliminates clustering. The research team will identify the service area of each of the 25 ITOs in the sample, the dispersion of participating households across a geographic area (such as on the reservation only, in contiguous counties, within a 20-mile radius, etc.), and the food distribution points. For example, some programs may operate more than one warehouse, with food distribution at each warehouse; other Tribes operate tailgate distributions at several remote locations within their service area. The procedures will be modified to implement sub-sampling, as needed, and any resulting design effects will be identified. Because each ITO may have different data formats and procedures, the research team is prepared to respond to the following scenarios:

  • If the participating household list and frame size (N) are obtained and known in advance, then research staff will implement the systematic random sampling procedure and identify the participants in the sample. ITO staff will "pull" the corresponding case record for each household and NORC will abstract the relevant data onsite.

  • If the case record information is to be shared electronically, then the ITO will extract the corresponding data elements for each household and transmit the dataset in a secure manner to NORC for analysis.

  • If the participating household list and frame size (N) are not obtained and known in advance, then research staff will construct the list and obtain the sample size while onsite. Once collected, this information will be communicated to the sampling statistician. The statistician will determine the units to be selected in the sample. Research staff will then implement the systematic random sampling procedure. Research staff will coordinate and conduct these steps by telephone or email. In all cases, the research team will be prepared to have the ITO participate in the process; for example, implementing the systematic random sampling procedure.

Weighting and Precision of Estimates. Following the collection of participant data, the research team will create weights. The weights will: give certainty sites their proper share of the population; adjust for changes in the distribution of sites between those used for site selection and the month of participant selection (reference month); and adjust for non-response to the survey. In addition, consideration will be given to adjusting weights to ensure that the interviewed sample distribution across regions matches that in the full population of participants. It is anticipated that most of the analysis will be reporting the proportion of households with a given characteristic or with members with that characteristic.  For example, the proportion of households with a child, rather than the proportion of participants who are children. This approach requires less weighting for analysis.

Using this sample, the research team is 95 percent confident of obtaining estimates between 44.8 and 55.2 percent for a measure with a population mean of 0.5, such as proportion of households with children or proportion of households with someone age 60 or older.  These estimates are sufficiently precise to meet the FNS goal of estimating the broad characteristics of participants across the country.

Methods for locating case records and procedures for extracting data. The following steps will be taken to collect consistent data from case record files for the sample of 832 households (with oversampling, the beginning sample will be 1040):

  • Conduct a teleconference with each ITO to discuss the approval process for obtaining the monthly participant list generated by the Automated Inventory System and access to the case records data;

  • Negotiate a data sharing agreement;

  • Obtain list of FDIR participating households;

  • Implement site-specific sampling as described in response to B.1;

  • Access or obtain case record data—assuming the majority of the ITOs do not have automated case record files, the plan includes visits to 20 Tribes to abstract the data.

A detailed plan for locating case records and abstracting data is included as Attachment K.

Administering the Survey. All sampled respondents will receive an advance letter describing the study and including a toll-free number for directly contacting the Field Interviewer assigned to their program. NORC, the survey subcontractor, provides each Field Interviewer with a dedicated toll-free number to accept calls from respondents. This allows respondents to reach the Field Interviewer directly from his or her own home or cell phone. Telephone surveys may occur at the time of the call or be scheduled for a later date at the respondent’s convenience.

Respondents often have individual preferences and circumstances that necessitate having alternate strategies in place for conducting a telephone or an in-person survey. While the data collection plan anticipates that telephone surveys may occur with the respondent using a home telephone, it also allows for the fact that some respondents may not have a home phone or may not wish to conduct the survey from their home phone for a variety of reasons (e.g., poor connections, lack of privacy, competing demands, etc.). Therefore, alternate approaches will be available for the respondent to contact the Field Interviewer. The research team will ask the ITO to provide a dedicated office phone and a private space that would be available for respondents to use either to call or meet with the Field Interviewer. In the event that the ITO does not have an alternate phone available, NORC will provide a cell phone.

If the respondent prefers to have an in-person survey, as is the case with many elderly respondents, the Field Interviewer will travel to the site and conduct the survey in the respondent’s home or in a designated, private office space (e.g., other Tribal office). In-person surveys may be pre-arranged or may occur on an ad hoc basis on food distribution days, as described below. The Field Interviewer will be onsite at least once a month in order to become known to the community and to be available should any sampled respondent indicate his/her willingness to participate in the survey. In addition to their interviewing responsibilities, each Field Interviewer will serve in the capacity of a “site liaison” in order to fully observe Tribal research protocols, build rapport with the FDPIR staff, and become a recognized presence in the community (Attachment L outlines the onsite responsibilities of Field Interviewers).

Tailgate food distributions pose a logistical challenge, and alternate strategies may need to be arranged in consultation with the ITO and the FDPIR program. The Field Interviewer will observe the food distribution process, which will facilitate meeting potential respondents face-to-face and will build initial rapport, increasing the likelihood that respondents in the sample will agree to a telephone or in-person interview. Because this process may have to occur more than once in order to gain recognition and establish rapport, the data collection plan includes repeated visits.

The procedures for data collection attempt to make the best use of variation in the food distribution process across Tribes and to cultivate an environment where the Field Interviewer becomes a known and trusted presence in the community. While this plan anticipates some of the logistical challenges that each site may present, the research team will also work proactively with each ITO to develop and implement site-specific solutions that facilitate cooperation and goodwill and which minimize the inconvenience or burdens of data collection over a 20-week period.

Conducting Site Visits. As stated in Part A, the site visits will include three types of activities: 1) interviews with program administrators, staff and service providers; 2) visits to FDPIR programs to observe facilities related to client enrollment, warehouses, and food distribution; and 3) discussion groups with program participants and non-participants.

No survey can fully capture the range of Tribal circumstances, priorities, relationships, and approaches with respect to addressing program operations and features that may be associated with FDPIR participation. Site visits and interviews with program staff provide an opportunity to present a richer and more nuanced description of these issues in 17 Tribal areas. Because budget constraints preclude visiting all local sites included in the nationally representative sample, the most effective sampling strategy is a purposive approach, whereby sites are chosen based on a number of factors in order to give a picture of the variety of local experiences across the country. To capture the greatest diversity within project resources, a purposive sample of 17 sites will be selected, based on consideration of the following factors:

  • Size (average monthly participating households)

  • Region (FNS region)

  • Economy (unemployment rate; per capita payments)

  • Participation change (2001-2011, ideally including sites with increasing as well as decreasing participation)

  • Program administration (Tribal or state-administered; service area, including size of service area and whether the program serves a single Tribe or multiple Tribes; coordination of FDPIR with other Tribal programs)

  • Coordination and access to other food programs (SNAP state or county-administered; ease of access to SNAP for reservation residents; other programs on the reservation such as CSFP and Tribal WIC)

  • Program features (food delivery/pick-up options; warehouse and ordering features; participation in Fresh Produce Program; nutrition education and health promotion activities).

The sites will be selected from the 25 sites that are participating in the case record review and survey. Initial outreach for the site visits will be conducted in tandem with the outreach for the case record review and participant surveys, so that if a site is replaced for the survey, it will no longer be considered for the site visit and program interviews. Discussions with FNS headquarters and regional staff and NAFDPIR representatives, as well as the Tribal consultations, will inform the site selection process. A detailed plan for conducting onsite visits is included as Attachment M.


  1. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

The survey procedures described above are designed to maximize responses rates. The approach includes: intensive outreach to Tribal leaders, communities, and program administrators; extensive training for interviewers, including scheduling procedures and specific approaches to address barriers to cooperation; incentives for participant survey respondents and discussion group participants; and follow-up of non-response. Field Interviewers will record each attempt to contact a household. Interviewers will vary their contact attempts to the selected households across the most probable times of contact. After making up to 5 contact attempts, the Field Interviewers will shift to more intensive and in-person locating efforts for hard-to-reach cases. Persistent non-contact households will be discussed with Field Managers; the resulting discussion will generate a new approach. Similarly, the interactions for resistant cases will be discussed and a strategy prepared. During a successful contact at the respondent’s dwelling, the Field Interviewers will introduce themselves and the survey, enable the household member to ask questions about the survey, and attempt to schedule or conduct an interview. For Tribes that issue an invitation onto the reservation (see Attachment J for an invitation template) the interviewer will have a replicated copy of the invitation.

Providing incentives is beneficial in gaining respondent cooperation, and demonstrates to respondents that their contributions are valued. We will provide an incentive worth $25 to individuals and $100 to ITOs for participating in the study. More details about remuneration to respondents can be found in the response to Part A, question 9.


  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

A pretest of each of the data collection instruments was conducted in November and December, 2012. The pretest procedures were designed to determine if the questionnaires, procedures, and documents work as planned in a field setting. The project team was especially interested in ensuring that time estimates accurately reflected the timing of each data collection component. Table B-2 shows the number of pretests completed for each of the primary data collection instruments.



Table B-2: Summary of Survey Instrument Pretests



Pretest Interview

Interviews/pretests completed

Participant Survey

6 FDPIR Participants

(3 In-person; 3 telephone)

Case Record Review

6 FDPIR Case Records

Onsite Interview Guide

2 Tribal Leaders

2 FDPIR Managers

1 Other Program Leader

(community nurse)

Onsite Discussion Group Guide

1 FDPIR Participant

1 SNAP Participant



As this table indicates, in each case, identical questions were asked of fewer than 10 respondents. Three tribes participated in the pretest (Red Cliff Reservation, Bois Forte Reservation, and Ho-Chunk Nation). These tribes have FDPIR Programs but are not in the sample selected for the study.

Overall, data collection instruments were found to be well designed, easily understood, and comprehensive of the research topics. Revisions typically involved rewording introductions and questions in order to make them clearer for the respondent and reducing redundancy and streamlining questions to reduce respondent burden. The pretest also yielded important information for improving interviewer instructions and highlighting topics to be addressed interviewer training.

Pretest of the Household Survey Instrument. NORC conducted one round of pretesting for the Participant Survey component of the Study of FDPIR utilizing the paper and pencil instrument. Six interviews were completed. Three were completed in-person and three were completed on the telephone. Both modes were tested since both modes will be used for the main fielding. The pretest sample consisted of a convenience sample recruited from a similar population to that of the main study, i.e., tribal members receiving FDPIR commodities. An experienced NORC Survey Director that had relationships with members of tribal communities recruited pretest respondents (see Attachment B2b, Household Survey Pretest Outreach Script). As a token of appreciation for their participation, all pretest respondents received $25.

The median administration time for the six interviews completed during the pretest was 45 minutes, exceeding the initial estimate and desired time for the survey. This included the time spent administering the informed consent, estimated to be 8-10 minutes (3-4 minutes to read the informed consent form aloud and 5-6 minutes to discuss respondent questions).

All pretest interviews concluded with a short debriefing section in which respondents were asked whether they had difficulty or felt uncomfortable with any of the questions or response options in the interview (see Attachment B2d, Pretest Debriefing Guide). The pretest interviewers completed a debriefing for each interview in which they noted instructions, questions or response options that presented a problem for respondents.

Based on the pretest, changes have been made to the questionnaire to clarify content and reduce the survey administration time. In order to reduce the length of the survey, one question was changed to request a YES/NO response to a series of items and one question was deleted. Such changes will improve data quality by helping respondents better understand what is being asked of them and reduce the length of the instrument. We estimate that the revised instrument will be administered in 40 minutes.

Case Record Review Pretest. The data abstraction form for the case record review consists of a spreadsheet with 25 data fields used to describe current program participants and their household characteristics. The data abstraction form is modeled on FDPIR program eligibility forms. Topics and data elements abstracted for each household member are: demographics; household composition; participation in SNAP (receiving, applied for, disqualified); earned and unearned income; self-employment income; receipt of student financial aid; and other available resources or public benefits.

To pretest the data abstraction form, we sought participation by tribes/ITOs that were also pretesting the participant survey (see Attachment B1c, Case Record Review Pretest Outreach Script). The pretest sample consisted of six files of differing household size: single-person household (3); three-person household (2); and a four-person household (1). The NORC Survey Director who conducted the onsite interviews for the participant surveys also conducted the case record reviews.

There were no quality assurance issues identified with the randomly sampled records for the pretest. Potential data quality issues may involve difficulty deciphering an applicant’s handwriting on the eligibility form. There were no changes to the data abstraction form as a result of the pretest.

Both FDPIR sites were amenable to participating in the case record review and very cooperative in retrieving the files on site. Retrieving and returning each file to the cabinet took about 2 minutes for the FDPIR staff (this time is calculated in the burden estimate). Onsite data abstraction ranged from 5 minutes for a single-person household to 10 minutes for a multi-person household (this time is not calculated in the burden estimate as the data abstraction will be conducted by members of the research team during a site visit).

Pretest of Onsite In-Person Interview Guides. The onsite interview guide consists of semi-structured questions and probes for in-person discussions with a variety of key informants, including Tribal leaders, FDPIR managers and staff, and representatives of other programs that work with FDPIR or FDPIR participants. The instrument is designed in segments so that interviewers can tailor the discussion based on a respondent’s role. Each section focuses on a separate topical area and may not be relevant to each person interviewed, though some segments are relevant for all possible discussants. The approach allows for flexibility in tailoring the discussion for each respondent.

To pretest this guide, we sought several respondent types in order to test the tailoring process as well as the clarity of questions, the flow of the discussions, and time required to complete the interview. Five interviews were completed during the pretest by experienced interviewers on the study team (see Attachment B4e for outreach email used to schedule and confirm interviews). One interview was completed in-person and four were completed by telephone. Telephone interviews were audio-recorded for pretesting, and consent was obtained from the respondents. The five respondents, from three Tribes, were: 2 Tribal Leaders, 2 FDPIR managers, and 1 other program leader. Interviews ranged from 35-90 minutes, with a median administration time of 57.5 minutes. As expected, interviews with FDPIR and other program managers took more time than interviews with Tribal leaders because Tribal leaders responded to fewer segments of the discussion guide. Several respondents indicated that it would be helpful to send questions requiring numerical data (such as jobs, employment, demographics) in advance in order to allow time for preparation and give an accurate answer.

All pretest interviews concluded with a short debriefing section in which respondents were asked about: the flow of the interview; the length of the interview; the clarity of questions and terminology; and whether they felt pressured to respond in a certain way. Respondents were also asked for suggestions to improve the interview guide, including adding or deleting topics (see Attachment B2d). The pretest interviewers also completed a debriefing for each interview in which they noted instructions, questions, or response options that presented a problem for respondents and suggested changes to the guide to address those issues.

Based on the pretest, a number of edits were made to the interview guide, including: deleting or combining questions to eliminate redundancy; rewording questions for clarity; adding additional probes and explanations of terms, to be used by the interviewer as needed; adding an introductory question at the beginning of several segments in order to ensure the respondent can answer questions about a topic area prior to beginning the segment; adding additional background about FDPIR and the purpose of the survey for respondents not working for FDPIR in order encourage them to feel comfortable answering questions. For example, one respondent was hesitant about questions pertaining to gaming and economic development, but was more comfortable responding when it was explained that we were are interested in learning about economic conditions because it can affect the need for food assistance. In addition, certain procedural changes will be made in response to the pretest findings. Interviewers will be trained on how to tailor the guide for different respondent types using the introductory questions and skip patterns. When interviews are scheduled, we will send respondents those questions that would benefit from preparation in advance, particularly those that would require checking data. Questions to send in advance will vary by type of respondent. For example, Tribal leaders will be asked about the demographics of their reservation/Tribal area, while FDPIR managers will be asked about the demographics of program participants. Our communications will state very clearly that respondents are not required to look up or send us this information in advance, but that these questions are being sent in advance for their use if they feel this will facilitate their interview. These small changes in the interview guide, interviewer training, and procedures will improve data quality by helping respondents better understand what is being asked of them and reduce the length of the interview. We estimate that, with these changes, the interviews will take no more than 60 minutes.

Pretest of Discussion Group Guide. The discussion group guide will be used to facilitate group discussions with FDPIR participants and eligible non-participants at each site visited. The pretest was not intended to simulate the discussion groups as planned for the study. Doing so would have been too costly and imposed an additional burden on FDPIR staff and discussion group participants. The pretest for the discussion groups was limited to testing the questions in the guide for clarity and relevance through individual telephone interviews. Two separate telephone interviews were conducted, one with a current FDPIR participant and one with a current SNAP participant who had previously participated in FDPIR (see Attachment B5f, Discussion Group Pretest Outreach Script). Both respondents were members of the same Tribe. As a token of appreciation for their participation, each pretest respondent received $25.

The FDPIR manager identified the participants to be interviewed and provided the contact information. The Urban Institute interviewers then contacted each of them, and an interview was scheduled. Both respondents were willing to participate and agreed to have the interview recorded. The interviews lasted 25 minutes and 20 minutes, respectively, and went smoothly. In actual discussion groups, additional time will be needed to explain the discussion group procedures and provide opportunities to hear from all participants (about 12) in the group. Based on prior experience with similar group discussions, we estimate up to 2 hours per discussion group.

In debriefings, both respondents indicated that the questions were good, to the point, and addressed the key topics that participants could speak to about the program. Minor changes were made to the guide for clarity and to encourage respondent discussion.

B5. Individuals or Contractors Responsible for Statistical Aspects of the Design

The agency responsible for receiving and approving contract deliverables:

Office of Research & Analysis

Food and Nutrition Service, USDA

3101 Park Center Drive

Alexandria, VA 22302

Persons who contribute to or comment on contract deliverables:

Bob Dalrymple, Ph.D., Senior Analyst

Office of Research and Statistics

703-305-2122

[email protected]

Dana Rasmussen, Chief, Policy Branch

Food Distribution Division

703-305-1628

Dana.Rasmussen @fns.usda.gov


Christy Meyer

Methods Branch

National Agricultural Statistics Service (NASS), USDA

202-720-2555


The organization responsible for designing and administering the household survey and collecting case file data on FDPIR participants:


NORC at the University of Chicago

4350 East West Highway

Bethesda, MD 20814


Persons who contributed to or commented on the survey and case-file components of the study:

Carol Hafford, Ph.D., Principal Research Scientist

301-634-9491

[email protected]


Suzanne Bard, Survey Director

(312) 759-4255

[email protected]


Nancy Pindus, Principal Investigator

202- 261-5523

[email protected]


Diane Levy, Project Manager

202- 261-5642

[email protected]


Walter Hillabrant, Ph.D., President

301-587-9006

[email protected]


The organization responsible for statistical design of data that will be collected:

Urban Institute

2100 M Street, NW

Washington, DC 20037

Persons who contributed to or commented on the statistical design:

Doug Wissoker, Ph.D., Senior Statistician

202- 261-5622

[email protected]

Nancy Pindus, Principal Investigator

202- 261-5523

[email protected]


The organization responsible for designing the analysis plan and analyzing data:

Urban Institute

2100 M Street, NW

Washington, DC 20037

Persons who contributed to or commented on the analysis plan:

Nancy Pindus, Principal Investigator

202- 261-5523

[email protected]


Doug Wissoker, Ph. D. Senior Statistician

202- 261-5622

[email protected]



Diane Levy, Project Manager

202- 261-5642

[email protected]


Carol Hafford, Ph.D. Principal Research Scientist

301-634-9491

[email protected]

Walter Hillabrant, Ph. D., President

301-587-9006

[email protected]

1 For non-certainty sites that are unable to participate in the study, a member of the replacement sample will be selected that is a good fit for the non-participating site with respect to region, size, and change in participation. For certainty sites, a suitable replacement site is unlikely to exist because all very large programs are included in the selected sample. Should a certainty site be unable to participate, the sample size in the remaining certainty sites will be boosted for the survey and case record reviews.

2 Based on average monthly number of participants in FY 2011 (Source: FNS National Data Bank Public Use Data File).



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorNarducci, Chris
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy