AFI NonSubstantive Change Memo

AFI NonSub Change Justification - revised - 10.30.14.docx

Assets for Independence (AFI) Program Evaluation

AFI NonSubstantive Change Memo

OMB: 0970-0414

Document [docx]
Download: docx | pdf

To: Brenda Aguilar; Office of Information and Regulatory Affairs (OIRA)

From: Erica Zielewski; Office of Planning, Research and Evaluation (OPRE); Administration for Children and Families (ACF)

Date: October 17, 2014

Subject: Nonsubstantive Change – Assets for Independence (AFI) Program Evaluation (OMB #0970-0414)


Background

The initial request for the Assets for Independence (AFI) program evaluation involved clearance for a site-specific evaluation in two AFI grantees – Prosperity Works (New Mexico) and Community Financial Resource Center (Los Angeles, CA). At the time of initial clearance, tokens of appreciation to respondents in the amount of $20 were proposed for the 12-month follow-up interview. At this time, we are requesting approval to provide larger tokens of appreciation for respondents in the amount of $40.


Statement of the Problem

The baseline recruitment effort of low-income respondents yielded a sample of 814 participants that will be contacted for the 12-month follow-up survey, with a targeted response rate of 85 percent. The baseline recruitment effort was much more difficult than anticipated, and our enrolled study sample is significantly lower than hoped (almost half of the 1,700 anticipated). As a result, it is critical to the success of the legislatively-mandated study that we have high follow-up response rates. Not preserving the sample will likely jeopardize the ability of the study to measure short-term impacts.


Unfortunately, initial telephone and tracing contacts have yielded a relatively low 45 percent response rate. Most contact attempts have resulted in no contact, with very few individuals answering the phone. As noted above, the study’s ability to measure impacts rests on gathering data from at least 85 percent of the sample. Thus, the current response rate is concerning.


If we are not able to raise the response rate to 85 percent, we will not be able to conduct the impact analyses at the core of this study. This will jeopardize the Government’s investment in this effort and hamper our ability to provide legislatively-mandated research findings.

Proposed Changes and Rationale

To improve follow-up response rates, the evaluation team plans to undertake a two-pronged strategy. First, the team plans to offer the survey – initially planned only as a telephone survey – as both a web-based survey (similar to the baseline survey, which was administered online) and an in-person survey using field-based interviewers. Since the baseline survey was administered online and one of the sites is a community college setting, the research team believes that offering a web-based version of the survey may support increased responses in at least one of the two sites. Similarly, transitioning from phone calls to in-person field locating efforts for hard-to-reach cases is a very common practice in the survey field. The survey research firm believes that using field locators and in-person interviews will also support improved response rates. We have revised Supporting Statement B to reflect these changes.


While we believe that offering different modes of the survey will be helpful, we also think that additional efforts are needed to support sample retention. As a result, the second part of this two-pronged strategy focuses on increasing the amount of the tokens of appreciation that respondents receive for participating in the 12-month follow-up interview. Specifically, we seek approval to increase compensation for follow-up interviews from $20 to $40 to boost follow-up response rates. We would like to increase our incentive for the follow-up survey to $40 across all modes (i.e., in-person, telephone, and web surveys), retroactive to those already interviewed.


We propose raising the incentive to $40 for a few reasons. First, it appears that the initial token of $20 is too little to motivate participants to complete the survey. We are experiencing a low rate of “direct refusals” consistent with our experiences on other similar surveys of low-income households. However, we have encountered difficulty establishing phone contact with many sample members, even though their phone numbers are working lines (i.e., the calls will ring and may then go to a voice mailbox or may be answered by someone indicating that the sample member is not present or available). We consider these cases to be passive refusals. Based on our experience with the earlier-enrolled cohorts, we believe that the sample members are more reluctant than anticipated to cooperating with a survey that asks for detailed income, assets, and debt information, where the respondent may need to refer to financial records. We hypothesize that the pending nonresponse reflects the fact that the $20 incentive (as mentioned in the initial consent form and the lead letter) is not high enough to create an intention to cooperate in the individual’s mind. We anticipate the higher incentive to have its effect by strengthening a sample member’s intentions to take the time to respond to an interviewer’s call.  


Second, we believe that $40 is the appropriate amount to offer given the target population and the types of questions that they will be asked. This amount reflects experience from prior studies, plus our judgment that sample retention is essential not only for the 12-month impact estimates, but also for the potential of estimating impacts at months 24 and 36.  Comparable studies that trace and interview similar low-income populations have used incentive amounts above $40 successfully. For example, the fourth wave of the American Dream Demonstration (ADD) study was conducted in 2008. Low-income program and comparison group participants were offered $50 for completing a telephone or field interview at ten years after random assignment, and the final response rate was 80 percent. Similarly, in 2002, the National Survey of Child and Adolescent Well-Being (NSCAW) doubled the incentive offered to respondents from $25 to $50 to reflect the needs of reaching a higher-risk population. In addition, over rounds 1 through 10 of the National Longitudinal Survey of Youth 1997 cohort, incentives offered to respondents ranged from $10 to $50 in an attempt to minimize attrition across waves of data collection. Offering $40 seems consistent with both the nature of the target population and the level of effort needed to complete the survey.


Taken together, we believe that this two-pronged strategy of adjusting the modes of survey administration and increasing the incentive amount will result in the higher response rates needed to preserve the survey sample and actually measure impacts of the AFI program.

Addressing Issues of Nonresponse

Below we address several issues related to nonresponse and nonresponse bias, including how an increased incentive will reduce this risk and how we plan to address bias should it occur.


How will an increased incentive reduce the risk of nonresponse bias? This study involves the analysis of data from an enrolled sample of 814 cases (409 treatment cases and 405 control cases). Nonresponse bias will exist to the extent that the characteristics of survey respondents differ from the characteristics of all enrollees. In the limit, if all 814 cases were to complete their follow-up interviews, as they did at baseline, there would be no nonresponse bias in the follow-up survey data.


In the context of an experimental impact evaluation such as this, where we measure program effects as treatment-control differences in outcomes, we are most concerned about differential bias: differences between the treatment and control groups in their respective patterns of bias, as this can distort the impact estimates. Most problematic is a nonresponse pattern that reflects an unobservable characteristic that is correlated with the outcome of interest. For example, if future-oriented individuals are instinctively more likely than present-oriented individuals to respond to the survey and if the control group respondents include more future-oriented individuals – who also tend to save more – this will bias downward the estimated program effect.        


With the current $20 incentive in place, the initial response rate on the 12-month follow-up survey has been lower than anticipated—below 50 percent for the treatment and control groups combined, with the control group’s rate somewhat lower than the treatment group’s. The risk of nonresponse bias diminishes at higher response rates, as the characteristics of respondents are then less likely to reflect the problematic issues of self-selection noted above. The higher response rate achieved through the increased incentive will limit the extent to which the survey data will reflect the instinctive propensities of individuals to participate in an interview. One cannot eliminate entirely the risk of nonresponse bias, as there will always be some study participants who are unwilling to cooperate with an interviewer, despite having consented to do so at the time they entered the study 12 months earlier.  


How will an analysis of nonresponse bias be conducted? We will conduct an analysis of nonresponse bias by comparing the baseline characteristics of follow-up survey respondents versus nonrespondents, identifying statistically significant differences.  The baseline survey encompassed a wide range of detailed individual and household characteristics, including qualitative items relating to personal attitudes and outlook.  Such an analysis, required by OMB if the survey response rate is below 80 percent, will be conducted separately for the treatment and control groups. Note that program enrollment required completing the baseline survey; thus we have baseline data for all follow-up survey respondents and nonrespondents.


How will the impact analysis adjust for the risk of nonresponse bias? In the statistical estimation of program effects, we will include a substantial number of baseline survey items as covariates in the econometric models; in particular any items that differ between the treatment and control groups.  This will adjust for the potential effects of baseline characteristics that may be disproportionately over or underrepresented among treatment or control cases.  Because the baseline survey has included such an extensive set of individual and household attributes, this approach will adequately protect against the risk of nonresponse bias, including the bias associated with characteristics such as motivation that are normally unobservable.    


Current Supporting Statement A language

A.9.        Explanation of Any Payment or Gift to Respondents

 Our study plan includes tokens of appreciation to respondents in the amount of $20 at the baseline interview and again upon completing the follow-up survey.


At baseline, IDA applicants are asked to complete a 30-minute self-administered baseline questionnaire as part of the program’s intake procedures. The offer of a $20 token of appreciation provided at this time will encourage individuals to enroll in the study and will engender good will among the study enrollees, important to their continued study cooperation as members of either the treatment or control group.


At follow-up, however, study participants are again asked to complete a 30-minute survey, administered to them by telephone. The survey again collects a significant amount of detailed financial information. Individuals may view the need to provide such information as burdensome. This may be especially true of control subjects who are not vested in the program. Thus, to prevent differential nonresponse between treatment and control groups, ACF recommends offering respondents another $20 as a token of appreciation in order to improve cooperation at follow-up. Estimates of program impacts may be biased if the respondents in each group are not comparable due to differential group nonresponse.


 Proposed Revision to Supporting Statement A

 A.9.        Explanation of Any Payment or Gift to Respondents

Our study plan includes tokens of appreciation to respondents in the amount of $20 at the baseline interview, and $40 upon completing the follow-up survey.


At baseline, IDA applicants are asked to complete a 30-minute self-administered baseline questionnaire as part of the program’s intake procedures. The offer of a $20 token of appreciation provided at this time will encourage individuals to enroll in the study and will engender good will among the study enrollees, important to their continued study cooperation as members of either the treatment or control group.


At follow-up, however, study participants are again asked to complete a 30-minute survey, administered to them by telephone. The survey again collects a significant amount of detailed financial information. Individuals may view the need to provide such information as burdensome. This may be especially true of control subjects who are not vested in the program. Thus, to prevent differential nonresponse between treatment and control groups, ACF recommends offering respondents an additional $40 as a token of appreciation in order to improve cooperation at follow-up. Estimates of program impacts may be biased if the respondents in each group are not comparable due to differential group nonresponse.


Current Supporting Statement B Language

B.2 Procedures for Collection of Information

Data Collection. Households will be contacted by telephone approximately one week after the lead letter has been sent. Interviewers will introduce themselves, ask to speak to the selected respondent and (when applicable) state "You may have received a letter from us” then will inform the potential participant about the study and proceed with the introductory script and informed consents (Attachment I).


B.3 Methods to Maximize Response Rates and Deal with Nonresponse

Respondent Tokens of Appreciation. Sample members who complete the baseline survey and the follow-up survey will receive $20 for their participation at each juncture. This token of appreciation will be mentioned first in the consent form and again in the lead letter (Attachment J) sent to sample members prior to the follow-up survey launch. In each instance, the token of appreciation is intended to encourage, but not obligate, participation. For the baseline survey, the token will be provided in a manner decided by the program agency, either in person or by mail. For the follow-up survey, the token will be mailed to respondents within 2 to 4 weeks after survey completion.


Proposed Revision to Supporting Statement B

B.2 Procedures for Collection of Information

Data Collection. Households will be contacted by telephone approximately one week after the lead letter has been sent. Lead letters (Attachment H) will include an invitation to take the survey via web using a unique username and password. Interviewers will introduce themselves, ask to speak to the selected respondent and (when applicable) state "You may have received a letter from us” then will inform the potential participant about the study and proceed with the introductory script and informed consents (Attachment I). The 12-month follow-up self-administered survey will contain the same security settings as the self-administered baseline interview. 


B.3 Methods to Maximize Response Rates and Deal with Nonresponse

Respondent Tokens of Appreciation. Sample members who complete the baseline survey will receive $20 for their participation, and will receive $40 for their participation in the 12-month follow-up survey. This token of appreciation will be mentioned first in the consent form and again in the lead letter (Attachment J) sent to sample members prior to the follow-up survey launch. In each instance, the token of appreciation is intended to encourage, but not obligate, participation. For the baseline survey, the token will be provided in a manner decided by the program agency, either in person or by mail. For the follow-up survey, the token will be mailed to respondents within 2 to 4 weeks after survey completion.


Please note that since all participants have already consented to participate in the study, we do not propose any changes to the consent form at this time. However, we do propose some changes to the lead letter (Attachment H), which is attached.


3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMCB
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy