justification

Justification for Non-Substantive Change Request Incentive Payments 1205-0506_8-21-13.docx

Follow-Up Survey Information for Green Jobs and Health Care Impact Evaluation, American Recovery Reinvestment Act Grants

justification

OMB: 1205-0506

Document [docx]
Download: docx | pdf

Justification for Non-Substantive Change Request for OMB 1205-0506, Green Jobs and Health Care Impact Evaluation: Incentive Increase to Enhance Follow-Up Survey Response Rate


This is to justify ETA’s request to increase incentive payments from $25 to $45 for all remaining sample members for the 18-month and 36-month follow-up surveys for the Green Jobs and Health Care Impact Evaluation. This will result in a minimal cost increase and will not affect the hour burden for OMB 1205-0506, approved in January 2013. The aim is to enhance the follow-up survey response rates.


Background:

The U.S. Department of Labor (ETA) has undertaken the Green Jobs and Health Care (GJ-HC) Impact Evaluation of the Pathways Out of Poverty and Health Care and High Growth Training grant initiatives. The evaluation’s goal is 1) to determine the extent to which enrollees achieve increases in employment, earnings, and career advancement as a result of their participation in the training provided by Pathways and Health Care grantees and 2) to identify promising practices and strategies for replication. ETA has contracted with Abt Associates and its subcontractor, Mathematica Policy Research, to conduct this evaluation.


In July, 2011, OMB approved the baseline data collection for this evaluation (OMB 1205-0486), and in March, 2012, OMB approved a subsequent request for the process study data collection, which includes site visits and focus group administration (OMB 1205-0487). In January 2013, OMB approved the 18 and 36-month follow-up survey data collection (OMB 1205-0506).


Abt began the 18-month survey of the participants in February 2013 and discovered that it is very difficult to get some of the participants in these training programs to provide information. Given that sample sizes for this project are now below projected levels overall, it is critical to achieve as high a response rate as possible on the survey. Several key impact measures, particularly regarding participation in education and training, credential receipt, and the type and quality of employment, are being measured through participant responses to the survey. Abt’s ability to detect meaningful impacts on these key measures is contingent on having the maximum number of sample members complete the survey. Therefore, Abt has proposed offering an increased incentive to the remaining sample members who have not yet completed interviews for the 18-month follow-up survey and to all sample members for the 36-month survey in order to increase the total response rate.


How The Increased Incentive Offer Would Work:

To improve response rates, Abt is proposing to adjust the initial study plan in order to offer a higher incentive of $45 to the remaining 18-month survey sample members and the entire 36-month survey sample. The survey fielding effort involves the release of sample members in batches, called “releases.” So far, the first 7 releases out of the 23 that are planned have occurred; these 7 releases represent about 34 percent percent of the full study sample. For sample members who are currently in the field, we will continue offering the original $25 incentive until the new $45 incentive is offered to all remaining sample members starting in September 2013. Sample members who are released in September (release number 8) will receive notification of the increased incentive in the advance letter they receive prior to being contacted by phone to complete the interview. At that time, all remaining sample members who have not yet completed an interview will also be informed of the new incentive via the regular follow-up contact mailings and field locator scripts. This approach ensures that all sample members who are still active as of September 2013 will become eligible for the increased incentive at the same time. In addition to enhancing operational efficiency, this approach enables the increased incentive to appeal to both newly released sample members and remaining nonrespondents. Moreover, this incentive amount will be offered to both treatment and control group members, with no distinction between the two groups; thus, there is potential for the higher incentive amount to reduce the likelihood of a problematic treatment-control group differential in responses rates at the end of the survey fielding period.


For the evaluation, there is a total sample of 2,652. To date, approximately 34 percent of the full study sample has been contacted to complete the survey based on the 18-month random assignment anniversary date. If Abt were to offer the increased payment to the remaining 66 percent of the sample members in the 18-month survey, and to all sample members for the 36-month survey, the total projected cost of adding this increased incentive would be about $70,536.


An Overview of Fielding Efforts and Justification For The Increased Incentive:

Abt and Mathematica’s original fielding plan called for approximately 8 weeks of outbound dialing to sample members followed by about 4 weeks of field locating. At the 8-week point (ending on May 1, 2013) Mathematica had achieved a response rate of 33% for sample release one, lower than the projected 54% that was expected by that time of the fielding period. This low response rate persisted despite efforts to contact sample members using different approaches and modes. Between the initial fielding date in late February and the 8-week mark, Mathematica sent hand-written reminder postcards, email reminders, refusal conversion letters, and Facebook reminders to “Release One” sample members. In addition, Mathematica decided to extend the field locating effort to 10 weeks instead of 4 as originally planned, as field efforts significantly boosted response rates once contact was made by prompting respondents to call in to complete the interview. However, response rates are still lower than expected. Thus, the need for additional efforts to boost the response rate so that accurate impact estimates can be generated. The increased incentive is one of many additional efforts that Abt and Mathematica are exploring in order to increase the response rate.


The increased incentive is needed to maximize response rates for several reasons:


  • The potential to increase the response rate and obtain as many of the follow up responses as possible is a critical effort to estimating more accurate impacts of these training programs.

  • The potential to increase the response rate will likely reduce non-response bias in the impact estimates. While there is some debate in the literature about the relation between response rates and non-response bias, the consensus remains that higher response rates lead to less concern about non-response bias, which is caused by differential treatment-control response rates. In particular, raising the incentive might plausibly raise the response rate by a few percentage points.

  • The increased incentives do not increase the cost of hour burden calculated for this project but do increase the accuracy of estimates.

  • The proposed cost incurred due to the increased incentives is within the costs allocated for this project.


The literature on the effectiveness of incentives on response rates in phone surveys generally finds that increases in monetary incentives improve response rates. However, the relationship is not strictly linear, as there is a declining effect on response rates as the dollar amount of the incentive increases (Gelman, Stevens, and Chan, 2002). In terms of converting non-responding sample members by using higher incentives, much less is known in the survey literature. In a few of Mathematica’s surveys for DOL, however, increased incentive amounts have been used successfully to boost response rates among non-responding sample members. In the National Evaluation of the Trade Adjustment Assistance (TAA) Program, which serves a similar population of unemployed and/or dislocated workers (although in the manufacturing field), an increased incentive experiment was carried out for different categories of sample members. In the TAA study, which was initially approved by OMB to offer a $25 incentive to all sample members, the incentive was increased to $50 and to $75 for some sample members and kept at $25 for others. Both the $50 and $75 incentives significantly increased response rates compared to sample members who only received $25. The results from this experiment showed that the higher incentive increased the overall response rate for nonrespondents from 41 percent to 55 percent. Among new sample members who were randomly assigned to the $50 incentive group compared to the $25 incentive group, the response rate was 53 percent and 49 percent, respectively.




Works Cited


Gelman, Andrew, Matt Stevens, Valerie Chan. 2002. Regression modeling and meta-analysis for decision making: A cost-benefit analysis in telephone surveys. Journal of Business & Economic Statistics 21 (2):213-25.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorNaradzay.Bonnie
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy