Request for Non-Material Change to Information Collection Request (ICR) 201208-1205-012: Workforce Investment Act Adult and Dislocated Worker Programs Gold Standard Evaluation: 15- and 30-Month Follow-Up Surveys
OMB Control No. 1205-0504
The Employment and Training Administration (ETA) of the U.S. Department of Labor is proposing a non-material change to a data collection approved by the Office of Management and Budget (OMB) in January 2013 via ICR 201208-1205-012. The OMB approval included administration of two study participant follow-up surveys for the Workforce Investment Act (WIA) Adult and Dislocated Worker Programs Gold Standard Evaluation (WIA Gold Standard Evaluation) (OMB Control No. 1205-0504). The WIA Gold Standard Evaluation will provide policymakers, program administrators, and service providers information about the relative effectiveness of WIA-funded services (including training), how the effectiveness varies by target population, and how the services are implemented. The requested change is to increase the incentive offered to those study participants who are unresponsive to outreach efforts for completion of the surveys, which collect data on customers’ service use and outcomes. ETA is requesting approval to increase the incentive for the unresponsive study participants from $40 to $75.
Background
In January 2013, OMB approved the WIA Gold Standard Evaluation’s study participants’ follow-up surveys, veteran’s supplemental study, and cost analysis data collections under OMB Control Number 1205-0504. This non-material change request pertains only to the two follow-up surveys being administered to 6,204 study participants. Specifically, the increased incentive will be offered to the subset of study participants who are unresponsive to multiple outreach efforts (as defined below).
The study team randomly assigned all study participants to one of three groups: a group eligible for core services only, a group eligible for core and intensive services, and a group eligible for core, intensive, and training services; the latter group is referred to as the full-WIA group. The outcomes for the three groups will be compared to determine the effectiveness of training and the effectiveness of intensive services.
Study participants across all three groups are included in the survey sample, including all members of the core-and the core-and-intensive groups and a subsample of members in the full-WIA group. The first follow-up survey is administered by telephone at 15 months after random assignment and the second follow-up survey is administered 30 months after random assignment. All respondents who complete a survey receive an incentive payment. Following the OMB-approved strategy, survey sample members are initially offered $25 to complete a survey. The incentive increases to $40 for sample members who are unresponsive to multiple outreach attempts. Sample members are deemed to be unresponsive to outreach attempts and thus eligible for the $40 incentive payment only if they have not completed an interview: (1) within three months of the first attempt to contact the sample member; and (2) after 15 attempts have been made to call the respondent and three letters or postcards have been sent.
The 15-month follow-up survey has been active since April 2013 and has a response rate of 78 percent as of December 2014, which is below the target response rate of 82 percent. Of the sample members who completed the survey, 80 percent have received a $25 incentive and 20 percent received $40. Intensive field and locating efforts continue for the remaining 22 percent who have yet to complete a survey. Despite offering remaining sample members a $40 incentive, it continues to be a challenge to reach them or gain their cooperation in the field. The evaluation has introduced additional outreach attempts and has extended the fielding period by several months. Outreach efforts to increase the response rate include sending additional reminder postcards and emails, employing experienced interviewers to undertake refusal conversion calls, conducting locating searches using databases from multiple vendors, and sending refusal conversion letters. Together, these efforts have not greatly increased the response rate.
The 30-month follow-up survey has been underway since June 2014 with the first 6 releases active (out of a planned total of 16 releases) and has a response rate of 20 percent as of December 2014. Of the sample members who completed the survey, 96 percent received a $25 incentive and 4 percent received a $40 incentive. The 80 percent of the sample that has yet to complete is either undergoing intensive field or locating efforts (18 percent) or awaiting a future sample release date (62 percent). A comparison of the 30-month survey response rate to the 15-month survey at a similar time shows a lag in the response rate, despite the increase in outreach and reminder efforts described above that have also been undertaken for the 30-month survey. Projecting this lag forward in time suggests that the target response rate of 82 percent will be difficult to reach without additional interventions to boost the number of responses.
Proposed Incentive Changes
The proposed change will increase the incentive from $40 to $75 for those unresponsive to outreach efforts. For the 15-month follow-up survey, all remaining sample members were contacted through the increased outreach efforts for over 3 months now and are currently offered $40; this would increase to $75. For the 30-month follow-up survey, once a sample member is deemed unresponsive to outreach attempts with a $25 incentive, they will become eligible for a $75 incentive payment. The same criteria to determine eligibility, as described above, will be used.
Reasons to Increase Incentives
Incentives can help achieve high response rates by increasing the sample members’ propensity to respond (Singer et al. 2000) and persist through survey completion (Göritz 2006). Meta-analysis studies show that increasing the amount of the incentive increases the response rate (Singer et al. 1999; Gelman, Stevens, and Chan 2002). Increasing incentives can help achieve higher response rates by improving sample members’ cooperation, facilitating contact with sample members, avoiding additional refusals, and helping reach sample members through locating efforts such as postcards or refusal-conversion letters. It can also help reach certain sub-groups who would typically complete the survey at a lower rate.
Increasing incentives to boost response rates will benefit the WIA Gold Standard Evaluation’s impact analysis. Both of the evaluation’s follow-up surveys currently have lower response rates for the limited service groups (the core and core-and-intensive groups) compared to the full-WIA group. This discrepancy was anticipated during the design phase as customers with less connection to the range of WIA services are more likely to refuse or avoid the survey. Increasing the incentive for the remaining sample members should help boost the response rates within the limited service groups. Gaining their cooperation and increasing the number of control responses is critical to achieve the study’s goal for minimal attrition between groups and minimum detectable impacts on quarterly earnings and other key measures as outlined in the Part B Supporting Statement of the information collection approved under OMB Control Number 1205-0504. In addition, non-response bias will likely be reduced in impact estimates.
Determining Incentive Amounts
As part of the National Evaluation of the Trade Adjustment Assistance (TAA) Program (OMB Control No. 1205-0460), Mathematica Policy Research conducted an experiment that offered different levels of incentives to sample members who were not responding to outreach attempts. Nonrespondents were randomly assigned to three groups: (1) a group that was offered an incentive of $25, the same amount as paid to respondents; (2) a group that was offered an incentive of $50; and (3) a group that was offered an incentive of $75. The experiment found that the response rate was 9.4 percentage points higher with an incentive of $50 than an incentive of $25, a difference that was statistically significant; the response rate was 15.0 percentage points higher with an incentive of $75 than an incentive of $25.1 In addition, respondents called in sooner and faster with each increase in incentive levels. Mathematica and ETA determined that a $50 incentive to sample members who did not respond to initial outreach efforts was cost effective for TAA but that paying an incentive of $75 was not cost effective. Upon conclusion of the experiment, the TAA Evaluation immediately began offering $50 incentives to all sample members in the comparison group and people in the treatment group with a tenuous connection to the TAA program. For the second follow-up survey, TAA used an initial incentive offer of $50 for three of the four respondent groups. Only the TAA treatment group was offered a $25 incentive initially.
For the 15-month follow-up survey for the WIA Gold Standard Evaluation, a $75 incentive should be more cost effective than the TAA experience because of the large costs associated with locating and field work for the small amount of remaining sample members. Costs per complete in the field are currently around $585, costs that would be expected to fall with a larger incentive offer.
On the 30-month follow-up survey, locating efforts are very intensive due to the long period of time between the study intake and the survey follow-up. More reminders are being used than the first follow-up survey, including one advance letter, five postcards, and two emails, in addition to any locating letters or refusal conversion letters. Up to 30 call attempts are being made on each working phone line.
Unresponsive sample members on the 30-month survey are currently offered an increased incentive of $40 to complete. If this higher incentive offer proves not successful, a field locator is dispatched to attempt in-person locating. This intensive locating effort is successful at generating additional responses, but has proven costly. The cost per complete for cases that require a field locator is around $260 higher than phone-administered cases. Increasing the incentive offer for unresponsive study participants from $40 to $75 would add $35 per complete in incentive outlays but can save costs by reducing the number of cases requiring costly field locating. Experience with the TAA project and the existing literature (Markesich and Kovac 2003; Gelmen, Stevens, and Chan 2002) shows that higher incentive amounts do spur sample members to complete surveys faster and reduce locating efforts and the number of outbound call attempts. The TAA Evaluation found that a $75 incentive level was not cost effective even though it increased the response rate. However, the TAA project did not include a field locating component, something that could greatly affect the cost-benefit determinations based on incentive experiment results.
Total Costs of Revised Incentive Amounts
Increasing the incentive amount from $40 to $75 for all remaining sample members on the WIA 15-month follow-up survey would increase the total incentive payment outlays by $8,890, assuming we reach the targeted response rate. Increasing the incentive from $40 to $75 for unresponsive study participants for the WIA 30-month follow-up survey is expected to increase total incentive payment outlays by $64,940. Costs for additional incentive payments are expected to be partially offset by a reduction in locating and field efforts. Any additional costs would be covered by funding already allocated to the evaluation. There are no changes to burden related to the follow-up surveys.
Burden
There will be no change to the burden estimates based on this request. The burden estimates provided in the original submission to OMB remain the same and are presented in Table 1 and Table 2, below.
Table 1. Annual Burden Estimates for WIA Evaluation Follow-up Surveys, Cost Data Collection, and Veterans’ Supplemental Study
Activity |
Annualized Number of Respondentsa |
Number of Responses per Respondent |
Average Burden Hours per Response |
Total Annual |
WIA Evaluation Follow-up Surveys |
||||
15-month follow-up |
2,460 |
1 |
40 minutes |
1,640 |
30-month follow-up |
2,460 |
1 |
30 minutes |
1,230 |
Average annualized burden for follow-up surveysa |
2,460 |
- |
35 minutes |
1,435 |
Cost Data Collection Package |
||||
Program costs questionnaire |
28 |
1 |
12 |
336 |
Front-line staff activity log |
336 |
1 |
1.25 |
420 |
Resource room sign-in sheet |
10,000 |
1 |
30 seconds |
83 |
annual burden for cost data collectiona |
- |
- |
- |
839 |
Veterans’ Supplemental Study (VSS) |
||||
VSS visits - staff preparations |
28 |
1 |
4 |
112 |
Staff Interviews - AJC staff |
168 |
1 |
20 minutes |
56 |
Staff Interviews - DVOP/LVER staff |
56 |
1 |
1 |
56 |
Staff Interviews - State Veteran coordinators |
19 |
1 |
1 |
19 |
Focus groups - Staff preparation |
8 |
1 |
1 |
8 |
Focus groups with veterans |
56 |
1 |
1 |
56 |
Annual burden for VSSa |
- |
- |
- |
307 |
Total Annualized Burdena |
||||
|
- |
- |
- |
2,581 |
aThe follow-up surveys each span two years, however, the cost data collection and VSS data collection happen within one year.
Table 2. Monetized Burden Hours
Activity/Respondent |
Annualized Number of Burden Hours |
Type of Respondent |
Average Hourly Cost |
Annualized Indirect Cost Burden |
WIA Evaluation Follow-up Surveys |
||||
15-month follow-up |
1,640 |
WIA customer |
$7.25 |
$11,890 |
30-month follow-up |
1,230 |
WIA customer |
$7.25 |
$8,918 |
Annualized cost burden for follow-up surveys |
- |
- |
- |
$10,404 |
Cost Data Collection Package |
||||
Program costs questionnaire |
336 |
WIA admin staff |
$35.18 |
$11,820 |
Front-line staff activity logs |
420 |
WIA front-line staff |
$22.20 |
$9,324 |
Resource room sign-in sheet |
83 |
WIA customer |
$7.25 |
$602 |
Annualized cost burden for cost data collectiona |
- |
- |
- |
$21,746 |
Veterans’ Supplemental Study (VSS) |
||||
VSS visit - staff preparation |
112 |
Local staff |
$22.20 |
$2,486 |
Staff Interviews - AJC staff |
56 |
Local staff |
22.20 |
$1,243 |
Staff Interviews - DVOP/LVER staff |
56 |
Local staff |
22.20 |
$1,243 |
Staff Interviews - State Veteran coordinators |
19 |
State staff |
35.18 |
$668 |
Focus group - Staff preparations |
8 |
Local staff |
22.20 |
$178 |
Focus groups with veterans |
56 |
Customer |
7.25 |
$406 |
Annualized cost burden for VSSa |
- |
- |
- |
$6,224 |
Grand Total—Annualized Cost Burden |
||||
Annual Total Costa |
- |
- |
- |
$38,3749 |
aThe follow-up surveys each span two years, however, the cost data collection and VSS data collection happen within one year.
References
Gelman, Andrew, Matt Stevens, Valerie Chan. “Regression modeling and meta-analysis for decision making: A cost-benefit analysis in telephone surveys.” Journal of Business & Economic Statistics, vol. 21, no. 2, 2002, pp. 213-25.
Göritz, Anja S. “Incentives in Web Studies: Methodological Issues and a Review.” International Jounal of Internet Science, vol. 1, no. 1, 2006, pp 58-70.
Markesich, Jason, and Marth D. Kovac. “The Effects of Differential Incentives on Completion Rates: A Telephone Survey Experiment with Low-Income Respondents.” Presented at The Annual Conference of the American Association of Public Opinion Research, 2003.
Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. “Experiments with Incentives in Telephone Surveys.” Public Opinion Quarterly, vol. 64, no. 2, summer 2000, pp. 171-188.
Singer, Eleanor, John Van Hoewyk, Nancy Gebler, Trivellore Raghunathan, and Katherine McGonagle. “The Effect of Incentives on Response Rates in Interviewer-Mediated Surveys.” Journal of Official Statistics, vol. 15, no. 2, 1999, pp. 217-230.
1 Results conveyed in a memorandum to DOL,”Short-Term Results of the New Survey Procedures for the TAA Evaluation,” November 20, 2008.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Ryan Callahan |
File Modified | 0000-00-00 |
File Created | 2021-01-25 |