NonSub Change Request - Parent Incentive

Project LAUNCH NonSub Change - Request for Incentives 9.27.17.docx

Project LAUNCH Cross-Site Evaluation

NonSub Change Request - Parent Incentive

OMB: 0970-0373

Document [docx]
Download: docx | pdf

DATE: September 27, 2017


TO: Josh Brammer

Office of Information and Regulatory Affairs; Office of Management and Budget


FROM: Laura Hoard

Office of Planning, Research, and Evaluation; Administration for Children and Families


SUBJECT: Change request to include incentives for respondents completing the Parent Survey for the Project LAUNCH Multi-Site Evaluation (0970-0373)


The Project LAUNCH Multi-Site Evaluation (MSE) is seeking approval to provide incentives to respondents of the Parent Survey to address the response bias observed in the current respondents. The goal of the MSE is to evaluate the impact of Project LAUNCH interventions on families in the communities where the interventions took place. These communities are primarily lower income, under educated, and in some states, include a higher percentage of racial and ethnic minorities than in the general population. The MSE consists of two parts, referred to as Part A and Part B. Part B of the MSE includes data collection with parents (the Parent Survey) in 10 communities served by LAUNCH interventions and 10 demographically similar matched comparison communities.


Recruitment of early childhood education centers (ECEs) and parents began in January 2017. We are recruiting parents from schools and ECEs in zip codes served by the LAUNCH program and the comparison zip codes. These ECEs were selected randomly from a list of licensed ECEs that serve more than 20 children and includes ECEs that accept childcare subsidies as well as Head Start/Early Head Start programs. Once a school or ECE agrees to participate, its staff publicize the survey and recruit parents to volunteer to complete it.


To date, we have gained the cooperation of 100 schools/ECEs and have 664 parent volunteers, as shown in Table 1 below. However, the completed surveys show that the respondents are not representative of the communities where they live.




Table 1: Current Status of Project LAUNCH Recruitment

# Communities with 1+ Participating Institution*

Total # of Institutions Recruited

# Institutions that have provided Parent Contact Info

# Parent Surveys Sent

# Parent Surveys Completed

LAUNCH

10

55

31

382

149

Comparison

10

45

25

282

118

Total

20

100

56

664

267

* Institution = school or early childhood education program (ECE)


We analyzed the demographic data from the first 224 completed Parent Surveys1 by comparing the information parent respondents reported on race/ethnicity, education level, employment status, and income range to the averages for these variables using the American Community Survey (ACS) data on these communities. The Parent Survey respondents to date are more likely to be white, college-educated, employed full-time, and from the highest-income category than the parents in the communities at large (see Table 2 below). This response bias will pose significant risks to the validity of the survey results.




Table 2: Comparison of Demographics of Respondents to Project LAUNCH Parent Survey and Average ACS Results for the Communities Sampled.


Parent Survey Responses as of 9/6/2017

Average of ACS for ZIP codes from which schools/ECEs sampled

Race/Ethnicity

Non-Hispanic

93.4%

90.2%

Hispanic

6.6%

9.9%

Black

31.1%

52.8%

White

57.7%

40.1%

Other2

8.9%

7.1%

Missing

3.6%

n/a

Highest Level of Education

<HS

1.2%

13.5%

HS or GED

10.6%

33.3%

Some college

16.7%

21.4%

2-yr college

13.0%

7.4%

4-yr college or higher

57.7%

24.5%

Missing3

0.8%

n/a

Employment4

Full-time

79.7%

52.2%

Part-time

12.2%

26.2%

Not employed

6.1%

21.5%

Missing

0.4%

n/a

Income Category



<$10K

6.9%

14.4%

$10K-$24K

8.1%

38.2%

$25K-$49K

15.9%

27.1%

$50K+

60.6%

34.7%

Missing

8.5%

n/a


The response bias reflected in the group of parents who have responded to the survey is a significant challenge because capturing representative perspectives is key to our ability to address the research questions. Based on this response bias we seek approval to provide a $25 gift card to parents who complete the 30-minute survey.


To ensure that we enroll ECEs that serve parents who are reflective of their communities, we established a rigorous recruitment protocol. We make multiple contact attempts, providing the background information about the study and answering questions from ECE staff. When possible, the LAUNCH grantees have also reached out to the ECEs in their area to reinforce the importance of the study. Recruiters have noted that some ECEs serving low-income populations have declined to participate, indicating that their parents would not participate in a survey without an incentive.


Once an ECE does agree to participate, we rely on the ECE coordinator to serve as a liaison between the research team and parents. Our recruiters strategize with the ECE coordinator on how to recruit parent volunteers most effectively to ensure that we obtain a representative mix of parents. These efforts have included asking the ECE staff to distribute flyers more than once, making sure the staff understand the importance of the survey so they can communicate it to parents, and having them promote the survey at open houses for their parents. Several ECE coordinators described to our recruiters the challenges they face in trying to get parents to complete other forms, such as applications for subsidies, and have expressed concerns that the parents will not be willing to volunteer to complete a survey that they do not perceive as bringing them any immediate benefit. For parents who do volunteer and provide contact information, the survey contractor follows up multiple times, using the best practices for survey data collection, including sending reminder emails on different days of the week and times of day, and varying the email subject line. For those who do volunteer, the survey completion rate (45%) is close to what we expected (50%).


Previous research indicates that the inclusion of incentives will improve the representativeness of survey sample. Several studies demonstrate the effectiveness of the use of incentives in hard-to-reach populations similar to those missing from the sample in Project LAUNCH MSE study, namely lower-income, lower-education and minority populations. Beebe, et al. (2005) showed that an incentive increased response rates across the board in a survey of Medicaid recipients, and specifically with minority populations in the sample. Other studies have shown that incentives increase participation of respondents typically under-represented in surveys such as those with low education levels (Singer, Van Hoewyk, and Maher, 2000), racial/ethnic minorities, and low-income households (Mack, 1998). For example, Mack, et al. (1998) found that, while a $10 incentive had little effect on response rates, offering a $20 incentive (in 1996 dollars, not adjusted for inflation) boosted response rates overall and particularly among low-income individuals and African Americans. Martinez-Ebers, et al. (1997) found that incentives significantly increased the proportion of Hispanic respondents at follow-up. In another study, research in the Wisconsin Pregnancy Risk Assessment Monitoring System (PRAMS) found that, compared to a coupon or no incentive, a small cash incentive significantly improved response rates among African Americans (Dykema, et al., 2012).


It is imperative that we collect data from a population that is representative of the LAUNCH and comparison communities in order to evaluate the impact of the program. Currently, the sample that has completed the Parent Survey is not representative of the communities. Previous research provides evidence that provision of incentives will increase the chances we have of completing the study with a sample that is representative of the communities served by Project LAUNCH. Recruitment of parents will end in January, and we will continue to prompt the parents to complete the surveys through April. We strongly feel that the best strategy for ensuring that we obtain survey responses from a representative set of parents in that timeframe is to offer a modest monetary incentive to parents.


We understand that OMB requires justification for the use of incentives. We look forward to reporting back on the results of using incentives in Project LAUNCH, including an evaluation of the demographics of parents who responded prior to incentives being offered as compared to those who responded with an incentive, in order to inform future research.




Beebe TJ, Davern ME, McAlpine DD, Call KT, Rockwood TH. (2005) Increasing response rates in a survey of Medicaid enrollees: the effect of a prepaid monetary incentive and mixed modes (mail and telephone). Med Care. 2005 Apr;43(4):411-4.

Dykema, J., Stevenson, J., Kniss, C., Kvale, K., González, K., & Cautley, E. (2012). Use of monetary and nonmonetary incentives to increase response rates among African Americans in the Wisconsin Pregnancy Risk Assessment Monitoring System. Maternal and Child Health Journal, 16(4), 785-791. doi:10.1007/s10995-011-07802


Mack, S., Huggins, V., Keathley, D., and Sudukehi, M. (1998). Do Monetary Incentives Improve Response Rates in the Survey of Income and Program Participation? U.S. Bureau of the Census, Demographic Statistical Methods Division, Washington D.C. 20233. Available at: http://www.amstat.org/sections/srms/Proceedings/papers/1998_089.pdf


Martinez-Ebers, V. (1997). Using Monetary Incentives with Hard-To-Reach Populations in Panel Surveys. International Journal of Public Opinion Research 99(1); 77-86.


Singer E, van Hoewyk J, Maher MP (2000). Experiments with incentives in telephone surveys. Public Opin Q. Volume 64:171–188.



1 The 224 responses are from 18 communities: Alabama, Colorado, Delaware, Georgia, Indiana, Maryland, New Hampshire and Tennessee. Responses from Montana (n=22) were excluded from this analysis as ACS data are not available at the same geographic level to make a direct comparison.

2 The ‘Other’ category includes: two or more races, American Indian or Alaska Native, Asian, Indian, Chinese, Filipino, Japanese, Korean, Vietnamese, Other Asian, Native Hawaiian, Guamanian, Chamorro, and Samoan.

3 There are no “missing” data in the ACS as missing responses are imputed.

4 An additional four respondents were excluded as they were retired or disabled.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWindows User
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy