Memo - Results of offering incentives for a parent survey – Project LAUNCH Multi-site Evaluation

Project LAUNCH memo to OMB on incentives 2019_01_28.docx

Project LAUNCH Cross-Site Evaluation

Memo - Results of offering incentives for a parent survey – Project LAUNCH Multi-site Evaluation

OMB: 0970-0373

Document [docx]
Download: docx | pdf


To: Office of Information and Regulatory Affairs (OIRA)

Office of Management and Budget (OMB)


From: Laura Hoard, Nicole Deterding

Office of Planning, Research, and Evaluation (OPRE)

Administration for Children and Families (ACF)


Re: Results of offering incentives for a parent survey – Project LAUNCH Multi-site Evaluation (OMB 0970-0373)


Date: January 30, 2019



This memo reports the results of offering monetary incentives to respondents for the Project LAUNCH Multi-site Evaluation (MSE) Parent Survey. After early concerns about the demographic representativeness of survey respondents, we observed that the addition of a $25 incentive substantially improved the demographic representativeness of the final data, increasing the proportion of lower-income, non-Bachelor’s degree, and non-White survey completers. Despite the addition of an individual-level incentive to the fielding protocol, parents with a high school degree or less remained substantially underrepresented at the conclusion of data collection1.


This memo provides background on the study, a brief review of the literature on the use of incentives in similar studies, a summary of the data collection process, and an analysis of key demographics before and after offering incentives. We conclude with discussion of some of the limitations of the analysis and suggestions for building further evidence about the role of incentives in improving survey data quality.

Background

The purpose of Project LAUNCH (Linking Actions for Unmet Needs in Children’s Health) is to promote the wellness of young children from birth to 8 years of age by addressing their physical, social, emotional, cognitive, and behavioral aspect of their development. The primary goal of Project LAUNCH is to improve school readiness and life outcomes of all children, in support of SAMHSA’s larger mission of reducing the impact of substance abuse and mental illness on America’s communities.

SAMHSA awarded cooperative agreements to states, communities, tribes, and territories to implement activities that improve coordination across child-serving systems, build infrastructure, and increase access to high-quality prevention and wellness promotion services for young children and their families. Grantees selected one pilot community in their jurisdiction to implement five core prevention and promotion strategies: 1) screenings and assessments of children’s social-emotional health; 2) enhanced home-visiting programs; 3) mental health consultation in early care and education; 4) family strengthening and parenting skills training; and 5) integration of behavioral health care into primary care settings.2

Grantees used LAUNCH resources to provide direct support and training within focal communities and to help build infrastructure to improve communication and coordination across child- and family-serving systems. Examples of direct services provided include mental health consultation to teachers, families, and young children; training child-care workers on social-emotional curricula; and offering family strengthening interventions. Systems-level strategies include activities related to coalition building, early childhood advocacy/policy change, public information/awareness campaigns, and fundraising/sustainability. Examples of systems-level activities included development of young child wellness councils that brought together child-serving groups that might not typically work together such as pediatricians and child-care providers, and making efforts to improve developmental screening, assessment and referral resources available to pediatricians or child care providers when they identify a child or family in need of services.


As part of the Multi-Site Evaluation (MSE) of Project LAUNCH, NORC at the University of Chicago surveyed parents in 10 LAUNCH communities and 10 demographically similar communities to estimate the overall impact of the program on child and family outcomes at the community level. 3 The LAUNCH communities selected for the MSE parent survey ranged from selected neighborhoods within large urban areas to entire rural counties. Using demographic data from the American Community Survey (ACS), NORC identified socio-economically similar comparison communities. Comparison communities were defined as a set of contiguous ZIP codes within the same state as the focal LAUNCH site, but far enough from the LAUNCH site to minimize any spillover effects. To select the most appropriate comparisons, NORC used ACS data at the tract level, matching various socio-economic measures, including percentage below the federal poverty level; percentage of the population with a high-school education; household income; percentage Hispanic and African American; percentage of adult population uninsured; percentage on Medicaid; percentage of households that are single-parent households; and percentage home ownership.


Within the 20 selected communities, NORC identified eligible elementary schools and early care and education centers (ECEs). Initially, they randomly selected schools and ECEs from lists of eligible institutions. Eligibility was defined as being a licensed child care provider serving at least 20 children. The study team did not have access to demographic information about the children served as part of the sample frame. By the end of the study, NORC exhausted the entire list of eligible ECEs in attempts to gain participation from six in each community.

The original study design proposed offering a $25 incentive to parents with a child (or children) attending a selected ECE, upon completion of the 30-minute web survey. However, the Office of Management and Budget (OMB) did not approve the request to provide parent incentives. After nine months of recruitment and five months of data collection, NORC compared the reported demographics of 244 survey respondents to the ACS demographics of the selected communities. While this was not a sufficiently large sample to test for statistically significant differences, the raw frequencies differed from the full target population, with less-educated, racial and ethnic minority, and lower-income parents under-represented among survey respondents. This evaluation was designed to examine child and family impacts at the community level, and the early response bias meant that data would not reflect the demographics of families within the community. We provided OMB with information on the response bias (Appendix A), resulting in OMB’s approval for the study to offer a $25 incentive for parents who completed the survey.

Prior Research on Incentives and Non-Response Bias

The target population of the MSE Parent Survey was parents of children 0-8 years of age residing in the communities served by Project LAUNCH and their demographically similar comparison communities. Because of the intervention’s design, residents of these communities are disproportionately lower-income, lower education, and, in some communities, racial or ethnic minorities.

These groups are known to be less likely to respond to surveys than those with higher levels of education and Whites. The existing literature suggests that modest incentives can increase response rates among the lower income, lower education, and racial and ethnic minority respondents. Beebe, et al. (2005) showed that an incentive increased response rates across the board in a survey of Medicaid recipients, and specifically with minority populations in the sample. Other studies have shown that incentives increase participation among respondents typically under-represented in surveys, such as those with low education levels (Singer, Van Hoewyk, and Maher, 2000; Petrolia and Bhattarcharhee, 2009), racial/ethnic minorities, and low-income households (Mack, 1998). For example, Mack, et al. (1998) found that, while a $10 incentive had little effect on response rates, a $20 incentive (in 1996 dollars, not adjusted for inflation) boosted response rates overall and particularly among low-income individuals and African Americans. More recently, research in the Wisconsin Pregnancy Risk Assessment Monitoring System (PRAMS) found that, compared to a coupon or no incentive, a small cash incentive significantly improved response rates among African Americans (Dykema, et al., 2012). Baron, et al., found that an incentive “produces the largest increases in response rates precisely amongst those groups which are least likely to respond in the absence of an incentive” (Baron, et al., 2009).

Data Collection

The study had a three-stage recruitment model. First, NORC sought cooperation from schools and ECEs. They then asked these sites to help the team recruit parents, who voluntarily provided their contact information. Finally, NORC encouraged those providing contact information to complete the web survey. At each stage, study materials emphasized the salience of Project LAUNCH, describing the program’s benefits to young children and the impact such programs could have on local communities. While recruiters were able to offer the schools/ECEs remuneration for their assistance in the study, over 50% of the schools and ECEs NORC approached declined to participate.

Before OMB approved the use of parent incentives, some administrators declined to participate expressing skepticism that parents would be interested in completing a survey without a monetary incentive. After nine months of study recruitment, only 55% of the target number of schools/ECEs had recruited any parent volunteers (66 of 120), and the average number of volunteers per school/ECE was 11.5, far below the study’s target of 30 per site.

Once OMB approved the use of parent incentives, recruiters re-contacted the ECEs that had previously declined to participate, as well as those that had previously agreed to participate but had not been successful in recruiting parent volunteers, providing them with updated study materials that mentioned the incentive. Additionally, parent volunteers who had not yet responded to the survey were informed by email that they would receive an incentive upon completion.

Findings: Incentives Increased Site Participation and Data Quality

Once the study began offering incentives, we observed an increase in the number of ECEs/schools that successfully recruited parent volunteers, and also an increase in the number of volunteers per ECE/school, as shown in Table 1.

Table 1: School/ECE Recruitment Before and After Incentives



Before incentives

After incentives

# ECEs/schools agreed to participate

108*

124

# ECEs/schools that had recruited parent volunteers

73

102

% of ECEs/schools that had recruited parent volunteers

67.6%

82.3%

# of parent volunteers

834

1832

Average # volunteers per ECE/school

11.4

17.9

*Six ECEs that previously agreed to participate, but who had not yet recruited parent volunteers, declined to participate further and were replaced.



We also found that incentives induced parent volunteers to complete the survey more quickly, and that incentives substantially increased the response rate among volunteers, as shown in Table 2.

Table 2: Average Time to Completion and Completion Rates

 

Respondents who completed - no incentive offered

Respondents who completed – incentive offered at recruitment 

Average # of days from invitation to completion

14.9

7.0

Completion rate after 4 weeks*

37.4%

55.8%

*Once OMB approved the parent incentives, we offered them to all non-responders. Therefore, we limited it to 4 weeks in order to compare completion rate before any incentives were offered to the “non-incentive” group.

In addition to drawing in new parent volunteers, 22.5% (n=108) of the previously non-responding parents (n=480) completed the survey after they were offered an incentive. While small in number, this group offers the clearest evidence on the impact of incentives on converting a different subset of volunteers to completion. The “Incentives Only” condition may have induced demographically-different sites, volunteers, survey completers, or a combination thereof.

NORC reviewed the survey responses for reported household income, the respondent’s education level, and the subject child’s race and ethnicity to determine whether the offer of a $25 incentive resulted in meaningful differences in the composition of these variables. For these analyses, they excluded missing values. NORC used one- and two-tailed tests of proportions to detect for significant differences in the means of the selected characteristics (at p<.05), before and after the incentives were offered. Incentives proved successful in obtaining responses from parents who had lower incomes, lower levels of education, and who had African-American children.4 Except for persistent under-representation of parents without a high school degree, the final set of survey respondents more closely approximates the desired inferential population than did initial pre-incentive responses.

NORC also compared the survey respondents to data from the ACS for the ZIP codes of the schools and ECEs from which the sample was recruited. The ACS data are not an exact benchmark for the survey data, as the LAUNCH intervention (and the parent survey) targeted a subset of the population in these areas, namely families with young children. However, the ACS data provided a useful gauge to evaluate the response bias in the sample prior to offering incentives and of the impact once they were offered. Appendix B includes the frequencies and averages for key demographic variables for each of the nine communities and overall (the average of the nine sites.)

Household Income. Respondents who completed the survey before the team was approved to offer incentives were more likely to come from higher-income households whereas, with incentives, more lower-income parents responded to the survey. We found a significant decrease in the proportion of parents in households with incomes above $50,000 once the survey included an incentive. The final distribution of respondents approximated the distribution in the ACS.

Shape1

Parents’ Education. Before we offered incentives, survey respondents were significantly more likely than the geographic population to have a college degree. Incentives brought in more respondents with lower levels of education. However, incentives did not appear to affect the recruitment of sites, volunteers, or respondents among adults with less than a high school degree. Even with incentives, the two lowest-education groups remain under-represented at the end of data collection.

Shape2

Children’s Race and Ethnicity. Prior to offering incentives, parents responding to the survey were significantly more likely to have a White child, whereas providing incentives generated more completed surveys from parents of African-American children. This is particularly visible in the “converted by incentive” group, suggesting that this subset of volunteers was particularly unlikely to proceed to survey completion absent an incentive. The overall average of children who were Hispanic across all nine states did not significantly vary with incentives, possibly due the small number of Hispanic children in the target communities overall (9.7% in ACS, Appendix B). However, in Colorado, where 36% of the community is Hispanic, incentives produced a larger increase in Hispanic completers (27.6% to 39.7%, Appendix B).

Readers should note that ACS data are not available at the ZIP-code level for the race/ethnicity of children under age 6, so the data reflect comparisons to the race/ethnicity of the general population in these communities. The over-representation of “Other” race children in the survey compared to the ACS would likely be smaller were it possible to direct compare focal children to the under-6 population.

Shape3

Missing Values. While there were too few missing demographic responses to test for statistical significance, it should be noted that respondents who received incentives were slightly more likely to skip demographic questions, as shown in Table 2.

Table 2: Missing values for demographic items


Non incentive

Incentive

Education

0.9%

1.9%

Income

8.5%

10.5%

Ethnicity

2.5%

3.2%

Race

2.5%

4.7%



Limitations and Directions for Future Research

The goal of the Project LAUNCH impact study was to assess the impact of grantee activities on families and children in the communities that grantees serve. To accomplish the evaluation’s goals within the study’s budget and timeline, the study was designed to recruit “control” survey respondents through schools and ECE centers. Within ECEs and schools, the team relied on a convenience sample of parents who volunteered their contact information. Finally, they secured completed surveys from these volunteers. Each step in this multi-stage design was an opportunity for survey respondents to diverge from our desired inferential population: all families with young children in the geographic community.

While our analysis documents that monetary incentives improved the community-level demographic representativeness of survey respondents, we are unable to precisely identify whether the improvement is due to changes in the demographic composition of willing ECEs, parent volunteers within a ECE/school, or respondents among volunteers. Evidence from the 108 non-respondents directly converted to participation by the incentive suggest that at least some of the bias reduction was from inducing survey responses from a different subset of volunteers within participating ECEs/schools.

The LAUNCH evaluation design is not unique. For reasons of efficiency, program evaluations often use clustered sampling rather than a simple random sample to assess “community impact.” Evaluation research also commonly relies on sites’ voluntary participation, particularly in “comparison” communities without a programmatic connection to the intervention that is studied. In future studies, it would be useful to formally examine the effect of incentives on recruitment of both sites and individual volunteers to determine how financial incentives affect potential bias from each of these sources.

Our experience makes it clear that incentives affected the
willingness of sites and parents to participate. In the first two months after incentives were approved for our study, sites recruited roughly double the number of parent volunteers (71 per week after incentives vs. 35 before) and we received 2.5 times the number of completed surveys (42 per week vs. 16). However, we do not have the necessary center-level data to directly examine the impact of incentives on the representativeness of participating ECE centers/schools or initial volunteers within them.


Finally, the experience with the LAUNCH evaluation indicates that special efforts, beyond individual-level incentives, may be necessary to secure survey participation from parents with the lowest levels of education. In studies where this is a population of particular interest, survey design and site recruitment should pay special attention to—and make efforts to address—barriers to participation for lowest-educated parents.



APPENDICES

APPENDIX A: MEMO TO OMB REQUESTING INCENTIVES


DATE: September 27, 2017


TO: Josh Brammer

Office of Information and Regulatory Affairs; Office of Management and Budget


FROM: Laura Hoard

Office of Planning, Research, and Evaluation; Administration for Children and Families


SUBJECT: Change request to include incentives for respondents completing the Parent Survey for the Project LAUNCH Multi-Site Evaluation (0970-0373)


The Project LAUNCH Multi-Site Evaluation (MSE) is seeking approval to provide incentives to respondents of the Parent Survey to address the response bias observed in the current respondents. The goal of the MSE is to evaluate the impact of Project LAUNCH interventions on families in the communities where the interventions took place. These communities are primarily lower income, under educated, and in some states, include a higher percentage of racial and ethnic minorities than in the general population. The MSE consists of two parts, referred to as Part A and Part B. Part B of the MSE includes data collection with parents (the Parent Survey) in 10 communities served by LAUNCH interventions and 10 demographically similar matched comparison communities.


Recruitment of early childhood education centers (ECEs) and parents began in January 2017. We are recruiting parents from schools and ECEs in zip codes served by the LAUNCH program and the comparison zip codes. These ECEs were selected randomly from a list of licensed ECEs that serve more than 20 children and includes ECEs that accept childcare subsidies as well as Head Start/Early Head Start programs. Once a school or ECE agrees to participate, its staff publicize the survey and recruit parents to volunteer to complete it.


To date, we have gained the cooperation of 100 schools/ECEs and have 664 parent volunteers, as shown in Table 1 below. However, the completed surveys show that the respondents are not representative of the communities where they live.



Table 1: Current Status of Project LAUNCH Recruitment

# Communities with 1+ Participating Institution*

Total # of Institutions Recruited

# Institutions that have provided Parent Contact Info

# Parent Surveys Sent

# Parent Surveys Completed

LAUNCH

10

55

31

382

149

Comparison

10

45

25

282

118

Total

20

100

56

664

267

* Institution = school or early childhood education program (ECE)


We analyzed the demographic data from the first 224 completed Parent Surveys5 by comparing the information parent respondents reported on race/ethnicity, education level, employment status, and income range to the averages for these variables using the American Community Survey (ACS) data on these communities. The Parent Survey respondents to date are more likely to be white, college-educated, employed full-time, and from the highest-income category than the parents in the communities at large (see Table 2 below). This response bias will pose significant risks to the validity of the survey results.


Table 2: Comparison of Demographics of Respondents to Project LAUNCH Parent Survey and Average ACS Results for the Communities Sampled.


Parent Survey Responses as of 9/6/2017

Average of ACS for ZIP codes from which schools/ECEs sampled

Race/Ethnicity

Non-Hispanic

93.4%

90.2%

Hispanic

6.6%

9.9%

Black

31.1%

52.8%

White

57.7%

40.1%

Other6

8.9%

7.1%

Missing

3.6%

n/a

Highest Level of Education

<HS

1.2%

13.5%

HS or GED

10.6%

33.3%

Some college

16.7%

21.4%

2-yr college

13.0%

7.4%

4-yr college or higher

57.7%

24.5%

Missing7

0.8%

n/a

Employment8

Full-time

79.7%

52.2%

Part-time

12.2%

26.2%

Not employed

6.1%

21.5%

Missing

0.4%

n/a

Income Category



<$10K

6.9%

14.4%

$10K-$24K

8.1%

38.2%

$25K-$49K

15.9%

27.1%

$50K+

60.6%

34.7%

Missing

8.5%

n/a


The response bias reflected in the group of parents who have responded to the survey is a significant challenge because capturing representative perspectives is key to our ability to address the research questions. Based on this response bias we seek approval to provide a $25 gift card to parents who complete the 30-minute survey.


To ensure that we enroll ECEs that serve parents who are reflective of their communities, we established a rigorous recruitment protocol. We make multiple contact attempts, providing the background information about the study and answering questions from ECE staff. When possible, the LAUNCH grantees have also reached out to the ECEs in their area to reinforce the importance of the study. Recruiters have noted that some ECEs serving low-income populations have declined to participate, indicating that their parents would not participate in a survey without an incentive.


Once an ECE does agree to participate, we rely on the ECE coordinator to serve as a liaison between the research team and parents. Our recruiters strategize with the ECE coordinator on how to recruit parent volunteers most effectively to ensure that we obtain a representative mix of parents. These efforts have included asking the ECE staff to distribute flyers more than once, making sure the staff understand the importance of the survey so they can communicate it to parents, and having them promote the survey at open houses for their parents. Several ECE coordinators described to our recruiters the challenges they face in trying to get parents to complete other forms, such as applications for subsidies, and have expressed concerns that the parents will not be willing to volunteer to complete a survey that they do not perceive as bringing them any immediate benefit. For parents who do volunteer and provide contact information, the survey contractor follows up multiple times, using the best practices for survey data collection, including sending reminder emails on different days of the week and times of day, and varying the email subject line. For those who do volunteer, the survey completion rate (45%) is close to what we expected (50%).


Previous research indicates that the inclusion of incentives will improve the representativeness of survey sample. Several studies demonstrate the effectiveness of the use of incentives in hard-to-reach populations similar to those missing from the sample in Project LAUNCH MSE study, namely lower-income, lower-education and minority populations. Beebe, et al. (2005) showed that an incentive increased response rates across the board in a survey of Medicaid recipients, and specifically with minority populations in the sample. Other studies have shown that incentives increase participation of respondents typically under-represented in surveys such as those with low education levels (Singer, Van Hoewyk, and Maher, 2000), racial/ethnic minorities, and low-income households (Mack, 1998). For example, Mack, et al. (1998) found that, while a $10 incentive had little effect on response rates, offering a $20 incentive (in 1996 dollars, not adjusted for inflation) boosted response rates overall and particularly among low-income individuals and African Americans. Martinez-Ebers, et al. (1997) found that incentives significantly increased the proportion of Hispanic respondents at follow-up. In another study, research in the Wisconsin Pregnancy Risk Assessment Monitoring System (PRAMS) found that, compared to a coupon or no incentive, a small cash incentive significantly improved response rates among African Americans (Dykema, et al., 2012).


It is imperative that we collect data from a population that is representative of the LAUNCH and comparison communities in order to evaluate the impact of the program. Currently, the sample that has completed the Parent Survey is not representative of the communities. Previous research provides evidence that provision of incentives will increase the chances we have of completing the study with a sample that is representative of the communities served by Project LAUNCH. Recruitment of parents will end in January, and we will continue to prompt the parents to complete the surveys through April. We strongly feel that the best strategy for ensuring that we obtain survey responses from a representative set of parents in that timeframe is to offer a modest monetary incentive to parents.


We understand that OMB requires justification for the use of incentives. We look forward to reporting back on the results of using incentives in Project LAUNCH, including an evaluation of the demographics of parents who responded prior to incentives being offered as compared to those who responded with an incentive, in order to inform future research.





Beebe TJ, Davern ME, McAlpine DD, Call KT, Rockwood TH. (2005) Increasing response rates in a survey of Medicaid enrollees: the effect of a prepaid monetary incentive and mixed modes (mail and telephone). Med Care. 2005 Apr;43(4):411-4.



Dykema, J., Stevenson, J., Kniss, C., Kvale, K., González, K., & Cautley, E. (2012). Use of monetary and nonmonetary incentives to increase response rates among African Americans in the Wisconsin Pregnancy Risk Assessment Monitoring System. Maternal and Child Health Journal, 16(4), 785-791. doi:10.1007/s10995-011-07802

Mack, S., Huggins, V., Keathley, D., and Sudukehi, M. (1998). Do Monetary Incentives Improve Response Rates in the Survey of Income and Program Participation? U.S. Bureau of the Census, Demographic Statistical Methods Division, Washington D.C. 20233. Available at: http://www.amstat.org/sections/srms/Proceedings/papers/1998_089.pdf

Martinez-Ebers, V. (1997). Using Monetary Incentives with Hard-To-Reach Populations in Panel Surveys. International Journal of Public Opinion Research 99(1); 77-86.

Singer E, van Hoewyk J, Maher MP (2000). Experiments with incentives in telephone surveys. Public Opin Q. Volume 64:171–188.










Appendix B: Demographics of Participating Communities and Completed Parent Surveys by Incentive Condition



Parent's education

Household income

Ethnicity

Race


Total

less than high school (<HS)

HS or equivlant

Some college

2-yr college

4-yr college or higher

Miss

<$10K

<$25K

$25K-$49K

$50K+

Miss

Non-Hispanic

Hispanic

Miss

White

Black

Other

Miss

OVERALL - AVERAGE

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Community average (ACS)

14.8%

32.5%

21.8%

7.3%

23.6%


9.7%

18.5%

25.9%

46.0%


90.3%

9.7%


47.1%

44.3%

8.6%






















No incentives (N)

316

3

36

54

36

184

3

22

33

46

188

27

286

22

8

174

94

40

8

No incentives (%)


1.0%

11.5%

17.3%

11.5%

58.8%


7.6%

11.4%

15.9%

65.1%


92.9%

7.1%


56.5%

30.5%

13.0%


Incentives (N)

593

11

129

161

81

200

11

82

115

152

182

62

519

55

19

219

277

69

28

Incentives (%)


1.9%

22.2%

27.7%

13.9%

34.4%


15.4%

21.7%

28.6%

34.3%


90.4%

9.6%


38.8%

49.0%

12.2%






















Total (N)

909

14

165

215

117

384

14

104

148

198

370

89

805

77

27

393

371

109

36

Total (%)


1.6%

18.4%

24.0%

13.1%

42.9%


12.7%

18.0%

24.1%

45.1%


91.3%

8.7%


45.0%

42.5%

12.5%






















ALABAMA

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Community average (ACS)

15.1%

31.7%

22.1%

7.0%

24.0%


11.6%

21.5%

26.2%

40.7%


96.5%

3.5%


57.4%

39.2%

3.4%




















No incentives (N)

57

1

2

9

8

37

0

4

5

10

32

6

55

1

1

31

22

3

1

No incentives (%)


1.8%

3.5%

15.8%

14.0%

64.9%


7.8%

9.8%

19.6%

62.7%


98.2%

1.8%


55.4%

39.3%

5.4%


Incentives (N)

58

2

11

12

6

27

0

7

9

7

32

3

55

1

2

28

16

12

2

Incentives (%)


3.4%

19.0%

20.7%

10.3%

46.6%


12.7%

16.4%

12.7%

58.2%


98.2%

1.8%


50.0%

28.6%

21.4%






















Total (N)

115

3

13

21

14

64

0

11

14

17

64

9

110

2

3

59

38

15

3

Total (%)


2.6%

11.3%

18.3%

12.2%

55.7%


10.4%

13.2%

16.0%

60.4%


98.2%

1.8%


52.7%

33.9%

13.4%






















COLORADO

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Community average (ACS)

14.8%

27.8%

24.6%

10.1%

22.6%


5.6%

14.9%

26.8%

52.8%


63.8%

36.2%


73.8%

6.3%

19.9%






















No incentives (N)

29

0

4

4

5

15

1

3

1

3

20

2

21

8

0

22

3

4

0

No incentives (%)


0.0%

14.3%

14.3%

17.9%

53.6%


11.1%

3.7%

11.1%

74.1%


72.4%

27.6%


75.9%

10.3%

13.8%


Incentives (N)

78

2

17

32

15

11

1

14

16

33

9

6

44

29

5

39

16

12

11

Incentives (%)


2.6%

22.1%

41.6%

19.5%

14.3%


19.4%

22.2%

45.8%

12.5%


60.3%

39.7%


58.2%

23.9%

17.9%






















Total (N)

107

2

21

36

20

26

2

17

17

36

29

8

65

37

5

61

19

16

11

Total (%)


1.9%

20.0%

34.3%

19.0%

24.8%


17.2%

17.2%

36.4%

29.3%


63.7%

36.3%


63.5%

19.8%

16.7%






















DELAWARE

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Community average (ACS)

15.9%

33.9%

21.1%

6.5%

22.6%


9.2%

17.2%

25.3%

48.3%


88.4%

11.6%


51.6%

38.8%

9.6%






















No incentives (N)

37

0

6

12

4

13

2

4

6

5

19

3

30

6

1

16

10

7

4

No incentives (%)


0.0%

17.1%

34.3%

11.4%

37.1%


11.8%

17.6%

14.7%

55.9%


83.3%

16.7%


48.5%

30.3%

21.2%


Incentives (N)

58

0

24

20

5

7

2

8

19

20

6

5

48

9

1

6

45

6

1

Incentives (%)


0.0%

42.9%

35.7%

8.9%

12.5%


15.1%

35.8%

37.7%

11.3%


84.2%

15.8%


10.5%

78.9%

10.5%






















Total (N)

95

0

30

32

9

20

4

12

25

25

25

8

78

15

2

22

55

13

5

Total (%)


0.0%

33.0%

35.2%

9.9%

22.0%


13.8%

28.7%

28.7%

28.7%


83.9%

16.1%


24.4%

61.1%

14.4%






















GEORGIA

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Community average (ACS)

19.3%

30.9%

23.7%

6.6%

19.5%


12.2%

23.5%

29.1%

35.1%


96.6%

3.4%


13.0%

77.9%

9.1%






















No incentives (N)

20

1

4

2

2

11

0

4

2

2

10

2

16

3

1

7

11

2

0

No incentives (%)


5.0%

20.0%

10.0%

10.0%

55.0%


22.2%

11.1%

11.1%

55.6%


84.2%

15.8%


35.0%

55.0%

10.0%


Incentives (N)

84

2

24

24

11

19

4

22

19

19

13

11

79

2

3

9

64

7

4

Incentives (%)


2.5%

30.0%

30.0%

13.8%

23.8%


30.1%

26.0%

26.0%

17.8%


97.5%

2.5%


11.3%

80.0%

8.8%






















Total (N)

104

3

28

26

13

30

4

26

21

21

23

13

95

5

4

16

75

9

4

Total (%)


3.0%

28.0%

26.0%

13.0%

30.0%


28.6%

23.1%

23.1%

25.3%


95.0%

5.0%


16.0%

75.0%

9.0%






















INDIANA

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Community average (ACS)

9.8%

36.1%

19.5%

8.6%

26.1%


6.3%

14.7%

27.4%

51.5%


96.6%

3.4%


90.9%

4.6%

4.5%






















No incentives (N)

41

0

3

8

5

25

0

0

4

9

26

2

38

2

1

32

0

9

0

No incentives (%)


0.0%

7.3%

19.5%

12.2%

61.0%


0.0%

10.3%

23.1%

66.7%


95.0%

5.0%


78.0%

0.0%

22.0%


Incentives (N)

54

0

7

10

8

29

0

1

10

9

28

6

52

2

0

47

0

7

0

Incentives (%)


0.0%

13.0%

18.5%

14.8%

53.7%


2.1%

20.8%

18.8%

58.3%


96.3%

3.7%


87.0%

0.0%

13.0%






















Total (N)

95

0

10

18

13

54

0

1

14

18

54

8

90

4

1

79

0

16

0

Total (%)


0.0%

10.5%

18.9%

13.7%

56.8%


1.1%

16.1%

20.7%

62.1%


95.7%

4.3%


83.2%

0.0%

16.8%






















MARYLAND

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Community average (ACS)

14.4%

35.1%

24.8%

5.7%

20.0%


8.0%

13.2%

23.3%

55.6%


92.1%

7.9%


12.9%

79.7%

7.5%






















No incentives (N)

22

0

4

4

1

13

0

0

2

5

14

1

20

0

2

0

19

2

1

No incentives (%)


0.0%

18.2%

18.2%

4.5%

59.1%


0.0%

9.5%

23.8%

66.7%


100.0%

0.0%


0.0%

90.5%

9.5%


Incentives (N)

44

0

8

14

4

18

0

5

6

16

13

4

42

0

2

0

40

1

3

Incentives (%)


0.0%

18.2%

31.8%

9.1%

40.9%


12.5%

15.0%

40.0%

32.5%


100.0%

0.0%


0.0%

97.6%

2.4%






















Total (N)

66

0

12

18

5

31

0

5

8

21

27

5

62

0

4

0

59

3

4

Total (%)


0.0%

18.2%

27.3%

7.6%

47.0%


8.2%

13.1%

34.4%

44.3%


100.0%

0.0%


0.0%

95.2%

4.8%






















NEW HAMPSHIRE

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Community average (ACS)

11.3%

32.0%

19.8%

9.8%

27.1%


5.2%

15.2%

23.9%

55.8%


93.8%

6.2%


89.1%

3.7%

7.2%






















No incentives (N)

37

0

3

6

5

23

0

0

2

7

28

0

34

2

1

34

0

3

0

No incentives (%)


0.0%

8.1%

16.2%

13.5%

62.2%


0.0%

5.4%

18.9%

75.7%


94.4%

5.6%


91.9%

0.0%

8.1%


Incentives (N)

49

0

6

12

6

24

1

2

5

10

24

8

44

3

2

44

0

3

2

Incentives (%)


0.0%

12.5%

25.0%

12.5%

50.0%


4.9%

12.2%

24.4%

58.5%


93.6%

6.4%


93.6%

0.0%

6.4%






















Total (N)

86

0

9

18

11

47

1

2

7

17

52

8

78

5

3

78

0

6

2

Total (%)


0.0%

10.6%

21.2%

12.9%

55.3%


2.6%

9.0%

21.8%

66.7%


94.0%

6.0%


92.9%

0.0%

7.1%






















PENNSYLVANIA

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Community average (ACS)

11.2%

30.0%

18.0%

6.7%

34.0%


13.2%

20.1%

23.8%

42.9%


97.1%

2.9%


41.8%

52.1%

6.1%






















No incentives (N)

37

0

2

1

3

31

0

1

3

2

27

4

37

0

0

25

6

6

0

No incentives (%)


0.0%

5.4%

2.7%

8.1%

83.8%


3.0%

9.1%

6.1%

81.8%


100.0%

0.0%


67.6%

16.2%

16.2%


Incentives (N)

103

2

15

26

19

39

2

12

23

22

34

12

92

8

3

15

64

19

5

Incentives (%)


2.0%

14.9%

25.7%

18.8%

38.6%


13.2%

25.3%

24.2%

37.4%


92.0%

8.0%


15.3%

65.3%

19.4%






















Total (N)

140

2

17

27

22

70

2

13

26

24

61

16

129

8

3

40

70

25

5

Total (%)


1.4%

12.3%

19.6%

15.9%

50.7%


10.5%

21.0%

19.4%

49.2%


94.2%

5.8%


29.6%

51.9%

18.5%






















TENNESSEE

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Community average (ACS)

21.0%

36.3%

22.2%

5.5%

14.9%


13.9%

26.5%

29.7%

29.9%


90.9%

9.1%


20.5%

71.0%

8.6%






















No incentives (N)

36

1

8

8

3

16

0

6

8

3

12

7

35

0

1

7

23

4

2

No incentives (%)


2.8%

22.2%

22.2%

8.3%

44.4%


20.7%

27.6%

10.3%

41.4%


100.0%

0.0%


20.6%

67.6%

11.8%


Incentives (N)

65

3

17

11

7

26

1

11

8

16

23

7

63

1

1

31

32

2

0

Incentives (%)


4.7%

26.6%

17.2%

10.9%

40.6%


19.0%

13.8%

27.6%

39.7%


98.4%

1.6%


47.7%

49.2%

3.1%






















Total (N)

101

4

25

19

10

42

1

17

16

19

35

14

98

1

2

38

55

6

2

Total (%)


4.0%

25.0%

19.0%

10.0%

42.0%


19.5%

18.4%

21.8%

40.2%


99.0%

1.0%


38.4%

55.6%

6.1%






















Percentages do not include missing values in the denominator

Bold indicates statistically significant difference between no-incentive and incentive conditions (p<.05)


1 The observed lack of effect of individual incentives for lower-education respondents may be because these parents were under-represented in participating ECEs, among volunteers generated by participating sites, or in the conversion of volunteers to respondents. Site-level demographic information about families served by participating ECEs was not available to the research team. This important point is discussed in the Data Collection and Discussion/Conclusion sections below.

3 The Parent Survey portion of the MSE was limited to State grantees, and did not include Tribes or Territories.

4 Data from the Montana communities were excluded from these analyses as the 2011-2015 five-year ZCTA level data are not available for the measures of interest in these communities.

5 The 224 responses are from 18 communities: Alabama, Colorado, Delaware, Georgia, Indiana, Maryland, New Hampshire and Tennessee. Responses from Montana (n=22) were excluded from this analysis as ACS data are not available at the same geographic level to make a direct comparison.

6 The ‘Other’ category includes: two or more races, American Indian or Alaska Native, Asian, Indian, Chinese, Filipino, Japanese, Korean, Vietnamese, Other Asian, Native Hawaiian, Guamanian, Chamorro, and Samoan.

7 There are no “missing” data in the ACS as missing responses are imputed.

8 An additional four respondents were excluded as they were retired or disabled.



6

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWindows User
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy