Nonsub Change Memo: Early Bird Incentive & Updated Respondent Materials

Early bird incentive OMB memo response 5.30.17.docx

Mother and Infant Home Visiting Program Evaluation (MIHOPE): Kindergarten Follow-Up (MIHOPE-K)

Nonsub Change Memo: Early Bird Incentive & Updated Respondent Materials

OMB: 0970-0402

Document [docx]
Download: docx | pdf

To: Steph Tatham, Office of Information and Regulatory Affairs (OIRA); Office of Management and Budget (OMB)


From: Nancy Geyelin Margie, Office of Planning, Research and Evaluation (OPRE); Administration for Children and Families (ACF)


Date: June 5, 2017


Subject: Experiment results and non-substantive change request for MIHOPE family follow-up incentive structure (Information Collection 0970-0402)



In order to improve response rates, we are requesting non-substantive changes to the follow-up data collection efforts for the legislatively mandated Mother and Infant Home Visiting Program Evaluation (MIHOPE).1 Specifically, we are requesting the ability to use an “early bird” incentive. In addition, we are requesting minor changes to wording of reminder emails and texts.


First, the request to use an early bird incentive is based on results from an experiment on the level and timing of incentives, approved as part of our information collection request for the MIHOPE follow-up data collection (OMB Control No: 0970-0402, approved 08/06/2015). We conducted this experiment to inform future data collection efforts for MIHOPE and other studies that may work with similar populations and that have similar expectations of participants. Results of the experiment are described more fully starting on Page 2 of this memo.


Second, we are also requesting minor changes to the wording of our outreach emails and texts to participants. As part of our greater effort to do whatever we can to raise our response rates within the budgetary constraints of our project, we consulted with experts in behavioral economics. They reviewed our materials and recommended the proposed changes, including the addition of multiple versions these materials so that participants do not receive the same content repeatedly. Based on their research and expertise, they believe that these minor changes in wording will help improve our response rates.


Email reminders have been rewritten with two goals in mind, based on these behavioral economic principles. First, the team wishes to ease the cognitive overload for respondents by reducing the amount of text overall and focusing respondents’ attention to the link and phone number to use for completing the survey, as well as the incentive for completing the survey. The team’s second goal is to change the “frame” of the email reminders, emphasizing respondents’ identity as parents and how continued participation in the study can benefit others like them. In that vein, the team included a link to learn more about the study so that respondents feel more connected and valued as an integral part of the research.


Text messages have been rewritten to include the respondent’s name, based on prior behavioral economic research that suggests that personalization of the message can enhance its saliency to the reader. Moreover, we have added the link to the web survey to reduce the hassle factors for respondents who may not recall that there is a web survey option.


Incentive Experiment Results


Highlights


  • The early bird incentive (which provides higher incentive amounts for individuals who respond quickly to the survey) generated a higher rate of survey responses. The early bird incentive generated higher response rates across data collection cycles, participant characteristics, and program model.


  • The early bird incentive seems to have a slightly larger effect for more vulnerable groups of participants. In particular, we found that the following subgroups had slightly higher response rates when they were offered the early bird incentive: pregnant mothers; families that had moved during the prior year; families where the father figure does not live in the household; and, families where the mother is not married to the biological father of the child.


  • Though there are other ways to raise response rates aside from using an early bird incentive, limited resources in this study suggest that using this strategy to address rates is appropriate at this time. We engaged in field locating processes for a portion of our sample. A comparison between the portion of the sample where field locating was used and the portion where it was not used suggests that when funds are available, field locating can be quite beneficial for increasing response rates. However, as this project cannot afford to continue field locating efforts at this time and the experimental findings do show benefit from the early bird incentive, we are requesting to implement the early bird incentive for future rounds of data collection.


Experiment structure


The experiment was conducted with 1,705 study participants. We tested two commonly used incentive structures:

  1. Prepaid incentives, through which individuals receive a small payment when they are notified of the survey and a larger payment when the survey is completed. Studies indicate that prepaid incentives have the potential to generate increased response rates to surveys (Singer et al. 1999).


  1. Early bird” incentives, through which individuals receive a larger payment if they complete the survey within a set period of time after being notified of the survey. Early bird incentives have been shown to decrease the number of days to complete a survey, which can lead to a decrease in the total survey field period and potentially result in lower costs (LeClere et al. 2012).


Under the prepay option, $5 of the incentive was provided in an advance letter and the remainder was provided after the survey was completed. So far, response rates have been similar for those who were and were not offered the pre-pay incentive.


Under the early bird incentive in the MIHOPE experiment, respondents received an additional $10 if they completed the survey within eight weeks. The early bird incentive was designed to be available for the first eight weeks of the data collection period because, after that point, field locating was planned to begin for cases that had not already responded to the survey. We were particularly interested in maximizing response rates before beginning field locating, since data collection efforts become much more difficult and costly at that point in the process. Due to cost constraints, which we will explain in detail later in this memo, only a portion of the sample was able to receive field locating. However, the early bird incentive continued to be offered for the first eight weeks after sample release for all study participants.


As shown in Table 1, the experiment divided individuals at random into four groups: (1) a control group that received neither an early bird incentive nor a prepayment, (2) a group that received a prepayment, (3) a group that received an early bird incentive, and (4) a group that received both a prepayment and an early bird incentive.


Table 1: Experiment Conditions2


Early bird incentive

Prepaid incentive


No

Yes

No

Treatment 1: $15 after completing the survey

Treatment 2: $5 with advance letter, $10 after completing the survey

Yes

Treatment 3: $25 if survey completed within 8 weeks, $15 otherwise

Treatment 4: $5 with advance letter, remainder ($20 if survey completed within 8 weeks, $10 otherwise) after completing the survey


Challenges to data collection


We began data collection as described above, with attempts to reach participants through calling and other reminder methods for the first eight weeks of eligibility and then through field locating for up to another 9 weeks. Cycle 1 and Cycle 2 of data collection (688 participants) were collected in this way.

By the time of the third sample release (1,017 participants), the project did not have sufficient funds to continue field locating due to the high costs of field locating in Cycles 1 and 2.


Therefore, for our third round of data collection, we attempted to reach participants only through calling and other reminder methods (as mentioned above), but extended the length of time we attempted to reach participants via these methods to 12 weeks.3 For a visual depiction of data collection for experiment participants, please see Figure 1.


Figure 1




For the remainder of this memorandum, we will present results of the experiment for the overall sample, and then also separately for (1) Cycles 1 and 2 combined (received field locating) and (2) Cycle 3 (did not receive field locating).4 The numbers presented in this memo reflect final response rates for each cycle.


Results


Prepay incentive. As stated earlier, the prepay incentive did not improve response rates, either for the full sample or either of the two subsamples (see Table 2).


Early bird incentive. Overall, 57.9% of those offered the early bird incentive responded to the survey, compared with 51.8% of other sample members. This differential response rate was found in the cycles of data collection for which field locating was possible (66.9% versus 63.7%), as well as the cycle for which field locating was not possible (51.8% versus 43.7%).


Field locating. Our experiment was not intended to measure the effect of field locating. However, given the unexpected but necessary modifications to the data collection process, a comparison of Cycles 1 and 2 with Cycle 3 provides a non-experimental comparison of response rates with and without field locating. Specifically, response rates are consistently higher for those in the group with field locating across all treatment conditions.


Table 2: Treatment groups and overall response rates


Treat 1

Treat 2

Treat 1+2

Treat 3

Treat 4

Treat 3+4

Treat 2+4

Treat 1+3


standard incentive

prepay only

did not receive early bird

early bird only

early bird plus prepay

did receive early bird

did not receive prepay

did receive prepay

Full experiment sample (N = 1,705)

52.9%

50.6%

51.8%

56.6%

59.2%

57.9%

54.8%

54.9%

Cycles 1 and 2

(field locating used; N = 688)

64.5%

62.8%

63.7%

68.0%

65.7%

66.9%

66.3%

64.2%

Cycle 3

(no field locating; N = 1,017)

45.1%

42.3%

43.7%

48.8%

54.7%

51.8%

47.0%

48.5%



Characteristics at baseline. For the survey conducted when children are 2.5 years old, we are finding some significant differences in important baseline characteristics between respondents and nonrespondents. For example, nonrespondents are 9.4 percentage points more likely to have entered the study while they were pregnant, which we expect to be an important predictor of the effectiveness of home visiting services. Nonrespondents are 8 percentage points more likely to have moved in the year prior to entering the study, so survey responses might not accurately represent the effects for the most mobile part of the sample. Nonrespondents are 7.3 percentage points less likely to live in a household with their child’s father figure, and 7.4 percentage points less likely to be married to the biological father of their child.


Table 3: Differential response to the 2.5 year old survey: Significant differences at end of fielding period5


Characteristics (at study entry)

Respondents

Nonrespondents

Difference

Pregnant

47.2

56.6

-9.4

Moved in the prior year

17.1

25.1

-8.0

Child’s father figure does not live in household

54.2

61.5

-7.3

Not married to biological father of child

77.9

85.3

-7.4



Experiment results show that response rates are higher for each of these groups when they were offered the early bird incentive. This benefit is present when we examine the full sample, those that received field locating, and those that did not receive field locating (see Table 4).


Table 4: Response rates by participant characteristics (at study entry)


Treat 1

Treat 2

Treat 1+2

Treat 3

Treat 4

Treat 3+4


standard incentive

prepay only

 did not receive early bird

early bird only

early bird plus prepay

received early bird

Pregnant

Full experiment sample

46.0

45.8

45.9

57.3

52.6

54.9

Cycles 1 and 2

(field locating used)

56.7

53.9

55.3

66.1

61.3

63.7

Cycle 3

(no field locating)

41.3

42.5

41.9

53.6

49.0

51.3

Moved in the prior year

Full experiment sample

33.3

45.8

39.2

54.3

48.9

51.5

Cycles 1 and 2

(field locating used)

39.4

54.6

48.1

62.5

61.1

61.7

Cycle 3

(no field locating)

30.0

35.9

32.3

50.9

41.1

46.0

Child’s father figure doesn’t live in the household

Full experiment sample

49.0

47.6

48.3

56.1

55.7

55.9

Cycles 1 and 2

(field locating used)

58.3

62.8

60.4

67.4

63.6

65.4

Cycle 3

(no field locating)

42.1

38.2

40.1

48.9

50.7

49.8

Not married to biological father of child

Full experiment sample

51.6

47.8

49.6

55.5

55.8

55.7

Cycles 1 and 2

(field locating used)

62.8

59.9

61.4

66.7

62.3

64.5

Cycle 3

(no field locating)

43.4

40.3

41.8

48.3

51.5

49.9


Results also suggest that the early bird incentive seems to have a slightly larger effect for the more vulnerable group for each of the program characteristics that we examined (see Table 5). In particular, there was a slightly greater benefit of the early bird incentive for pregnant mothers, families that had moved in the prior year, families where the father figure does not live in the household, and families where the mother is not married to the biological father of the child. This effect is present across the full sample, for the cycles that received field locating as well as the cycle that did not receive field locating.


This is especially important because we know that a significantly greater percentage of individuals with each of these more vulnerable characteristics are in our nonrespondent group (see Table 3).



Table 5: Effect of the early bird incentive for more and less vulnerable participants (based on participant characteristics measured at study entry)


Full sample

Program characteristic at entry

Non early bird

Early bird

Difference

Program characteristic at entry

Non early bird

Early bird

Difference

Pregnant

45.9

54.9

9.0

Not Pregnant

58.3

60.9

2.6

Moved in the prior year

39.2

51.5

12.3

Didn't move in the prior year

55.0

59.8

4.8

Child's father figure doesn't live in household 

48.3

55.9

7.6

Child's father figure lives in household

57.2

61.6

4.4

Not married to biological father of child 

49.7

55.7

6.0

Married to biological father of child

61.8

67.1

5.3

Cycles 1 + 2 (with field locating)

Program characteristic at entry

Non early bird

Early bird

Difference

Program characteristic at entry

Non early bird

Early bird

Difference

Pregnant

55.3

63.7

8.4

Not Pregnant 

68.9

68.6

-0.3

Moved in the prior year

48.1

61.7

13.6

Didn't move in the prior year 

68.6

68.8

0.2

Child's father figure doesn't live in household 

60.4

65.4

5.0

Child's father figure lives in household

67.6

69.4

1.8

Not married to biological father of child 

61.4

64.5

3.1

Married to biological father of child

75.0

76.3

1.3

Cycle 3 (no field locating)

Program characteristic at entry

Non early bird

Early bird

Difference

Program characteristic at entry

Non early bird

Early bird

Difference

Pregnant

41.9

51.3

9.4

Not Pregnant 

46.6

52.5

5.9

Moved in the prior year

32.3

46.0

13.7

Didn't move in the prior year 

46.2

53.4

7.2

Child's father figure doesn't live in household 

40.1

49.8

9.7

Child's father figure lives in household

49.8

55.7

5.9

Not married to biological father of child 

41.8

49.9

8.1

Married to biological father of child

52.4

59.8

7.4



Program model. The most recent round of data collection has seen some differential nonresponse by program model (see Table 6). In particular, we see higher response rates for Early Head Start-Home-Based Option and Healthy Families America and lower rates for Nurse-Family Partnership and Parents as Teachers. This is true of response rates for the overall sample and for the cycles that received field locating. However, this pattern does not appear to hold for the cycle without field locating.


Any differences in response rate across program model are particularly concerning because a primary purpose of the study is to compare four national home visiting program models and learn about which approaches are most effective. A sufficient number of respondents in each program model is needed in order to make this comparison. We anticipate the early bird incentive will assist in reaching a sufficient number of respondents to make the comparison, as

results suggest that the early bird incentive also seems to improve response rates irrespective of program model.



Table 6: Response rates by program model


Treat 1

Treat 2

Treat 1+2

Treat 3

Treat 4

Treat 3+4

Total


standard incentive

prepay only

 did not receive early bird

early bird only

early bird plus prepay

 did receive early bird

All participants

Early Head Start – Home Based Option

Full experiment sample

63.5

54.7

59.1

68.9

55.4

61.9

60.5

Cycles 1 and 2

(field locating used)

73.5

69.4

71.4

76.5

66.7

71.4

71.4

Cycle 3

(no field locating)

51.7

35.7

43.9

59.3

41.4

50.0

46.9

Healthy Families America

Full experiment sample

52.9

51.2

52.0

56.8

65.2

61.0

56.5

Cycles 1 and 2

(field locating used)

73.1

64.8

68.9

66.0

80.0

73.2

71.0

Cycle 3

(no field locating)

42.9

44.4

43.7

52.3

57.6

54.9

49.3

Nurse-Family Partnership

Full experiment sample

52.6

46.5

49.7

58.1

55.6

56.9

53.2

Cycles 1 and 2

(field locating used)

57.1

75.0

64.9

66.7

63.2

65.0

64.9

Cycle 3

(no field locating)

50.9

38.2

44.6

54.7

52.8

53.8

49.1

Parents as Teachers

Full experiment sample

48.1

50.0

49.0

49.6

55.5

52.5

50.8

Cycles 1 and 2

(field locating used)

55.4

54.6

54.9

65.6

53.3

59.5

57.2

Cycle 3

(no field locating)

40.9

45.2

43.0

33.9

57.6

45.8

44.4




Time to respond and use of study resources. In addition to improving response rates, the early bird incentive also resulted in a reduced need to use study resources to call families. That is not only because the early bird incentive resulted in higher response rates, but also because those responses occurred sooner in the fielding period. The following figure compares response rates over the first eight weeks of fielding between those offered the early bird incentive and those not offered the early bird incentive (“other” in the table below). The comparison is made for the full experiment sample (solid lines), for Cycles 1 and 2 (dashed lines), and for Cycle 3 (dotted lines). Results are shown only through eight weeks since respondents received an additional incentive from the early bird offer only for that period of time.



The figure shows that the early bird offer generated higher response rates throughout the fielding period for each of the samples displayed. For the full sample, the difference in response rates grew from about 2 percentage points after the first week to about 4 percentage points after Week 4, and 8 percentage points after Week 8 (and the trend is similar for each of the samples compared). In addition, each person who responded earlier to the early bird incentive did not have to be called by the team to encourage them to complete the survey, allowing effort to be redirected toward increasing overall response rate and reducing nonresponse bias. Requested wording changes to the emails and text reminders may also decrease response time and allow effort to be redirected.


When participants respond quickly, this gives research staff more time and funds to reach the participants who are most difficult to find. This savings of effort and funds facilitates higher response rates and reduces the likelihood of differential nonresponse based on any characteristics that are directly associated with how difficult families are to locate, such as having moved in the year prior to the study.


Given the higher than anticipated costs of locating and reaching the families participating in this study, we may not be able to complete data collection for the full sample with currently available funds. Cost savings through the use of the early bird incentive will increase the likelihood that we will be able to complete data collection for our full sample.


Importance of maximizing response rates


MIHOPE participants are highly mobile: between when families entered the study (either while the participant was pregnant or had a child under six months old) and when the child was 15 months old, about 40 percent of the sample had moved. Because of this, it has been a challenge to achieve response rates for follow-up data collection that meet standard criteria for high quality studies (such as those set out by the What Works Clearinghouse; U.S. Department of Education, Institute of Education Sciences) or that will provide enough statistical power to answer some of the study’s primary research questions. Keeping low-income individuals and young adults, like those participating in MIHOPE, engaged in research over the course of many years is very challenging, because they tend to move often or lack stable contact information and some may have hostile views toward government or social science (Becker, Berry, Orr, & Perlman, 2014; Haan & Ongena, 2014; Tourangeau, 2014).


In a longitudinal study, high response rates are important for several reasons: 1) a high response rate will increase the likelihood that survey respondents are representative of the initial sample; 2) the response rate must be sufficient to make comparisons across program models; and 3) we have found a lower response rate for those who were nonresponders in the previous round of data collection, suggesting low rates of response in the current round of data collection may further impact future rounds.


Last year, MIHOPE collected extensive information on best practices for follow-up longitudinal data collection, and designed the data collection efforts accordingly. Despite this, we have encountered the differential nonresponse biases cited above as well as lower than expected overall response rates. Because contacting these highly mobile families has been so challenging, extra field-locating efforts were used to obtain the response rates reported above for Cycles 1 and 2 of data collection. These continued extra efforts are not sustainable for the project, which means that future cohorts will likely have greater nonresponse bias, differential attrition, and higher overall attrition rates. Even our current overall and differential attrition are no longer clearly within acceptable ranges as defined by the What Works Clearinghouse (WWC)6 and by HHS’s Home Visiting Evidence of Effectiveness review (HomVEE)7, the primary evidence review for the home visiting field. Our current overall attrition is 45.2%, and our differential attrition (between program and control groups) is 4%.


HomVEE considers our current attrition rates “high” making it impossible for this study to receive a high quality rating from the review and only able to receive a moderate quality rating if the study meets a set of additional criteria. We have completed only three of the six sample releases in the 2.5 year old round of surveys, and expect overall attrition to increase in each round, which will further reduce the acceptable range of differential attrition. We believe that offering the early bird incentive to all future cohorts will assist in lowering the overall attrition rates and help reduce nonresponse bias, thereby increasing the quality of the study and raising both its HomVEE and WWC ratings. Though the magnitude of improvement in response rates with an early bird incentive is less than the improvement that could be expected by continuing to use field locating, given funding constraints we believe it is important to use the early bird to maximize the current investment in this data collection effort.







REFERENCES



Becker, Kirsten, Sandra Berry, Nate Orr, and Judy Perlman. 2014. “Finding the Hard-to-Reach and Keeping Them Engaged in Research.” Pages 619-641 in Roger Tourangeau, Brad Edwards, Timothy P. Johnson, Kirk M. Wolter and Nancy Bates (eds.), Hard-to-Survey Populations. Cambridge, UK: Cambridge University Press.


Cantor, D., B. O’Hare, and K. O’Connor. “The Use of Monetary Incentives to Reduce Non-Response in Random Digit Dial Telephone Surveys.” In Advances in Telephone Survey Methodology, edited by J.M. Lepkowski, C. Tucker, J.M. Brick, E. De Leeuw, L. Japec, P.J. Lavrakas, M.W. Link, and R.L. Sangster, pp. 471–498. New York: J.W. Wiley and Sons, Inc., 2007.


Haan, Marieke, and Yfke Ongena. 2014. “Tailored and Targeted Designs for Hard-to-Survey Populations.” Pages 555-574 in Roger Tourangeau, Brad Edwards, Timothy P. Johnson, Kirk M. Wolter and Nancy Bates (eds.), Hard-to-Survey Populations. Cambridge, UK: Cambridge University Press.


LeClere, F., S. Plumme, J. Vanicek, A. Amaya, and K. Carris. “Household Early Bird Incentives: Leveraging Family Influence to Improve Household Response Rates.” American Statistical Association Joint Statistical Meetings, Section on Survey Research, 2012.


Singer, Eleanor, John Van Hoewyk, Nancy Gebler, Trivellore Raghunathan, and Katherine McGonagle. “The Effect of Incentives on Response Rates in Interviewer-Mediated Surveys.” Journal of Official Statistics, vol. 15, 1999, pp. 217–230.


Tourangeau, Roger. 2014. “Defining Hard-to-Survey Populations.” Pages 3-20 in Roger Tourangeau, Brad Edwards, Timothy P. Johnson, Kirk M. Wolter and Nancy Bates (eds.), Hard-to-Survey Populations. Cambridge, UK: Cambridge University Press.


U.S. Department of Education, Institute of Education Sciences, What Works Clearinghouse.

(2013, July). “Assessing Attrition Bias.” Retrieved from: https://ies.ed.gov/ncee/wwc/Docs/ReferenceResources/wwc_attrition_v2.1.pdf

1 Initial data collection for the legislatively mandated Mother and Infant Home Visiting Program Evaluation (MIHOPE) was approved in January 2012.

2 Response rates for those who received the early bird incentive include participants from treatment groups 3 and 4. We contrast those response rates with the participants in treatment groups 1 and 2 combined. Response rates for those who received the prepay incentive include participants in treatment groups 2 and 4. We contrast those response rates with participants in treatment groups 1 and 3 combined.

3 We ended data collection for the sample when we stopped seeing gains from additional contact attempts to participants.

4 The previous version of this memorandum provided results that were reflective of either response rates for the entire fielding period for cycles 1 and 2 (through 17 weeks) or reflective of response rates for the full sample through week 12 only.

5 Numbers for this table represent percentages for the full sample of participants. Therefore, it includes both results for Cycle 1 + Cycle 2, which received calls until 8 weeks and field locating until 17 weeks, and results for Cycle 3, which received calls for 12 weeks and no field locating.

12


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorLaura Nerenberg
File Modified0000-00-00
File Created2021-10-04

© 2024 OMB.report | Privacy Policy