Incentive Increase Responses and Justification

MIHOPE2 responses to OMB comments on incentive memo 05mar15.docx

Mother and Infant Home Visiting Program Evaluation (MIHOPE)

Incentive Increase Responses and Justification

OMB: 0970-0402

Document [docx]
Download: docx | pdf

Responses to OMB’s comments on our memo requesting to increase the incentive amounts:


  1. The agency notes that it had intended to achieve an 85% response rate at follow up, and that the observed response rate of ~60% is lower than they expected. In my experience, a response rate of ~60% is typical when collecting information from mothers of young children. Although I understand that the agency had planned for 85%, they may not be able to achieve a response rate much higher even with increased incentive amounts.


Response: While it can be challenging to achieve high response rates with a young, highly mobile population, we were aiming for 85% in MIHOPE because we have had success in past studies in getting 75-85% of participants to respond to follow-up surveys. For example, Building Strong Families, which included unmarried couples who had just had a baby and where 72% of the mothers were under age 25 at baseline, the 15-month follow-up survey achieved a 78% response rate. In the Supporting Healthy Marriage evaluation, response rates were 85% for mothers 12 months following random assignment. In the national impact evaluation of Early Head Start, where participants were about the same age on average as in MIHOPE, 75% of the sample responded to a follow-up survey 15 months following random assignment. Therefore, while an 85% response rate may be ambitious, based on past studies of ours with similar populations, we believe that it is still possible to achieve a response rate higher than the current 60%.


  1. In general, we are not concerned with response rates so much as we are concerned about differential nonresponse. From your note, we do not see evidence of differential nonresponse. If you have this information, it would be useful. Any argument for increases to incentive amounts would have to be made in addressing the observed differential nonresponse.


Response: While we agree that differential response between the program and control groups would be especially problematic, low response rates overall reduce the ability of the study to detect statistically significant effects of home visiting, which is one of the primary goals of the study. For example, using the sample pooled across all sites, MIHOPE is designed to detect effects of .069 standard deviations if it achieves an 85% response rate, which increases slightly to .073 standard deviations with a 75% response rate. However, the minimum detectable effect increases more markedly to 0.082 standard deviations if the response rate is only 60%. The low response rate becomes potentially more problematic in answering some of the study’s key research questions that would not use all sites. For example, a goal of the study is to estimate the effects of home visiting for each of four national models included in the study. In looking at a national model that contributed one-fourth of the full sample, the study’s minimum detectable effect is 0.109 with an 85% response rate but increases to 0.116 with a 75% response rate and to 0.130 with a 60% response rate.


  1. The proposed plan to improve response rates by increasing incentive amounts does not seem to address the higher than expected rate of mothers requiring in-person location (who did not respond by telephone).  


Response: We believe, based on the literature cited in the memo submitted previously, that offering higher incentives will encourage families to complete the survey by phone, thereby negating the need to do in-person locating.


As noted in the request for increased incentives, the team is not relying solely on incentive increases to reduce the need for in-person locating but is also undertaking the following activities:


  • conducting batch searches using national address databases to confirm the respondent's address and telephone number before mailing the advance letter and dialing the respondent;

  • conducting additional internet searches as needed to locate respondents using Google pages, white pages, and social media like Facebook and Instagram;

  • calling the contact names the respondent provided at baseline to try and reach respondents who have moved or changed telephone numbers, and if needed, conducting database searches to locate the contact persons if they have moved or changed phone numbers;

  • using paradata to inform the best time of day and day of week to dial respondents, based on optimal patterns of response in completing the baseline and follow-up surveys;

  • assigning cases that completed the baseline survey in Spanish to a Spanish-speaking interviewer for their initial and follow up dial attempts; and,

  • assigning cases that appear to be passive avoiders to the most experienced telephone interviewers with high cooperation rates to try again.


4.      In Federally directed research, we generally do not use incentives at all. When it used, the amount of the incentive is intended to offset burden of respondent participation. In Federally directed research, incentive amounts in the range of $50 are more typically reserved for biospecimen collection or other forms of participation that are particularly burdensome. MIHOPE does not collect this form of information.


Response: We understand and have reduced the requested incentive increase amount accordingly, as well as propose an experiment for examining the effects of different incentive amounts on survey and in-home data collection response and completion rates. Please see response to comment #8 below for more details.


5.      The examples cited of improved response rates (although, see the limitation noted in item 2 above) describe an observed difference between $20 and $35 with “significantly improved response rates.” They do not speak to the difference in response rate that would be expected for $25 to $40 for the survey and $20 to $50 for the in home assessment; it is not clear how much the response rate would change.


Response: We expect an increase from $25 to $40 to have a similar effect to an increase from $20 to $35 since both are increases of $15. We agree that an increase from $20 to $50 is a larger increase than used in prior studies, and it is not clear whether this will make a larger difference. As discussed below, we now propose a more modest increase and to test the impact of the increase by randomly assigning families to different incentive amounts.


6.      Three of the studies cited on page 4 do not appear to be Federally-directed research.  In YouthBuild, FACES and Baby FACES, the incentive cited was less than proposed here. In STED, the population sought is by definition a very hard to reach population. Re Supporting Healthy Marriage, it is unclear to what extent response rates improved. (Again, note the limitation in item 2.) If anything, I would want to see the results of increased incentive use in Supporting Healthy Marriage and STED before extending that more broadly or curtailing incentive use further.


Response: In SHM, comparing the period immediately before and after incentives were increased from $30 to $50, response rates for the survey increased from 70% to 78%, and for the video-taped interactions from 22% to 52%. This comparison may suggest that increased incentives will be important for the in-home data collection in MIHOPE, which includes a video-recorded interaction between the mother and child and a direct assessment of the child’s receptive language skills.


However, changes seen in SHM were not based on randomization of incentives, and increased incentives were accompanied by other changes to the survey firms’ procedures that may be responsible for part of the response rate gain. That is why we agree with OMB’s suggestion to conduct an experiment to understand more about the independent role of incentives in encouraging study participants to complete data collection.


The incentive increase was implemented too recently in STED to have precise numbers on its effect on response rates.


7.      It is perhaps important to underscore that the concern with giving incentives to participants is the public misperception that an agency is purchasing a particular response. Therefore, the concern is not necessarily outweighed by reduction in overall study costs.


We understand and are sensitive to these concerns. The language used in our participant materials is sensitive to these concerns and our justification package is clear that the incentive will have positive impacts on response and data quality. This is consistent with OMB’s guidelines on offering incentives to respondents.


8.      If OPRE revises its proposal to a more modest incentive increase that focuses on addressing observed differential nonresponse and higher than expected in person location, and if this proposal were in the form of an experiment, that may be helpful to you.


Response: Given the scarcity of rigorous evidence on the effects of incentives on MIHOPE’s target population, we are open to conducting an experiment with a more modest increase in incentives. Specifically, we would like to propose a factorial design to test three versions of incentives for the survey and two levels of incentives for in-home data collection.


For the survey, families would be randomly assigned to one of three groups: (1) a group that would be offered the current incentive ($25), (2) a group that would be offered a more generous incentive ($35), and (3) a group that would be offered a more generous incentive only if they respond to the survey by calling the Mathematica Survey Operations Center. Comparing groups (1) and (2) would provide information on the impact of a higher incentive. Comparing (1) and (3) would provide a test of a potentially more efficient set of incentive payments. Each individual who is encouraged by incentives to call the Survey Operations Center will save the study considerable resources.


For in-home data collection, families would be randomly divided so that half would be offered the current incentive ($20) and half would be offered a more generous incentive ($40). The more generous incentive would be more generous for in-home data collection than for the survey because in-home data collection is expected to take 1.5 hours to complete while the survey is designed to take one hour to complete.


For the full experiment, 1,200 families would be randomly assigned to six groups (3 versions of survey incentives and two levels of in-home incentives). Adjusting for having the three comparisons described above, the full experiment would have an 80% chance of finding statistically significant differences of 10 percentage points in response rates for the survey (for example, an increase from 65% to 75%) and 8 percentage points for in-home data collection. If results halfway through the experiment show substantial difference in response rates, we would – in consultation with OMB – stop the experiment and use more generous incentives with the remaining sample. If there is not a substantial difference in response rates at the halfway point, the study would continue and the results would be used to determine whether to increase incentives for the remainder of the study, which would include an additional 1,500-2,000 families to be followed up, depending on when the experiment begins. However, the study would also provide important rigorous evidence that could be used in determining incentive amounts in future studies and in longer-term follow-up with the MIHOPE sample.

3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCharles Michalopoulos
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy