MIHOPE Check-in_Supporting Statement B REVISED 5.26.17_clean

MIHOPE Check-in_Supporting Statement B REVISED 5.26.17_clean.docx

Mother and Infant Home Visiting Program Evaluation (MIHOPE)

OMB: 0970-0402

Document [docx]
Download: docx | pdf



Maternal and Infant Home Visiting Program Evaluation (MIHOPE):

MIHOPE Check-in



OMB Information Collection Request

0970 - 0402



Supporting Statement

Part B

May 2017


Submitted By:

Office of Planning, Research and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


330 C Street SW 4th FloorWashington, D.C. 20201


Project Officer:

Nancy Geyelin Margie


Part B. COLLECTION OF INFORMATION USING STATISTICAL METHODS


B1. Sampling


The sampling plan for MIHOPE was described in the supporting statement for Phase 1 (MIHOPE 1) data collection activities (attached here). MIHOPE recruited 4,229 families from approximately 88 local programs (sites) in 12 states. Families were randomly divided between a program group, which could be enrolled in one of the home visiting programs being studied, or a control group, which was provided with referrals to other services in the community. Families were eligible for the study if the mother wass pregnant or the family has a child under six months old when they were recruited for the study, the mother is 15 years or older, and the mother was available to complete the baseline family survey. Local sites meeting several criteria were chosen: (1) operating programs that have existed for at least two years, (2) evidence of enough demand for home visiting services that they could provide a control group, (3) no evidence of severe implementation problems that would interfere with the program’s ability to participate in the study, and (4) a contribution to the diversity of sites and families for purposes of estimating effects for important subgroups of families. The OMB supporting statement for the MIHOPE data collection (approved July 12, 2012) indicated that the sample is adequate to detect policy relevant impacts of home visiting, both overall, for key subgroups, and for each of the four evidence-based models included in the evaluation (see Attachments 1 and 2).


Families were recruited into the study by Mathematica’s survey research staff, who visited families to obtain informed consent when home visitors determine whether a family was eligible for the study or soon after that determination was made. MIHOPE Check-in will collect additional information from families when the child is approximately 2.5, 3.5, and 4.5 years of age.


Baseline data collection was completed in September of 2015. 4,229 participants had completed the baseline survey (MIHOPE 1), with 2,111 1,976 participants randomized into the treatment group and 2,118 participants randomized into the control group. Follow-up data collection activities have been conducted when children were 15 months old (MIHOPE 2). Data collection has been completed with the majority of the sample and will end in July 2017. Follow-up data collection when children are 15 months old (MIHOPE 2) has been challenging, with lower response rates and more in-person locating than expected, as discussed in the documentation submitted for the non-substantive change approved on April 22, 2015 (ICR Reference No: 201504-0970-001). As approved by OMB, the study conducted an experiment with incentive amounts on the 15-month data collection effort (MIHOPE 2) to see if they increase response rates. Specifically, the 15-month follow-up portion of the study randomly assigned potential respondents to one of three versions of incentives for the survey (current amount; greater amount; greater amount only if they call to complete the survey) and one of two levels of incentives for in-home data collection (current amount; double the current amount). Results will be reviewed with OMB through submission as a non substantive change..


B2. Procedures for collection of information


This section describes the collection of follow-up data for MIHOPE Check-in. Best practices will be followed for conducting the data collection, including training and certifying staff on data collection procedures and monitoring data collection to ensure that high quality data are collected, high response rates are achieved, and differential response rates are avoided. Our follow-up data collection method builds on the methods being used in MIHOPE to the extent possible. In particular,


  • Sample members will first be encouraged to complete the follow-up survey online. This option is being provided to increase response rates, reduce the level of effort needed to obtain survey responses, reduce respondent burden, and to provide convenience and privacy to respondents in completing the survey.

  • For those who do not complete the survey online, computer assisted telephone interviewing (CATI) will be used to conduct the survey.

  • Tokens of appreciation will be provided to increase families’ willingness to respond to each of the follow-up surveys.


Conducting the Follow-Up Family Survey


Sample will be released for the follow-up family survey biannually. Each survey release will include families in which the focal child reached 2.5, 3.5, or 4.5 years of age during the prior 6 month period. Prior to each survey release, Mathematica will send the family an advance letter about the follow-up survey (Attachment 5). Families who provided an email address in a prior round of the survey will also be sent an email (Attachment 5). Both the email and the advance letter will contain information about the study, the web survey URL and the family’s unique username and password, the gift card amount to be provided for completing the survey, and a toll-free number to contact the study team. Families who did not provide an email address will receive a postcard in the mail one week after the advance letter is sent out, while those with email addresses will receive a second email (Attachment 5), subsequent to their receipt of the advance letter. Telephone interviewers at Mathematica’s Survey Operations Center (SOC) will begin trying to contact families who have not yet completed the survey two weeks after the advance letter is mailed. Telephone interviewers will call nonresponding families who do not have an email address for six weeks. If the family does have an email address, Mathematica telephone interviewers will begin trying to reach them four weeks after mailing the advance letter and will continue calling for an additional four weeks. These families will receive an additional email reminder about the survey three weeks after the initial email is sent. Field locators may begin working cases if telephone attempts do not yield successful contact with the family and completion of the survey within eight weeks of the advance letter being mailed. In addition to the methods mentioned above, we will utilize a variety of methods throughout the data collection period to remind respondents about completing the survey including additional phone calls, postcards, letters, emails, and text messages (Attachment 5). During each follow-up survey, we will ask families about their preferred method of contact (i.e., mail, phone, email, text message). We will concentrate our contacting efforts in future survey rounds on the preferred method of contact for each family.


We may develop and maintain a study Facebook profile to maintain contact and relay information about the study to participating families. We plan to use Facebook as a tool for directly contacting study participants who are not reachable through other means, as well as using it as a locating tool. Facebook pages of participating families may contain contact information such as a phone number, email address, or mailing address that is not available through other sources. Facebook users tend to monitor and maintain their pages actively and consistently compared with other forms of contact, such as phone number or email address, which may be changed frequently or be used less often by members of a young, mobile population. During each survey round, we will ask families if they have a Facebook profile, the email address they use on the profile, and we will obtain their consent to contact them via Facebook. If the family has a Facebook profile and consents to be contacted this way, we will send the family a friend request (if the family is not our “friend” already). Our study Facebook profile will have privacy settings such that families who friend the study cannot see each other. This will ensure that the identities of families participating in the study remain private. Periodically we will post updates about the study to the Facebook profile to keep families informed about what’s happening with the study.


Prior to beginning administration of the 2.5 year survey, we will obtain verbal consent for the survey if it will be conducted over the phone. The web version of the survey will contain the same language about the purpose of the survey, the expected time to complete the survey, and the voluntary nature of the survey. (Attachment 3).


To help inform future data collection for MIHOPE and other studies, proposed to conduct an experiment on incentives. We tested two commonly used incentive structures, prepaid incentives and early bird incentives, to examine the best method for maximizing survey response among a low-income, highly mobile population. Researchers have long regarded prepaid incentives as having the potential to generate increased response rates to surveys (Cantor et al. 2008; Singer et al. 1999). Furthermore, early bird incentives have been shown to decrease the number of days to complete a survey, which can lead to a decrease in the total survey field period and potentially result in lower costs (LeClere et al. 2012).


Individuals were divided into the following four groups, depending on whether they were offered an additional “early bird” incentive for completing the survey quickly or a prepaid incentive that would include $5 of the incentive as part of the advance letter and the remainder after completing the survey.


Results from the experiment are included in a memorandum that accompanies this submission.


Early bird incentive

Prepaid incentive


No

Yes

No

$15 after completing the survey

$5 with advance letter, $10 after completing the survey

Yes

$25 if survey completed within 8 weeks, $15 otherwise

$5 with advance letter, remainder ($20 if survey completed within 8 weeks, $10 otherwise) after completing the survey



B3. Maximizing response rates


Minimizing sample attrition is of utmost importance to any longitudinal study. It is likely that many MIHOPE families will be highly mobile, and therefore there will be the risk of attrition at follow-up.


Several strategies will be adopted to mitigate the risk of attrition at follow-up:


  • Use the detailed information collected in MIHOPE 1 and MIHOPE 2 (including names, dates of birth, Social Security numbers, addresses and phone numbers (home and work), and email addresses for the family, as well as addresses and phone numbers for up to three relatives or friends who will know how to reach the family) and employ Mathematica’s highly effective locating techniques to reach families.

  • Train field staff in how to gain cooperation and avoid refusals.

  • Provide tokens of appreciation for each follow-up survey to encourage participation.

  • Use the email addresses and cellular telephone numbers of participants that have been collected in MIHOPE 1 and MIHOPE 2 to send email and text message reminders about the follow-up survey during each survey field period.

  • Use an early bird incentive structure to encourage participant response.


Additional strategies that may be adopted:


  • Develop and maintain a study Facebook profile to maintain contact and relay information about the study to participating families.

  • Use wording in participant contact materials that reflects best practice from the field of behavioral economics research.


Updating Participant Contact Information. Mathematica’s Sample Management System (SMS) will be the central clearinghouse for all contact information on MIHOPE families, and will also be used to track survey response rates and potential sample attrition. Contact between rounds of the survey will increase sample retention and reduce the level of effort needed to locate families. To reduce the loss of families between rounds of the follow-up survey, we plan to send families a study information packet that will contain a newsletter with updates about the study and a small gift such as a refrigerator magnet or book of sticky notes with the study’s name and toll-free number listed on it. We will send these between each survey round so we contact all sampled families at least once every six months. Additionally, we will send a birthday postcard to each child and mother/caregiver on a yearly basis (Attachment 5). Since the newsletter will contain information about ongoing study activities, it has not yet been developed.


If any letters or birthday postcards are returned to Mathematica from the post office with an updated address, we will document the new address for the family in the SMS and re-mail the letter or birthday card to the updated address.


Locating Participants. Although the outlined strategies to track participants between follow-up rounds will likely result in lower attrition rates, additional techniques will be employed to ensure a high response rate is achieved at each follow-up round from this mobile population. Mathematica has extensive experience conducting studies with mobile and hard-to-reach populations and has developed several techniques to locate these populations. Locating can be costly, depending on which methods are used. In general, mailing letters and receiving updated information via returned mail is less expensive than electronic database searches; electronic database searches are less expensive than locators calling neighbors or other contacts; and telephone locating is less expensive than in-person field locating. The least expensive methods (mailing and electronic locating) will be used before moving to more expensive methods (telephone and in-person locating). As preparations to conduct each round of follow-up data collection get underway, the following process for locating participants will be employed: (1) pre-field mailing, (2) in-house locating, and as needed and budgetarily feasible (3) field locating. All materials used for locating participants are included in Attachment 5.


  1. Pre-Field Mailing. Prior to each sample release, families with a focal child that has reached 2.5, 3.5, or 4.5 years of age during the previous 6 months will be sent an advance letter about the follow-up survey. Any letters that are returned to Mathematica with updated information will be re-mailed to the new address and the new address will be entered into the SMS.

  1. In-House Locating. Custom database searches and telephone calls to contacts provided by the family during prior rounds of the survey will be conducted when the existing contact information we have for a family is bad and pre-field mailing does not yield an updated telephone number or address. Mathematica’s specialized locating staff uses searchable databases, directory assistance services, reverse directories, and contacts with neighbors and community organizations to obtain current contact information. Mathematica’s locating staff will also search the Web and social networks such as Facebook, Myspace, and Instagram to find sample member contact information.


  1. Field locating. Some families will not be locatable using in-house locating methods. These families will be assigned to field locators who will employ proven techniques for finding hard-to-find populations. For instance, field staff may approach neighbors residing in close proximity to the families’ last known address or the contact persons provided during prior survey rounds. They will also rely on neighborhood resources such as local post offices, churches, bars, homeless shelters, or community centers as sources of information. Field staff will be trained not to reveal any private information about the participant to any informants, including the study’s name or unique details about the study. All field staff will be equipped with cellular telephones and will initiate the call to the SOC so that the family, once found, can complete the survey by phone.


Non-response bias analysis. All efforts will be made to obtain information on a high proportion of families, including the maximum $25 incentive offered to respondents, and the other steps listed in section B2. We will monitor response rates for the program and control groups throughout data collection. The steps listed above for mitigating attrition will also be used to mitigate the potential for non-response bias by helping to ensure a high response rate. A non-response analysis will also be conducted, however, to determine whether the results of the study may be biased by non-response. In particular, two types of bias will be assessed: (1) whether estimated effects among survey respondents apply to the full study sample, and (2) whether program group respondents are similar to control group respondents. The former type of bias affects whether results from the study can be generalized to the wider group of families involved in the study, while the second assesses whether the impacts of the programs are being confounded with pre-existing differences between program group and control group respondents.


To assess non-response bias, several tests will be conducted.


  • The proportion of program group and control group respondents will be compared to make sure the response rate is not significantly higher for one research group.


  • A logistic regression will be conducted among respondents. The “left hand side” variable will be their assignment (program group or control group) while the explanatory variables will include a range of baseline characteristics. An omnibus test such as a log-likelihood test will be used to test the hypothesis that the set of baseline characteristics are not significantly related to whether a respondent is in the program group. Not rejecting this null hypothesis will provide evidence that program group and control group respondents are similar.


  • Baseline characteristics of respondents will be compared to baseline characteristics of non-respondents. This will be done using a logistic regression where the outcome variable is whether someone is a respondent and the explanatory variables are baseline characteristics. An omnibus test such as a log-likelihood test will be used to test the hypothesis that the set of baseline characteristics are not significantly related to whether a respondent is in the program group. Not rejecting this null hypothesis will provide evidence that non-respondents and respondents are similar.


  • Impacts from administrative records sources – which are available for the full sample – will be compared for the full sample and for respondents to determine whether there are substantial differences between the two. This analysis can be done using early impacts from administrative data from MIHOPE 2 or new administrative data collected during MIHOPE Check-in.


If any of these tests indicate that non-response is providing biased impact estimates, a standard technique such as multiple imputation or weighting by the inverse probability of response will be used to determine the sensitivity of impact estimates to non-response.


B4. Pre-testing


As part of MIHOPE Check-in, the study team used pretesting to identify revisions to be made to materials, procedures, and instruments for follow-up data collection. We reached out to Early Head Start centers in both California and New Jersey to identify 9 families with a child of the relevant ageand recruited them to pretest the follow-up survey via phone (6 pretests) or via web (3 pretests).


The pretests included debriefings to investigate parents’ understanding of questions, and ease or difficulty of responding, and any questions or confusion they may have had. In addition, we collected information from phone pretesters about their ability and willingness to complete the survey on the web (all six indicated that they would be at least somewhat interested in completing the survey on the web).


Nine pretests were conducted (six via phone and three via web). No issues that required modifications were raised during debriefing with pretest participants and the average time to complete the survey has been 21 minutes, which is lower than the estimated burden of 30 minutes. We did not recommend any changes to the survey items based on pretesting.


B5. Consultants on statistical aspects of the design and Individuals Collecting and/or Analyzing Data

  • MDRC

  • Mathematica

  • Nancy Geyelin Margie (ACF/OPRE)

  • Laura Nerenberg (ACF/OPRE)



REFERENCES



Cantor, D., B. O’Hare, and K. O’Connor. “The Use of Monetary Incentives to Reduce Non-Response in Random Digit Dial Telephone Surveys.” In Advances in Telephone Survey Methodology, edited by J.M. Lepkowski, C. Tucker, J.M. Brick, E. De Leeuw, L. Japec, P.J. Lavrakas, M.W. Link, and R.L. Sangster, pp. 471–498. New York: J.W. Wiley and Sons, Inc., 2007.

James, T. (2001). Results of the Wave 1 incentive experiment in the 1996 survey of income and program participation. Paper presented at the Proceedings of the Section of Survey Research Methods, Alexandria, VA.

LeClere, F., S. Plumme, J. Vanicek, A. Amaya, and K. Carris. “Household Early Bird Incentives: Leveraging Family Influence to Improve Household Response Rates.” American Statistical Association Joint Statistical Meetings, Section on Survey Research, 2012.

Mack, S., Huggins, V., Keathley, D., & Sundukchi, M. (1998). Do monetary incentives improve response rates in the survey of income and program participation? Paper presented at the Proceedings of the Section on Survey Research Methods, Alexandria, VA.

Martin, E., Abreu, D., & Winters, F. (2001). Money and motive: Effects of incentives on panel attrition in the survey of income and program participation. Journal of Official Statistics, 17, 267-284.


Singer, Eleanor, John Van Hoewyk, Nancy Gebler, Trivellore Raghunathan, and Katherine McGonagle. “The Effect of Incentives on Response Rates in Interviewer-Mediated Surveys.” Journal of Official Statistics, vol. 15, 1999, pp. 217–230.





8


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMIHOPE Check-in Supporting Statement B draft
AuthorMolly Buck
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy