IL change memo

IL change memo.doc

Evaluation of Child Care Subsidy Strategies; Massachusetts, Illinois, and Washington

IL change memo

OMB: 0970-0306

Document [doc]
Download: doc | pdf

Date

March 11, 2008



To

OMB





From

ACF



Subject

Request for Change for the Evaluation of Child Care Subsidy Strategies: Illinois site (OMB NO.: 0970-030)


The purpose of this memo is to outline our request for a non-substantive change to the currently approved collection as part of the Child Care Subsidy Strategies Evaluation in order to increase the participant incentive payment in our Illinois site from $20 to 50. ACF is requesting expedited approval of the increase in incentive because of the need to complete surveys with study participants by the end of May 2008. A previous memo sent to OMB as part of our request for a change to the Massachusetts has laid out the initial reasoning for this change. Below we extrapolate on the original memo (see file) by providing a justification for the expedited review and addressing the questions posed by OMB in response to the original memo.

Overview:

As described to OMB in 2005, approximately 1,900 parents in Cook County, Illinois, who applied for child care subsidies with income between 50 and 65 percent of state median income (SMI) were asked to participate in a random assignment study. Without the study, these families would not have been eligible for child care subsidies because their income placed them above state income guidelines. Half of the families who agreed to be in the study were approved to receive subsidies for two years.

In Illinois, we request an increase in the incentive payment in order to reduce non-response bias. Experience and previous respondent feedback indicates that a higher incentive amount will increase the number of respondents. The original data collection plan included a $20 incentive for study participants who completed a 35-minute telephone interview. We request approval to increase payments to $50 incentive for the remaining participants who have not yet completed the interview. As described in more detail below, the survey is the only source of information about how the use of child care subsidies affects the parents’ selection of and satisfaction with child care, as well as its stability, as such information is not available through administrative records. Thus, the quality of information, precision of impact estimates and the ability to answer the most policy relevant questions will be greatly enhanced by reducing any source of potential bias.


Due to the nature of the approved study design, it is essential to complete the survey by May 2008. The key features of the study design that necessitate this are:

  • the end date for the intervention treatment for the final group in the experimental group will receive child care subsidies under the eligibility cap through April 2008;

  • the reapplication period for participants in the experimental group begins after April 2008, at which time they will have to apply under the current eligibility criteria in Illinois; and,

  • the nature of the study questions to test impact of subsidy receipt on the outcomes of interest to ACF require that both experimental- and control-group participants report on their experiences related to employment and child care outcomes during a period of two years from the time of random assignment.


Because of the nature of the experimental design, it is essential to complete data collection in a timely manner. Extending the survey after the target deadline would compromise the reliability of the information collected through the survey due to decreased recall of life experiences related to employment and child care in the case of both study groups, and to actual changes in experiences with the subsidy system in the case of the experimental group. In order to achieve an appropriate response rate within the current timeframe, we are requesting approval to increase the incentive payment to $50. Below we address the specific questions posed by OMB:


What non-monetary recruitment/refusal conversion approaches can ACF try to increase the RR? What have been tried so far and what has been the result?


The project has tried several strategies for increasing response rates in Illinois. These have included resending the study information letters reminding the participants about the study and the $20 incentive payment if they complete the interview. These have been sent by the survey group and the Agency that administers the local subsidy program, Action for Children. In addition, we have tried different voice mail messages when leaving messages, again noting that this is a follow-up call related to the study in which they agreed to participate.

In addition, the survey firm, SRBI, tried calling individuals at their place of employment to set up an appointment to complete the interview when the person is at home. (Sample members had provided information on work phones when they completed their child care subsidy application.)

In fact, originally the survey was planned to be by telephone only. At the beginning of June 2007, we began to conduct in-person tracking. Response rates have increased more rapidly since in-person tracking began. For the month of May 2007, for example, 35 individuals completed the survey in Illinois. In the eight months since then, another 602 surveys have been completed, for an average of about 75 per month.


In-person tracking is mentioned as a specific approach ACF has tried with success. The justification says that 80% RR could be reached but it would take 8 months, if not longer. What is the downside of taking 8 months longer? If that would increase contractor costs, by how much would it increase the costs? What non-monetary approaches could ACF undertake, in conjunction with in-person tracking, that might speed up the refusal conversion process?

As noted above, the research design used in Illinois prohibits extending the survey period beyond 2-years after random assignment to either the experimental or control group. For example, the survey asks about income in the month prior to the survey as well as recent problems with child care. In order to maintain the integrity of the experimental design, answers to the survey questions must occur while the intervention is in place.

In addition, although questions about individuals’ current situations are important, the primary purpose of the survey is to obtain a history of employment and child care from the point of random assignment to the point of the survey. We believe that extending the survey period a short time after the end of the program will still result in reliable information on employment and child care for at least the second year of the program period. Thus, we anticipate extending data collection through May 2008. However, we believe that data collected after that date has the potential of being significantly affected by recall bias and the effect of the end of the intervention. Therefore, it is necessary to close the survey by May 2008, even if an extension beyond that time could increase the response rate.



What is the RR to date? (the justification states that you haven’t reached 80% but doesn’t state what the RR has been) What is the RR differential between the two groups right now (treatment vs control) and what was it on the last round of interviews?


The last estimate of response rate was 58% overall. If the ratio of response rates between the two groups holds, assuming the $20 incentive payment under the current approved protocol, we anticipate final response rates of 65% in the treatment group and 51% in the control group.


What will be the additional cost to the federal government?


The additional direct cost to the federal government would be $12,240 in increased incentive payments.


Given that ACF will have access to rich administrative data, would it be possible to do a more robust non-response bias analysis with the RR you have been able to get?

Our plan is to do robust non-response bias analyses, no matter what the response rate. However, given the nature of the experimental design and the fact that the control group does not participate in the child care subsidy program, administrative data fail to provide information that can only be gleaned through a survey. For example, critical information about child care arrangements, access and choices, proportion of family income paid for care, and other characteristics of care used by low-income families can only be answered through the survey. Thus, administrative data combined with non-response bias analyses, would be insufficient to answer the critical policy-relevant questions of interest in this study.


Below we address our plans to conduct non-response bias analyses:


Our standard response bias analysis consists of several comparisons. First, we would compare baseline characteristics for respondents and non-respondents to see whether there was evidence of a systematic difference between the two groups. This comparison provides the first indication of whether results for respondents could be generalized to the full sample. If we do see significant differences between respondents and non-respondents, we would consider weighting survey responses so they reflect the full sample.


More important in a random assignment study is that treatment and control group respondents are similar. The second response bias analysis would therefore compare respondents in the two research groups. Our standard approach is to run a regression in which the treatment group indicator is the dependent variable and various baseline characteristics are the covariates. An F-test or similar test would indicate whether there are significant differences overall between the two groups. If we do see significant differences, we would again consider weighting survey responses to bring the program and control group into balance.


Why $50? Why not $30, $40, or some other amount?


There is little research evidence available about the specific, appropriate incentive level to maximize response rate. We have chosen an incentive in the amount of $50 because we believe it would adequately increase the response rate in this study, while maintaining consistency with incentive payments previously approved by OMB. For example, the Moving to Opportunity (MTO) study conducted by Abt Associates, completed in 2004, paid survey respondents $55 per completion. The population interviewed for MTO is very similar to the sample for this study in that they are both low-income respondents with children living in highly urban areas. Due to our successes on the MTO project with a $55 incentive, we recommended a similar incentive on for this study, in the hope we would see the same type of gains achieved on MTO.


Additionally, our recommendation is informed by conversations between our interviewers in the field and respondents. The respondents have indicated that the current incentive payment is too low; they complained that answering the survey “is not worth my time”. This is especially true for respondents in the control group who have not received a child care subsidy. We believe that this increase will appropriately balance concerns about not providing an incentive that is excessive while at the same time encouraging responses sufficient to meet our desired response rate.

File Typeapplication/msword
File TitleEvaluation of Child Care Subsidy Strategies: Evaluation of eligibility and re-determination child care subsidy policies in Illin
AuthorComputer Network Services
Last Modified ByUSER
File Modified2008-03-11
File Created2008-02-20

© 2024 OMB.report | Privacy Policy