Revised Part b

Revised Part b.doc

Evaluation of Child Care Subsidy Strategies; Massachusetts, Illinois, and Washington

OMB: 0970-0306

Document [doc]
Download: doc | pdf

Evaluation of Child Care Subsidy Strategies



Request for OMB Clearance: Massachusetts, Illinois, and Washington





Revised: February, 2008

(Note: changes to the original are highlighted in yellow.)









Prepared for

Ivelisse Martinez-Beck

Child Care Bureau

Administration for Children and Families, HHS

1250 Maryland Avenue, SW

Portals Building, Suite 800

Washington, DC 20024




Prepared by

Ann Collins

Charles Michalopoulos

Jean I. Layzer

Part B
Collection of Information Using Statistical Methods

B 1 Sample Universe, Sampling Method and Expected Response Rates

a. Massachusetts


Sample Universe

The target population includes family child care providers who are licensed, part of a network, and stable (i.e., in business for at least two years). The study is being conducted with a sample of family child care providers from family child care networks in the state who have indicated interest in and the capacity to implement Learningames. A statewide sample of such family child care homes is desired in order to obtain results that are applicable to the state as a whole.1 Choosing a sample from only part of the state would yield results that are representative of those parts of the state, but substantial differences in economic and personal circumstances of family child care providers and families in different parts of the state would mean the results would be of less use to the State. However, in order to increase efficiency and reduce costs we will try to cluster the sample of providers within a relatively small number of regions of the state. The study will include children in family child care homes enrolled in the home who, at the start of the study, are 36 months of age or younger.


Sampling Method

Within each region included in the sample, we will recruit family child care networks that can contribute at least 10 homes to the study (e.g., they have 10 homes that will volunteer to participate, and each has been in business for two years and cares for two children under 36 months of age). The number of networks participating in the study will vary by region. Randomization will occur within each family child care network so that all participating agencies are guaranteed to have half of their homes be in the Learningames group.

Within these providers’ homes, we will study the development of preschool children. Since the study is longitudinal, following the same providers over two years, we will include in the study children in the homes who are 36 months old or younger who either (a) are in the homes at the outset of the evaluation or (b) who enroll in the homes during the first 18 months of the study. This “rolling” sampling strategy will help increase our chances of having an adequate sample of children for the impact analyses on child outcomes. We will close study enrollment to new children six months before the end of the evaluation period so that all children evaluated at the final assessment point will have been in the home at least six months.


The children will be in the family child care homes for differing amounts of time. Some children will enter the home during the study period and others will leave. At the end of the two years of the evaluation, the analyses of child impacts will first analyze the average-age standardized score on the measure of language development across all children clustered within the home. Second, we will examine the impacts for different age groups of children, assuming the final sample includes sufficient numbers of children in the relevant age categories. We propose to divide the sample into four age groups, based on age of child at the completion of the study or at the last testing point before the child leaves the home:


  • under 12 months,

  • 12-23 months,

  • 24-35 months, and

  • 36 – 60 months.

Sample size is determined by our desire to measure child outcomes as well as provider behavior. We will net approximately 350 providers, 175 treatment and 175 control. We assume that each provider will have at least two children in the sample. This sample size allows us to detect effects on children and on providers of 0.23 standard deviations.2


B2 Data Collection Strategy

a. Massachusetts

Four kinds of measures will be collected for the evaluation: systematic observations of provider behavior; standardized assessments of children’s development; a provider questionnaire; and a home visitor questionnaire.


Exhibit B2.1 shows the categories of data to be collected, data sources, time-period for collection and analyses in which they will be used.


Exhibit B2.1

Overview of Data Needs and Data Sources

Data Needs

Sources of Data

Time Collected

Analyses for Which Data Are Used

Child characteristics

Age, gender, home language, length of time in care setting

  • Provider records

  • June 2006 and as children enter the home


  • Impact analysis






Provider characteristics Age, ethnicity, education, training, experience, job motivation

  • Provider questionnaire

  • June 2006

  • January 2008

  • Implementation study

  • Impact analysis





Home environment

Health and safety, support for cognitive, language, social-emotional development, equipment materials

  • QUEST Environment checklist

  • Caregiver rating

  • Baseline (July-September 2005)



  • Implementation study

  • Impact analysis





Provider behaviors and interactions

  • Level of implementation of Learningames (treatment group only)


  • Behaviors and interactions




  • Fidelity observation

  • Provider log



QUEST caregiver rating



  • June 2006

  • January 2007

  • January 2008


  • June 2006

  • January 2007

  • January 2008



  • Implementation study

  • Impact analysis



  • Impact analysis

Child outcomes

  • Child development outcomes


  • Child languages and pre literary skills




  • Ages and stages (extant data)


  • PLS4 Auditory Subscale; Bracken School-Readiness Subscale



  • Baseline (June-September 2005


  • June 2006

  • January 2007

  • January 2008



  • Impact analysis






Home visitor characteristics

Education, training, experience, caseload size, frequency and duration of home visits, job responsibilities



  • Home Visitor Questionnaire



  • June 2006



  • Implementation study

  • Impact analysis






Observations of Providers

Baseline data will be collected by the staff of the family child care networks using the QUEST form and trained by Abt staff. Study staff, hired by Abt Associates, will collect similar observation data using QUEST and FDCRS six months after the intervention begins, and again at 12 and 24 months. At each observation point, providers will be for approximately 2.5-3.0 hours. The observations will use a standardized rating system. All observers trained to reliability by the Abt staff.


Child Assessments

Baseline information on the developmental status of children in the study will be drawn from extant data collected by the participating family child care systems for children who are in the homes at the outset of the study, and for children who enter the homes at a later date and up to six months before the study ends. Similar data will be obtained for children who are under 36 months and enroll in the family child care home after the study begins and up to six months before the end of the study. The evaluation team will collect assessment data at three points over the two years, on the same schedule as for the observation data. These assessments will use subscales from two standardized measures, the PLS-4 Auditory
Subscale and the Bracken School-Readiness Subscale described in an earlier section. For those children 3 years and older, the test will be administered individually to the children by study staff, at the family child care homes.


Provider Questionnaire

A provider questionnaire will be administered by Abt study staff in June 2006 and January 2008. The initial questionnaire will obtain information on the background and educational and training experience, and motivation of the providers. The second questionnaire will focus specifically on additional education and training obtained over the two years, beyond that offered by Learningames.


Home Visitor Questionnaire

A questionnaire for home visitors will be distributed by Abt study staff in June 2006. The questionnaire will collect data on education and training, caseload size, job responsibilities, frequency and duration of home visits.


b. Illinois

Exhibit B2.2 presents a summary of our data collection strategy. Our main sources of data are extant administrative data and documents, the parent interview as described in sections above, and unstructured interviews with state officials and child care experts. Because we are using extant data or will be speaking with fewer than nine people in any category for the Implementation study, and because we are not using a structured format for these interviews, we are not asking for review for the Implementation study component of the data collection.


Exhibit B2.2


Overview of Data Needs and Data Sources


Data Needs

Sources of Data

Time Collected

Analyses for Which Data Are Used

Family and household characteristics (e.g., family size, number of parents, number and ages of children)

  • Standard application for child care subsidies


  • Parent survey

  • Baseline





  • Months 8, 16, & 24

  • Impact analysis

  • Cost benefit analysis

  • Implementation study (baseline only)





Employment and educational characteristics (e.g., number of employers, employment hours and schedules, earnings, school attendance)

  • Standard application for child care subsidies


  • Parent survey



  • Unemployment Insurance records

  • Baseline





  • Months 8, 16, & 24


  • Quarterly, Months 0-24

  • Impact analysis

  • Cost benefit analysis

  • Implementation study (baseline only)





Family income (e.g., total household income, child support received, household income from employment)

  • Standard application for child care subsidies


  • Parent survey

  • Baseline



  • Months 8, 16 & 24

  • Impact analysis

  • Cost benefit analysis

  • Implementation study (baseline only)





Public assistance use and costs (e.g., use of TANF cash assistance, use of food stamps, administrative costs of subsidy receipt)

  • Administrative records for TANF and food stamps

  • State and agency budget documents


  • Interviews with IDHS and DCACI staff

  • Ongoing






  • 4 months after random assignment

  • Impact analysis

  • Cost benefit analysis

  • Implementation study





Child care characteristics (e.g., number of children receiving child care, type of subsidized arrangements, schedule of arrangement, child care subsidy costs, administrative costs, family costs)

  • Standard application for child care subsidies


  • Parent survey



  • Administrative records from child care subsidy system

  • Baseline




  • Months 8, 16 & 24


  • Ongoing

  • Impact analysis

  • Cost benefit analysis

  • Implementation study (baseline only)





Planning and start up (e.g., demonstration design, rationale, target groups, intended impacts, planning Implementation, start-up experiences, etc.)

  • Unstructured interviews with informants from IDHSI and ACI


  • Demonstration design plans


  • Memo of Understanding


  • Meeting minutes

  • 3 months prior to random assignment through 1 month into random assignment

  • Implementation study





Demonstration operations (e.g., client flow through random assignment, levels and patterns of participation)

  • Unstructured interviews with informants from IDHSI and DCACI

  • Administrative records from subsidy intake unit

  • Throughout the period of random assignment

  • Implementation study





Site-related contextual factors (e.g., local child care market conditions, local economic conditions, expectations about subsidy use among low-income families)

  • Unstructured interviews with informants from IDHSI, DCACI, local child care and public interest groups; families using the subsidy system

  • Local research reports and public interest documents

  • Bureau of Labor Statistics area employment and earnings data

  • Throughout the period of random assignment

  • Implementation study


c. Washington


Exhibit B2.3 aligns the categories of data with our data sources and provides the time period during which they will be collected.


Exhibit B2.3

Overview of Data Needs and Data Sources

Data Needs

Sources of Data

Time Collected

Analyses for Which Data Are Used

Family and household characteristics (e.g., family size, number of parents, number and ages of children)

  • Standard application for child care subsidies


  • Parent survey

  • Baseline



  • Months 8, 16 & 24


  • Impact analysis

  • Cost benefit analysis

  • Implementation study (baseline only)





Employment and educational characteristics (e.g., number of employers, employment hours and schedules, earnings, school attendance)

  • Standard application for child care subsidies


  • Parent survey



  • Unemployment Insurance records

  • Baseline



  • Months 8, 16 & 24



  • Quarterly, Months 0-24

  • Impact analysis

  • Cost benefit analysis

  • Implementation study (baseline only)





Family income (e.g., total household income, child support received, household income from employment)

  • Standard application for child care subsidies


  • Parent survey

  • Baseline



  • Months 8, 16, & 24


  • Impact analysis

  • Cost benefit analysis

  • Implementation study (baseline only)





Public assistance use and costs (e.g., use of TANF cash assistance, use of food stamps, administrative costs of subsidy receipt)

  • Administrative records for TANF and food stamps

  • State and agency budget documents

  • Interviews with state staff

  • Ongoing

  • Impact analysis

  • Cost benefit analysis

  • Implementation study (baseline only)





Child care characteristics (e.g., number of children receiving child care, type of subsidized arrangements, schedule of arrangement, child care subsidy costs, administrative costs, family costs)

  • Standard application for child care subsidies


  • Parent survey



  • Administrative records from child care subsidy system

  • Baseline



  • Months 8, 16, & 24



  • Ongoing

  • Impact analysis

  • Cost benefit analysis

  • Implementation study (baseline only)



Data Collection Strategies

The major sources of data include the parent interview, administrative data and other extant information, and interviews with staff at Washington Department of Social and Human Services (DSHS) and the State’s regional offices. Each of these is discussed briefly below.


Parent Interviews

As stated earlier, parent interviews will be conducted by telephone at 8, 16, and 24 months after random assignment. We do not plan to interview people in their homes. We will attempt to interview about 2,500 families and expect to interview 2,000 families (for an 80 percent response rate). Interviews will be divided about equally between the treatment and control groups. The interview will be a vital source of information for the impact and benefit-cost analyses and parts of it may also be used for the Implementation Study. The interview will provide us with more detailed information about family characteristics than is available in the baseline data, as well as changes that have occurred in some of these characteristics (e.g., the birth of a child, an additional adult moving into the household) since random assignment began. The survey will also be the study’s primary source of information about child care and employment characteristics over the course of the two-year period.


We believe that we can capture changes in employment and child care with sufficient accuracy through a telephone interview. If, at the completion of the first interview at 8 months, it becomes clear that a higher percentage of in-person interviews will be necessary, we would be able to adjust our data collection plan.


Administrative Data

Records from various public assistance programs will be used for the impact and benefit-cost analyses. In general, these records will be used to quantify participants’ use of various forms of public assistance. In addition, information on employment and earnings from Unemployment Insurance records will augment data from parent surveys.


For each automated system, data will be provided one year (understanding data system limitations) prior to and two years following random assignment. If additional funding is obtained, we may seek additional follow-up data. In that case, we would ask for consent for release of identifying information when families are surveyed at the 24-month point. The automated systems include the following:


  • Child care subsidy amounts and provider information;

  • TANF authorized grant amounts and dates;

  • Food Stamps authorized amounts and dates; and

  • Unemployment Insurance (UI), quarterly wages (earnings), and employer ID numbers.

Data Collection for the Implementation Study

The Implementation Study will rely on information from the baseline, administrative, and survey data (Exhibit B2.4). In addition, this part of the subsidy evaluation will rely on a range of open-ended interviews and document reviews. These are described briefly below.


On-Site Data Collection

Open-ended interviews, as well as the collection of various documents, will take place on-site through two field visits over the course of the demonstration. During the visits, researchers will conduct individual and small-group interviews with State DCCEL and DSHS management and staff, and local DSHS management and staff. Researchers will also use both visits to observe demonstration operations.



Exhibit B2.4

Data Collection Strategies for the Implementation Study

Data Source

Collection Strategy

Demonstration providers

Parents

Small group open-ended interviews

Follow-up surveys (as part of the impact analysis)

State DSHS staff

Individual and small group open-ended interviews

Local DSHS staff

Individual and small group open-ended interviews

DSHS administrative data

Periodic files provided by DCCEL (as part of impact analysis)

DSHS statistical reports

Periodic requests to DCCEL

Demonstration plans and design

Requests for DCCEL planning documents; MOAs between Abt Associates and DCCEL

Subsidy system policy manuals and eligibility forms

Requests to DCCEL

Census information

U.S. Census

BLS labor market data

BLS publications (hard copy and online)



Individual and Small Group Open-Ended Interviews

Much of the descriptive information about intervention design, planning, and implementation, as well as about the context in which the demonstration will operate, will come from individual and small-group interviews with key informants during the first site visit to Washington. The Implementation Study will include open-ended interviews with the following informant state and local subsidy and public assistance agency staff.


Researchers will use interview guides that will be developed for each type of informant. The open-ended interviews will be conducted individually or in small groups of up to three informants. An advantage of small-group interviews (where possible) is that although one respondent may forget one or more details, or may answer incorrectly, informants in small groups usually correct one another and can fill in details others may leave out. Because we are primarily interested in "getting the story right," we will try to organize small-group interviews, where possible.


The interview guides will be organized by topic area for each type of informant. Within each topic area, the guides will include basic questions and probes designed to stimulate discussion and more complete information for each topic area. The use of detailed interview guides insures some level of uniformity across researchers and informants. Also, the guides as annotated by interview notes provide a structure to data collection that readily organizes field notes for analysis and reporting.


Another useful practice in conducting open-ended interviews is to ask respondents the reasons and/or evidence for their judgmental answers. First, this may force informants to think more carefully about their responses and qualify them in the light of their grounds for holding their opinions. Second, it allows the researcher to weigh the informant's opinion against the strength of the evidence used to support it.


Subsidy Agency Statistical Reports

Extant subsidy agency statistical reports will be used to help characterize the child care subsidy market in the demonstration sites. We expect such reports to provide basic information about: subsidy use, including numbers of families, children, and providers; mean subsidy amounts; types of care used.


Subsidy System Policy Manuals and Eligibility Forms

We will collect demonstration site subsidy system manuals and eligibility forms as our primary source of information about subsidy eligibility criteria, subsidy levels, and co-payment amounts and collection processes. The manuals and eligibility forms will also allow some insight into the initial eligibility and recertification processes, although information about those operations will also be collected in the open-ended interviews at demonstration sites.


Census Information

Census Bureau information will be used as a primary data source for information about site demographic and socio-economic characteristics. Using data from census tracts that most closely overlap with the demonstration sites, the Implementation Study will summarize information about demonstration site ethnicity, household number and composition, number of families with children, distribution of children by age, and other relevant contextual factors.


Bureau of Labor Statistics Labor Market Data

The BLS is an important source of data about local area labor markets, wage rates, industrial mix, employment/population ratios, unemployment data, and other labor market factors. The BLS data are organized by major metropolitan areas and the larger standard statistical metropolitan areas (SMSA). The BLS data will be important in characterizing the low-income labor market facing many subsidy families.






B3 Methods to Maximize Response Rates


a. Massachusetts

The data collection strategies planned for the study involve observations in the family child care home and direct assessment of children. Early in the study, providers will be asked to complete a brief questionnaire about their educational background, experience and motivation. Since the response burden for providers is very little (7-10 minutes), and since home visitors will assist Abt staff in collecting any missing questionnaires, we expect a response rate for the questionnaire of better than 90%. There is, however, a burden imposed by the presence of observers and assessors; if not addressed with sensitivity, this could, over time, affect provider willingness to allow data collection in their home.


Using past experience as a guide, we propose several strategies to address this issue. First, in scheduling visits to the home, we will emphasize that the visit will occur on a morning that is convenient for the provider, and that their schedule and preferences will be decisive in scheduling a visit. The date and length of the visit will be confirmed in a letter, which will also set out expectations for what will happen during the visit. Data collection staff will telephone providers the day before the visit to confirm the schedule since, in any child care setting, unscheduled events can throw off the provider’s schedule. If this occurs, we will reschedule the visit at a time that is convenient for the provider.


Second, at the end of each visit, we will give each provider a $20 gift certificate to compensate her for the disruption in her schedule occasioned by the data collection.


Finally, as part of our validation efforts, we will telephone a sample of providers visited by each data collector to ensure that the visit went as planned, that the data collector explained what she was doing, answered questions, and was respectful and unobtrusive. For all other providers, we will send a thank you card with a toll-free number they can call if they have any concerns about the data collection.


In addition to these strategies, early in the study, each provider will receive a library of 12 children’s books. We will maintain contact with providers through holiday cards and newsletters.


We expect that these efforts will be successful in maintaining providers’ cooperation. However, there are many reasons why we might experience attrition from the study that have to do with providers’ own lives. Providers may leave the study because they have decided not to continue providing care, because of a family or personal emergency or for reasons beyond our control. If their reasons for leaving the study have to do with the demands of the study, we will work with home visitors and system staff to negotiate a solution. We have planned for approximately 15% attrition. If attrition increases beyond this rate, we plan to refresh the sample by adding new providers. We would randomly assign these providers within systems to either Learningames or the control group, following the same procedures as those initially used.


We expect children to leave the child care home in the course of the study and will replace these children with new entrants under three years of age. While we hope to obtain two assessments on each child, the design does not call for a longitudinal study of specific children. We will continue to recruit age-eligible children into the study until six months before the study ends. Our plan is to have essentially continuous data collection and to have providers notify us if a child is leaving the home. This will allow us maximum flexibility in assessing children and reduce non-response because of brief absence or permanent attrition. At the same time, we expect to have no more than three measurements of each child, for the purposes of calculating burden.


b. Illinois

Survey data will be collected at three points in time. All families in the treatment and control group (a total of 2,000) families will be contacted to be interviewed. Our goal is to achieve an 80% response rate at the first survey wave, conducted approximately 8 months after random assignment (1,600 respondents), 75% of the sample at Wave 2 at 18 months (1,500 respondents) and 70% at Wave 3 at 24 months (1,400 respondents). For each wave, we will attempt to reach the entire study sample, excluding those who ask not to be contacted further. For example, for Wave 2, we will not exclusively attempt to contact the 1,600 respondents who participated in the Wave 1 interview but will use the total study group of 2,000, with the exception of those who refused to be contacted further. While we are estimating a response rate for each wave of the study, we estimate that the overall response rate will be close to 80%; that is, 80% of the sample will respond to at least one of the three survey waves.


In order to increase the likelihood of obtaining this rate, the evaluation team will ensure that the contact information from study participants is accurate and of high quality. The contact information provided by the study participants will include their own address and telephone numbers as well as similar contact information of relatives and friends who are likely to know the participant's whereabouts and do not cohabitate with the respondent. In addition, where it is pertinent, the team will use contact information that it can obtain from public assistance records for those who use TANF, food stamps, or Medicaid over the course of the study period. Contact information will be entered into a centralized sample database that will be used for data tracking and management purposes.


In addition to ensuring that we have high-quality contact information, the evaluation team will use a number of interim tracking methods to ensure that we continue to have up-to-date information. The evaluation team will provide the study participants with a toll-free number to call should they move or get a new phone number. To ensure that the number is on hand we will print it on both a refrigerator magnet and a coffee mug. We will also give sample members a pre-addressed, postage-paid postcard that they may send with any updated address or telephone information. Finally, sample members will be mailed "tracking" letters at points prior to their interviews times. If these letters are undeliverable, the team will engage in a number of efforts to locate the proper address and telephone number. All respondents who complete an interview will receive a $20 voucher or gift certificate. If, after five attempts, the team is unsuccessful in obtaining the study participant’s agreement to be interviewed, the incentive will be increased to $50.


Using data from UI wage records and other public records, we will be able to gather basic information about the non-respondents. If necessary, we will be able to construct weights to address non-response. We do not expect that there will be differential response rates between the treatment and control groups.


c. Washington

Survey data will be collected at three points in time. All 2,500 families who are selected to be in the interview sample (drawn evenly from the treatment and control groups) will be contacted to be interviewed. Our goal is to achieve an 80% response rate at the first survey wave, conducted approximately 8 months after random assignment (2,000 respondents), 75% of the sample at Wave 2 at 18 months (1,875 respondents) and 70% at Wave 3 at 24 months (1,750 respondents). For each wave, we will attempt to reach the entire interview sample, excluding those who ask not to be contacted further. For example, for Wave 2, we will not exclusively attempt to contact the 2,000 respondents who participated in the Wave 1 interview but will use the total interview group of 2,500, with the exception of those who refused to be contacted further. While we are estimating a response rate for each wave of the study, we estimate that the overall response rate will be close to 80%; that is, 80% of the sample will respond to at least one of the three survey waves.


In order to increase the likelihood of obtaining this rate, the evaluation team will ensure that the contact information from study participants is accurate and of high quality. The contact information provided by the study participants will include their own address and telephone numbers as well as similar contact information of relatives and friends who are likely to know the participant's whereabouts and do not cohabitate with the respondent. In addition, where it is pertinent, the team will use contact information that it can obtain from public assistance records for those who use TANF, food stamps, or Medicaid over the course of the study period. Contact information will be entered into a centralized sample database that will be used for data tracking and management purposes.


In addition to ensuring that we have high-quality contact information, the evaluation team will use a number of interim tracking methods to ensure that we continue to have up-to-date information. The evaluation team will provide the study participants with a toll-free number to call should they move or get a new phone number. To ensure that the number is on hand we will print it on a refrigerator magnet and a coffee mug. We will also give sample members a pre-addressed, postage-paid postcard that they may send with any updated address or telephone information. Finally, sample members will be mailed "tracking" letters at points prior to their interviews times. Families who return these postcards will receive a $5 voucher or gift certificate. If these letters are undeliverable, the team will engage in a number of efforts to locate the proper address and telephone number. All respondents who complete the first interview will receive a $10 voucher or gift certificate; $15 for completion of the second interview; and $20 for completion of the third interview. The incentive will not affect participants’ potential benefits for public benefits.


Using data from UI wage records and other public records, we will be able to gather basic information about the non-respondents. If necessary, we will be able to construct weights to address non-response. We do not expect that there will be differential response rates between the treatment and control groups.


B4 Tests of Procedures

a. Massachusetts

The observation measures and provider questionnaire have all been tested and used in other large-scale studies with similar populations and so do not require pretesting. The same is true for the standardized child assessments. However, to ensure that our plan for collecting the data is realistic and does not impose undue burden on the provider, we will pretest the data collection procedures in nine family child care homes early in 2006. The results of the pretest will be sent to OMB, with a description of any recommended changes in procedures.


For the children in the study, we want to obtain permission from the maximum number of parents to allow their child to participate in the standardized assessments. We will work closely with the providers to have them help us contact and convince parents of the importance of the study and the low risk of negative consequences for their child. We have a hotline that parents and providers will be able to use to call with questions or concerns at any time during the study.




b. Illinois

We will pre-test the parent telephone interview survey with nine respondents. The results of the pretest will be sent to OMB, with a description of any recommended changes in wording or administration of the survey.


c. Washington

The parent telephone interview survey used in the Illinois study will also be used in the Washington study. The results of the pre-test conducted in the Illinois study apply to the Washington study as well.


B5 Individuals Consulted on the Statistical Aspects of the Design

The information for all three studies is being collected by Abt Associates Inc. and its subcontractor, Moore & Associates, on behalf of the Administration for Children and Families (ACF), U.S. Department of Health and Human Services. With ACF oversight, Abt Associates is responsible for study design, data collection, analysis, and report preparation.


a. Massachusetts

The project staff responsible for the design include the project director (Jean Layzer) the deputy project director (Ann Collins), and the director of analysis (Barbara Goodson).


b. Illinois

The project staff responsible for the design include the project director, Jean Layzer (Abt Associates); the deputy project director, Ann Collins (Abt Associates), and the co-leads for analysis, Nancy Burstein (Abt Associates) and Charles Michalopoulos (MDRC).

c. Washington

The project staff responsible for the design include the project director, Jean Layzer (Abt Associates); the deputy project director, Ann Collins (Abt Associates) and the director of analysis, Charles Michalopoulos (MDRC).


For all three studies, outside consultants reviewed the statistical aspects of the design. These include:


Robinson Hollister

Professor: Econometrics, Labor and Social Economics, Health Economics

Swarthmore College

500 College Avenue

Swarthmore, Pennsylvania 19081


Marcia Meyers

Associate Professor

University of Washington

School of Social Work and

Daniel J. Evans School of Public Affairs

4101 15th Ave NE

Seattle, WA 98105

206-616-4409


Ann D. Witte

Professor of Economics

Wellesley College

Wellesley, MA



References


Anderson, S., Ramsburg, D., & Rothbaum, B. (2003). Illinois study of license-exempt child care: Interim report. Report to the Child Care Bureau, U.S. Department of Health and Human Services. Urbana, IL: School of Social Work, University of Illinois at Urbana-Champaign.


Blau, D. M., “The Economics of Means-Tested Child Care Subsidies,” paper prepared for an

NBER volume on Means-Tested Transfers, February, 2000.


Burstein, N., Layzer, J., and Cahill, K. (2001). National Study of Child Care for Low-Income

Families. Patterns of Child Care Use Among Low-Income Families. Cambridge, MA: Abt

Associates, Inc.


Capizzano, J., Adams, G., and Sonenstein, F. (2000). Child Care Arrangements for Children Under

Five: Variations Across States. Washington DC: Urban Institute.


Casper, L. What Does It Cost to Mind Our Preschoolers? Census Bureau, Current Population

Reports, Household Economic Studies, P70-52.


Chaplin, D., Robins, P.K., Hofferth, S., Wissoker, D., and Fronstin P., “The Price Sensitivity of Child Care Demand: A Sensitivity Analysis,” unpublished manuscript, January 1999.

Collins, A., Layzer, J., Kreader, J., Werner, A., and Glantz, F. (2000). National Study of Child Care for Low Income Families: State and Community Substudy Interim Report. Cambridge, MA: Abt Associates Inc.

Council of Economic Advisors. (December, 1997). The Economics of Child Care. White Paper. Washington, DC: Author.


Hart, B. & Risley, T.R. (1995) Meaningful differences in the everyday experience of young American children. Baltimore, MD: Paul H. Brookes Publishing Co.


Marshall, N. L., Creps, C.L., Burstein, N. R., Glantz, F. B., Robeson, W. W., and Barnett S. (2001). The Cost and Quality of Full Day, Year-round Early Care and Education in Massachusetts: Preschool Classrooms. Wellesley Center for Women and Abt Associates Inc.


Marshall, N. L., Creps, C.L., Burstein, N. R., Cahill, K., Robeson, W. W., Wang, S. Y., Schimmenti, J., and. Glantz, F. B (2003). Family Child Care Today. Wellesley Center for Women and Abt Associates Inc.


Meyers, M., Peck, L., Davis, E., Collins, A., Kreader, J.L., George, A., Schexnayder, D., Schroeder, D., Olson, J. (2002). The Dynamics of Child Care Subsidy Use. A Collaborative Study of Five States. New York, NY: The National Center for Children in Poverty, Mailman School of Public Health, Columbia University.


Shadish, W.R., Cook, T.D., & Campbell, D.T. (2002). Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Boston: Houghton-Mifflin.


Siegel, G.L., and Loman, L.A. (1991). Child Care and AFDC Recipients in Illinois: Patterns,

Problems, and Needs. St. Louis, MO: Institute of Applied Research.


1 Not all family child care providers in the state are associated with networks. This limits the generalizability of the findings to providers who are linked to networks and receive the support and monitoring provided by network staff. However, this subset of providers, who receive child care subsidies, is of particular policy interest to the state. We will use additional extant data from the networks, and from Abt’s Cost-Quality Study of Family Child Care in Massachusetts, to investigate the differences between the study sample of providers and the wider universe of providers in Massachusetts.

2 We are purposefully looking for whether or not Learningames produces positive changes in provider and child outcomes. Therefore, we will conduct a one-tailed test using α1=0.05. We have also assumed that the analyses will include baseline measures that explain 25% of the variation in study outcomes.

Abt Associates Inc. OMB Clearance Request Part B 16

File Typeapplication/msword
File TitleEvaluation of Child Care Subsidy Strategies
AuthorUSER
Last Modified ByUSER
File Modified2008-03-11
File Created2008-03-11

© 2024 OMB.report | Privacy Policy