Part B

Part B.doc

Pre-testing of Evaluation Surveys

Part B

OMB: 0970-0355

Document [doc]
Download: doc | pdf

National Survey of Early Care and Education (NSECE)



Pre-testing of Evaluation Surveys

(OMB 0970‑0355)



Supporting Statement Part B







November 2010



  1. Collections of Information Employing Statistical Methods

1. Respondent Universe and Respondent Selection Method

This section summarizes the primary features of the sampling and statistical methods used to collect data for and draw methodological conclusions from the NSECE field test.

In order to achieve the dual goals of this project, the sample includes both providers and families who use early child care and education and (before and) after-school programs that serve parents and their children under the age of 13. For the purposes of the field test, the target population for the demand side is civilian households that have children under age 13 residing in 21 sampling clusters across three states. The field test target population for the provider surveys are all services, centers, programs, facilities, and individuals in the 21 selected test sites that offer early care and education and (before and) after-school programs.

The primary goals of the field test are to test the performance of the questionnaires (Household Screener, Household, Home-Based Provider, Center-Based Provider and Workforce Provider), to understand the eligibility rates of household survey respondents and home-based provider respondents from the Household Screener, to explore the yield of eligible center-based and home-based providers from administrative lists in sampled clusters, and to understand the issues surrounding implementation of the Workforce Provider Survey. In balancing the sample sizes across the different components of the field test, we have consulted the project team, an expert panel and other sources to identify those open issues that were most pressing for the design and implementation of the main study and most possible to learn from within the resources available for the test. Based on the advice and guidance of these sources, in order to meet the multiple goals of the project, and to maximize the usefulness of the field test within a fixed budget, NORC has constructed a set of samples that will best inform the individual field test objectives and the study as a whole.

NORC will use the field test as an opportunity to test the sampling plans for the five interrelated surveys, to test the effectiveness and efficiency of the hybrid sampling design, to validate and fine-tune various rates assumed in the sample size calculations, to determine the number of providers our current provider cluster sampling design will yield, to refine the sample plans based on the outcome of the field test, and to produce more accurate estimates of costs for the main study.

The NSECE field test will include 21 second-stage sampling units (provider clusters) from three states. Using a hybrid sampling design, provider clusters will be selected so that they vary in characteristics such as population density, area size, urbanicity, socioeconomic levels of communities, and so on. Having diverse clusters will allow us to test the robustness of our sampling protocols (including methods for building provider clusters), data collection methods, and questionnaires in a wide range of situations. Sampled household addresses will be screened for eligibility for the Household and/or Home-Based Provider Questionnaire. Approximately 500 households with children under 13 will be eligible for the Household Interview. From 12 provider clusters, we expect to complete 450 household interviews, 300 center-based provider interviews, and 250 home-based provider interviews (combining those identified on administrative lists with those identified through the Household Screener), and 200 workforce interviews.

Administrative lists will be used to build a sample frame for center-based providers appearing on licensing and regulatory lists. We will sample providers from the frame in a specified radius around areas selected for the Household Screener.

Stratum

Method of Frame Construction

Community-based Child Care Centers

Administrative lists


Head Start/Pre-K

List from Head Start Bureau;

CCD/PSS; QED; community-based organizations

School-Age Care

Telephone contact with public schools, city parks, libraries, United Way, CDBGs, and Statewide Afterschool Networks


A similar frame will be built for home-based providers appearing on licensing and regulatory lists, and those providers too will be sampled from areas surrounding the locations of households sampled for the Household Screener.


The table below indicates expected completed interviews by questionnaire and sample type.


Questionnaire

Completed Interviews

Telephone Center

Mail

Field

Web

Total

Household Screener

1,062

487

1,043

n/a

2,592

Home-based Provider (from Household Screener)

52

n/a

84

14

150

Home-based Provider (from Administrative Lists)

n/a

n/a

85

15

100

Center-based Provider

n/a

n/a

255

45

300

Workforce Provider

n/a

n/a

144

96

240



2. Design and Procedures for the Information Collection

Field test data collection will follow the main study design components. All five questionnaires will be tested: Household Screener, Household Questionnaire, Home-Based Provider Questionnaire, Center-Based Provider Questionnaire, and Workforce Provider Questionnaire.

Household Screener

As discussed above, 2,487 households will be sampled for the field test of Household Screener. Selected addresses will first go through a telephone number matching process. Addresses for which a telephone number can be matched (estimated 1,144 sample addresses) will be contacted first by telephone to complete the Household Screener. After six weeks of work in the telephone center, all remaining cases (those that have not yet been screened) will be transferred to the field. Addresses that are not matched with a telephone number or matched with a non-working or inaccurate telephone number (estimated 1,219 sample addresses) will be sent a brief mail screener two weeks before field data collection begins. This will be followed, one week later, by a postcard reminder and, another two weeks later, by a follow-up screener mailing. All nonrespondents to the mail screener will then be sent to field interviewers to be worked by telephone or in-person for Household Screener completion.

The Household Screener will: 1) identify eligible respondents to complete the Household Questionnaire, and 2) identify eligible respondents for the Home-Based Provider Questionnaire. Eligibility criteria for these surveys are as follows:

  • Household Questionnaire criteria: Households where a child under age 13 usually lives. Adult most knowledgeable about youngest such child will be sought for interview.

  • Home-Based Provider Questionnaire criteria: Someone in the household age 18 or older regularly provides care in a home-based setting to a child age 13 or under who is not in the custody of the caregiver. And that person does not appear on administrative lists of regulated and registered home-based providers.



Household Questionnaire and Home-Based Provider Questionnaire (from Household Screener)

Once a household is determined eligible, administration of the appropriate survey will begin. Eligible respondents identified through the mail Household Screener will be transferred to the field for in-person interviewing. Eligible respondents identified by telephone Household Screener will be worked by telephone, but transferred to the field after one refusal or hang-up during introduction, or after five non-contact attempts in the telephone center. Respondents identified through in-person administration of the Household Screener will remain in the field for the Household Questionnaire or Home-Based Provider (from Household Screener) data collection.

The Household Screener may indicate that a household is eligible for one or both of the surveys. Interviewers will attempt to conduct the interview when eligible respondents (for the Household Questionnaire and/or the Home-Based Provider Questionnaire) are identified. If the household is eligible for both surveys, the Household Questionnaire will be administered first. Once that is complete, the interviewer will request to speak to the household member who is eligible to complete the Home-Based Provider Questionnaire (if that is not the same person as the parent survey respondent). If respondents are unable to complete the interview at that time, they will be asked to complete the interview either by telephone or in-person at a later date.

Home-Based Provider Questionnaire (from Administrative Lists) and Center-Based Provider Questionnaire

After the 21 areas are sampled for the Household Screener, NORC will work on constructing provider clusters centered around the 21 areas. We will use a two-mile radius for the construction of provider clusters given the Design Phase Feasibility Test results. A sampling frame of center- and home-based providers will be constructed through state licensing and regulatory lists. We will draw samples from the frame and send three mailings to all sampled providers. The first mailing will be an advance letter that explains the purpose of the study, the reason for their selection, and the Web survey URL, a login ID, and the password. One week later a second mailing, in different packaging, would be sent, again requesting their participation. The third and final mailing would follow one week later, informing respondents that a field interviewer will be contacting them in the near future to complete the interview in a different mode. Two weeks later, nonrespondents to the Web survey will be contacted by field interviewers, who will prompt them to complete the Web survey and offer alternative ways to complete the survey, through face-to-face interviewing or over the telephone.

Workforce Provider Questionnaire

The Workforce Provider Questionnaire respondent will be an individual assigned to a classroom within a Center-Based Provider that has completed the Center-Based Provider Questionnaire.

The Workforce Provider Questionnaire will be available for completion by web, hard-copy self-administration, or field CATI or CAPI with a field interviewer. Upon completion of a Center-Based Provider Interview, the standard automated handling of the interview will trigger spawning of a randomly selected workforce member for the workforce survey. The sampled workforce member will be identified to the field interviewer immediately upon completion of the interview with the director so that the field interviewer may alert the director to the fact that an additional interview is requested. If possible, the field interviewer will attempt a contact with the sampled workforce member at that time. The sampled center-based provider workforce members who are not immediately available at the time of the director’s questionnaire administration will be contacted by mail and asked to participate in the web version of the workforce survey. Following the procedures for the Center-Based Provider Questionnaire, members of the workforce study sample will receive up to three mailings (sent to the address of the provider) to solicit participation. Sample members who have not completed the survey after the three rounds of mailing will be contacted by telephone by a field interviewer.



3. Maximizing Response Rates

Several issues in the NSECE field test data collection make achieving a 75 percent response rate challenging: the data collection period is short; the rate of eligible households is relatively low for the Household Survey and very low for Home-Based Provider Respondents identified through the Household Screener. New phone technologies make reaching potential respondents harder, and concerns about privacy make them less willing to participate. For center-based providers, concerns include reluctance to disclose competitive information such as staff qualifications and prices charged to parents, burden from various regulatory reporting requirements, site visits and inspections, and other government-supported data collections.

NORC will deploy time-tested, evidence-based strategies to address these obstacles in order to maximize response rates and help the project meet its production goals. Below are some steps that will be taken prior to the start of data collection to facilitate respondent cooperation.

  • Compelling contact materials: in coordination with principal investigators for each survey and project staff, ACF will approve contact materials that foster a successful first encounter with the respondent by communicating the importance of the study for each respondent group and anticipating concerns likely to prevent participation.

  • Strategic interviewer trainings: because the first few seconds of each call or in-person visit are crucial, NORC conducts innovative interviewer trainings designed to produce effective interviewers equipped with skills and information to build rapport with potential respondents and avert refusals.

  • Incentive plan: During the field test, NORC will test a number of incentive plans to find one that increases participation in the study while providing the best value for the government.

  • Certificate of confidentiality: NORC will obtain a certificate of confidentiality in order to allay concerns about privacy and legal issues, two common barriers to respondent cooperation.


During the data collection period, additional steps will be taken to maximize response rates. Data collection progress will be monitored using NORC’s proven cost and production mechanisms for tracking sample and oversample targets, including monitoring sample progress, and where necessary, diagnose the problem and develop a plan to reverse the trend.


ACF understands the broad impact that nonresponse can have on a data collection effort, including the potential for lengthening a field period, lowering response rates, and introducing possible bias in the resulting data. The first line of defense against non-response is refusal aversion. During interviewer training much effort is made to prepare interviewers with both general and project-specific techniques to successfully address respondent concerns. However, despite thorough training, some respondents will refuse to participate. Therefore, NORC has developed a number of procedures for the NSECE that allows us to complete the necessary number of interviews and achieve high response rates.


Refusal aversion and conversion techniques are initially introduced and practiced in training sessions so that interviewers are well-versed in NSECE-related facts and in the household and provider sample frame methodology by the time they start interviewing and can successfully address respondent concerns. Throughout the training process, interviewers are educated as to the relatively short time period they have to gain a respondent’s cooperation and that the process of averting refusals begins as soon as contact is made with a potential respondent. Although the individual answering the telephone or door may not be the person who ultimately completes the household screener or detailed interview, the way he or she is treated can directly impact the willingness of other household members to participate and provide quality data.

NORC knows many of the primary reasons that parents of young children are hesitant to participate, and the points within the questionnaire that are most likely to lead to refusals. Interviewers will be armed with this knowledge so they can encourage parents to supply employment and early care arrangements and schedules and provider information. Trainings will use recordings of administered interviews to allow trainees to hear and respond to situations they will encounter during data collection before they conduct their first interview.

4. Testing of Questionnaire Items

The purpose of the NSECE field test is to inform the implementation of the NSECE main study by implementing the full design plan, analyzing the data collection outcomes and making necessary revisions to the questionnaires, sample design and data collection procedures. The approach to revising the NSECE questionnaires will focus on three fundamental aspects of that process particularly important for NSECE. First, we establish the study’s analytic tasks as the guidance system for questionnaire development or revision, to ensure that the questionnaires developed will optimally serve the study’s analytic, and ultimately policy, goals. Second, we use our knowledge of the characteristics of the populations to be surveyed —including, for NSECE, low-income households, and other subgroups --- on both the household and provider sides of the study. Third, we are cognizant that all proposed instruments will be administered in multiple modes, and adopt strategies to minimizing mode effects throughout the design process.

Questionnaire content. The instruments for the NSECE were designed on the basis of an analysis plan that outlined research priorities for the study. Out of those priorities emerged these instruments:


The Household Screener is designed to determine whether individuals in a household are eligible to complete the Household Questionnaire or the Home-based Provider Questionnaire. It collects information on the number of children in the household and whether any household members look after children who are not their own.


The Household Questionnaire is designed to collect information on a variety of topics including household characteristics (such as household composition and income), parents’ employment (employment schedule, employment history, and so on), the utilization of early care and education services (care schedules, care payments and subsidy, attitudes towards various types of care and caregiver, etc.) and search for non-parental care. The respondent is an adult in the household who is knowledgeable about the ECE activities of the youngest child in the household. Based on preliminary timing tests, we project that this interview will take just over 30 minutes. To ensure that we do not exceed our burden request, we have estimated an average of 45 minutes per completed interview. One key feature of the Household Questionnaire is a full week’s schedule of all non-parental care (including elementary school attendance) for all age-eligible children in the household, and all employment, schooling, and training activity of all parents or regular caregivers in the household.


The Home-Based Provider Questionnaire covers some of the same topics as the center-based provider questionnaire described below. Additional topics include the household composition of the provider, questions trying to understand the proportion of household income coming from home-based care, and an option to collect a full roster of children cared for in the prior week if the count of such children is small. The feasibility test showed that many home-based providers under our definition do not self-identify as providing care. Cognitive tests and examination of call records suggest that “looking after” is a better word than “providing care” and should be used throughout the screener and the questionnaire.


The Center-Based Provider Questionnaire is designed to collect a variety of information about the provider, including structural characteristics of care, revenue sources, enrollment, and admissions and marketing. One feature of the questionnaire is a section collecting information about a randomly selected classroom, including characteristics of all instructional staff associated with that classroom.


Across the three instruments we tended to follow certain guidelines. 1) Behavioral, rather than attitudinal, data are collected wherever possible to support a broad range of analytical methods and minimize response error. For example, rather than asking parents about satisfaction with provider hours of service, we instead collect parental work schedules and use of wrap-around care. 2) We make intensive use of Geographic Information Systems (GIS) data and other externally available data sources to extend the value of the questionnaire data. For example, we use GIS to understand distance-from-home as a factor in ECE provider selection, and catchment areas for providers. We collect names of providers used to improve the accuracy of Head Start and other program identification. 3) We employ best practices in survey methodology to capture the most accurate data, for example, collecting specific information about a random sample of classrooms in center-based programs, rather than asking directors to generally describe all of their classrooms as if they were homogeneous. 4) Our emphasis is on the full spectrum of activity, so that all children under age 13 are included in the Household Interview, and all sectors of ECE providers are included in the Home-Based and Center-Based Provider Interviews. 5) In the questionnaire design, as in the construction of the provider cluster, we have consistently conceived of providers and families as distinct components of a single, closely inter-related ecosystem.6) New items were developed to measure: parental support by providers, (accurately-captured) participation in publicly supported ECE/SA, (employment-related) barriers to ECE/SA usage, and the process of parental search for ECE/SA. In addition we include questions about market definition of provider catchment areas that are new to this context.


The Workforce Provider Questionnaire is designed to collect information about workers from both center-based providers and home-based providers in four categories: characteristics (demographic, employment-related, education/training); activities (teacher-led/child-led; active/passive); attitudes and orientation toward caregiving (perception of teacher role, parent-teacher relationships, job satisfaction); knowledge of QRIS (quality rating and improvement systems) and QII (quality improvement initiatives).

5. Statistical Consultant

Dr. Kirk Wolter

NORC

55 East Monroe Street

Chicago, IL 60603

(312) 759-4000



The sample design was conducted by NORC, which will also conduct the data collection for the field test.



File Typeapplication/msword
Authorconnelly-jill
Last Modified ByDSHOME
File Modified2010-11-15
File Created2010-11-10

© 2024 OMB.report | Privacy Policy