2019 NSECE Supporting Statement Part B REVISED_10-11-18_clean

2019 NSECE Supporting Statement Part B REVISED_10-11-18_clean.docx

National Survey of Early Care and Education (NSECE): The Household, Provider, and Workforce Surveys

OMB: 0970-0391

Document [docx]
Download: docx | pdf




The 2019 National Survey of Early Care and Education:

The Household, Provider, and Workforce Surveys



0970-0391



SUPPORTING STATEMENT



Part B















Version: August 2018, Revision October 2018





Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201

Table of Contents





List of Exhibits


Table of Attachments

Supporting Document

Attachment

2019 NSECE Center-based Provider Questionnaire Items - Overview and Comparison

1

2019 NSECE Center-based Provider Screener and Questionnaire

2

2019 NSECE Home-based Provider Questionnaire Items - Overview and Comparison

3

2019 NSECE Home-based Provider Screener and Questionnaire

4a

2019 NSECE Home-based Provider Screener and Questionnaire (Spanish)

4b

2019 NSECE Classroom Staff (Workforce) Questionnaire Items - Overview and Comparison

5

2019 NSECE Classroom Staff (Workforce) Questionnaire

6a

2019 NSECE Classroom Staff (Workforce) Questionnaire (Spanish)

6b

2019 NSECE Center-based Provider Survey Contact Materials

7 (A-I)

2019 NSECE Listed Home-based Provider Survey Contact Materials

8 (A-I)

2019 NSECE Classroom Staff (Workforce) Survey Respondent Contact Materials

9 (A-H)

2019 NSECE Listed Home-based Provider Survey Contact Materials (Spanish)

10 (A-I)

2019 NSECE Classroom Staff (Workforce) Survey Respondent Contact Materials (Spanish)

11 (A-H)

2019 NSECE Unlisted Home-based Provider Survey Contact Materials

12 (A-E)

2019 NSECE Unlisted Home-based Provider Survey Contact Materials (Spanish)

13 (A-E)

2019 NSECE Household Questionnaire Items - Overview and Comparison

14

2019 NSECE Household Screener and Questionnaire

15a

2019 NSECE Household Screener and Questionnaire (Spanish)

15b

2019 NSECE Household Survey Respondent Contact Materials

16 (A-I)

2019 NSECE Household Survey Respondent Contact Materials (Spanish)

17 (A-I)



Supporting Statement B

for the National Survey of Early Care and Education

  1. Collections of Information Employing Statistical Methods

1. Respondent Universe and Sampling Methods

An important feature of the 2019 National Survey of Early Care and Education (NSECE) survey design is support for analyses of the association between the utilization of early care and education (ECE) services by families and the availability of such services offered by providers. Analysis of this association is critical to the development of a better understanding of how government policy can support parental employment and promote child development. The local nature of ECE usage underscores the importance of collecting and analyzing data in matched geographic areas. In order to strengthen the tie between the utilization and provision of ECE services, we propose a sampling approach in which sampled units from all four surveys are co-located in small geographic areas.

Specifically, we will tightly cluster households within one or at most two census tracts. We will then sample center-based providers and listed home-based providers from a larger cluster of census tracts that include the tracts from which households were sampled. The co-location of households and providers will greatly enhance the value of the data about each. Parents’ search behaviors can be understood in the context of the choices actually available to them, while providers’ decisions about capacity and staffing can be seen as responses to their real audiences. Details on this sampling approach are provided below.

This multistage, cluster sampling design is stratified by the 50 states and the District of Columbia (DC) and includes an over-sample of households and providers located in low-income areas. The respondents for this study include individuals living in households and ECE providers and their workforce. For the surveys of individuals in households, the eligible population of households is those households with children under 13 in the 50 states and DC. The eligible population of providers is those individuals and facilities who provide ECE services in the 50 states and DC. Additional household and provider eligibility criteria is further detailed in the following sections for each respective sample type.

Design of the Household Survey

First Stage. At the first stage of sampling, we propose a stratified probability sample of primary sampling units (PSUs) representative of all geographic areas in the 50 states and DC. We define the PSU as a county or a group of contiguous counties. Counties are generally large enough to represent a bundle of child care markets. Smaller counties will be linked to geographically adjacent neighboring counties until a minimum PSU size is met. We will stratify PSUs by state, because historically, policies for ECE are set at the state level and vary widely across states. Stratification by state will be likely to reduce the sampling variance of 2019 NSECE estimators because of the relative homogeneity within states in child care policies. The number of PSUs will be allocated to each state in proportion to the number of households with age-eligible children, with a minimum of two PSUs for each state to support variance estimation. A probability-proportional-to-size (PPS) method will be used to select PSUs within each stratum (or state). Given PPS sampling, the probability of selecting a PSU is based on the measure of size (MOS) of the PSU. The number of households with at least one child under age 13 is the MOS for PPS sampling. The 2019 NSECE sampling design will begin from the PSUs selected for the 2012 NSECE, which used the MOS from the most recent five-year data from the American Community Survey (ACS) for the period 2005–2009 (the most recent available at the time of selection in 2011). Some of the largest PSUs may be selected with probability equal to 1.0 – such PSUs are typically called certainty or self-representing PSUs. DC will be treated as a “self-representing” PSU or a stratum in its own right. The weights associated with each PSU may be adjusted to reflect any changes in population of households with children using 2011–2015 ACS data. PSU definitions may be altered or additional PSUs may also be selected if required.

Second Stage. Within the selected PSUs, we will select a sample of census tracts, or second stage units (SSUs). SSUs will be constructed by combining multiple adjacent tracts in order to create SSUs that meet a minimum MOS requirement. Overall, we will select about 741 SSUs with about three SSUs per non-certainty PSU and more per certainty PSU on a proportional basis. For the household sample, the MOS is the number of households with at least one child under age 13 in each SSU, and will be determined from 5-year data from the ACS for the period 2011-2015, since only the 5-year data contain information at census tract level.

Third Stage. We will use an extract of the United States Postal Service (USPS) computerized delivery sequence file (CDS or CDSF) acquired from the Valassis vendor as the sampling frame for housing units (HUs) at the third stage of sampling. We acquire this extract from a vendor because the USPS is not allowed to directly license their computerized delivery sequence file to parties other than the US Census Bureau (AAPOR ABS Task Force 2017). The CDS is a list of residential addresses updated continually by mail carriers and is considered a suitable address frame for national samples (AAPOR ABS Task Force 2017). Direct mail marketers (e.g., InfoUSA, Valassis) license the list from the USPS and, in turn, NORC maintains a nationwide license for the Valassis version of the CDS. NORC has conducted extensive evaluation of the quality of the delivery sequence file as a sampling frame for HUs (O’Muircheartaigh, English, and Eckman, 2007). We have found that the list is of very good quality for non-rural areas.


For the NSECE, we will compare the number of computerized delivery sequence (CDS) addresses (i.e., updated delivery sequence address information) in a selected SSU to the corresponding census count. In 2012 we implemented list-and-go methods in order to adequately sample from areas where the CDS count was unacceptably smaller than the census count. In 2019 we plan to take advantage of more cost effective methods of dealing with areas with limited coverage in the CDS. Recent research indicates that supplementing the CDS frame with commercially-available address lists as well as the USPS “no-statistics” file can improve coverage in sparsely populated or otherwise unique geographies and is a cost effective way of addressing this sampling issue. In addition, it would avoid known drawbacks of list-and-go methods related to unintentional and intentional errors in implementation.

An analytic focus of the NSECE is on the low-income population, and thus the household sample oversamples low-income families. Screening on household income is not a worthwhile option because it would damage response rates and thus undermine the representativeness of the sample. Instead, we will achieve an oversample of low-income households and age-eligible children living in such households by oversampling addresses in high-density low-income tracts.

Overall, we will select approximately 100,000 HUs within the selected tracts to obtain 10,600 completed household interviews. We will use equal probability systematic sampling to select HUs within a tract. Our goal will be to make the national sample as close to self-weighting (equal probabilities of selection) as possible within each of the high- and low-density strata.

Exhibit 1 below documents the estimated number of sampled households and screening by several data collection modes to achieve the goal of 10,600 completed household questionnaires. Ultimately, we expect to screen 61,500 households (12,000 by mail, 9,000 by web, and 40,500 by telephone or in person), and complete 10,600 household interviews by telephone or in person.

Exhibit 1. Target Sample Size for the Household Survey


Data Collection Mode for Household Screener Completion

Estimated counts

Total sampled households

100,000

Screening by hard-copy questionnaire

12,000

Screening by web questionnaire

9,000

Field interviewer screening

40,500

Total completed household questionnaires

10,600



We propose using multiple modes of data collection (mail, web, telephone, and in-person) to complete the household screener for the hybrid sample of HUs. We will begin data collection by mailing a screening questionnaire or optional web questionnaire for the household screener to all sampled households with the exception of drop points. These addresses (drop points) will never be suitable for telephone or mail screening and we will directly route these cases to in-person screening. The household sample will be matched against a telephone commercial database to identify any telephone numbers associated with each sampled household. Selected addresses that have screened eligible (i.e., at least one child under 13 years of age) and have a telephone match will be worked by telephone initially, with a subsequent in-person visit, as needed. If an eligible household does not have an available telephone number, a field interviewer will visit in person to conduct the household questionnaire.

Design of the Home-based Provider Survey

Two types of respondents will complete the home-based provider questionnaire, listed and unlisted home-based providers. Listed home-based providers include all sampled licensed, license-exempt, registered, regulated, and otherwise listed home-based ECE providers identified from state or national administrative lists. Unlisted home-based providers include all home-based providers that do not appear on any administrative lists and are identified from the household screener. Eligible home-based providers will be individuals who provide care in a residential setting for one or more children under age 13 (who are not their own) at least five hours per week. In the event that more than one home-based provider is identified through the household screener, one respondent will be randomly selected for the home-based provider questionnaire.

While these two respondent types complete the same home-based provider questionnaire, these respondents stem from two different samples: 1) sampling for the unlisted home-based providers is through the design of the household survey, and 2) the sampling for the listed home-based provider survey is through the design of the listed provider-based survey (for center-based and home-based providers from administrative lists), as detailed in the following section.

Design of the Survey for Listed Providers (center-based and home-based providers from administrative lists)

The target population for this sample type is defined as center-based ECE providers to children birth through age five years, not yet in kindergarten; registered, regulated, and otherwise listed home-based ECE providers serving children from birth to 13 years; and classroom-assigned center-based workers in ECE centers.

First Stage. The sampling plan for the listed providers follows the same initial stratified probability sample of PSUs representing all geographic areas in the 50 states and DC for the household sample. As previously noted, this serves as the foundation to collecting and analyzing data for the clustered sampling approach across surveys.

Second Stage. The SSUs for the provider sample are larger in area but geographically connected to household SSU’s. Provider SSU’s are designed in a way that they contain providers that are likely to serve the households of the selected household SSU. The only distinction here is that the MOS for the listed provider sample is the number of providers within the SSU. This number will be based on the sampling frame constructed for the sampling of center- and listed home-based providers. If these data are not available at the time of SSU selection, the number of households with at least one child under age 18 based on the ACS for the period 2011-2015 will be used instead. This MOS will be used to evaluate how many tracts are required to obtain an adequate number of providers in the cluster from which providers will be sampled. The probability of selection for the SSU will be a function of the size of the associated household SSU.

Third Stage. While the sampling frames for the first and second stages come from Census definitions and data, we will construct a sampling frame of providers for the third stage. Specifically, we will gather state-level lists of center-based and home-based providers from the relevant state agencies. This effort will include obtaining publicly available online administrative lists, and having individualized follow-up conversations with state administrators based on the publicly available information. We will supplement these with national lists of ECE providers, for example, lists from the Office of Head Start, the accreditation list from the National Association for the Education of Young Children, and ECE provider lists maintained by federal agencies such as the Department of Defense and the General Services Administration. An important supplemental source will be a commercially available list of all K-8th grade elementary schools in the country; many of these locations potentially have early childhood programs that do not appear on state lists.

Once the sampling frame has been de-duplicated of providers appearing on more than one list, we will construct a provider location sampling frame. This will be a list of physical locations where at least one eligible provider is located. In cases where more than one provider is located at a selected location, an additional stage of sampling will be used to select a single provider from that location. Exhibit 2 below summarizes the expected number of sampled providers and classroom staff needed to meet the targeted number of completed questionnaires.

Exhibit 2. Target Sample Sizes for the Center- and Listed Home-based Provider Surveys

Stratum

Released Programs

Not Obsolete or Unlocatable

Eligible

Completed Provider Questionnaires

Rate

Number

Rate

Number

Rate

Number

Stratum 1:

Known Center-based Providers

10,974

95%

10,370

88%

9,076

80%

7,215

Stratum 2:

Listed Home-based Providers

7,076

85%

6,015

70%

4,211

95%

4,000

Stratum 3:

Possible Center-based Providers

5,337

95%

5,070

15%

735

80%

584

Total

23,387


21,455


14,021


11,800



Center-based and listed home-based providers will be sampled from the constructed sampling frame in quantities chosen to meet the specified number of completes for each type of provider. A listed home-based provider is a single entity providing non-parental care to children under age 13 in a home-based setting at least five hours per week and appearing on an administrative list. A center-based provider is a single entity that provides ECE services to children at a single location. To be eligible as a center-based provider, the entity must meet four criteria, as follows:

  • Offer non-parental care, including early education, or supervision of children birth through age five years, not yet in kindergarten;

  • Operate on a regular schedule of at least three days per week and two hours per day but not including residential care;

  • Offer ECE services above and beyond ad-hoc drop-in care (excluding entities such as shopping malls and YMCA open gym programs); and

  • Offer care during the school year (excluding entities that offer only summer or holiday care).


An entity that fails one or more of these criteria is not considered to be a center-based provider for the purposes of the 2019 NSECE.

All provider locations/addresses on the sampling frame will be assigned to one of three provider location type strata. Each location is assigned to the first stratum for which it qualifies in the order presented below:

  1. Known center-based provider locations – Locations where at least one Head Start, pre-K, child care center, faith-based child care entity, or other ECE entity is listed by state, local, or tribal governments as providing care to children birth through age five years, not yet in kindergarten;

  2. Listed home-based provider locations;

  3. Possible center-based provider locations – Locations indicated in the sampling frame as providing care for older children, such as elementary schools not known to have early childhood programs, or centers listed as providing only school-age care. It has been observed that a percentage of these centers also care for children five years of age or younger, not yet in kindergarten.


We will classify each provider location on the final sampling frame into one and only one of the strata detailed above. Matching techniques (based upon name, address, and other characteristics) will be used to de-duplicate the sampling frame within and across the strata within a sampled PSU. For example, many programs will appear both on a list of Head Start providers and in a state list of licensed ECE facilities; we would want the location to appear just once. We will use advanced matching systems based upon the Fellegi-Sunter matching algorithm supplemented by human review.

An analytic focus of the NSECE is on the low-income population. In order to ensure a larger number of providers sampled from communities with higher percentages of households at or below 250% federal poverty level, two-thirds of sampled providers will be from communities where at least 40 percent of households are at or below 250% federal poverty level.

Design of the Classroom Staff (Workforce) Survey

The center-based provider questionnaire includes questions about a randomly selected classroom or group within the program. The respondent is then asked to enumerate all personnel who are primarily assigned to that classroom. For the classroom staff (workforce) survey, we will randomly select at least one staff member from among those enumerated as belonging to the randomly selected classroom. Two classroom staff will be selected from a pre-selected subset of classrooms in order to increase the total number of respondents for the classroom staff (workforce) survey. The selected individuals will be sampled from the following staff roles: Lead Teacher, Instructor, Teacher (possibly including Director/Teacher), Assistant Teacher/Instructor, and Aide. Other roles, for example, specialist personnel, are not eligible for the classroom staff (workforce) survey even if they are enumerated as personnel associated with the randomly selected classroom. Items about the teaching/caregiving workforce in home-based care are included in the home-based provider questionnaire and parallel the constructs that are included in the classroom staff (workforce) questionnaire to be administered to the sampled staff in center-based programs.

Exhibit 3. Target Sample Size for the Classroom Staff (Workforce) Survey

Stratum

Completed Center-based Provider Questionnaires

Completed Classroom Staff (Workforce) Questionnaires

Number of Staff Selected per Classroom

Number of Selected Staff

Rate*

Number

Stratum 1:
Known Center-based Providers

1

6,571

72%

4,718

2

644

72%

463

Stratum 1 total

7,215

72%

5,643

Stratum 3:
Possible Center-based Providers

1

532

72%

382

2

52

72%

37

Stratum 3 total

584

72%

457

 

Total

7,800


6,100

* This rate is based on what was realized for the 2012 classroom staff survey and discounted to account for general response rate attrition over the years.

2. Procedures for Collection of Information

The 2019 NSECE data collection effort involves four inter-related surveys (household survey, home-based provider survey, center-based provider survey, and classroom staff [workforce] survey). These surveys are comprised of seven questionnaires: 1) household screener, 2) household questionnaire, 3) home-based provider screener, 4) home-based provider questionnaire, 5) center-based provider screener, 6) center-based provider questionnaire, and 7) classroom staff (workforce) questionnaire. Two underlying principles influence the data collection approach for all four 2019 NSECE surveys.

First, we use a multi-mode contacting approach that attempts to complete as many questionnaires in the most cost efficient mode possible (self-administration by web for all provider surveys, and for the household screener, self-administration by web and paper-and-pencil), resorting to higher-cost modes to address nonresponse in later stages. The multi-mode approach has two primary benefits: 1) it saves costs by exploiting lower-cost modes, and 2) it improves response rates by offering respondents a range of participation options that better accommodate their preferences including timing, mode, contact with interviewers, and other data collection factors.

Second, the nature of the NSECE sample itself constrains the data collection period somewhat. A central feature of the household questionnaire is the prior-week’s child care schedule, which captures non-parental care for all age-eligible children in the week prior to questionnaire administration, supplemented with the work, school, and training schedules of all adults in the household caring for the age-eligible children (including parents). So that these data can capture non-parental care practices, it is desirable for the data collection period to avoid extended periods when alternative non-parental care arrangements may be in use, for example, the end-of-December vacations and summer breaks. Similarly, center-based and home-based providers are typically inaccessible during major vacation periods, and many programs such as Head Start may close altogether during the summer. For these reasons, we confine the data collection period to the 17 weeks between winter vacations and the start of summer sessions around the middle of May.

In combining these two principles, any given case is slated for only two data collection approaches, with the latter phase involving assignment to a field interviewer. The former phase may involve self-administration (for example, by web or paper-and-pencil). The latter, field effort, includes variation in mode as well, encouraging case completion by telephone as well as in-person for the household samples, and prompting for web completion as well as telephone or in-person completion for the center-based and listed home-based provider samples.

This data collection approach is rooted in our experiences preparing for and conducting the 2012 NSECE. Leading up to the 2012 data collection, the project verified the need for phone and in-person outreach in a design phase feasibility test and the 2012 NSECE field test where we saw that that mail data collection with phone prompting alone was inadequate to achieve the desired levels of sample cooperation. When interviewers were able to conduct an in-person visit, we saw increased cooperation across sample types. Interviewers were able to navigate organizational hierarchies more easily and identify the individual who could complete the questionnaire more quickly. They also were able to provide the needed assistance to help respondents with questions or technical issues. We, therefore, incorporated field interviewer visits into the data collection protocol for the 2012 NSECE. This protocol was designed so that lower levels of effort were expended in the initial phases of data collection where mailing and telephone contact was the primary mode of outreach and in-person visits were reserved for the less cooperative or more complicated cases in the later phases. So that we could make repeat visits without having respondents feel pestered, we incorporated small in-kind tokens relevant to ECE providers and households with young children such as growth charts and crayons. These tokens do not have incentive value per se, but give the interviewers fresh reasons to contact already-contacted respondents.

During the 2012 data collection, we found that this field interviewer outreach played a critical role in facilitating completes by web. Exhibits 4 and 5 below show the pattern of completions across the 2012 NSECE field period for the center-based and listed home-based provider surveys. Along the top of the graphs we have included the type of outreach that was dominant during that phase and the line across the center indicates the number of completed surveys throughout the phase. The graphs show that significant numbers of provider questionnaires were completed through only mail and telephone contact, but the majority of completed center-based and home-based provider questionnaires in 2012 required some in-person outreach.


Exhibit 4. 2012 NSECE Center-based Provider Response Rates by Phase of Data Collection Outreach



Exhibit 5. 2012 NSECE (Listed) Home-based Provider Response Rates by Phase of Data Collection Outreach

The field interviewer prompting efforts during the 2012 NSECE helped yield a 94 percent screener completion rate and a 74 percent weighted response rate for the center-based provider survey, and an 81 percent weighted response rate for the listed home-based provider survey.

As a result, the 2019 NSECE plans employ a similar outreach approach that combines different types of contact and technical support across multiple phases of data collection. These outreach activities will be customized for each survey to account for the unique characteristics of each sample and the challenges we encountered collecting data in 2012.

We discuss our procedures for fielding each questionnaire in turn:

Household Screener and Questionnaire

The household screener (included in Attachment 15a-b) will: 1) identify eligible respondents to complete the household questionnaire, and 2) identify eligible respondents for the home-based provider questionnaire (identified as unlisted home-based providers). Eligibility criteria for these surveys are specified in Exhibit 6 below.

Exhibit 6. Household Screener Eligibility Criteria


Survey

Eligibility Criteria

Household survey

100 percent of screened households with at least one child under the age of six.

80 percent of screened households with a youngest child between the ages of 6 and 13.

Home-based provider survey (for unlisted home-based providers)

Individuals who regularly provide care in a residential setting for one or more children under age 13 who are not their own at least five hours per week.

(These individuals do not appear on administrative lists of regulated or registered home-based providers.)


Sampled households will be sent a letter introducing the study and providing the option of completing the household screener online or by a brief self-administered mail screener (Attachment 16A/17A). This mailing will be followed, approximately one week later, by a postcard reminder (Attachment 16B/17B). After another two weeks, we will send the first nonresponse follow-up mailing to households that have not yet returned the mail household screener or completed the household screener via web (Attachment 16C/17C). This will include a different follow up letter and the same web or mail household screener options. A second and final nonresponse mailing will be sent three weeks later, which will include a third follow up letter and the web or mail household screener options (Attachment 16D/17D).

Information from completed mail screeners will be data entered and combined with response data from the web screener completes. All households eligible for the household survey will be mailed a letter alerting them that a field interviewer will be contacting them soon to complete the household questionnaire (Attachment 16F/17F). The household sample will be matched against a telephone commercial database to identify any telephone numbers associated with each sampled household. Field interviewers will contact households by phone where possible and in person to complete the questionnaire. Unlike the provider and classroom staff (workforce) surveys, the household questionnaire can only be administered by an interviewer because of its complexity. Procedures for households eligible for the unlisted home-based provider survey are described in more detail under the home-based provider survey below.

Those households that do not respond to these initial screening efforts will also be assigned to field interviewers for telephone and in-person follow up. Prior to interviewer outreach these households will receive a letter (Attachment 16E/17E) that reintroduces the study and informs the household that a field interviewer will be contacting them in the near future. Interviewers will attempt to screen these households and administer the household questionnaire if eligible.

The household screener may indicate that a household is eligible for both the household and unlisted home-based provider surveys. If the household is eligible for both surveys, the interviewer will administer the household questionnaire first. Once that is complete, the interviewer will request to speak to the household member who is eligible to complete the unlisted home-based provider questionnaire (if that is not the same individual who completed the household questionnaire). If the unlisted home-based providers are unable to participate in the survey at that time, they will be asked to complete the home-based provider questionnaire either by web, telephone, or in person at a later date.


Home-based Provider Screener

The 2019 NSECE will screen a sample of home-based providers drawn from national and state licensing and administrative lists for participation in the home-based provider survey. Because there is a high ineligibility rate (roughly 23 percent) among this sample group, a screening questionnaire (Attachment 4a/4b) will determine whether home-based providers are currently caring for children. Identifying eligible listed home-based providers was one of the main challenges encountered during the 2012 data collection effort; we have therefore focused our initial contacts on this screening step (please refer to Attachments 8A-C and the corresponding Spanish materials in Attachment 10). The questions included in the screening questionnaire are designed to confirm that the sampled provider cares for at least one child under 13 who is not their own, in a home-based setting for at least five hours per week. This threshold was set to allow us to capture the full range of home-based ECE providers while also excluding incidental care that is not regular ECE like the individual who walks a child home from school five days a week or an occasional ‘date night’ babysitter. If these criteria are met, then the home-based provider is eligible for participation and moves from the screening questions to the beginning of the main questionnaire where they will be asked to answer additional questions and presented with a new informed consent statement (Attachment 4a/4b). For those who are no longer caring for children, the screener will proceed to ask a few additional questions to provide some insight into when and why they stopped caring for children.

In an effort to resolve any uncertain cases in the final stages of data collection, we plan to send a screening postcard (Attachment 8F/10F) to any listed home-based providers whose eligibility remains unknown. Identifying eligible sample will allow us to better focus project resources at the end of the data collection and facilitate the final resolution of cases.

Home-based Provider Questionnaire

Listed home-based providers will be fielded in parallel with the center-based providers as described below, beginning with a series of initial contacts by mail (Attachments 8A-C for English or 10A-C for Spanish). These include an initial letter explaining the study and encouraging the sampled provider to complete the screening questions (Attachment 8A/10A). This initial letter is followed by two more contact attempts (Attachments 8B-C/10B-C). After this initial mail contact, field interviewer outreach will begin initially by phone and then shift to in-person visits as the field period continues. Interviewers will help to determine eligibility, prompt providers to complete the questionnaire by web, gain cooperation, provide technical support, and complete questionnaires with providers as needed.

Although unlisted home-based providers are identified through the household screener, the field approach is similar to that implemented for the listed home-based provider sample. Once identified through the household screener, these home-based providers will receive an initial advance letter (Attachment 12A/13A) inviting them to complete the home-based provider questionnaire by web. Approximately two weeks later a follow-up thank you/reminder postcard (Attachment 12B/13B) will be sent to eligible unlisted home-based providers that have not yet completed the web screener. Two weeks later, sampled providers that have not yet responded will receive a follow-up letter (Attachment 12C/13C), alerting them that a field interviewer will be contacting them to complete the questionnaire. Field interviewers will attempt outreach by phone where available and then shift to in-person visits to reach the sampled providers and complete the questionnaire. If a household is screened by phone or in person, the interviewer will be able to provide web access information by email or complete the interview at that time if the respondent is willing.

Center-based Provider Screener

As mentioned earlier, we propose that a single physical location or “establishment” be the ultimate sampling unit for the center-based provider survey since a given physical location may have one or more organizations operating ECE programs within its walls. One ECE-providing organization operating at the sampled location will be randomly selected for the survey. Consequently, every location and every organization in the targeted population will have a known probability of selection. For this approach, a screener will be administered to center-based providers (Attachment 2) that will confirm the eligibility of providers located at the sampled addresses as shown on the sampling frame. The questionnaire also screens for the presence of additional eligible providers that are not covered by the sampling frame.

The 2012 NSECE demonstrated that a multi-tiered approach to screening was an efficient investment of resources and resulted in better data quality. An increase in efficiency planned for 2019 is to screen multi-organization addresses in advance, but not addresses associated with multiple programs operated by a single organization. We found in 2012 that these multi-program/single organization centers, which are a significant portion of the sample, do not require advance screening for smooth administration of the web questionnaire.

Center-based Provider Questionnaire

Sampled locations will receive an initial mailing containing an advance letter that explains the purpose of the study, the reason for their selection, the web questionnaire URL, a login ID, and a password (Attachment 7A). Two weeks later a thank you/reminder postcard will be sent to sampled providers who have not yet responded, again requesting their participation if they have not yet completed the questionnaire (Attachment 7B). The third mailing will follow approximately two weeks later to sampled providers who have not yet responded, informing them that a field interviewer will be contacting them in the near future (Attachment 7C). Interviewers will then facilitate completion of the web questionnaire. The main goals of these contacts are to confirm eligibility, increase awareness of the study, and identify the most knowledgeable individual to complete the questionnaire. Field interviewers will also use this time to collect additional contact information (email, phone) from sampled providers for further follow-up. Interviewers will collect this information using the integrated case management system on their tablets. Although a majority of center-based providers (approximately 87 percent) completed the 2012 NSECE questionnaire by web, a large portion of these cases (approximately 75 percent) required at least one in-person visit by a field interviewer. If needed, field interviewers will be able to complete the questionnaire in person or over the phone.

Classroom Staff (Workforce) Questionnaire

The classroom staff (workforce) survey sample will be selected from a roster of classroom or group staff within a center-based facility where an administrator, director, or other leader has completed the center-based provider questionnaire. The classroom staff sample will be selected by first randomly selecting one age group served by the center that includes children age five and under, not yet in kindergarten. We will then enumerate all of the classrooms at the center that serve that age group, and randomly select one of those. The classroom staff (workforce) questionnaire will be available for completion by web, hard-copy self-administration, or by phone or in person with a field interviewer. Upon completion of a center-based provider questionnaire, the programming will automatically trigger the spawning of a randomly selected classroom/group-assigned staff person for the classroom staff (workforce) survey. A pre-selected subset of centers will select two staff members per classroom/group in order to increase the total number of individuals sampled for the classroom staff (workforce) survey. The mode in which the center-based provider questionnaire is completed and the history of prompting contacts between the field interviewer and the original center-based provider respondent will determine the initial outreach method for newly sampled individuals for the classroom staff (workforce) questionnaire. The center-based questionnaire is programmed to randomly select the sampled staff, who will be identified to the field interviewer immediately upon completion of the questionnaire with the center-based provider respondent so that the field interviewer may alert the respondent to the fact that an additional survey (or two) is requested. If the center-based provider questionnaire is completed online, the sampled classroom staff will be contacted by mail (Attachments 9A/11A) and asked to complete the web version of the classroom staff (workforce) questionnaire. Following the procedures for the center-based provider questionnaire, members of the classroom staff (workforce) sample will receive up to three mailings (sent to the address of the provider) to solicit participation (Attachments 9A-C/11A-C). A field interviewer will contact sampled providers who have not completed the survey after the initial three mail contacts by telephone to gain cooperation and prompt them to complete the questionnaire by web. If the center-based provider questionnaire is completed in person, the field interviewer will attempt contact with the sampled classroom staff at that time. If sampled individuals are not immediately available at the time the center-based provider questionnaire is completed, they will be invited to complete the web version of the classroom staff (workforce) questionnaire and receive the same three mailings described above (Attachments 9A-C/11A-C) before the case is assigned to a field interviewer for prompting.

3. Methods to Maximize Response Rates and Address Nonresponse

Maximizing Response Rates

Several issues in the 2019 NSECE data collection make it challenging to achieve high response rates: the limited data collection period, the high ineligibility rate of households and home-based providers, and the demanding schedules of all four sample groups. Current phone technologies make reaching potential respondents difficult, and concerns about privacy make them less willing to participate. For center-based providers, concerns include reluctance to disclose competitive information such as staff qualifications and prices charged to parents, burden from various regulatory reporting requirements, site visits and inspections, and other government-supported data collections.

The 2019 NSECE project’s approach to maximizing response rates will emphasize reducing the perceived costs of participation to sampled individuals, and increasing the perceived benefits of doing so. The mixed-mode data collection approach offers respondents a range of methods and therefore locations and times for participating in the 2019 surveys. Clear communication of privacy protections should further reduce the perceived costs of participation. We will also closely monitor data collection production to understand on-going sample representativeness.

Below are some steps that will be taken prior to the start of main data collection to facilitate cooperation across all sample types.

  • Compelling contact materials: Contact materials are designed to foster a successful first encounter with each individual by communicating the importance of the study for the different sample groups and anticipating concerns likely to prevent participation.

  • Strategic interviewer trainings: Because the first few seconds of each call or in-person visit are crucial, the project conducts innovative interviewer trainings designed to produce effective interviewers equipped with skills and information to build rapport with potential respondents and avert refusals.

  • Honorarium plan: ACF includes the use of honoraria as a part of the overall data collection strategy based on a number of experiments conducted during the 2012 NSECE field test and on outcomes of the 2012 NSECE data collection. These strongly suggest that honoraria are an important tool in securing cooperation with professional staff. Please refer to Supporting Statement A for more details.

  • Incentive plan: ACF also incorporates incentives and in-kind gifts, such as crayons and coloring books, or small gifts like little plants to help open doors and begin conversations. These small gifts build rapport and gain trust with respondents who are reluctant to participate, provide additional contact information for additional follow up, and give interviewers additional reasons to re-contact respondents. Please see Supporting Statement A for more information.


In addition to planning in advance of data collection, we will implement a systematic production monitoring effort during data collection that will also contribute to maximizing response rates. Specifically, data collection progress will be monitored using automated cost and production mechanisms for tracking sample and oversample targets, including monitoring sample progress and representativeness, and where necessary, diagnosing issues and developing plans to mitigate unwanted trends. Exhibits 7 and 8 demonstrate for the household survey some types of analyses that we will conduct on an on-going basis to understand sample representativeness and how key subgroups are responding to different components of the data collection approach. We will implement similar monitoring activities for all sample types.

Our household survey data collection approach makes use of phone numbers matched to the sampled addresses using commercially available resources. Because a matched phone number provides additional opportunities to contact respondents, we wanted to first understand how the availability of matched phone numbers was distributed across racial/ethnic groups. Since Exhibit 7 indicated higher rates of phone match among White Non-Hispanic respondents and lower rates among Hispanic respondents, we then hypothesized that Hispanic respondents might require more effort (such as in-person visits) in order to be recruited. Exhibit 8 explores and supports this hypothesis.

Exhibit 7. 2012 NSECE Household Respondents

Race

% of Phone Matched HHs

% of HHs with No Phone Match

% of all Households

White Non-Hispanic

60%

48%

54%

Black Non-Hispanic

15%

17%

16%

Asian Non-Hispanic

4%

5%

4%

Native Hawaiian or Other Pacific Islander Non-Hispanic

0%

0%

0%

American Indian or Alaska Native Non-Hispanic

0%

0%

0%

Other race Non-Hispanic

1%

1%

1%

Two or more races Non-Hispanic

1%

2%

2%

Hispanic or Latino

18%

26%

22%

Total

100%

100%

100%



Exhibit 8 shows the racial ethnic composition of low-income households requiring different levels of effort. White non-Hispanic cases were 32 percent of the overall sample of low-income households, and 32 percent of those requiring low, medium and high effort. African-American non-Hispanic households were over-represented among low-effort cases and under-represented among high-effort cases (29 and 21 percent, respectively, compared to their 25 percent overall prevalence among low-income households). In contrast, low-income Hispanic respondents were more likely to need a higher level of effort, as measured by contact attempts and length of time between screener and survey completion, than other racial groups. (We see that 42 percent of high-effort low-income households were Hispanic, compared to 38 percent of all low-income households.) Association of demographic characteristics with level of effort among completed interviews may also suggest patterns among non-interviews, on whom demographic data are not available.

Exhibit 8. Level of Effort for Household Survey Completion by Race for Respondents below the Poverty Line

Race

Low Effort *

Medium Effort**

High Effort***

All Effort Levels Combined

White Non-Hispanic

32%

32%

32%

32%

Black Non-Hispanic

29%

24%

21%

25%

Asian Non-Hispanic

0%

2%

1%

1%

Native Hawaiian or Other Pacific Islander Non-Hispanic

0%

0%

0%

0%

American Indian or Alaska Native Non-Hispanic

1%

1%

1%

1%

Other race Non-Hispanic

1%

1%

1%

1%

Two or more races Non-Hispanic

2%

1%

2%

2%

Hispanic or Latino

35%

37%

42%

38%

Missing

0%

1%

0%

0%

*Low effort: one contact and less than two days between screener and household survey completion

**Medium effort: 2-7 contacts and 2 and 7 days between screener and household survey completion

***High effort: four or more contacts and eight or more days between screener and household survey completion



Addressing Nonresponse

ACF understands the broad impact that nonresponse can have on a data collection effort, including the potential for lengthening a field period, lowering response rates, and introducing possible bias in the resulting data and estimates. The first line of defense against nonresponse is refusal aversion. Throughout the field period, the 2019 NSECE will make the most of compelling contact materials and online study information to inform the NSECE sample groups about the surveys and motivate participation. These materials will include a detailed project description and purpose, tailored to appeal to the interests of the different groups where possible. They will also contain a toll-free telephone number and project email address should respondents have any questions and wish to contact project staff. The online presence reinforces the study’s legitimacy and provides an easily accessible venue to learn more about the project. In addition, during interviewer training much effort is made to prepare interviewers with both general and project-specific techniques to successfully address respondent concerns.

Refusal aversion and conversion techniques are initially introduced and practiced in training sessions so that interviewers are well-versed in NSECE-related facts and in the sample frame methodology by the time they start interviewing and can successfully address respondent concerns. Throughout the training process, interviewers are educated as to the relatively short time period they have to gain an individual’s cooperation and that the process of averting refusals begins as soon as contact is made with a potential respondent. Although the initial contact with an individual at the sampled household or provider location may not be the person who ultimately completes the screener or detailed questionnaire, the way he or she is treated can directly impact the willingness of others to participate and provide quality data.

Analysis of Nonresponse Bias

As part of the final 2019 NSECE data preparation and analysis, the project will conduct a nonresponse bias analysis to further evaluate the impact of nonresponse across the samples and responses to the different surveys across modes. The nonresponse bias analysis will observe unit nonresponse across the different surveys by mode, address nonresponse bias overall, and describe the weights constructed for estimation. Finally, the analysis will include an assessment of nonresponse bias in the 2019 NSECE, including specific types of analyses, if any, that might be affected by nonresponse. We will submit this analysis to OMB as a non-substantive change when completed in 2020.

4. Tests of Procedures or Methods to be Undertaken

Testing of Questionnaire Items

Questionnaire items for the 2019 NSECE will include (1) intact versions of items from the 2012 questionnaires; (2) revised items fielded in the 2012 NSECE, and (3) new items developed to support research questions specific to the 2019 NSECE. All questionnaire documentation, including a high-level summary of each 2019 questionnaire, an item-level table comparing the 2012 and 2019 NSECE questionnaire items, and the individual questionnaires can be found in Attachments 1-6b and 14-15b. Please refer to the table of attachments to locate each respective questionnaire by sample type and language.

Items from the 2012 Questionnaires. Since it is an agency priority to maximize comparability of estimates between the 2012 and 2019 NSECE, the majority of questionnaire items for the 2019 NSECE correspond to items from the 2012 NSECE. Items from the 2012 NSECE questionnaires were reviewed and approved by OMB for the 2012 NSECE, which was fielded from November 2011 through June 2012. The 2012 NSECE questionnaires were developed during the Design Phase preceding the project’s implementation. The twenty-eight month Design Phase included a substantial review of the literature, an extensive compendium of measures from ECE-related surveys, multiple rounds of cognitive interviews per respondent type, and a feasibility test in which all questionnaires were administered with eligible samples using the modes and technologies planned for the main study. (The Design Phase cognitive interviews were conducted under a generic OMB clearance package; the Design Phase feasibility test was conducted under OMB # 0970-0363.)

Revised and New Items. Revised and new items have been proposed by the contract team, OPRE, Content Advisory Team (CAT), consultants to the contract team, the Technical Expert Panel (TEP), and other federal agencies. Some revisions to items are non-substantive, such as updates to the definition of last year (from 2011 to 2018). Many of the newly adopted items are new to the NSECE but have been included in other surveys and cognitively tested elsewhere, such as self-rated health or the CES-D depression scale. Some additional new items for 2019 were developed and tested for the 2012 NSECE but not included that year due to administration time constraints. New or revised items that have not exhibited adequate item functioning through one of these mechanisms have undergone cognitive testing with fewer than 10 people as part of the current contract. The 2012 NSECE measured ECE staff well-being using the Kessler Distress Scale (K6). The CES-D is equivalent to the Kessler scale in its content, potential sensitivity level, and burden for respondents. The key advantage of the CES-D is its wide use across multiple subpopulations, providing relevant baseline comparisons to ECE staff included in the NSECE. We do not expect any impacts on response rates due to differential sensitivity or burden of the CES-D in 2019 relative to the Kessler in 2012.



The project team conducted cognitive testing of items for the provider and classroom staff surveys in February and March 2018. (The 2019 household survey contains very limited revisions, compared to the 2012 household survey, and most revisions involve changes in skip patters, so no additional cognitive testing was completed.)   For the provider surveys we recruited three center-based providers, four home-based providers, and five classroom/group staff (workforce) participants via word of mouth, advertisement, and outreach to local child care networks in the Chicago and San Francisco Bay areas. The same question was not asked of more than nine people. The project team screened for eligible provider and classroom staff (workforce) participants, aiming for a diverse pool, and used findings from cognitive interviews to further revise the wording of new and updated questionnaire items. For more details, please refer to the questionnaire documentation. Item-level tables comparing the 2012 and 2019 NSECE questionnaire items can be found in Attachments 1, 3, 5, and 14. Please refer to the table of attachments to locate questionnaire documentation by each sample type.



5. Individual(s) Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Dr. Kirk Wolter

Executive Vice President,

NORC at the University of Chicago

55 East Monroe Street

Chicago, IL 60603

(312) 759-4000

NORC at the University of Chicago conducted the sample design, and will also conduct the data collection for the proposed 2019 surveys.

Reference List

AAPOR Task Force on the Future of U.S. General Population Telephone Survey Research. (2017). The future of U.S. general population telephone survey research, prepared for the AAPOR council under the auspices of the AAPOR standards committee. Retrieved at https://www.aapor.org/getattachment/Education-Resources/Reports/Future-of-Telephone-Survey-Research-Report.pdf.aspx

O'Muircheartaigh, C., English, E., & Eckman, S. (2007). Predicting the relative quality of alternative sampling frames. Chicago, IL: National Opinion Research Center at the University of Chicago. Retrieved at http://ww2.amstat.org/sections/srms/proceedings/y2007/Files/JSM2007-000575.pdf














File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy