ELMC Supporting Statement PART B_110411_FINAL[1]

ELMC Supporting Statement PART B_110411_FINAL[1].doc

Evaluation of Early Learning Mentor Coaches (ELMC) Grants

OMB: 0970-0399

Document [doc]
Download: doc | pdf








The Evaluation of Early Learning Mentor Coaches (ELMC) Grants


Supporting Statement for OMB Clearance Request
Part B






June 2011, updated November 4, 2011




Part B

B. Statistical Methods

This section provides supporting statements for each of the five points outlined in Part B of the OMB guidelines, for the collection of information in the Evaluation of Early Learning Mentor Coaches (ELMC) Grants.


B1. Respondent Universe and Sampling Methods

To collect information that is of maximum use to documenting the range of mentor-coaching approaches being used across the ELMC grantees, data collection activities will include online census surveys and telephone interviews with selected sample members. The surveys and interviews will collect information about the grantee and community context; characteristics of mentor-coaches and staff who are mentor-coached; the nature of the mentor-coaching approach including content, structure, process, and quality; mentor-coaching dosage; implementation outcomes; and, perceived improvements in behavior of staff that are mentor-coached and grantee programs. Two types of data collection activities will be conducted with three types of individuals as outlined in Table 1.


Table 1. Overview of Data Collection Activities and Respondents


Data Collection Instrument (estimated time)

Respondent

Population (N) /Sample (n)

Grantee Census Online

Survey (30 minutes)

Grantee Director

(grantee level information)

Population

(N=130)

Mentor-Coach Census Online

Survey (30 minutes)

Mentor-Coach

Population

(N=400)

Mentor-Coach Telephone

Interview (60 minutes)

Mentor-Coach

Sample

(n=65)

Staff Telephone

Interview (60 minutes)

Classroom-based teacher, home-based visitor, family child care provider

Sample

(n=130)


As can be seen in Table 1, there are two categories of respondents who will complete telephone interviews, including mentor-coaches and teaching staff (including classroom-based staff, home-based visitors, or family child care providers). There are 130 HS and EHS grantees in 42 states and the District of Columbia that represent the ELMC grantee population. Survey data will be collected from the full population of grantees (Grantee Census Online Survey) and the full population of mentor-coaches at these grantees (Mentor-Coach Census Online Survey). Telephone interview data will be collected from a selection of Mentor-Coaches and Staff who are mentor-coached. In the case of the 65 mentor-coaches that are interviewed, they all will have already completed the mentor-coach census survey. However, a selection of mentor-coach staff will also be involved in a telephone interview that includes questions that do not repeat questions that were asked on the survey. They ask more detailed questions related to mentor-coaching and questions about the two teaching staff that we will interview. In the case of the teaching staff, the telephone interview will be the first and only time that they answer questions for the evaluation. The evaluation expectations and incentives will be made clear in initial communications with each grantee, and the expectations will be reiterated when scheduling interviews with individual respondents.


To contact and recruit the respondents, we will follow this recruitment and contact sequence:

  • The Office of Head Start (OHS) will provide the contact list of the 130 ELMC grantees.

  • We will contact the ELMC grantees to complete the online grantee census survey and provide us a contact list of their mentor coaches.

  • We will contact ELMC mentor coaches (estimated n = 400) to complete the online mentor coach census survey. If they are a mentor coach within a selected grantee (n =65), we will randomly select one mentor coach to participate in an interview.

  • We will contact the selected ELMC mentor coaches who agree to participate in the telephone interview for a contact list of the staff the work with (n = 130).

  • We will randomly select two teachers from the selected mentor coaches list and ask them to participate in the telephone interview.


To select a sample of 65 grantees from the population of 130 grantees, we will rely on a 3 by 4 (rural/urban classification by size) stratified sample design (see Exhibit 1). Because we are selecting a stratified random sample of 65 grantees, we have a very large (50%) sampling fraction meaning that a random sample of the population may have resulted in a very similar sample to using the approach that has been executed. It is also important to reemphasize that the purpose of this evaluation is to describe implementation. We are not able to draw conclusions about the effects of mentor-coaching – either on the whole population or parts of the population – therefore the issues of sampling is lessened. Exhibit 1 shows the population and sample sizes within each stratum of the design. The design selects with certainty all grantees in the four shaded (certainty) strata and all grantees with American Indian/Alaska Native (AIAN) or Migrant Seasonal Head Start (MSHS) programs in the un-shaded strata, for a total of 15 grantees in all. The design allocates the remaining sample of 50 grantees across the un-shaded strata in proportion to the number of grantees within each stratum who were not selected with certainty (referred to as “non-certainty” grantees).


Exhibit 1. Sample Design for the Early Learning Mentor Coach Evaluation



Population Size (number of non-certainty grantees)

Sample size [number selected randomly, number selected with certainty]


Size

Total

Less than 400 funded children

400 to 1,000 funded children

1,001 to 5,000 funded children

More than 5,000 funded children

Rural/Urban Classification

Metro area with one million or more residents

17 (15)

8 [6,2]

15 (14)

7 [6,1]

22 (22)

10 [10,0]

3 (0)

3 [0,3]

57 (51)

28 [22,6]

Metro area with less than one million residents

7 (7)

3 [3,0]

11 (11)

5 [5,0]

20 (18)

10 [8,2]

1 (0)

1 [0,1]

39 (36)

19 [16,3]


Rural area


20 (17)

10 [7,3]

12 (12)

5 [5,0]

2 (0)

2 [0,2]

1 (0)

1 [0,1]

35 (29)

18 [12,6]


Total


44 (39)

21 [16,5]

38 (37)

17 [16,1]

44 (40)

22 [18,4]

5 (0)

5 [0,5]

131 (116)

65 [50,15]


In sum, there were two main variables that were used to classify the grantees into strata: rural/urban classification and grantee size. We used three variables to sort the records for the systematic sampling procedure: (1) ELL classification (50% or more ELL; less than 50% ELL); (2) program options (center-based only and other); and (3) program type (Early Head Start, Head Start, EHS/HS, Migrant and Seasonal Head Start, American Indian/Alaska Native). To select the sample within each un-shaded stratum, we relied on a systematic sampling procedure. The procedure sorted the records of all non-certainty grantees within each un-shaded stratum by program type (EHS, HS, and EHS/HS), proportion of children who are English Language Learners, and program options (whether the program is entirely center-based or offers other options, such as family or home-based childcare) before the sample of non-certainty grantees is drawn. For each grantee we applied the selection probability – or the probability of selection into the sample for each member in the grantee population (i.e., selected with certainty, not selected with certainty). Lastly, we determined the base weight for each grantee, which for all grantees not selected into the sample was zero and for all grantees selected into the sample was the inverse of the selection probability for sample members. This sampling procedure ensured that the sample of non-certainty grantees within each un-shaded stratum is balanced across these background characteristics.


B2. Procedures for Collection of Information

The ELMC evaluation will collect information using four data collection protocols: (1) grantee census online survey; (2) mentor-coach census online survey; (3) mentor-coach telephone interview; and, (4) staff telephone interview. The following paragraphs describe each of the data collection protocols; the mode, frequency, and duration of data collection; and, in the case of the interviews, the identity of the interviewers.



Surveys will collect data from the complete population of grantees and mentor-coaches.

  • Grantee Census Online Survey: This survey will gather descriptive information at the grantee level about the grantee, professional development context, the mentor-coach grant specifically, participant information, challenges to implementation, and efforts towards sustaining the mentor-coaching program. 

  • Mentor-Coach Census Online Survey: This survey will obtain information about the mentor-coach, the mentor-coaching approach, implementation of the grant, goals for mentor-coaching, challenges to implementation, and perceived success of mentor-coaching.


Interviews will collect data from a selection of mentor-coaches and staff who were mentor-coached within the 65 sampled grantees.

  • Mentor-Coach Telephone Interview: This interview will gather more in-depth information about goals for mentor-coaching at the grantee level and the mentor-coach’s own goals, mentor-coaching approach generally and with two specific staff, and overall information about implementation, including support from the grantee and challenges to mentor-coaching. This interview will be conducted via telephone with the selected mentor-coach. The interviewer will be a trained individual from the evaluation team.

  • Staff Telephone Interview: This interview will gather more in-depth information directly from two randomly selected staff that receive mentor-coaching from the randomly selected mentor-coaches within the sampled grantees. The interview will gather participant information, their perceptions of mentor-coaching including how they were selected and goals for mentor-coaching, their experiences with mentor-coaching, their perceptions about their mentor-coach including the working relationship that they had, and perceived effects of mentor-coaching. This interview will be conducted via telephone with two randomly selected staff that worked with the mentor-coach who participated in a telephone interview. The interviewer will be a trained individual from the evaluation team.


The measures are included as attachments to the supporting statements: Attachment 1 (Grantee Census Online Survey), Attachment 2 (Mentor-Coach Census Online Survey), Attachment 3 (Mentor-Coach Telephone Interview), and Attachment 4 (Staff Telephone Interview).


Quality Control


The evaluation team will institute a variety of methods to ensure the quality of the data collected. All protocols were developed in consultation with OPRE and in collaboration with the technical experts and consultant group, which includes early childhood practitioners and a mentor- coach consultant. Earlier versions of the protocols have also been pilot tested with Head Start and Early Head Start staff.


Evaluation team members will ensure the quality of the survey data collected in the secure electronic data capture system. A team member who developed the Vovicci data collection system and programmed the surveys will be available to answer questions from the respondents. All interviewers will attend training in telephone interview techniques and in the questions to be asked on each interview. A document will be developed that includes terms used in the interview guides, and questions and answers about the data collection instruments and procedures. Staff will review strategies for working with diverse populations and for ensuring that data are collected in a manner that is culturally sensitive.

B3. Methods to Maximize Response Rates and Deal with Nonresponse

We are aiming for at least a 75% response rate on the Grantee Census Online Survey (98 respondents out of a possible 130), a 75% response rate on the Mentor-Coach Census Online Survey (300 respondents out of a possible 400), a 75% response rate on the Mentor-Coach Telephone Interview (49 respondents out of a possible 65), and a 75% response rate on the Staff Telephone Interview (98 respondents out of a possible 130).


Several factors will help ensure a high rate of cooperation and response rates among respondents:

  • Quality of the Protocols: We designed the protocols to make them as efficient as possible to ensure that we obtain the necessary information without wasting respondents’ time (e.g., topics are tailored to respondent role, terminology matches that utilized in the ELMC grant proposals), and respondents will be able to complete the survey at a time most convenient to them in one or multiple sittings. Consistent with research findings, the web survey will be kept as simple as possible while still being visually appealing; because these surveys take less time to load, response rates can be affected positively.1 We will include contact information on the surveys in case respondents require assistance. Interviews will also be as efficient as possible, only asking questions about information that is essential to addressing the goals of the ELMC initiative.

  • Quality of the Interviewers: Evaluation team interviewers will receive training in telephone interview techniques ensuring that they know how to establish their credentials and the significance of the study at the beginning of the interview.2 They will be fully trained to establish rapport with respondents, provide smooth transitions in the interview, and facilitate complete responses.

  • Prior Notification: As part of regular program communication from the Office of Head Start with the ELMC grantees, all grantees will receive an announcement and letter from OHS in support of the evaluation. All respondents are invested in the issues surrounding mentor-coaching and should be interested in contributing to the knowledge about implementation of these Head Start/Early Head Start mentor-coaching grants. For additional information about notification of respondents, see section B1 of this document.

  • Follow-Up with Non-Respondents: The evaluation team will include a deadline for the web-based survey, which has been found to improve response rates in web-based surveys.3 Follow-up will be conducted three times with survey non-respondents and for scheduling interviews using email, phone, and via the grantee director.

  • Special Incentives: The interviewer will offer a social incentive to respondents by stressing the importance of the data collection as part of an evaluation designed to inform future mentor-coaching initiatives and Head Start improvement efforts. Respondents also will be eligible for small incentives for their time in return for participating in data collection activities as described in Section A9 in Supporting Statement A. The evaluation team believes that the use of these incentives in addition to the methods that will be used to increase response rates is justified for a number of reasons:

    • Avoidance of non-response bias – the evaluation team is concerned with maximizing response rates because the purpose of the evaluation is to describe implementation, which requires obtaining a census of all grantees and mentor-coaches and requires obtaining as high response rates as possible on the selected sample across all of the important stratification variables .

    • Improved coverage of specialized respondents – the evaluation team is collecting data from respondents that work directly with low-income families and children, and some of the programs are migrant or seasonal or serve other low-incidence populations such as American Indian/Alaska Native. These populations are typically excluded from evaluation studies, but will be selected with certainty for the ELMC evaluation . The use of a monetary incentive should help to maximize responses from these respondents, espeically because the monetary incentive is approximately equivalent to the level of effort required for their response.

    • Decreased amount of time spent doing follow-up with non-respondents – the evaluation team plans to conduct follow-up outreach with non-respondents; however, the use of a monetary incentive may decrease the amount of time spent in follow-up activity, thereby decreasing the overall costs of the information collection.


B4. Tests of Procedures or Methods to be Undertaken

The evaluation team has vast experience developing survey and interview protocols, particularly around issues related to early childhood professional development and mentor-coaching. Questions related to context were developed by reviewing existing national surveys and questions related to mentor-coaching and implementation were developed by reviewing previous studies on the same topic. To ensure we developed the most appropriate data collection protocols, they were vetted with the evaluation’s technical experts, the consultant group (which includes practitioners), and Federal staff from OHS and OPRE. Earlier versions of the protocols have also been pilot tested with Head Start and Early Head Start staff.


B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The information for this study is being collected by the American Institutes for Research (AIR) and its subcontractors MEF Associates and NORC, on behalf of the U.S. Department of Health and Human Services (HHS), Office of Planning, Research and Evaluation (OPRE). With HHS-OPRE’s oversight, AIR is responsible for the study design, sampling, data collection, analysis, and report preparation. Key input to the statistical aspects of the design was received from the following individuals:


AIR Team

Eboni Howard, Project Director, AIR

Fiona Helsel, Project Manager, AIR

Mike Fishman, Subcontract Director, MEF Associates

Sharon McGroder, Independent Consultant, MEF Associates

Kirk Wolter, NORC


Department of Health and Human Services

Naomi Goldstein, Director, OPRE

Jennifer Brooks, Senior Social Science Research Analyst, OPRE

Amanda Bryans, Office of Head Start

Wendy DeCourcey, Project Officer, OPRE

Jamie Sheehan, Office of Head Start


Inquiries regarding statistical aspects of the study design should be directed to:

Eboni Howard

American Institutes for Research

1000 Thomas Jefferson, NW

Washington, DC 20007

202-403-5533


The HHS project officer, Wendy DeCourcey, has overseen the design process and can be contacted at:

Wendy DeCourcey

Office of Planning, Research, & Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services

370 L'Enfant Promenade, S.W.

Washington, DC 20447

202-260-2039


1 Dillman, Tortora, Conrad, & Bowker (2001)

2 O’Toole, Sinclair, & Leder (2008)

3 Porter & Whitcomb (2003)

Part B: Statistical Methods B-7


File Typeapplication/msword
File TitleAbt Single-Sided Body Template
Authorbartlets
Last Modified Bybbarker
File Modified2011-11-15
File Created2011-11-15

© 2024 OMB.report | Privacy Policy