1801 Evaluation_Supporting Statement Part B_6.8.2020 CLEANX

1801 Evaluation_Supporting Statement Part B_6.8.2020 CLEAN.DOCX

Evaluation of the DP18-1801 Healthy Schools Program

OMB: 0920-1302

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT FOR THE


EVALUATION OF THE DP18-1801 HEALTHY SCHOOLS PROGRAM


PART B





Submitted by:

Sarah M. Lee, PhD

School Health Branch

Division of Population Health

National Center for Chronic Disease Prevention and Health Promotion

4770 Buford Hwy, NE Mail stop K78

Atlanta, GA 30341
770-488-6162 (voice); 770-488-5964 (fax)

E-mail: [email protected]

Centers for Disease Control and Prevention
Department of Health and Human Services

06/08/2020


List of Attachments

Attachment A: SEA Initial Contact Approval Request Email

Attachment B: LEA Initial Contact Approval Request Email

Attachment C: School Initial Contact Approval Request Email

Attachment D: SEA Survey Respondent First Email

Attachment E: LEA Survey Respondent First Email

Attachment F: School Survey Respondent First Email

Attachment G: SEA Interview Scheduling Email

Attachment H: LEA Interview Scheduling Email

Attachment I: School Interview Scheduling Email

Attachment J: SEA Survey Reminder Email

Attachment K: LEA Survey Reminder Email

Attachment L: School Survey Reminder Email


Section B: Collections of Information Employing Statistical Methods

  1. Respondent Universe and Sampling Methods

The respondent universe for this evaluation varies by data collection method and is described in Exhibit 1 below. For the web-based surveys, the respondent universe consists of 3 State Education Agencies (SEAs) funded by the cooperative agreement DP-18-1801, all targeted Local Educational Agencies (LEAs) within each of the 3 SEAs (up to 30 LEAs), and a subset of schools within the targeted LEAs (up to 210 schools). Only one person per institution will be asked to respond to the web-based survey.

For the key-informant interviews the respondent universe consists of the same 3 SEAs (3 respondents per SEA = 9 SEA staff total), a subsample of two LEAs per SEA (total of 6 LEAs = 12 LEA staff total), and a subsample of 3 schools per 2 selected LEAs (total 18 school staff per state = 54 school staff total).

Exhibit 1. Summary of Primary Data Collection Sample Size per Time Point


Sample Size Per State

Total Sample Per Time Point

Implementation Survey (Web-based)

LEAs

All targeted LEAs in the state (up to 10)

Up to 30 LEAs total

Schools

9 schools per targeted LEA

Up to a maximum of 210 schools1

Key Informant Interviews

SEA staff

3 SEA staff per state

9 SEA staff total

LEAs

2 LEAs within the state / 2 staff per LEA / 4 LEA staff per state

6 LEAS total / 12 LEA staff total

Schools

3 schools per 2 selected LEAs / 3 school staff per school / 18 school staff per state

18 schools total / 54 school staff total

ICF will work with the CDC’s technical monitor and other key CDC staff, including project officers (POs), to select and invite the three identified state grantees to participate in the 1801 intensive evaluation. To aid in the selection of the three SEAs, a matrix was developed with information regarding SEA’s implementation activities, implementation readiness, geographic diversity, presence of state-wide coalition, accessibility of extant data, among other factors. The aim is for the selected SEAs to represent a continuum of capacity and level of implementation, and to the extent possible, represent a range of 1801 grantee characteristics to maximize learning and relevance of the evaluation. Once identified, the SEAs will receive an invitation to participate directly from their PO. If they accept to participate, the PO will establish a primary point of contact (POC) for the evaluation, and ICF will initiate direct communication with the SEAs POC. Procedures for selecting and recruiting the LEAs and schools are described below for each type of data collection.



Recruitment Methods for the Implementation [Web-Based] Surveys

ICF will email the POC from each selected SEA to identify one person to complete a web-based Implementation Survey on behalf of the SEA (n=3). ICF will also communicate with the SEAs via e-mail to identify and recruit LEAs for this data collection activity (Attachment A). Up to 10 LEAs within each participating state will be invited (Attachment B) to participate in the web-based Implementation Survey (30 maximum). The e-mail invitation will explain the objectives of the evaluation, how the LEA was identified, and describe the requested data collection activities for participating LEAs and associated schools. If the LEA agrees to participate, ICF will work with the district-level administrator to identify the appropriate POC within the LEA to respond the survey (similar to the SEAs).

Within the targeted LEAs, nine schools per LEA will be randomly selected and invited to participate in the survey portion of the evaluation, including three elementary schools, three middle schools, and three high schools, up to a maximum of 210 schools. Once selected, ICF will send an approval request e-mail to the principal of each school (Attachment C). The e-mail will explain the objectives of the evaluation and describe the requested data collection activities, including next steps for identifying the appropriate POC within the school to respond the survey.

Once the survey respondents from each level (SEA, LEA, and school) are identified ICF will communicate with the respondents via e-mail with next steps for accessing the web-based survey (Attachments D, E, and F).

Recruitment Methods for the Key Informant Interviews

ICF will email the SEAs POC to identify three SEA staff per state (n=9) who work on the implementation of 1801 program activities to participate in key informant interviews (KII) (Attachment G). ICF will work closely with each selected SEA to identify two LEAs per state based on a variety of factors such as degree of implementation activities, implementation readiness, overall accessibility, willingness to participate, geographic diversity, or other special circumstances that present valuable learning opportunities for the field. Once LEAs have been selected, ICF will send approval request e-mail (Attachment B) to district-level administrators. Two POCs will be contacted via e-mail (Attachment H) from each of the selected LEAs to participate in the KII (n=12). Further, a subsample of three schools (one elementary school, one middle school, and one high school) from the two selected LEAs per state will be invited to participate in the case studies for six schools per state (n=18). This subsample of schools will be selected with LEA input to identify schools based on implementation readiness and willingness, among other factors. Once schools are selected ICF will send approval request e-mail (Attachment C) to school administrators including next steps to identify 3 interview respondents per school (n=54). Upon approval to participate, the interviewees will be contacted via e-mail (Attachment I) to confirm participation and schedule time and day for the KII.

We anticipate that response rates will be appropriately high to produce reliable and valid results. At the organization level we are targeting a 100% State, 90% LEA, and 90% school participation rate due to the voluntary nature of engagement and because similar evaluations conducted by the ICF team have yielded similar rates. At the individual level we aim for the following response rates based on previous program evaluations conducted by ICF:

  • SEA staff: 95%

  • LEA staff: 90%

  • School staff: 90%

  1. Procedures for the Collection of Information

No statistical sampling will take place to select the SEAs and LEAs. As noted in the previous section, ICF will work with the CDC’s technical monitor and other key CDC staff to identify a convenience sample of three SEAs based on predetermined factors such as the status of implementation activities, geographic diversity, and presence of state-wide coalition. All LEAs within those three states will be invited to participate. Based on the known average number of targeted LEAs per SEA, we do not expect the number of schools to exceed 210 schools. Thus, for the web-based survey we anticipate being able to include all available schools in the evaluation. As noted in B1, for the KIIs both the two LEAs and the three schools per two selected LEAs will be selected on the basis of recommendations from the SEAs and LEAs.

Participants at all three levels (SEA, LEA, school) will respond to the survey using an online platform (Survey Monkey®). Trained ICF staff will conduct the KII via telephone for the initial round of data collection. Interviews may be conducted by telephone or in person for the second round of data collection. Interviews will also be recorded and later professionally transcribed.

  1. Methods to Maximize Response Rates and Deal with No Response

In order to maximize response rates, the three selected SEAs will be personally invited to participate by their POs. SEAs will help to identify, invite, and encourage participation at the district (LEA), and school-level. We anticipate that the relationships established between these entities will facilitate response and reduce the likelihood of non-responsiveness. Through garnering support from superintendents and school administrators we expect a high initial response rate. We will employ multiple rounds of follow-up contact, including follow-up e-mails, phone calls, and providing hard copies of instruments as necessary. ICF will send e-mail reminders (Attachments J, K, and L) one week prior to the due date for submission of the web-based survey, and two additional reminders to those participants who do not submit a response by the due date. ICF also plans to handle non-responsiveness by leveraging relationships with SEAs. Furthermore, the data collection process is designed to be low-burden and modest incentives will be offered to LEA and school level participants to compensate them for their time and effort in responding to evaluation activities (see incentives in the Supporting Statement A, section A9). 

  1. Test of Procedures or Methods to be Undertaken

ICF conducted an informal test of all instruments with ICF staff experienced with schools, and district and state education agencies. Each instrument was tested by two staff to gather feedback on the content, format, readability, and flow. ICF used the pilot test feedback to refine a few of the survey questions and interviewer instructions, as well as adjust skip patterns and formatting in Survey Monkey®.

  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The following individuals at ICF are responsible for collecting the data and statistical aspects of the project and analyzing the data, while CDC staff provide oversight:

  • Isabela Ribeiro Lucas, PhD (Contractor) …………...Task Lead

404.592.2155; [email protected]

  • Dana Keener Mast, PhD (Contractor) ………………………..…………...Project Director

404.592.2206; [email protected]

  • April Carswell, PhD (Contractor)………………………..Data Collector and Data Analyst

404.592.2132; [email protected]

  • Syreeta Skelton-Wilson, MPA (Contractor)……………..Data Collector and Data Analyst

404.592.2115; [email protected]

  • Megan Brooks, PhD (Contractor)………………………........................Lead Data Analyst

651.330.6085; [email protected]

  • Sarah M. Lee, PhD (CDC)……………………………………………..Technical Monitor

770.488.6126; [email protected]

  • Seraphine Pitt Barnes, PhD (CDC)………………………………………Health Scientist

770.488.6115; [email protected]

  • Adina Cooper, PhD (CDC)……………………………………………….ORISE Fellow

404.718.6628; [email protected]

1 Based on the average number of targeted LEAs per SEA, we don’t expect to the sample to exceed 210 schools.




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCMS
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy