CLPE_OMB Part B

CLPE_OMB Part B.docx

Comprehensive Literacy Program Evaluation: Comprehensive Literacy State Development (CLSD) Program Evaluation

OMB: 1850-0945

Document [docx]
Download: docx | pdf
Shape1

Office of Management and Budget Clearance Request:

Supporting Statement Part B—Statistical Methods (DRAFT)

Comprehensive Literacy Program Evaluation: Striving Readers Implementation Study



Shape2

PREPARED BY:

American Institutes for Research®
1000 Thomas Jefferson Street, NW, Suite 200

Washington, DC 20007-3835


PREPARED FOR:

U.S. Department of Education

Institute of Education Sciences



August 2018









Office of Management and Budget
Clearance Request
Supporting Statement Part B











August 2018

















Prepared by: American Institutes for Research®
1000 Thomas Jefferson Street NW
Washington, DC 20007-3835
202.403.5000 | TTY 877.334.3499

www.air.org






Contents



Introduction

This package requests clearance from the Office of Management and Budget to conduct data collection activities associated with the legislatively mandated evaluation of the Striving Readers Comprehensive Literacy (SRCL) program. The purpose of this evaluation is to provide information to policymakers, administrators, and educators regarding the implementation of the SRCL program, including grant award procedures, technical assistance, continuous improvement procedures, and literacy interventions at the school level. Data collection will include interviews with state-level grantees and district, school, and teacher surveys. In addition, the study team will conduct site visits to 50 schools and observe instruction in 100 classrooms using SRCL-funded literacy interventions. The study team also will collect and review grantee and subgrantee applications and comprehensive literacy plans.

Clearance is requested for the grantee interview, surveys, fidelity site visit components, and collection of extant data, including the purpose, sampling strategy, data collection procedures, and data analysis approach.

The complete OMB package contains two sections and a series of appendices, as follows:

  1. OMB Clearance Request: Supporting Statement Part A—Justification

  2. OMB Clearance Request: Supporting Statement Part B—Statistical Methods [this document]

  3. Appendix A—Grantee Interview Protocol and Consent Form

Appendix B—Subgrantee Questionnaire and Consent Form
Appendix C—Principal Questionnaire and Consent Form
Appendix D—Teacher Questionnaire and Consent Form
Appendix E—Principal Interview Protocol and Consent Form
Appendix F—Reading Specialist Interview Protocol and Consent Form
Appendix G—Teacher Pre-/Post-Observation Interview Protocol and Consent Form
Appendix H—Request for lists of subgrantee districts and schools

Appendix I—Request for student achievement data

Appendix J—Request for teachers rosters in schools sampled for survey administration



Supporting Statement for Paperwork Reduction Act Submission

This package requests clearance from the Office of Management and Budget (OMB) to conduct data collection activities for the legislatively mandated evaluation of the Striving Readers Comprehensive Literacy Program. The Institute of Education Sciences, within the U.S. Department of Education, awarded the “Comprehensive Literacy Program Evaluation” contract to conduct this evaluation to American Institutes for Research (AIR) and its partners Abt Associates, National Opinion Research Center (NORC) and Instructional Research Group (IRG) in May 2018.

In recent years, educational policy has focused on college and career readiness, but many U.S. students still do not acquire even basic literacy skills. Students living in poverty, students with disabilities, and English learners (ELs) are especially at risk. By grade 4, there is a substantial gap in reading achievement between students from high- and low-income families (as measured by eligibility for the national school lunch program). According to the National Assessment of Educational Progress, the average student from a high-income family is at about the 65th percentile of the distribution of reading achievement, whereas the average student from a low-income family is at about the 35th percentile. Gaps by grade 8 are only slightly smaller. Average grade 4 scores for students with disabilities (17th percentile) and ELs (18th percentile) are even lower than for low-income students.1

To narrow the gap in literacy between disadvantaged students and other students, in 2011 the federal government launched the Striving Readers Comprehensive Literacy (SRCL) program. SRCL is a discretionary federal grant program authorized as part of Title III of Division H of the Consolidated Appropriations Act of 2016 (P.L. 114-113) under the Title I demonstration authority (Part E, Section 1502 of the Elementary and Secondary Education Act (ESEA). The goal of SRCL is to advance the literacy skills, including preliteracy and reading and writing skills, of children from birth through grade 12, with a special focus on improving outcomes for disadvantaged children, including low-income children, ELs, and learners with disabilities. SRCL is designed to achieve these goals for children by awarding grants to state education agencies (SEAs) that in turn use their funds to support subgrants to local education agencies (LEAs) or other nonprofit early learning providers to implement high-quality literacy instruction in schools and early childhood education programs. Ultimately, this enhanced literacy instruction is the mechanism through which student reading and writing are expected to be improved.

This submission requests clearance to conduct data collection for an implementation evaluation of the SRCL grants given to 11 states in 2017, totaling $364 million. The implementation evaluation will describe the extent to which (a) the activities of the state grantees and of the funded local subgrantees meet the goals of the SRCL program, and (b) the extent to which the literacy instruction in the funded subgrantees’ activities reflects the SRCL grant program’s definition of high-quality, comprehensive literacy programming. Most SEAs are expected to give awards to subgrantees at the end of the first grant year, with LEAs implementing SRCL literacy instruction in a selected set of schools and early childhood centers the second and third grant years. The implementation study will cover SEA and LEA activities over the entire three-year grant period. The analyses for this evaluation will draw on the following data sources:

  • Grantee interviews

  • Surveys of subgrantees, principals, and teachers

  • Fidelity site visits, including classroom observations and interviews with principals, reading specialists, and teachers

  • Extant data, including documents (grantee applications, state requests for subgrant applications, subgrant applications, and literacy plans) and Reading/English language arts standardized test score data. The study team will also request grantees to provide lists of all subgrantees.

Description of Statistical Methods

1. Sampling Design

This study will include the following two samples, which will provide different types of data for addressing the study’s research questions.

  • Survey samples. The study team will determine the universe of all the Striving Readers Comprehensive Literacy (SRCL) subgrantees, and within these, identify samples of schools and teachers to be surveyed.

  • Sample for fidelity site visits. A nested sample of 100 classrooms in 50 schools within 15 subgrantee districts will be selected to gauge the fidelity with which selected interventions are being implemented.

Survey Samples

The study aims to obtain responses from the full population of about 300 subgrantees (which generally are LEAs but may include some consortia of multiple LEAs) for the subgrantee survey. The study also aims to obtain responses from a sample of about 500 schools for the school survey, and 3,146 teachers for the teacher survey. We estimate needing to include a school sample of 600 schools and a teacher sample of 3,700 teachers to obtain these response goals.

Subgrantee Sample. Grantees will be asked to provide a complete list of all the subgrantees within each of their respective states, including district names, in the event that the subgrantees are consortia. The comprehensive list of all subgrantees in all funded states will constitute the sampling frame for the subgrantee survey. We will include all subgrantees in all 11 grantee states.

School Sample. Grantees will be asked to provide a listing of schools and early childhood centers where funding was allocated. In the event that grantees are not able to provide the names of funded schools, the study team will request this information from subgrantees. The comprehensive list of all funded schools will constitute the sampling frame for schools.

Based on information obtained from the states, the school districts, and the CCD, we will classify all schools on the sampling frame into one of five non-overlapping sampling strata, as follows:

  1. Early childhood centers (ECC)

  2. Elementary schools without early childhood programs (ESO)

  3. Elementary schools with early childhood programs (ESW)

  4. Middle schools (MS)

  5. High schools (HS).

For purposes of sampling, schools will be assigned to one and only one stratum as follows:

  • ECC: includes pre-elementary center-based care, formal home-based care, state- and locally-funded pre-K, Early Head Start, and Head Start.

  • ESO: starts from kindergarten to Grade 4; may include all other grades up to and including Grade 12, and Common Core of Data (CCD) variable G_PK_OFFERED indicates pre-kindergarten not offered or not reported

  • ESW: starts from pre-kindergarten to Grade 4; may include all other grades up to and including Grade 12, and Common Core of Data (CCD) indicates pre-kindergarten offered

  • Middle school: starts from Grade 5 to Grade 8; may include higher grades up to and including Grade 12

  • High school: starts from Grade 9 to Grade 12, and ends in any grade up to and including Grade 12.

For purposes of sampling, the strata must be non-overlapping, and each school in the population must be assigned to a unique stratum. While the strata could, hypothetically, be defined in slightly different ways, we believe the strata we have proposed here will best achieve the analytical objectives of the survey. ESO and ESW are identical except the latter includes pre-kindergarten and the former does not.

For purposes of later analysis of the survey data, schools may be classified in various other ways to meet the objectives of the data analyst.

Based on information obtained from the states, the school districts, and the CCD, we will assign an urban-rural geographic code to each of the schools on the sampling frame based on the county in which the school is located, as follows:

  1. Large central metro counties in an Metropolitan Statistical Area (MSA) of 1 million or more population that (i) contain the entire population of the largest principal city of the MSA, (ii) are completely contained within the largest principal city of the MSA, or (iii) contain at least 250,000 residents of any principal city in the MSA.

  2. Large fringe metro counties in an MSA of 1 million or more population that do not qualify as large central metro counties.

  3. Medium metro counties in an MSA of 250,000-999,999 population.

  4. Small metro counties in an MSA of less than 250,000 population.

  5. Counties in MSAs.

  6. Counties not in MSAs.

Within each of the five sampling strata, we will sort the sampling frame of schools by state, urban-rural geographic code, and school district. The survey of schools will be designed to achieve 500 completed school questionnaires. The sample size will be allocated to the five sampling strata as shown in Exhibit 3.

Exhibit 3. Sample Size by Sampling Strata

Stratum

Completes

Reserves

Total Sample to be Selected

ECC

100

20

120

ESO

200

40

240

ESW

50

10

60

MS

75

15

90

HS

75

15

90

TOTAL

500

100

600

In addition to the base selection of 500 schools, we will select a “reserve sample” of 100 schools, or in other words, we will select 600 schools in total.

We will sample schools independently within each of the five explicit sampling strata using a method of equal probability systematic sampling. The resulting sample within each stratum may be viewed as a bundle of six systematic subsamples. One of the six subsamples will be designated the reserve sample and the other five subsamples will be released for data collection operations. Use of systematic sampling within stratum, effectively imparts on the sample of schools an implicit stratification by state and urban-rural status. This will ensure that the sample is appropriately balanced by school level, state, and urban-rural status.

We will attempt to obtain agreement to participate in the survey from each of the schools in the released sample. If a school in the released sample refuses to cooperate with the school survey, we will release a school from the reserve sample for data collection operations. The replacement school will come from the same sampling stratum as the noncooperating school and be as close as possible to the noncooperating school on the ordered list of schools. Replacements will continue to be released, if necessary, until 500 school responses are completed.

Due to the fact that state schedules for awarding SRCL grants and compiling information on participating schools differ, and the timeline for the evaluation is short, it may not be possible to assemble the combined list of schools ahead of the time schools are to be sampled. As a contingency plan in the event the listings of participating schools are still arriving on a flow basis at the deadline for sampling, we will determine sampling rates for schools in each of the five explicit sampling strata based on a projection of the total number of participating schools to eventually appear on the sampling frame within the stratum. Then, schools will be sampled systematically within stratum on a flow basis according to the sampling rates. Given this contingency plan, schools within stratum would effectively be sorted by district prior to sampling, but it may or may not be feasible to incorporate sorting of schools by state and urban-rural geocode prior to sampling.

Teacher Sample. The sample of 3,146 completed teacher surveys is designed to achieve a target precision (95 percent confidence interval) of plus or minus 5 percentage points for a dichotomous item for the population of teachers as a whole and for 5 reporting domains:

  1. Pre-Kindergarten,

  2. Elementary (Kindergarten – grade 2)

  3. Elementary (grades 3 – 5)

  4. Middle School (grades 6 – 8)

  5. High School (grades 9 –12)

After sampling schools for the school survey, we will obtain teacher rosters from each of the cooperating schools, which will be known as the sampling frame for teachers. We will classify each teacher on the sampling frame into one of the aforementioned five reporting domains. If a teacher qualifies for more than one domain, he or she will be classified into the first domain in the aforementioned order.

There may be some instances in which a teacher in the population teaches at more than one school. Because the survey responses for each survey will differ based on the school where the teacher is teaching, we intend to treat the teacher-school pair as the sampling and reporting units for the survey of teachers. This means that all such teachers will be duplicated on sampling frame and classified to domain separately within each of their schools. If such a teacher would be selected into the sample more than once as a member of two or more teacher-school pairs, we would conduct the teacher survey for each of the selected pairs.

The questionnaire content is specific to the school where a teacher is teaching. A teacher who teaches at two schools may respond to this questionnaire differently for each school depending on the unique circumstances of the school. Therefore, the survey should make inferences to the population of all teacher-school pairs. This can be accomplished by surveying teachers for each instance in which they are selected into the sample. The characteristics reported in the survey are characteristics of the teacher-school pair, not simply of the teacher nor of the school. That said, we anticipate it will be rare to find a teacher working in more than one school, and even rarer for a teacher to be selected into the sample more than once.

For planning purposes, we assume 150 of the cooperating schools (= 100 ECC schools plus 50 ESW schools) will contain teachers in the pre-kindergarten grade span; 250 of the cooperating schools (= 200 ESO schools plus 50 ESW schools) will contain teachers in the K-2 and 3-5 grade spans; the 75 cooperating MS schools will contain teachers in the MS grade span, and the 75 cooperating HS schools will contain teachers in the HS grade span.

To achieve the target precision requirements, we will select 4 teachers per school from the pre-kindergarten grade span, 2 teachers per school from the K-2 grade span, 2 teachers per school from the 3-5 grade span, 14 teachers per school from the 6-8 grade span, and 14 teachers per school from the 9-12 grade span. Then, assuming an 85% cooperation rate for teachers, we obtain the numbers of completed teacher surveys shown in Exhibit 4.

Exhibit 4. Number of Schools and Teachers by Grade Span

Teacher Grade Span

Cooperating Schools

Teachers to Select Per School

Total Teachers to Select

Completed Teacher Surveys

PK

150

4

600

510

K-2

250

2

500

425

3-5

250

2

500

425

6-8

75

14

1,050

893

9-12

75

14

1,050

893

TOTAL

500


3,700

3,146

The plan is to select an average of 7.4 (= 3,700/500) teachers per cooperating school and to complete an average of 6.3 (= 3,146/500) teacher surveys per cooperating school.

The sample sizes are driven by the precision requirement to achieve a 95 percent confidence interval (±5%) and to produce estimates for each of five reporting grade spans. The precision is a function of three factors: 1) the number of cooperating schools; 2) the number of teachers surveyed per school; and 3) the intra-class correlation (the correlation between teachers within the same school with respect to the questionnaire items). For example, for the grades 9-12 teacher grade span, there will be 75 cooperating schools in the sample (namely, the high schools). Our sample size calculations tell us 893 completed surveys of grade 9-12 teachers would be required to achieve the precision constraint assuming a correlation of 0.1 between two teachers in the same school, which is typical of teacher questionnaire items. If we select 14 such teachers per school, then, after allowing for nonresponse, the sample of 1,050 (= 14 × 75) teachers should provide the required 893 completed surveys. We carried out similar calculations for each of the other four teacher grade spans, and the resulting sample sizes are displayed in Exhibit 4.2

Working with the sampling frame for teachers, we will select teachers independently within each cooperating school and within each teacher domain represented within the school. We will select simple random samples without replacement of teachers within each school/domain using a system of permanent random numbers. If fewer teachers are present in the population in a given school/domain than are specified in Exhibit 4, we will select all of the teachers. If a school should contain extra domains not expected according to the school’s classification, we will select teachers from the extra domains as well as from the domains expected. If a school fails to contain domains expected according to the school’s classification, no teachers will be selected.

Approximately, 3,700 selected teachers will be invited to take the surveys.

Estimation for the Surveys

We will prepare estimation weights for each of the three surveys as follows:

School Districts: For the survey of school districts, the estimation weight for the -th cooperating district is defined simply by , where is the response rate for districts in the cell, , that contains the -th district. We will investigate the definition of cells later, but at this early juncture we would expect to consider grantee state and urban-rural status in their definition.

Selected Schools: For the survey of schools, we will maintain records of the probabilities of selection, denoted by for the -th school. The estimation weight for the -th cooperating school is defined simply by , where is the response rate for schools in the cell, , that contains the -th school. We would expect to consider grantee state, urban-rural status, and other known school characteristics in the definition of cells.

Selected Teachers: For the survey of teachers, we will maintain records of the probabilities of selection, denoted by for the -th school-teacher pair. The estimation weight for the -th completed survey is defined simply by , where is the response rate in the cell, , that contains the -th school-teacher pair. We would expect to consider grantee state, urban-rural status, and other known school and teacher characteristics in the definition of cells.

We will prepare standard errors for key statistics resulting from the surveys of schools and teachers. We do not anticipate preparation of standard errors because school district survey is to be a census.

Almost all of the key statistics to be produced from the surveys of schools and teachers will be in the form of ratio estimators or differences of ratio estimators. At this early stage in the planning of the surveys, we anticipate use of the Taylor series method for the estimation of all standard errors.

Sampling for Fidelity Site Visits

The purpose of the optional site visits is to gauge the fidelity with which selected interventions are being implemented and understand contextual factors that can shape local implementation. We will use the site visits to obtain evidence on the fidelity of implementation of five literacy approaches, curricula, or interventions. Although some information relevant to fidelity may be obtained through surveys or interviews, information on the quality of implementation will require direct observation of teacher instruction and interviews with staff.

To choose the five interventions, we will prioritize those most widely used by subgrantees, observing up to 100 classrooms in 50 schools (10 per intervention) in approximately 15 districts. The schools selected for the fidelity site visits will be nested within the survey sample and will be identified only after survey data collection is complete. To do so, we will rely on data from the subgrantee survey to identify the modal interventions. After identifying the top five interventions among all subgrantees, we will identify the schools that use these interventions, using data from the principal survey, which includes a question regarding literacy interventions, curricula, or other strategies in use.

Our intent is that these 50 schools be nested within 15 subgrantees. Thus, we will identify the subgrantees that have the highest concentration of schools engaged in these five interventions. From these subgrantees, we will seek a random sample of 50 schools that include high percentages of disadvantaged students, per the SRCL guidance. In these 50 schools, we will identify two teachers for observation, who also responded to the teacher survey.

2. Procedures for Data Collection

The procedures for carrying out survey and fidelity site visit data collection activities are described in the following section.

Survey Data Collection: Preparation

Pre-field Activities. For successful recruitment and data collection, we will employ a cascading process using a case ownership model for consistent outreach and documentation of the recruitment effort and data collection. In this model, the same staff person (Field Managers, reporting to the research team task leads) will “own” specific subgrantees (and all the sampled schools and respondents within these subgrantees) throughout recruitment and data collection. This consistency facilitates relationship building and helps us quickly identify and mitigate any concerns throughout the data collection cycle. The cascading process will allow us to follow the standard hierarchy of research approvals in each district or governing body.

A toll free line and project email account will be set up prior to any outreach to grantees, subgrantees, and schools in order to ensure effective communication. These accounts will both be monitored real time during typical business hours, and all contacts will be returned within 24 hours (business days).

Grantee Notification, Subgrantee and School Recruitment. Grantees will be notified of the evaluation and asked to provide of list of schools where funding was allocated. Each Grantee (state SRCL director) will be sent a notification letter via Priority mail that includes information about the study and what will be requested of sub-grantees and schools. It will outline the process for the team to conduct outreach at the districts and schools, as well as the timeline of activities. It will also include the contact information for project leadership, in the event of questions.

Subgrantee recruitment. The first step in securing the success of this study will be to acquire research approvals at the district level. Districts (subgrantees) typically have a research clearance protocol, often requiring forms to be completed prior to school outreach, even when studies are mandated. We will prepare succinct outreach materials that outline key aspects of the evaluation, such as the study purpose, sampling approach, timeline, and burden. We will notify participants sequentially, starting with grantees, as previously noted. After contacting grantees, we will notify subgrantees of the requirement to participate in the evaluation.

While all of the subgrantees will receive the sugbrantee survey, not all will be required to take part in the school based surveys. A slightly different subgrantee recruitment protocol will be used for subgrantees that have schools drawn into the school sample and those that do not, given that subgrantees with sampled schools will be requested to provide more extensive information. Once the sample of 500 schools (plus the 100 reserve sample schools) are identified using the sampling protocol, we will reach out to the subgrantees of the corresponding sampled schools and notify them of the requirement to participate in the evaluation, and the schools in which we plan to administer surveys. We will inform them of the schools selected for participation and complete any requirements needed for research clearance. We anticipate that about 150 of the 300 subgrantees will have sampled schools. For the other 150, we will request participation in the subgrantee survey only.

As part of the recruitment effort, we will send a notification letter via Priority mail to the district superintendent or research office. This mailing, as with all outreach mailings, will be done in order to bypass email spam filters, and we will follow up via telephone and email within two business days of expected receipt to ensure that the letter was received, and to answer any questions. Prior to mailing, as time permits, NORC Field Managers with expertise in similar district and school recruitment efforts, will attempt to identify any district specific research clearance forms or protocols required. To facilitate district-specific research clearance submissions, we will prepare a “generic” research clearance package that will include the study information and purpose, draft questionnaires, sampling and study design, school outreach materials, data collection procedures, incentive information, and organization (NORC, AIR, etc.) IRB certifications. Having this information at the ready is an extremely useful means for quickly communicating the research request, completing district specific forms, and increasing the rates of timely approvals. Approvals will be documented in both the Case Management System (CMS) that will be developed for this study, as well as via written (email or hardcopy) correspondence from the district. This approach to securing district recruitment has been effectively used in multiple recent studies by NORC, most recently the School Effectiveness in Indiana (SEI) study.3

School Recruitment: After securing district participation and assistance, we will contact schools, sharing our prepared outreach materials and any governing body approvals to avoid delays in securing agreement. At this time, we also will identify a school coordinator to assist in providing teacher rosters for sampling and prompting teachers and principals for any outstanding surveys during data collection. We will use the CMS database to manage the outreach, and district and school recruitment processes, documenting each step (including every contact).

As subgrantees provide approval, the school recruitment within the districts will begin. Advance letters addressed to the school principal, will be sent Priority mail, with a follow up email and phone call by the FM within two days of expected receipt. The goal will be to recruit a total of 500 sampled schools. If a school either refuses or cannot participate, a comparable school will be selected. This will all be documented within the CMS and conveyed to the sampling team.

During recruitment there are always instances when the selected schools refuse to cooperate or have a change in status (such as a closure, change in school type, change in grade allocation, have merged with another school, etc.). While we will try to have as up-to-date information as possible, we will plan to select a reserve sample of 100 schools in order to meet the goal of 500 recruited schools. In the case of teachers, where possible, we will select a new teacher for any teacher that will have an extended absence, such as medical or maternity leave, or for whatever reason is no longer at the school post-sampling. The sampling team will be alerted in these instances, the reason for replacement will be documented, and a school or teacher replacement will be selected. Once that occurs, the necessary steps to obtain school approval or to provide an advance survey letter will occur.

Survey Data Collection: Administration

All surveys will be administered via an online platform. The surveys will be programmed using the MRInterview platform, and for security purposes, will require a unique PIN/password to login. As noted earlier, a toll free line and project email account will be set up prior to any outreach to subgrantees, principals, and teachers. Should respondents encounter problems with the survey, a member of the survey administration team will respond within 24 hours.

Subgrantee Surveys: An advance letter with information about the study, and instructions for how to complete the survey, including the survey web link and login information, will be sent via Priority mail to the designated sub-grantee staff person (identified during recruitment). Field Managers and research staff will monitor the rates of completion, conducting non-response prompting efforts via telephone and email. Information regarding the status of individual cases within a district will be monitored using custom production reports and the CMS, which provides information in real time. Field Managers will conduct email and telephone outreach within two days of expected receipt of the letter, and will follow up with up to up to two (2) contacts (phone or email) per non-responding subgrantee.

Principal and Teacher Surveys: Upon OMB approval (expected early 2019), we will reach out to the School Coordinator at sampled schools to collect teacher roster data. School Coordinators will be provided a simple form to complete, and will be asked to provide the names and characteristics of active teacher staff as specified in the sample design plan, including grade and subject(s) taught. School Coordinators will be asked to complete the form (spread sheet template) within 5 business days. Upon receipt of the roster information, it will be reviewed for quality and completeness by the Field Manager and study team, with clarification contacts with the School Coordinator as needed. Once the roster is final, it will be imported into a master data base from which the sampling team will select teachers and maintain the sampling frame. Field Managers will make up to three (3) contacts (email or phone) per school to acquire a complete teacher roster. After teachers are selected for a school, the principals and teachers will be loaded into the CMS, unique IDs and survey PIN/passwords will be generated, and survey advance letters will be prepared for the school.

Once the sampled teachers are identified, we will generate advance letters with unique login information, and send those individual letters to each respective school coordinator via Priority mail for distribution to the individual school staff. Within two days of the expected receipt of packet of letters at the school, a Field Manager will contact the SC to ensure receipt and distribution, and respond to any questions or concerns. We will follow-up with the SC throughout data collection and ask him/her to assist with prompting teachers and principals that have outstanding incomplete surveys (an average of up to four contacts per SC – via email or phone).

Completed School: A school will be considered complete when all the teachers and principal have completed their surveys, or it has been determined by the project that no additional survey data will be obtained.

Procedures for Grantee Interviews

In the fall of 2018 and 2019, primary research staff will conduct one‑hour telephone interviews with the state level SRCL director for each of the grantee states. The study team will conduct nine interviews in the fall of 2018 and 11 in the fall of 2019. Prior to each interview, the interviewer will carefully review the state’s SRCL application and RFA to ensure he or she does not ask any questions that could be addressed through extant documents. The interview protocol for state SRCL grantees includes questions that are phrased in a clear, conversational manner that engages the respondent but ensures consistency across respondents. This approach allows for standardization of data across states, but provides adequate opportunity for respondents to elaborate on their responses and provide information about any unique features of SRCL implementation in their states. Each interview will be accompanied by a note‑taker and will be recorded digitally. Notes will be summarized following each interview. In the event that exact quotes or verification is needed, the audio file will be available as a backup.

Procedures for Fidelity Site Visit Interviews

To answer questions related to fidelity of implementation of evidence-based reading interventions, the study team will conduct classroom observations in 50 schools nested within 15 districts. In each school site, research staff will conduct interviews with principals, reading specialists, and teachers with whom the study team conducts classroom observations. (As noted previously, this OMB submission does not seek clearance for the observation component of the study, which imposes no burden and does not require any special preparation on the part of the teacher.) Two researchers will participate in each site visit, to facilitate on-site logistics, take notes during interviews, and to enable the team to accommodate any last minute changes.

Training for site visitors. Prior to the fidelity site visits, all site visit staff will convene in Washington, D.C., or by video-conference for a half‑day training session. The site visit team leads will jointly develop and conduct the training. The purpose of the training is to ensure common understanding of the site visit procedures, including the following: pre-visit activities such as reviewing extant data for the site, tailoring protocols, scheduling the visit, and communicating with site contacts; procedures to be followed during visits; and post-visit procedures such as following up with respondents to obtain greater clarification on a topic that was discussed or to collect additional materials or information that were identified during the visit as important for the study. This training for the site visit team also will focus on ensuring a shared understanding of the purpose of the study, the content of the protocols, and interview procedures. In terms of interview procedures, the training will include discussions and role-playing to learn strategies for avoiding leading questions, promoting consistency in data collection, and conducting interviews that are both conversational and systematic.

Prior to the training, the site visit task leaders will develop a site visit checklist that outlines all tasks the site visitors need to perform before, during, and after each visit. All site visit team members will adhere to this checklist to ensure that visits are conducted efficiently, professionally, and consistently.

Scheduling site visits and interviews. Each site visit team will be responsible for scheduling the visits and interviews for their assigned sites. For each site, the teams will work with district and school staff to identify appropriate respondents prior to the site visit and develop a site visit and interview schedule. Because of the anticipated variation in district and school enrollment (among other variables), the job titles and roles of interviewees will vary across sites.

Conducting interviews. In preparation for each visit and to facilitate and streamline the interviews with respondents, site visit teams will consider the reasons each jurisdiction was selected for inclusion in the purposive sample and review any relevant information from state education agency websites as well as other extant data collected on each site. The teams will then annotate each section of the individual interview protocols accordingly. These annotated notes will be used to tailor the wording of each question, as appropriate. When possible, both members of each site visit team will attend all interviews. In addition, each interview will be audio recorded with permission from the respondent(s).

Data management. In preparation for the site visits and while on site, the study team will use Microsoft OneNote to organize extant data, interview protocols, and audio files. Prior to each visit, the site visit team will enter extant data into OneNote for reference in the field. With respondent permission, all interviews will be audio-recorded through OneNote, which allows typed text to be linked with audio data. This feature of OneNote facilitates transcription as well as easy retrieval of audio data. During each interview, one team member will be conducting the interview and one will take simultaneous notes. We will seek to structure the day such that there is time to review and supplement interview notes while in the field that can help streamline and focus subsequent interviews with district and school staff based on the information that has been gathered.

Quality control. The case study site visit and interview data collection process will make use of the following quality control procedures: (1) weekly site visit debriefings among the team to identify logistical and data collection concerns; (2) a formal tracking system to ensure that we are collecting the required data from each site; and (3) adherence to the timely cleaning and posting of interview notes and written observations, as well as interview audio transcripts, to a secure project website for task leaders to check for completeness and consistency.

3. Methods to Maximize Response Rates

Data collection is a complex process that requires careful planning. The team has developed interview protocols and survey instruments that are appropriately tailored to the respondent group and are designed to place as little burden on respondents as possible. The team has used cognitive interviews with principals and district coordinators to pilot survey data collection instruments to ensure that they are user‑friendly and easily understandable, all of which increases participants’ willingness to participate in the data collection activities and thus increases response rates.

Recruitment materials will include the state education agency and the school district’s approval of the study. The materials will emphasize the social incentive to respondents by stressing the importance of the data collection to provide much‑needed technical assistance and practical information to districts and schools. In addition to carefully wording the recruitment materials, district coordinators and principals will be offered varied and sequenced options for completing and submitting the survey.

As previously mentioned, we will employ a “case ownership” approach to enhance communication and rapport with districts and schools. The role of the school coordinator is a critical one, as s/he will serve as a point person for all study requests, and encourage respondent participation in the study. We plan to offer the School Coordinator a $50 incentive for his/her time and effort, as this has been successfully used on past studies, including the School Effectiveness in Indiana study4, for which survey response rates exceeded 95% across public, private, and charter schools. To ensure efficient outreach, we will continue to use the CMS developed for managing recruitment to provide continuity in staffing and tailored follow-up. This will include regular monitoring by the staff person who “owns” the subgrantees for early identification of nonrespondents among schools or groups of teachers, allowing us to coordinate with school leadership to encourage full participation when necessary. As necessary, we will work with schools to accommodate their school calendars during busy times, such as assessment testing. We have found that this flexibility increases participation, as it acknowledges the burden on schools and their staff.

In addition, the study teams plans to offer $25 electronic gift cards as an incentive to teachers to encourage their participation. Once a completed survey has been verified through the web for the identified respondent, the survey contractor will email to the survey respondent a claim code number in a gift card template. We are aware that teachers are the targets of numerous requests to complete data collection instruments on a wide variety of topics from state and district offices, independent researchers, and that ED and several decades of survey research support the benefits of offering incentives. Specifically, we propose incentives for the teacher surveys to partially offset respondents’ time and effort in completing the surveys. We propose offering a $25 incentive to teachers each time he or she completes a survey to acknowledge the 25 minutes required to complete each survey. This proposed amount is within the incentive guidelines outlined in the March 22, 2005 memo, “Guidelines for Incentives for NCEE Evaluation Studies,” prepared for OMB. Incentives are proposed because high response rates are needed to ensure the survey findings are reliable and data from the teacher survey are essential to understand literacy instruction and professional supports in SRCL-funded schools.

In addition to the incentives, we will, as needed, develop text to be used in non-response prompting. This text will be consistent in messaging and will encourage participation, remind respondents of the incentives for participation, and most importantly, underscore the importance of the study. We will also create real-time reports (such as a Percent Complete Report) that will highlight schools with the lowest percentage of surveys complete, such that targeted prompting can occur for efficiently and effectively.

4. Expert Review and Piloting Procedures

To ensure the quality of the data collection instruments, the study team conducted an initial set of informational interviews, pilot tested the draft instruments, and will convene a Technical Working Group (TWG). As an important step in this instrument development process, the study team engaged in a series of informational interviews with current and former SRCL grantees, respecting the limits regarding the number of respondents allowable prior to OMB clearance (no more than nine). These informational interviews helped refine the organization and content of the survey questions and to ensure the study team drafted items that will result in high-quality data to address the study’s research questions.

These informational interviews do not replace piloting procedures, which constituted the second phase of the instrument development process. The study team conducted cognitive interviews with a limited set of principals and district officials to pilot the survey items, again respecting limits regarding the number of respondents prior to OMB clearance. In addition to providing an estimate of respondent burden time, the cognitive interviews included a debrief with the respondent about survey items or instructions that were difficult to understand, poorly worded, or had other problems. The cognitive interviews were used to revise and improve the survey.

5. Individuals and Organizations Involved in Project

AIR is the prime contractor for the Comprehensive Literacy Program Evaluation, supported through subcontracts to Abt Associates, NORC, and the Instructional Research Group (IRG). The project director is Dr. Jessica Heppen, and the co-Principal Investigators are Dr. Mike Garet and Dr. Barbara Goodson. The project leaders are supported by an experienced team of researchers leading the major tasks of the project. Contact information for the individuals and organizations involved in the project is presented in Exhibit 5.

Exhibit 5. Organizations, Individuals Involved in Project

Responsibility

Contact Name

Organization

Telephone Number

Co-Principal Investigator

Dr. Michael Garet

AIR

(202) 403-5345

Co-Principal Investigator

Dr. Barbara Goodson

Abt Associates

(617) 349-2811

Project Director

Dr. Jessica Heppen

AIR

(202) 403-5488

Deputy Project Director

Dr. Eric Isenberg

AIR

(312) 283-2409

Implementation Study Lead

Dr. Kerstin Carlson Le Floch

AIR

(202) 403-5649

Feasibility Study - Protocol Development Task Lead

Tim Silva

AIR

(202) 403-6708

Implementation Study - Survey Recruitment and Collection Task Lead

Cynthia Simko

NORC

(312) 759-4066

Implementation Study – Sampling Statistician

Kirk Wolter

NORC

(312)759-4206

Implementation Study - Evidence Review Task Lead

Dr. Madhavi Jayanthi

IRG

(714) 826-9600

Implementation Study - Site Visit Task Lead

Dr. Mary Jo Taylor

IRG

(714) 826-9600

In addition, the study team has secured a technical working group (TWG) of researchers and practitioners to provide input on the data collection instruments developed for this study as well as other methodological design issues. The TWG consists of researchers with expertise in issues related to literacy, instruction, grant implementation, and evaluation methods. The study team will consult the TWG throughout the evaluation. TWG members included the following:

  • Kymyona Burk, Mississippi Department of Education

  • Cynthia Coburn, Northwestern University

  • Thomas Cook, George Washington University

  • Barbara Foorman, Florida State University

  • Pam Grossman, University of Pennsylvania

  • Carolyn Hill, MDRC

  • James Kim, Harvard University

  • Susanna Loeb, Brown University

  • Timothy Shanahan, Center for Literacy, University of Illinois at Chicago

  • Sharon Vaughn, University of Texas—Austin



1 Author calculations are based on the Nation’s Report Card (see https://www.nationsreportcard.gov/reading_math_2015/#reading?grade=4).

2 We selected the target number of schools at each grade level in part based on the expected number of teachers per school; in general, middle and high schools have more teachers, and thus fewer schools are needed to achieve the desired teacher sample size.

3 http://www.norc.org/Research/Projects/Pages/school-effectiveness-in-indiana-study.aspx. NORC successfully recruited 577 public, private and charter schools across Indiana, in over 200 districts, yielding principal and teacher survey response rates of over 95% each.

4 http://www.norc.org/Research/Projects/Pages/school-effectiveness-in-indiana-study.aspx

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorInformation Technology Group
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy