WE 5.1.12 PRA Supporting Statement Part A_04.20.2023_clean_rev

WE 5.1.12 PRA Supporting Statement Part A_04.20.2023_clean_rev.docx

Evaluation of the REL West Supporting Early Reading Comprehension through Teacher Study Groups Toolkit

OMB: 1850-0982

Document [docx]
Download: docx | pdf

REL West Toolkit Efficacy Evaluation


SUPPORTING STATEMENT

FOR PAPERWORK REDUCTION ACT SUBMISSION


PART A: Justification


April 2023



Submitted to:

Institute of Education Sciences

U.S. Department of Education


Submitted by:

RAND Corporation

1200 South Hayes Street

Arlington, VA 22202

(703) 413-1100




























Tracking and OMB Number: (XX) XXXX-XXXX

Revised: XX/XX/XXXX


Overview


The U.S. Department of Education (ED), through its Institute of Education Sciences (IES), requests clearance for the recruitment materials and data collection protocols under the OMB clearance agreement (OMB Number (XX) XXXX-XXXX) for activities related to the Regional Educational Laboratory West Program (RELWest).


Elementary-grade students in U.S. public schools continue to struggle with reading comprehension, with only 35 percent of 4th-grade students performing at or above proficient on NAEP scores in reading (Hussar et al., 2020). To address this problem in earlier grades, when schools begin reading comprehension instruction, the REL West toolkit development team is developing a toolkit to support teachers in implementing evidence-based instructional strategies to improve reading comprehension among students in grades K–3. The toolkit is based on the Improving Reading Comprehension in Kindergarten Through 3rd Grade IES practice guide (Shanahan et al., 2010) and is being developed in collaboration with state and district partners in Arizona.


The REL West toolkit evaluation team is requesting clearance to conduct an independent evaluation that will assess the efficacy and cost-effectiveness of the school-based professional development resources included in the toolkit. The evaluation will also assess how teachers and facilitators implement the toolkit to provide context for the efficacy findings and guidance to improve the toolkit and its future use. The evaluation will take place in 70 schools across six districts in Arizona and focus on K–3 reading comprehension for all students.


A1. Circumstances Necessitating the Data Collection


As part of the REL solicitation request (Solicitation #91990020R0032), IES required each applicant to develop at least one research-based toolkit to support educators’ use of evidence-based practices, and to conduct an independent efficacy and implementation evaluation of the toolkit.


Per the solicitation:

IES is invested in developing practitioner-friendly toolkits to help educators use evidence-based practices in classrooms – from preschool through postsecondary settings. Some of the best evidence available is consolidated in the WWC Practice Guides, in which researchers and practitioners review the evidence from the most rigorous studies available, develop recommendations for practice, and create action steps for how to use the recommended practices. To help get this evidence into the hands of stakeholders, RELs shall partner with educators and postsecondary instructors (if relevant) to develop one toolkit based on an assigned WWC Practice Guide, which shall include all materials necessary for effective implementation.”


The toolkit contains the following three parts: 1) Initial Diagnostic and On-going Monitoring Instruments, 2) Professional Development Resources, and 3) Steps for Institutionalizing Supports for Evidence-Based Practice. The solicitation also states that RELs must evaluate the efficacy and implementation of the professional development resources in the finished toolkit. According to the solicitation, “(t)he evaluation shall examine changes in teacher practice and may also include measures of teacher knowledge and/or teacher self-efficacy.”


The purpose of this data collection will be to measure the efficacy and implementation of the REL West developed toolkit designed to improve reading comprehension among students in grades K–3. The toolkit evaluation will produce a report for district and school leaders who are considering strategies to improve reading comprehension in kindergarten through 3rd grade. The report will be designed to help them decide whether and how to use the toolkit to help them implement the practice guide recommendations. The report will also include information about how to improve the toolkit, even if the efficacy study demonstrates the toolkit had positive effects on teacher and student outcomes, so that the toolkit is as actionable and useful as possible to a wide number of educators.


A2. Purpose and Use of the Data


RAND Corporation is serving as the independent evaluator for the REL West toolkit. The impact, implementation, and cost-effectiveness research questions (RQs) addressed in this study include the following:

  1. What is the impact of the toolkit (including professional development materials and facilitator support) on K-3 students’ reading skills?

  2. What is the impact of the toolkit on Hispanic/Latinx K-3 students’ reading skills?

  3. What is the impact of the toolkit (including professional development materials and facilitator support) on second and third grade students’ reading comprehension?

  4. What is the impact of students’ receipt of reading instruction from a teacher who participated in toolkit-based learning on K-3 reading skills?

  5. What is the impact of the toolkit on teachers’ knowledge and use of the K–3 reading comprehension practices articulated in the practice guide?

  6. To what extent is the impact of the toolkit on students’ reading skills and Hispanic/Latinx students’ reading skills mediated by teacher knowledge and teacher practices after toolkit implementation?

  7. How are the planning, professional learning, and institutionalization activities embedded in the toolkit as implemented?

    1. To what extent are the toolkit activities implemented with fidelity to toolkit specification by teachers and facilitators?

    2. How do reading comprehension professional learning experiences differ between teachers in the treatment versus control conditions?

    3. What are the implementation challenges and strategies for addressing these challenges?

  8. What are the costs and cost-effectiveness of implementing the toolkit compared to business as usual?





The impact study will be a school-level, cluster-randomized controlled efficacy trial. The evaluation team will recruit and randomly assign 70 schools across six districts to the treatment condition (toolkit) or business as usual (control) in May 2024, after the collection of consent forms and baseline data (recruitment materials are attached in Appendix A). In schools assigned to the toolkit group, K–3 teachers and their administrators will be invited to use the toolkit materials with the guidance of a school-based facilitator. In control schools, K–3 teachers will not have access to the toolkit until after the study. Both groups will be asked to participate in study data collection using teacher instructional logs, surveys, knowledge tests, school leader and facilitator surveys, and district leader interviews (data collection communication materials are attached in Appendix B). The study will also collect administrative data, including background information on students and teachers, and student reading comprehension assessment scores.


Table 1 below provides an overview of the data that will be used for this study, including the name of the data set, type of data collection, years covered, observation level, condition (treatment or control group), key measures, and research questions addressed. The primary data collection instruments are attached in Appendix C.

Table 1 – Data Sources, Measures, and Research Question Overview

Data Source

Data Collection

Dates for Acquiring Data

Observation Level

Condition

Measures

RQ

Reading Assessment Data

Secondary (District)

February 2024, January 2025, May 2025

Student

Treatment, control

Reading comprehension scores

RQ1, RQ2,

District Administrative Data

Secondary (District)

February 2024, January 2025, May 2025

Student, Teacher, School

Treatment, control

Student characteristics, teacher characteristics, student-teacher links (rosters)

RQ3, RQ4, RQ6, RQ8

Toolkit Platform Statistics

Secondary

(IES)

May 2025

Teacher

Treatment

Teacher access of online toolkit materials

RQ1, RQ2, RQ3, RQ4, RQ5, RQ6, RQ7, RQ8

Training Records and Artifacts

Secondary

May 2024, September 2024, May 2025

Teacher, Facilitator

Treatment

Toolkit implementation

RQ4,

RQ7a

Teacher Survey

Primary

May 2024, September 2024, May 2025

Teacher

Treatment, control

Teacher knowledge, professional learning

RQ7

Teacher Practice Log

Primary

May 2024, September 2024, May 2025

Teacher

Treatment, control

Teacher practices

RQ5,

RQ7a, c

School Leader and Facilitator Survey

Primary

May 2024, September 2024, May 2025

Toolkit Facilitator, Principal

Treatment, control

Toolkit implementation, cost

RQ5

District Interview

Primary

May 2025

District Administrator

Treatment, control

Toolkit implementation, cost

RQ7a, c, RQ8


Data Collection Activities for Which Clearance is Requested as Part of this Package

District and School Recruitment. The Arizona Department of Education (ADE) will help connect the evaluation team to districts, and the team will leverage REL West’s established relationships throughout the state to set up the first meetings. ADE will make initial contact with district leaders, through phone calls or emails, to ask them to look at the study communications. Researchers will follow up within a day with informational materials and schedule a time to meet. If district leaders are interested in participating, the evaluation team will ask for their help contacting schools and their ideas for how the study might be a fit for their schools. Upon district agreement, the team will reach out to school principals. District leaders will be asked by the evaluation team to hold information meetings for principals. The researchers will then email each school an information package and schedule a school-specific conversation with the principal. If the school principal is interested, staff Q&A meetings by school will be held, and informational webinars across schools, to provide information directly to teachers and facilitators and to hear their thoughts. Researchers on the team will ask school principals, facilitators, and teachers to review and sign a brief consent statement prior to random assignment indicating that they understand the intervention and the study and will participate to the best of their ability, regardless of the condition to which they are assigned. This is a non-binding agreement. Schools in the study will be included if the principal, the facilitator, and at least one-half of the teachers in each grade make this commitment, and if the district does not require active consent from students/parents for participation in the study.


Teacher Knowledge Assessment and Implementation Survey. The project team will assess the impact of the toolkit on teachers’ pedagogical content knowledge of research-based approaches to teaching reading comprehension. The team will use an existing assessment with established reliability and validity, such as the Content Knowledge for Teaching and Reading (Phelps, 2009), and the Teacher Knowledge Survey (Jordan, Bratsch-Hines, & Vernon-Feagans, 2018). In addition to the knowledge assessment, the survey will include items to measure implementation of the toolkit and time commitment (for treatment teachers), and teacher background characteristics.


Teacher Logs. Teacher logs will be administered to measure perceptions of teachers’ use of the recommended teaching practices. The toolkit evaluation team proposes two rounds of logs, each covering two weeks. Teachers will complete one log, focused on a randomly selected focal student for all logs in that round. The data in each round of logs should be representative of the classes, but not of the students. The team plans to use the Study of Instructional Improvement Language Arts Log, which was designed for grades 1–5, and which captures frequency of instructional activities for a sample of students in the class. To ensure data quality, a two-part training session will be conducted with teachers (one hour per segment), and they will be asked to study the materials and try the tools between sessions. To help teachers report more accurately, the team plans to use strategies recommended by Rowan and Correnti (2009), that is, ask teachers to record one day at a time, focus on only one student at a time, limit the number of practices covered in the log, and provide training and support on completing the logs. Reliability of the reading comprehension focal unit at the teacher level is 0.74 (Rowan, Camburn and Correnti, 2004)


School Leader and Facilitator Survey. The school leader and facilitator survey will collect information about implementation, costs, and institutionalization of the toolkit, or on other professional development activities for control schools. Two types of school leaders will be surveyed: those responsible for toolkit facilitation at the school, and those responsible for resource allocation (likely the school principal). Schools may differ in whether the school leader takes some responsibility for toolkit activities, so we propose to ask the same questions of both types of respondents.


District Interview. The district interview will collect information about district supports for toolkit implementation, non-toolkit activities to support reading comprehension instruction, and costs for implementing the toolkit. Items will be developed by the evaluator. The interview will be conducted with two district leaders per district who are the most engaged with supporting instruction and will be conducted in two parts. We will ask the district respondents to complete a pre-interview worksheet using data from records to answer factual questions about district resource allocation for toolkit (or toolkit-like) activities. Following receipt of the worksheet, we will interview district respondents via telephone or video call.


Training Records and Artifacts. To assess implementation, the research team will collect training records (e.g., dates and number of participants for teacher study groups and observations) and artifacts (e.g., action plans). These data provide evidence that toolkit activities were completed as intended and complement the toolkit platform statistics. Researchers will provide facilitators and teachers with a secure email address to submit these materials and will send reminders along with the survey reminders. The research team will also collect artifacts documenting facilitator engagement in technical assistance activities provided by the toolkit developer, to provide insight into the need for additional supports for implementation. The evaluation team plans to collect artifacts from teachers and facilitators three times over the course of the evaluation, immediately prior to each survey administration.


Reading Assessment Data. Arizona’s Move On When Reading (MOWR) policy requires districts to select, and schools to administer, benchmark assessments of student reading skills in grades K–3 three times during the school year. Using these assessment data will limit the need for the study to test students, thus reducing burden.


District Administrative Data. Student-level administrative data will include FRPL eligibility, race, ethnicity, gender, English learner status, IEP status, age, and grade level. Masked student identifiers will be requested to allow the evaluation team to link administrative data over time and across multiple district sources. Teacher-level data will include years of experience, teacher demographics, and teacher access to professional development. Student-teacher links (classroom rosters) will be requested. School-level data will include school characteristics, such as school enrollment and charter status. The evaluation team will also request teacher rosters and email addresses in order to email teachers the invitation to complete the surveys and logs. The data sharing agreements (DSAs) with districts have not been finalized. However, copies of the DSAs can be provided to OMB upon request once they are finalized, and we fully expect the districts to come to an agreement given the interest expressed by them for this study.


Evaluation Activities for Which Clearance is not Requested as Part of this Package (provided for context)


Toolkit Platform Statistics. If possible, the study will collect information from the website managed by a separate ED contractor and that will host the toolkit materials, including individual-level toolkit platform statistics at the end of the implementation period. The data request will include teacher identifying information (email), frequency of login, amount of time spent reviewing materials, and number of teacher learning modules accessed. The study will collect these data through alternative sources in the event that we cannot obtain platform statistics. For example, training records and teacher surveys will provide data on participation in toolkit activities, and teacher logs will provide data on use of practice guide recommendations.

A3. Use of Technology to Reduce Burden

The data collection plan is designed to obtain information in an efficient way that minimizes respondent burden. Where feasible, the evaluation team will collect all possible data from administrative sources rather than through primary data collection. District staff will submit information electronically using secure file transfer procedures. The email address, to which respondents can electronically direct questions, will be included in the materials for preparing the teacher list.


Data that can only be obtained directly from school leaders, facilitators and teachers will be collected by RAND’s Survey Research Group (SRG) using through an online survey platform. SRG will manage the whole data collection process from questionnaire programming, sample management, and fieldwork monitoring. SRG will email to study participants a link to online surveys. To reduce the burden on respondents, the software is flexible, and allows survey respondents to participate using a multitude of devices like computers, PDAs, and Smart phones, and to switch between devices while completing the survey. When requested, questionnaires will be transmitted to and from the respondent by fax. A telephone number to a staffed help desk and electronic mail address are included in the questionnaire if anyone has questions. All question types for our data collection will be 508 compliant. Individual surveys will go through a Section 508 accessibility review before they are released for use to ensure accessibility. These procedures are designed to minimize the survey burden on respondents.


We will use secure methods for collecting data. Interviews will be conducted on Teams hosted by Microsoft 365 Government (GCC). The GCC cloud offering is a data enclave of Commercial, which is a segregated environment with servers residing in regional Azure data centers, and the data enclave in CONUS. REL West will ensure the interviews are secure, and only screened US Persons are authorized for customer content access.


A4. Efforts to Avoid Duplication of Effort


In an effort to avoid duplication of effort, this study will use extant administrative records where possible to understand the impact of the toolkit. The evaluation team will collect school-level characteristics such as size, level (elementary, middle, high), accountability status; teacher-level characteristics such as degree earned, race, gender, and job title; principal-level characteristics such as degree earned, race, gender, and job title; and student-level characteristics like student achievement to minimize the length of surveys administered directly to principals and teachers. The primary data collection that is part of this study only includes information that is not available from other sources.


A5. Methods to Minimizing Burden on Small Entities


The use of administrative records will reduce the burden on school educators by ensuring that only the minimum amount of original data is requested from schools in order to meet the objectives of this study. Aside from request for administrative records and the links surveys emailed directly to participants, the evaluation team will not contact schools to request additional data.


A6. Consequences of Not Collecting Data


The Education Science Reform Act of 2002 states that the central mission and primary function of the regional education laboratories is to support applied research and provide technical assistance to state and local education agencies within their region (ESRA, Part D, section 174[f]). If the proposed data were not collected, REL West would not be fulfilling its central mission to serve the states in the region and provide support for evidence-based research. The systematic collection and analysis of the data described above is required to accomplish the goals of the research project approved by IES. Participation in all data collection activities is voluntary. Information for site recruitment will be collected using the process described in response to question A2. This is a one-time study (i.e., not recurring) and therefore periodicity is not addressed.


A7. Special Circumstances


There are no special circumstances involved with this data collection. Data collected will be conducted in a manner consistent with the guidelines in 5 CFR 1320.5.


A8. Federal Register Announcement and Consultations Outside the Agency


A 60-day Federal Register Notice was published on January 10, 2023. One non-substantive comment was received during the 60-day comment period. A 30-day notice will be published.


In addition, throughout the course of this study, we will draw on the experience and expertise of Dr. Herb Turner, President and Principal Scientist at Analytica, and REL peer reviewers. All REL studies and study proposals undergo rigorous, external peer review as required by the Education Sciences Reform Act.


A9. Payments or Gifts

The evaluation team proposes to provide school leaders and facilitators a $50 gift certificate upon completion of each of two 30-minute surveys (baseline survey before the intervention, and follow-up survey after the intervention), for a total of $100 per person. Teachers completing the survey will receive a gift card of $30 for completing each of two 30-minute surveys (baseline survey before the intervention, and follow-up survey after the intervention) for a total of $60 per person. Teachers will also receive a gift card for $75 for completing each of two rounds of practice logs for a total of $150 per person. These amounts were set based on the hourly wage rate of principals ($47/hour) and teachers ($30/hour) and the estimated length of the survey. The proposed compensation amounts are consistent with current guidance from IES and link dollar amounts to the extent of burden.1

Incentives will be distributed electronically (i.e., a link to a gift card) after respondents complete the data collection instruments. Schools randomly assigned to the control group will receive $2,500 to be used on activities unrelated to the intervention.


A10. Assurances of Confidentiality


The data collection efforts that are the focus of this clearance package will be conducted in accordance with all relevant federal regulations and requirements. The REL West will be following the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183 requires “All collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment.


We will protect the confidentiality of the information participants provide, to the extent provided by law, and the information will only be used for the purpose of the study. No one at the school, district, or the state will have access to survey responses that include respondents’ names, school names, or other information that could potentially be used to identify individuals or schools. The project has been approved by RAND’s Human Subjects Protection Committee (Study ID #2022-N0312), which serves as RAND’s Institutional Review Board (IRB00000051) to review research involving human subjects. RAND is registered with the Office of Human Research Protection (OHRP) as a research institution (IORG0000034). RAND's Federal wide Assurance for the Protection of Human Subjects (FWA00003425, effective until February 18, 2026) serves as our assurance of compliance with federal regulations.


In addition, for student information, the data collection efforts will ensure that all individually identifiable information about students, their academic achievements, their families and information with respect to individual schools, shall remain confidential in accordance with section 552a of Title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act. The study will also adhere to requirements of subsection (d) of section 183 prohibiting disclosure of individually identifiable information as well as making the publishing or inappropriate communication of individually identifiable information by employees or staff a felony.


The evaluation team will protect the confidentiality of all information collected for the study and will use it for research purposes only. No information that identifies any study participant will be released publicly. Information from participating institutions and respondents will be presented at aggregate levels in reports. Information on respondents will be linked to their institution type (e.g., elementary schools vs. middle schools) but not to any individually identifiable information. No individually identifiable information will be maintained by the study team upon study completion.


We will protect the confidentiality of the information respondents, to the extent provided by law and the information will only be used for the purpose of the study. All members of the study team have obtained their certification on the use of human subjects in research. The following safeguards are routinely employed at RAND, the contractor that will execute this study, to carry out confidentiality assurances:

  • All employees at RAND working on this project will sign a confidentiality pledge emphasizing its importance and describing their obligations under it (please see Appendix H for the confidentiality pledge).

  • All research projects that have access to identifiable private or proprietary data need to have a Data Safeguarding Plan reviewed and approved by RAND’s Human Subjects Protection Committee. The Data Safeguarding Plan includes information on who is responsible for data safeguarding, the types of sensitive information to be transferred and stored, the mode of data transfer, client and respondent agreements, disclosure risks, audit and monitoring plans, and the procedures to be employed for data safeguarding.

  • Any electronic transmission and sharing of individually identifiable data will be encrypted. This procedure will prevent anyone without permission to access and enter the data system

  • Access to the data shall be limited to the minimum number of individuals necessary to achieve the approved purpose and to those individuals on a need-to-know basis only.

  • Identifiable data will be stored in a locked container when not in use. We will store original and derivative data files only on disks (e.g., servers, local hard disks) that are not routinely backed up. We will keep all hardcopy materials containing sensitive data in a locked file cabinet when not in use.

  • When no longer needed, we will discard sensitive output in a shredder or sensitive-waste container. We will destroy all individual linkages to data after a respondent ceases participation in the project.


Also, the REL study team has submitted to the NCEE security officer a list of the names of all people who will have access to respondents and data. The contractor, on behalf of ED, will track new staff and staff who have left the study and ensure that signatures will be obtained or clearances revoked, as necessary.


The evaluation team will make certain that all data are held in strict confidentiality, as just described, and that in no instance will responses or data be made available except in in aggregate statistical form. The following statement will appear on all letters to respondents on data collection:

Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district, school, or individual. The contractor will not provide information that identifies you or your district to anyone outside the study team, except as required by law. Any willful disclosure of such information for nonstatistical purposes, without the informed consent of the respondent, is a class E felony.”


A11. Justification for Sensitive Questions


There are no personally sensitive questions in this data collection. Teachers completing the survey will be asked questions to measure their pedagogical content knowledge in teaching reading to students in grades K-3, about the type of professional development activities they are enrolled in, their opinion about the quality of the professional development activities offered by the school district, and information about their background characteristics. Teachers completing the practice logs will be asked about the amount of time they spend on specific reading instruction activities. School leaders and facilitators completing the survey will be asked about the professional development activities offered at the school, the number of hours and staff involved in providing professional development, and background characteristics. District staff completing interviews will be asked about the professional development activities offered in the district and their opinions about the toolkit.


A12. Estimates of Hours Burden


There are three components for which the evaluation team has calculated hours of burden for this clearance package: recruitment activities, extant data provided by the districts, and survey data collected from study participants. Table 2 shows the hourly burden overall and for all three components. The total burden associated with this study, across three study years, is 2,510 hours, with an annualized burden of 1,255 hours over two years. The recruitment burden is 1,143.2 hours, the extant data collection burden is 105 hours, and the survey data collection burden is 1,262 hours. The annualized number of responses is 6,012 (for a total of 12,023 across all two years).


Table 2. Estimated Annual Burden and Respondent Costs Table

Information Activity

Sample Size

Respondent Response Rate

Number of respondents

Responses per Respondent

Number of Responses

Average Burden Hours per Response

Total Annual Burden Hours

Estimated Respondent Average Hourly Wage

Total Annual Costs

Recruitment

 

 

 

 

 

 

 

 

 

District recruitment contact (e-mail)

12

100%

12

1

12

0.05

0.6

$ 50.00

$ 30.00

District recruitment follow up (e-mail)

12

100%

12

1

12

0.05

0.6

$ 50.00

$ 30.00

District recruitment phone call

10

100%

10

1

10

1

10

$ 50.00

$ 500.00

First Principal contact (e‑mail)

100

100%

100

1

100

0.05

5

$ 50.00

$ 250.00

Recruitment Follow-up for nonresponding Principal

100

100%

100

1

100

0.05

5

$ 50.00

$ 250.00

Principal Information session meeting

100

80%

80

1

80

1

80

$ 50.00

$ 4,000.00

Principal recruitment phone call

100

80%

80

1

80

1

80

$ 50.00

$ 4,000.00

School Informational webinar

100

80%

80

1

80

1

80

$ 50.00

$ 4,000.00

First Teacher contact (e‑mail)

800

90%

720

1

720

0.05

36

$ 30.00

$ 1,080.00

Recruitment Follow-up for nonresponding Teacher

800

90%

720

1

720

0.05

36

$ 30.00

$ 1,080.00

School staff Q&A meeting

900

90%

810

1

810

1

810

$ 30.00

$ 24,300.00

Subtotal







1143

$ 39,520.00

Extant Data Collection

 

 

 

 

 

 

 

 

 

Student Assessment Data (3 requests for data)

1

100%

1

3

3

15

45

$ 30.00

$ 1,350.00

Administrative data on school, teachers and students (3 requests for data)

1

100%

1

3

3

20

60

$ 30.00

$ 1,800.00

Subtotal







105

$ 3,150.00

Survey Data Collection

 

 

 

 

 

 

 

 

 

Principal / Facilitator survey - request to take survey (4 emails per wave)

105

100%

105

4

420

0.05

21

$ 50.00

$ 1,050.00

Teacher survey - request to take survey (4 emails per wave)

720

100%

720

4

2880

0.05

144

$ 30.00

$ 4,320.00

Teacher log - request to complete log (4 emails per wave)

720

100%

720

4

2880

0.05

144

$ 30.00

$ 4,320.00

District Interview - request to participate (4 emails per wave)

12

100%

12

4

48

0.05

2.4

$ 50.00

$ 120.00

Principal / Facilitator Consent

105

100%

105

2

210

0.15

31.5

$ 50.00

$ 1,575.00

Teacher Consent

720

100%

720

2

1440

0.15

216

$ 30.00

$ 6,480.00

District Consent

12

100%

12

1

12

0.15

1.8

$ 50.00

$ 90.00

Wave 1 Principal /Facilitator Survey

105

85%

89.25

1

89.25

0.5

44.625

$ 50.00

$ 2,231.25

Wave 2 Principal / Facilitator Survey

105

85%

89.25

1

89.25

0.5

44.625

$ 50.00

$ 2,231.25

Wave 1 Teacher Survey

720

85%

612

1

612

0.5

306

$ 30.00

$ 9,180.00

Wave 2 Teacher Survey

720

85%

612

1

612

0.5

306

$ 30.00

$ 9,180.00

Subtotal







1262

$ 40,777.50

Totals







2510

$ 83,447.50




A13. Estimates of Cost Burden to Respondents


There are no other costs that are incurred.


A14. Annualized Cost to the Federal Government


The total cost to the federal government for work conducted over all five years is $2,028,511.24 and the estimated annualized cost to the federal government for each year of the study is $405,702.20.


Funding includes staff time for independent evaluators to recruit participants, collect, clean, and analyze data from the study. Also included are costs incurred by the independent evaluator and REL West staff related to study preparation and submission of the study information to IES (from proposed research design through reporting of results).


A15. Reasons for Program Changes and Adjustments


This is a new study.


A16. Plans for Tabulation and Publication of Results


a. Tabulation Plans


All results for REL rigorous studies will be made available to the public through peer-reviewed evaluation reports that are published by IES. The datasets from these rigorous studies will be turned over to the REL’s IES project officer.


After the study report is finalized, the evaluation team will prepare restricted-use data files in accordance with NCES standards. These files will contain all the primary survey data collected for the study with all personal identifiers removed. Thorough documentation will be provided for each data file, including a detailed codebook and explanations of the unit of observation, weights, and methods for handling missing data. These data will become IES restricted-use data sets requiring a user’s license that is applied for through the same process as NCES restricted-use data sets. Even the evaluation team would be required to obtain a restricted-use license to conduct any work with the data beyond the original evaluation.


b. Publication Plans

All results for REL studies are made available to the public through peer-reviewed reports that are published by IES. The data sets from these studies will be turned over to the REL’s IES project officer. These data may become IES restricted-use data sets requiring a user’s license that is applied for through the same process as National Center for Education Statistics restricted-use data sets (see http://nces.ed.gov/pubs96/96860rev.pdf for procedures related to obtaining and using restricted-use data sets). Restricted use files will be made available so other researchers can replicate the REL’s research or answer additional research questions. Restricted use files will not include administrative data, but instructions on how to obtain those data and information on how those data were used in the analysis will be made available. All restricted use files are required to be reviewed by IES’ Disclosure Review Board. The Disclosure Review Board (DRB) comprised of members from each NCES Division, representatives from IES’ Statistical Standards Program, and a member from each of the Institute of Education Sciences (IES) Centers. The DRB will review disclosure risk analyses conducted by the REL contractor to ensure that data released do not disclose the identity of any individual respondent. The DRB approves the procedures used to remove direct identifiers from restricted-use data file


The primary focus of the toolkit evaluation is to determine whether the toolkit improves students’ reading comprehension (RQ1). School and district leaders need rigorous evidence of impact to justify investing staff time and district resources in a new initiative. For Arizona and many other states with large Hispanic/Latinx student populations and low reading achievement for that population, it is also important to know whether the toolkit improves reading comprehension in that group of students (RQ2). Answers to RQs 3 through 6 will shine a light on the path through which the toolkit does (or does not) impact student outcomes, so educators can focus on the parts of the process that are most critical. For example, to what extent do students who received instruction from toolkit-trained teachers from the intervention benefit from the intervention (RQ3)? School and district leaders also need to know whether the toolkit improves teachers’ knowledge and use of effective reading comprehension instructional practices (RQ4) and whether that change is necessary to improve students’ reaching achievement (RQ5). This information can help them assess whether the training is leading to effects on students, in the short term. Knowing common challenges to implementation (RQ6)—and strategies that resolve those challenges—can help educators achieve strong implementation and potentially positive impacts. Finally, school and district leaders make decisions on which interventions to adopt based on costs as well as impact (RQ7), so it is useful to provide information on cost and cost effectiveness. The first five RQs are the impact analysis questions, with the first RQ designated as confirmatory analysis under the reading comprehension domain, and RQ2 through RQ5 designated as exploratory. RQ6 is the implementation analysis, and RQ7 is the cost analysis question.


No responses or data will be reported for individual staff members, students, or schools. Reported data will contain no fewer than four cases per reported table cell to protect confidentiality and mask individually identifiable data.


Project Time Schedule


The timeline for the activities in this project, including data collection, analyses and reporting are in Table 3.


Table 3. Project Timeline


Activity/Milestone

01

02

03

04

05

06

07

08

09

10

11

12

2022

Submit toolkit proposal













Toolkit development













Submit evaluation proposal













OMB review for evaluation














2023

Toolkit development













Toolkit usability study













Toolkit submitted to IES













OMB review for evaluation













Recruit evaluation sample














2024

Recruit evaluation sample













Conduct random assignment













Collect secondary data: Benchmark (student assessments)













Collect primary data: Teacher logs assessment and surveys; school leaders surveys














2025

Collect secondary data: student assessments













Collect primary data: Teacher logs, assessment, and surveys; school leader and facilitator surveys













District leader interviews













Analyze data














2026

Submit evaluation report to IES













Publication of evaluation report on IES website













Summary of revisions to toolkit based on efficacy evaluation results














A17. Approval not to Display the Expiration Date for OMB Approval


The Institute of Education Sciences is not requesting a waiver for the display of the OMB approval number and expiration date. The surveys and notification letters will display the expiration date for OMB approval.


A18. Exception to the Certification Statement


This submission does not require an exception to the Certificate for Paperwork Reduction Act
(5 CFR 1320.9).



References


Hussar, B., Zhang, J., Hein, S., Wang, K., Roberts, A., Cui, J., & Smith, M. (2020). The conditions of education 2020. Washington, DC: Retrieved from https://nces.ed.gov/pubs2020/2020144.pdf

Jordan, R. L. P., Bratsch-Hines, M., & Vernon-Feagans, L. (2018). Kindergarten and first grade teachers’ content and pedagogical content knowledge of reading and associations with teacher characteristics at rural low-wealth schools. Teaching and Teacher Education, 74, 190–204. doi: https://doi.org/10.1016/j.tate.2018.05.002

Phelps, G. (2009). Just knowing how to read isn’t enough! Assessing knowledge for teaching reading. Educational Assessment, Evaluation and Accountability, 21, 137–154. doi: 10.1007/s11092-009-9070-6

Rowan, B., Camburn, E., & Correnti, R. (2004). Using teacher logs to measure the enacted curriculum: A study of literacy teaching in third-grade classrooms. The Elementary School Journal105(1), 75-101.

Rowan, B., & Correnti, R. (2009). Studying reading instruction with teacher logs: Lessons from the study of instructional improvement. Educational Researcher, 38(2), 120–131. doi:10.3102/0013189X09332375


Shanahan, T., Callison, K., Carriere, C., Duke, N. K., Pearson, P. D., Schatschneider, C., & Torgesen, J. (2010). Improving reading comprehension in kindergarten through 3rd grade: A practice guide (NCEE 2010-4038). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. Retrieved from whatworks.ed.gov/publications/practiceguides


1 IES consulted the Bureau of Labor Statistics (BLS) Occupational Outlook Handbook to identify the most current information (currently from 2021) about educator wages to calculate reasonable incentive amounts. Across classroom educator (teacher) categories, the 2021 approximate annual wage is $61,500. Across principals, the approximate annual wage for 2021 is $98,420. By dividing the annual wages by 2080 hours, IES arrived a teacher hourly rate of $30/hour and a principal hourly rate of $47/hour.




12

Shape1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement Part A
AuthorAuthorised User
File Modified0000-00-00
File Created2024-11-22

© 2024 OMB.report | Privacy Policy