51410 REL CE_OMB Package Part A

51410 REL CE_OMB Package Part A.docx

Evaluation of the Toolkit to Support Evidence-Based Algebra Instruction in Middle and High School - Recruitment Activities

OMB: 1850-0988

Document [docx]
Download: docx | pdf

Shape1

Evaluation of the Toolkit to Support Evidence-Based Algebra Instruction in Middle and High School

Part A: Supporting Statement for Paperwork Reduction Act Submission

June 6, 2023









Evaluation of the Toolkit to Support Evidence-Based Algebra Instruction in Middle and High School

Part A: Supporting Statement for Paperwork Reduction Act Submission

June 6, 2023



Submitted to:

Submitted by:

U.S. Department of Education

Institute of Education Sciences

550 12th Street, S.W.

Washington, DC 20202

Project Officer: Amy Johnson

Contract Number: 91990022C0015

Mathematica

P.O. Box 2393

Princeton, NJ 08543-2393

Telephone: (609) 799-3535

Fax: (609) 799-0005

Project Director: Phillip Herman

Reference Number: 51410



Contents

A. Justification 1

Introduction 1

A1. Circumstances making the collection of information necessary 1

A2. Purposes and use of the information collection 2

A.2.1 Data collection activities for which clearance is requested as part of this OMB package 3

A.2.2 Data collection activities for which clearance is not requested as part of this OMB package (provided for context) 4

A3. Use of information technology to reduce burden 6

A4. Efforts to identify duplication 6

A5. Efforts to minimize burden in small businesses 6

A6. Consequences of not collecting the information 7

A7. Special circumstances justifying inconsistencies with guidelines in 5 CFR 1320.6 7

A8. Federal register announcement and consultation 7

A9. Payments or gifts 8

A10. Assurances of confidentiality 8

A11. Questions of a sensitive nature 9

A12. Estimates of response burden 10

A13. Estimate of total capital and startup costs/operation and maintenance costs to respondents or record-keepers 10

A14. Annualized cost to the federal government 10

A15. Reasons for program changes or adjustments 10

A16. Plans for tabulation and publication of results 11

A17. Approval not to display the expiration date for OMB approval 11

A18. Exception to the certification statement 12

References 13

appendix A: study recruitment letter

Appendix B: study recruitment flyer

appendix C: Study Recruitment screener tool

Appendix D: Confidentiality Pledge

Exhibits





A. Justification

Introduction

The Institute of Education Sciences (IES) within the U.S. Department of Education (ED) requests clearance for activities to support the recruitment of school districts to participate in an efficacy study of a Toolkit to Support Evidence-Based Algebra Instruction in Middle and High School as part of the Regional Educational Laboratory Central (REL Central) contract. A second OMB package, which will be submitted later this year, will request clearance for data collection instruments and the collection of district administrative data.

This efficacy study will compare the outcomes of teachers and students in grades 8 and 9 Algebra I classrooms in which teachers have access to the toolkit resources and receive the toolkit’s professional development supports (the treatment group) with the outcomes of teachers and students in similar grades 8 and 9 Algebra I classrooms in which teachers receive their business-as-usual professional development supports (the control group).

A1. Circumstances making the collection of information necessary

As part of the REL solicitation request (Solicitation #91990020R0032), IES required each applicant to develop at least one research-based toolkit to support educators’ use of evidence-based practices, and to conduct an independent efficacy and implementation evaluation of the toolkit. 

Per the solicitation:  

“IES is invested in developing practitioner-friendly toolkits to help educators use evidence-based practices in classrooms—from preschool through postsecondary settings. Some of the best evidence available is consolidated in the WWC Practice Guides, in which researchers and practitioners review the evidence from the most rigorous studies available, develop recommendations for practice, and create action steps for how to use the recommended practices. To help get this evidence into the hands of stakeholders, RELs shall partner with educators and postsecondary instructors (if relevant) to develop one toolkit based on an assigned WWC Practice Guide, which shall include all materials necessary for effective implementation.” 

The toolkit contains the following three parts: (1) Initial Diagnostic and On-going Monitoring Instruments, (2) Professional Development Resources, and (3) Steps for Institutionalizing Supports for Evidence-Based Practice. The solicitation also states that RELs must evaluate the efficacy and implementation of the professional development resources in the finished toolkit. According to the solicitation, “(t)he evaluation shall examine changes in teacher practice and may also include measures of teacher knowledge and/or teacher self-efficacy.”   

REL Central has developed a toolkit of professional learning supports to help Algebra I teachers learn about, make sense of, plan for, and implement three evidence-based Algebra I teaching practices that were identified in the related What Works Clearinghouse (WWC) practice guide Teaching Strategies for Improving Algebra Knowledge in Middle and High School Students. The three recommended practices are as follows: (1) use solved problems to engage students in analyzing algebraic reasoning and strategy; (2) teach students to use the structure of algebraic representations; and (3) teach students to intentionally choose from alternative algebraic strategies when solving problems.

The toolkit guides educators through a series of modules that focus on the recommended practices for teaching algebraic content. Within each module, district or school instructional leaders facilitate professional development sessions for algebra teachers focused on one recommended practice based on a guidebook for facilitators that is included with the toolkit. Teachers then begin using the practice—supported by a suite of online toolkit resources that include videos, sample lessons, worked examples, and interactive activities—to help them apply the practice in their classroom. Using the toolkit’s diagnostic instruments and resources, teachers and instructional leaders will collect data to assess their use of the practice and meet to discuss possible improvements. Finally, a facilitator guide for district and school instructional leaders will provide tips and resources for how they can institutionalize necessary district- or school-wide supports in order to sustain effective use of the recommended practices. All of the professional development materials, diagnostic instruments, and institutionalizing supports resources are contained in the toolkit in a manualized format so that users will have everything needed for implementation.

A small but rigorous evidence base suggests that the practices recommended in the WWC practice guide will lead to improved student performance in algebra when implemented by teachers. However, past studies have not examined the impact of providing a comprehensive resource toolkit for educators to train them on how to implement all three practices. Therefore, a rigorous evaluation of the efficacy and implementation of the toolkit is necessary to gather evidence about this set of resources and determine whether this type of toolkit could serve as a model for implementation support for math teachers more broadly. The toolkit will be made publicly available after the study is conducted, and this study will provide critical evidence to its potential users, which could include algebra teachers across the country. In addition, the study will provide implementation findings that can inform how the toolkit could be improved so that it is as useful as possible to a wide range of districts, schools, and teachers.

A2. Purposes and use of the information collection

This efficacy study will assess whether implementing the Toolkit to Support Evidence-Based Algebra Instruction in Middle and High School improves teacher and student outcomes and will describe the implementation of the toolkit in study schools that use it (the treatment group). Using a school-level randomized controlled trial (RCT) during the 2024–2025 school year, the efficacy study will estimate the impact of the toolkit on teachers’ self-efficacy and their understanding and use of the promising practices, as well as on students’ algebraic content knowledge, self-efficacy, and mathematical mindsets. With this design, the study team will compare teachers and students in schools implementing the toolkit with teachers and students in control schools that experience the district’s usual professional development supports.

To ensure the toolkit is well implemented, the study team will provide individualized technical assistance to teachers and instructional leaders within the treatment schools to assist their understanding and implementation of the toolkit. Although this additional support will not be available to schools using the toolkit once it is publicly released, it is important to ensure the toolkit is implemented properly for the efficacy study. Exhibit A.1 lists the efficacy study’s research questions.

Shape2

Exhibit A.1. Research questions

Impact research questions

  1. What is the impact of providing the toolkit to teachers on teachers’ knowledge of the practices recommended in the practice guide and teaching self-efficacy?

  1. What is the impact of providing the toolkit to teachers on teachers’ use of the practices recommended in the practice guide?

  1. What is the impact of providing the toolkit to teachers on students’ understanding of how to solve algebraic problems, including (1) identifying strategies for solving problems based on efficiencies and trade-offs, (2) understanding the structure of a representation and using it to determine a solution strategy, and (3) examining a solved problem to identify errors and alternative solution strategies?

  1. What is the impact of providing the toolkit to teachers on students’ other outcomes, including self-efficacy, mathematical mindset, and algebra achievement as measured by student assessments and course passage rates for students in introductory algebra?

Implementation research questions

  1. Were all the toolkit components implemented as intended?

  1. What were teachers’ and instructional leaders’ perceptions of how their capacity to implement the practice guide recommendations changed after using the toolkit? What aspects were perceived as most useful, and what improvements do they recommend for the toolkit? What challenges did they encounter, and how did they attempt to overcome those challenges?

  1. How did the professional development supports and resources available to algebra teachers differ in treatment and control schools?

After data collection and analysis, the study team will develop a white paper to share findings from the efficacy study. The white paper’s primary audience will be Algebra I teachers and instructional leaders, who will benefit from information on the extent to which the toolkit improves outcomes, the conditions under which the toolkit is perceived to be most useful, and potential challenges that may emerge when implementing the toolkit. IES and the REL Central team that developed the toolkit will be the secondary audience, who will benefit from information on potential refinements to the toolkit.

The efficacy study’s data collection activities are listed below. This package only requests clearance for district recruitment activities. The remaining data collection activities are provided as context. A separate OMB package will request clearance for the survey instruments, focus group and interview protocols, and associated data collection procedures.

A.2.1 Data collection activities for which clearance is requested as part of this OMB package

Study recruitment screener tool. The study team will use the Common Core of Data (CCD) from the 2021–2022 school year to develop a list of candidate districts in the REL Central region (Colorado, Kansas, Missouri, Nebraska, North Dakota, South Dakota, and Wyoming). Candidate districts will be those that have at least six schools that serve students in grades 8 or 9. Leveraging relationships through the REL’s work in the region, the study team will identify which districts might be the best suited to participate in the study and will prioritize recruiting those districts. More details about the types of districts that are most suitable for this study are in Section B1 of Part B under Respondent universe and sampling methods.

After OMB approval, the study team will send the study recruitment letter (Appendix A) via mail and email to the district leaders of the prioritized school districts. The recruitment letter introduces the toolkit and notifies the districts of their candidacy to participate in the study. This letter will be accompanied by the study recruitment flyer (Appendix B), which provides additional details about the toolkit and the study activities. Shortly after disseminating the recruitment letter and flyer, the study team will conduct a recruitment call with the district leaders of prioritized districts using the study recruitment screener tool (Appendix C). The study team will use the recruitment screener tool to gather information on the following topics to determine which districts and schools are suited for participation in the study and how the study would proceed if the district were selected:

  • The Algebra I curricula used in schools across the district, which will help the study team understand if the same curricula are used across all participating schools in a district and ensure the study assesses the impacts of the toolkit, rather than the impacts of different curricula.

  • The timing and format of Algebra I semester and end-of-course assessments used in the schools, which will inform whether districts have comparable and similarly timed Algebra I end-of-course assessments across all participating schools within the district, helping to ensure valid comparisons.

  • The number of Algebra I teachers in middle and high schools, which will inform whether the study will have sufficient statistical power. Ideally, each school in the study will have at least three Algebra I teachers.

  • The types of professional development supports Algebra I teachers receive, as districts that already provide extensive professional development supports to Algebra I teachers may not be well suited for the study because the contrast in supports between treatment and control schools may be limited.

  • The district’s interest in participating in this study, which will help the study team to prioritize ongoing recruitment efforts.

  • The process by which the study can gain district approval to conduct the efficacy study, which will help the study team plan for the study’s implementation should the district be selected.

Using information gathered through the recruitment screener calls, the study team will work closely with district leadership in districts that are most likely to participate in the study to determine which middle or high schools in the district are eligible for participation in the study. See Section B1, Respondent universe and sampling methods, for more details on how the study will use the recruitment data to inform the sampling process.

A.2.2 Data collection activities for which clearance is not requested as part of this OMB package (provided for context)

Note that the activities described in this section are provided strictly for context on the types of information the study will collect after recruitment. Approval for these activities is not being sought as a part of this OMB package. Instead, the study team will submit a second OMB package that provides additional information and data collection instruments for these activities.

Classroom student rosters. The study team will work with districts to collect student rosters in all Algebra I classrooms within the treatment and control schools in fall 2024. These classroom rosters will help the study team identify the sample of students for whom the study will collect outcome data. The classroom rosters will help with distributing and collecting parent/guardian consent for the student survey as well as implementing the student survey.

Teacher survey. The teacher survey (to be administered as a web survey in fall 2024 and spring 2025 to teachers in both treatment and control schools) will include questions to measure teacher self-efficacy, teacher pedagogical content knowledge of the recommended practices, and teacher use of the recommended practices. The survey will also include questions on teacher demographics, the type and amount of district professional development supports that treatment and control teachers receive, and the sustained implementation of the practices after teachers complete the toolkit’s modules (treatment group teachers only). These topics are aligned with the efficacy study’s research questions. Questions regarding teacher self-efficacy and teacher demographics will come from previously validated instruments; questions regarding teacher pedagogical content knowledge, use of the recommended practices, and type and amount of professional development supports will be developed by the study team.

Teacher implementation log. Teachers in the treatment group will be asked to complete an online implementation log at the conclusion of each of the four toolkit modules. These logs will include questions to measure the fidelity of toolkit implementation for each module, usefulness of toolkit components within each module, implementation challenges encountered during each module, and teacher capacity to implement recommended instructional practices in the classroom. These topics are aligned with the efficacy study’s research questions.

Instructional leader implementation log. Only instructional leaders associated with treatment group schools will participate in the study. These instructional leaders will be asked to complete an online implementation log at the conclusion of each of the four toolkit modules. These logs will include questions to measure the fidelity of toolkit implementation, usefulness of toolkit components, implementation challenges, and instructional leader capacity to support teacher implementation of the recommended instructional practices. These topics are aligned with the efficacy study’s research questions.

Instructional leader interview. Treatment school instructional leaders will be asked to participate in a virtual one-on-one interview in spring 2025. The study team will ask instructional leaders to describe the usability and usefulness of the toolkit components, the implementation challenges they experienced and their recommendations for improvement, and the extent of their capacity to support teacher implementation of the recommended practices and the collaborative learning process. The study team will use the qualitative data from the interviews in an illustrative manner to supplement the quantitative implementation findings that emerge from the instructional leader implementation logs. These topics are aligned with the efficacy study’s research questions.

Teacher focus group. All teachers in the treatment group will be invited to participate in virtual focus groups in spring 2025. The study team will ask teachers to describe how useful and usable they found the toolkit components to be, the implementation challenges they experienced and their recommendations for improvement, and their capacity to implement the recommended practices in their classroom. The study team will use the qualitative data from the focus group in an illustrative manner to supplement the quantitative implementation findings that emerge from the teacher survey and implementation logs. These topics are aligned with the efficacy study’s research questions.

Student survey. The study team will work with all teachers in the treatment and control Algebra I classrooms to administer a paper survey to students in spring 2025. This survey will gather data on student self-efficacy and mathematical mindset, students’ understanding of how to solve algebraic problems, and student perceptions of teacher instructional practices in the classroom. These topics are aligned with the efficacy study’s research questions.

District administrative data. The study will gather student-level administrative data covering the 2024–2025 school year from districts participating in the study. The study will request student demographic data, algebra assessment data, and algebra passage rates in summer 2025.

A3. Use of information technology to reduce burden

The data collection plan is designed to obtain information in an efficient way that minimizes respondent burden. The study team will gather recruitment information over the phone, while also allowing districts to provide some of the requested information electronically via email if that is less burdensome to the district.

The other data collection activities, for which the study team will seek approval through a second OMB package, will include a web-based survey platform and web-based teleconference platform such as Zoom or Webex.

  • Web-based data survey platform. The study team will use a web-based survey platform to collect data for the teacher survey, teacher implementation logs, and instructional leader implementation logs. Respondents can use the web-based survey platform to complete the data collection instrument at a location and time of their choice, and the platform’s built-in editing checks and programmed skip patterns will reduce the level of response errors.

  • Web-based teleconference platform. The study team will use a web-based teleconference platform, such as Webex, to conduct the teacher focus groups and instructional leader interviews virtually. Using a teleconference platform will enable participants to join the focus group or interview from a location that is convenient to them, reducing the burden of participation.

A4. Efforts to identify duplication

The study team will not collect information that is already available from alternative data sources for this project. No population frame exists that the study team can use to identify districts and schools that are most suitable for the study; the recruitment process is key to identifying suitable districts and schools.

The other data collection activities, for which the study team will seek approval through a second OMB package, will draw on information that is already available from alternative data sources to the extent possible. For example, the study team will collect students’ Algebra I assessment scores through the district administrative data rather than through a separate assessment. However, information obtained from the teacher and instructional leader implementation logs, teacher and student surveys, teacher focus groups, and instructional leader interviews are not available elsewhere. In addition, the data collection plan reflects careful attention to the potential sources of information for this study, particularly to the reliability of the information and the efficiency in gathering it.

A5. Efforts to minimize burden in small businesses

During recruitment, the study team will schedule recruitment calls with district staff at a time convenient to district staff schedules. Before the call, the study team will use publicly available sources, such as the Common Core of Data, to collect information about the districts and schools that can inform recruitment, only asking the district for information that was not available.

The other data collection activities, for which the study team will seek approval through a second OMB package, will not affect any small businesses, but some of the schools might be small entities. To minimize the burden on these entities, the study team will provide participating schools with a secure, web-based system that teachers can use to complete their survey and teachers and instructional leaders can use to complete their implementation logs on their computer, laptop, tablet or phone. The study team will also communicate with schools in advance of the data collection period to ensure they understand the purpose of the study and the information the study will ask them to provide.

A6. Consequences of not collecting the information

This study aims to provide evidence relevant to the types of schools that may benefit most from the toolkit, including high-need schools, schools that reflect the variety of Algebra I classrooms in middle and high school settings, and those that have substantial diversity in students’ racial and ethnic characteristics. If the study does not collect these data during recruitment, the study team will be unable to successfully select and recruit into the study a sample of districts and schools that meet this aim. In addition, the information the study team collects during recruitment will help select only those districts and schools that can feasibly follow the study protocol and implement the toolkit. Not collecting this information will increase the risk of failing to learn about the efficacy of this toolkit in improving teachers’ instructional practices and students’ algebra skills.

The data collection activities for which the study team will seek approval in a second package will contribute to understanding the toolkit’s potential to affect student and teacher outcomes. If this study does not collect data from the surveys, implementation logs, focus groups, interviews, and district administrative records, the study team will be unable to draw conclusions about the toolkit’s effect on student and teacher outcomes.

A7. Special circumstances justifying inconsistencies with guidelines in 5 CFR 1320.6

This data collection has no special circumstances associated with it.

A8. Federal register announcement and consultation

a. Federal register announcement

A 60-day notice to solicit public comments was published in the Federal Register (88 FR 43330) on July 7, 2023. No public comments were received.

The 30-day notice will be published to solicit additional public comments.

b. Consultations outside the agency

The study team will draw on its relationships through the REL partnerships and stakeholders to identify promising school districts in the REL Central region to prioritize for recruitment, including leaders from state education agencies, local education agencies, and the REL’s Governing Board (Exhibit A.2).

In addition, throughout the course of this study, we will draw on the experience and expertise of Dr. Hanley Chiang from Mathematica. The study proposal has also gone through external peer review as required by the Education Sciences Reform Act (ESRA) for all REL studies, and the study proposal will undergo IRB (Institutional Review Board) review from Health Media Lab in summer 2023.

Shape3

Exhibit A.2. REL Central Governing Board members to consult for district recruitment

Name

Title and affiliation

Kirsten Baesier

State Superintendent, North Dakota Department of Public Instruction (ND)

Matt Bakke

Superintendent, Devils Lake Public Schools (ND)

Lyndsi Engstrom

Director of Research, Design, and Value, State Board of Public Schools and Central Regional Education Association (ND)

Dan Jorgensen

Accountability Support Manager, Colorado Department of Education (CO)

Paul Katnik

Assistant Commissioner, Missouri Department of Elementary and Secondary Education (MO)

Josh Males

K-12 Curriculum Specialist for Mathematics, Lincoln Public Schools (NE)

Laurie Matzke

Assistant Superintendent, North Dakota Department of Public Instruction (ND)

Catherine Palmer

Assessment Supervisor, Wyoming Department of Education (WY)

Debra Smith

Superintendent, Fort Washakie Schools Fremont County #25 (WY)

Kelly Spurgeon

Education Program Consultant, Kansas State Department of Education (KS)

Juliana TakenAlive

Education Program Specialist, Standing Rock Sioux Tribe Tribal Department of Education (ND)

Shirley Vargas

School Improvement Officer, Nebraska Department of Education (NE)

A9. Payments or gifts

Participation in the efficacy study of a Toolkit to Support Evidence-Based Algebra Instruction in Middle and High School will place minimal burden on district staff, teachers, and instructional leaders. The current information request focuses on work with district staff to gather data for recruitment, and there are no incentives associated with the recruitment activities.

As part of the recruitment activities proposed in this information collection, the recruitment screener tool (Appendix C) describes the incentives for upcoming data collection activities. A future information collection request will provide detailed information on the data collection procedures and instruments, including incentives.

A10. Assurances of confidentiality

a. Personally identifiable information

The information districts provide during the recruitment process will include personally identifiable information of school district staff, including district administrators, school principals, and Algebra I teachers, such as their work email addresses and phone numbers. The study team needs this information to conduct outreach to these district staff, school principals, and teachers to inform them about the study and confirm their willingness to participate in the study.

The data collection activities, for which the study team will seek approval through a second OMB package, will collect classroom rosters from the participating Algebra I classrooms. The study team needs this information to obtain parent/guardian consent for students to take the student survey in spring 2025. The study team will also use the classroom rosters to link the students to the district administrative data.

b. Assurances of privacy

The study team has established procedures to protect the confidentiality and security of its data. This approach will comply with all relevant regulations and requirements, in particular the Education Sciences Reform Act of 2002, Title I, Subsection (c) of Section 183, which requires the director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.”

The study team will protect the privacy and confidentiality of all data collected during the recruitment and data collection activities and will use it for research purposes only. The principal investigator will ensure that all personally identifiable information about participants remains confidential. All data will be kept in secured locations, and identifiers will be destroyed as soon as they are no longer required. All members of the study team with access to the data will be trained and certified on the importance of confidentiality and data security. When reporting any results, the study team will present data only in aggregate form so that individuals, schools, and districts are not identifiable.

Mathematica routinely uses the following safeguards to maintain data confidentiality and will apply them consistently to this study:

  • All Mathematica employees are required to sign a confidentiality pledge (Appendix D) that emphasizes the importance of confidentiality and describes employees’ obligations to maintain it.

  • Personally identifiable information is maintained on separate forms and files, which are linked only by random, study-specific identification numbers.

  • Access to hard-copy documents is strictly limited. Documents are stored in locked files and cabinets. Discarded materials are shredded.

  • Access to computer data files is protected by secure usernames and passwords, which are available only to specific users who need to access the data and who have the appropriate security clearances.

Mathematica’s standard for maintaining confidentiality includes training staff on the meaning of confidentiality, particularly as it relates to handling requests for information and assuring respondents about the protection of their responses. It also includes built-in safeguards on status monitoring and receipt control systems. In addition, all study staff who have access to confidential data must obtain security clearances from ED, which requires completing personnel security forms, providing fingerprints, and undergoing a background check.

A11. Questions of a sensitive nature

The recruitment activities for which approval is requested in this OMB submission will not include questions of a sensitive nature. The data collection activities, for which the study team will seek approval through a second OMB package, will collect race/ethnicity and gender data on teachers and students. The study will collect this information on students through administrative data obtained from the district and on teachers through a teacher survey administered in fall 2024 by the study.

Collecting these data will strengthen the study in four important ways. First, the data on the race/ethnicity and gender of teachers and students will be used to describe the samples of students and teachers that participated in the study, which will provide valuable information about the context of the study and generalizability of the findings to other districts that may wish to use the toolkit. Second, the data on the race/ethnicity and gender of teachers and students will be included as covariates in the impact analysis, which will lead to precision gains and allow the study to achieve the necessary statistical power. Third, the data on the race/ethnicity and gender of students and teachers will be used to test that random assignment was implemented as intended, a critical aspect of the study’s design. Fourth, data on the race/ethnicity of students will be used to conduct subgroup analysis, which will indicate whether the impacts of the toolkit differed for different types of students, potentially informing how the toolkit can be implemented most effectively in the future.

A12. Estimates of response burden

The preliminary activities for which approval is requested in this submission include outreach to prioritized districts for recruitment. The total response burden for these data collection activities is 30 hours. This is a one-time series of recruitment activities and there are no plans for follow-up or other recurring collections outside of what is being proposed in this package.

Exhibit A.3 shows estimates of time burden for the recruitment screener tool, which will be conducted with district-level staff. To compute the total estimated annual cost, the study team multiplied total burden hours by the average hourly wage for district-level staff, based on median wages from the Bureau of Labor Statistics, Usual Weekly Earnings of Wage and Salary Workers (first quarter of 2023). For district-level staff, the study team used the wage for “Professional and related occupations” (Table 4).

Shape4

Exhibit A.3. Estimated respondent time burden and cost

Respondent type and data collection activity

Time per response (hours)

Maximum number of responses per respondent

Number of respondents

Total time burden (hours)

Average hourly wage

Cost per response

Total cost burden

District-level staff

Recruitment screener tool

1

1

Up to 30

30

$36.43

$36.43

$1,092.90

Total hours and costs across all years




30



$1,092.90

Note: Average hourly wage for district-level staff is the wage for “Professional and related occupations” from the U.S. Bureau of Labor Statistics (1st quarter 2023 – Table 4).

A total of 30 hours assumes that the efficacy study team will reach out to an estimated 30 school district leaders starting in fall 2023. Each response (participating in a recruitment screener call) is expected to take an average of one hour.

A13. Estimate of total capital and startup costs/operation and maintenance costs to respondents or record-keepers

This data collection has no direct, startup, or maintenance costs to respondents or record-keepers.

A14. Annualized cost to the federal government

The estimated cost to the federal government for this study—including preparing recruitment OMB clearance forms, the data collection OMB clearance forms, implementing the intervention, conducting data collection, analyzing the data, preparing the reports, and creating data files—is approximately $305,827 per year.

A15. Reasons for program changes or adjustments

This is a request for a new collection of information.

A16. Plans for tabulation and publication of results

a. Analysis plan

The activities for which this first OMB package requests clearance will not be directly tabulated and published, but rather will be used to facilitate the selection of the study sample. The study team will use the information gathered to understand what differences may exist within and across candidate study districts. The study team will aim to recruit a sample of districts that include high-need middle and high schools, with at least half of the schools having a student population in which at least 50 percent of students are eligible for free or reduced-price lunch subsidies.1 This demographic reflects the variety of Algebra I classrooms in middle and high school settings that may adopt the toolkit.

Once the sample is selected, the study team will conduct the data collection activities that will be included in the second OMB package, including surveys, implementation logs, focus groups, and interviews. The primary analysis of the data from these instruments will rely on a study design that includes randomly assigning schools within a district to either implement the toolkit or not implement the toolkit. The use of random assignment will ensure the differences in mean outcomes for treatment schools and control schools are an unbiased estimate on the impact of the toolkit for participating schools. To increase the precision of the estimates, the analysis will control for students’ and teachers’ baseline characteristics that can explain some differences in outcomes. The study will also conduct a descriptive analysis of implementation, which will include both a quantitative analysis of elements of implementation based on the logs and surveys and a qualitative analysis of implementation themes, such as successes and challenges, based on data from the teacher focus group and instructional leader interviews.

b. Publication plan

Although most of the information the study will collect through the recruitment activities will be for internal use only, the study team will report some of the information from the recruitment screener about the professional development that districts offer. The team will use this information to answer the research question about how professional development supports for Algebra I teachers differed for those in the study’s treatment and control groups.

Using the data gathered through the study’s data collection activities that will be included in the second OMB package, the study team will produce and disseminate a white paper on the efficacy study findings with an expected release in 2026. The primary audience for the white paper consists of Algebra I teachers and instructional leaders, as it will provide them with information on the extent the toolkit improved outcomes as well as implementation context and challenges. The secondary audience consists of IES and the REL Central team that developed the toolkit, as the white paper will inform potential refinements to the toolkit.

A17. Approval not to display the expiration date for OMB approval

IES is not requesting a waiver for the display of the OMB approval number and expiration date. The study will display the OMB approval number and expiration date at the top of the study’s recruitment letter, recruitment flyer, and recruitment screener tool. The study staff will offer to read the OMB number and expiration date at the beginning of the recruitment call.

A18. Exception to the certification statement

No exceptions to the certification statement are requested or required.



References

Katz, V.J. (Ed.). “Algebra: Gateway to a Technological Future.” Washington, DC: Mathematical Association of America, 2007.

Social and Character Development Research Consortium. “Efficacy of Schoolwide Programs to Promote Social and Character Development and Reduce Problem Behavior in Elementary School Children.” NCER 2011–2001. Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Research, 2010.

Susa, A., A. Bubic, A. Vrbanc, and M. Planinic. “Development of Abstract Mathematical Reasoning: The Case of Algebra.” Frontiers in Human Neuroscience, vol. 8, 2014. https://doi.org/10.3389/fnhum.2014.00679.





Mathematica Inc.

Princeton, NJ • Ann Arbor, MI • Cambridge, MA
Chicago, IL • Oakland, CA • Seattle, WA
Woodlawn, MD • Washington, DC

mathematica.org website

EDI Global, a Mathematica Company

Operating in Tanzania, Uganda, Kenya, Mozambique, and the United Kingdom

Mathematica, Progress Together, and the “spotlight M” logo are registered trademarks of Mathematica Inc.

1 This definition of high need is that used in some federal programs, such as the Teacher and School Leader Incentive Program (86 FR 18519, 2021).

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy