Data Collection for the Evaluation of the REL Appalachia Teaching Math to Young Children Toolkit
PART A: Justification
April 2024
Submitted to:
Institute of Education Sciences
U.S. Department of Education
Submitted by:
SRI International
333 Ravenswood Ave
Menlo Park, CA 94025
(650) 859-2000
Tracking and OMB Number: 1850-0991
Revised: April 8, 2024
The U.S. Department of Education (ED), through its Institute of Education Sciences (IES), requests clearance for data-collection instruments and the collection of district administrative data, as a revision to the Office of Management and Budget (OMB) clearance agreement (OMB Number 1850-0991) for activities related to the Regional Educational Laboratory Appalachia (REL AP) program.
Mathematics knowledge acquired in early childhood provides a critical foundation for long-term student success in math as well as reading (Duncan et al., 2007; Watts et al., 2014), but the professional development (PD) and curricular support for preschool teachers often lack specific content and training on high-quality math instruction delivered by math content experts. To address this problem, REL AP is developing a toolkit to support preschool teachers in implementing core teaching practices essential to promoting early math skills and knowledge in children. The toolkit is based on the Teaching Math to Young Children What Works Clearinghouse (WWC) practice guide (Frye et al., 2013) and is being developed in collaboration with state and district partners in Virginia.
REL AP is requesting clearance to conduct an evaluation to assess the efficacy of the professional development resources included in the toolkit. The evaluation will also assess how teachers implement the toolkit to provide context for the efficacy findings as well as guidance to improve the toolkit and its future use. The evaluation will take place in 50 schools across approximately 10 school divisions in Virginia and focus on mathematics teaching practices and student mathematics knowledge and skills in preschool classrooms.
As part of the REL solicitation request (Solicitation #91990020R0032), IES required each applicant to develop at least one research-based toolkit to complement a WWC Practice guide in order to support educators’ use of evidence-based practices, and to conduct an independent efficacy and implementation evaluation of the toolkit.
Per the solicitation:
IES is invested in developing practitioner-friendly toolkits to help educators use evidence-based practices in classrooms — from preschool through postsecondary settings. Some of the best evidence available is consolidated in the WWC Practice Guides, in which researchers and practitioners review the evidence from the most rigorous studies available, develop recommendations for practice, and create action steps for how to use the recommended practices. To help get this evidence into the hands of stakeholders, RELs shall partner with educators and postsecondary instructors (if relevant) to develop one toolkit based on an assigned WWC Practice Guide, which shall include all materials necessary for effective implementation (pp. 44–45).
This data collection is consistent with the authorizing legislation of the REL Program, the Education Sciences Reform Act (ESRA) of 2002 (see appendix A). Part D, Section 174(f)(2) of ESRA states that as part of their central mission and primary function, each regional educational laboratory “shall support applied research by. . . developing and widely disseminating, including through Internet-based means, scientifically valid research, information, reports, and publications that are usable for improving academic achievement, closing achievement gaps, and encouraging and sustaining school improvement, to — schools, districts, institutions of higher education, educators (including early childhood educators and librarians), parents, policymakers, and other constituencies, as appropriate, within the region in which the regional educational laboratory is located.”
The toolkit contains the following three parts: (1) Initial Diagnostic and Ongoing Monitoring Instruments, (2) PD Resources, and (3) Steps for Institutionalizing Supports for Evidence-Based Practice. The solicitation also states that RELs must evaluate the efficacy and implementation of the professional development resources in the finished toolkit. According to the solicitation, “[t]he evaluation shall examine changes in teacher practice and may also include measures of teacher knowledge and/or teacher self-efficacy.”
The Early Math Toolkit will address core teaching practices essential to promoting early math skills and knowledge in preschool children. Using the recommendations in the IES Teaching Math to Young Children practice guide (Frye et al., 2013) as a basis, the toolkit developers identified a set of teaching practices that operationalize the recommendations so teachers can focus on a specific set of actions to implement in the classroom. The toolkit addresses Recommendation 1: Teach number and operations using a developmental progression; Recommendation 3: Use progress monitoring to ensure that math instruction builds on what each child knows; Recommendation 4: Teach children to view and describe their world mathematically; and Recommendation 5: Dedicate time each day to teaching math and integrate math instruction throughout the school day.
Preschool teachers assigned to the Early Math Toolkit intervention condition will be invited to participate in five PD modules and implement the practice guide recommendations to promote early mathematics learning throughout the school year. The first module is an introductory module that should take a week to complete, and the remaining content modules will each take four weeks to complete. The PD modules are designed to increase the teachers' knowledge and skills with respect to planning and applying evidence-based math teaching practices. As teachers' knowledge increases and skills improve, they will use more evidence-based practices and will do so more effectively. They include an introductory module that outlines the recommendations in the practice guide and associated teacher practices and describes how to use the PD resources and other toolkit components, as well as four content modules. All the professional development content is contained in the toolkit, and the toolkit provides teachers all the materials they need to implement the toolkit.
A small but rigorous evidence base suggests that when teachers implement the practices recommended in the WWC practice guide, early math learning improves. However, past studies have not examined the impact of providing a comprehensive toolkit to train educators on how to implement evidence-based practices for teaching math to young children. Therefore, a rigorous evaluation of the efficacy and implementation of the toolkit is necessary to gather evidence about this set of resources and determine whether this type of toolkit could serve as a model for implementation support for preschool teachers more broadly. The toolkit will be made publicly available after the study is conducted, and this study will provide critical evidence to its potential users, which could include preschool teachers across the country. In addition, the study will provide implementation findings that can inform how the toolkit could be improved to be as useful as possible to a wide range of districts, schools, and teachers.
The impact and implementation research questions addressed in this study include the following:
Do teachers in intervention schools (that is, teachers who are offered the toolkit PD resources) report greater confidence in, and positive attitudes toward, using evidence-based practices in math compared to teachers in control-assigned schools?
Do teachers in intervention schools implement more math activities, spend more time on math, include more instruction across settings and activities during each day, and teach math on more days than teachers in control-assigned schools?
Do teachers in intervention schools demonstrate more frequent use of evidence-based math teaching practices than teachers in control-assigned schools?
Do preschool students in intervention schools score higher on measures of math achievement in the spring of preschool than students in control-assigned schools?
Did the professional development components of the toolkit implementation, classroom activities, and instruction occur as intended?
What are different ways that teachers engage with the toolkit PD resources? To what extent does teachers’ use of the PD resources vary? What helps or hinders effective learning from the PD resources?
What challenges do teachers face in implementing the toolkit and how do teachers attempt to overcome those challenges? What additional supports are needed and what improvements do participants recommend for the toolkit?
The impact study will be a school-level, cluster-randomized controlled efficacy trial. The evaluation team will recruit and randomly assign 50 schools across approximately 10 school divisions to the treatment condition (toolkit) or business as usual (control) in the spring and fall of 2024. Random assignment of schools will occur after the collection of consent forms. In schools assigned to the toolkit group, preschool teachers will be invited to use the toolkit materials. In control schools, preschool teachers will not have access to the toolkit until it is made available to the public after the study is completed.
Both groups will be asked to participate in study data collection using teacher instructional logs, surveys, and observations. The study team will ask the implementation group to participate in additional data collection to address the implementation questions, including completing implementation checklists and a toolkit satisfaction survey. The study team will also collect administrative data, including demographic information on students and teachers, and student standardized mathematics assessment scores.
The report's primary audience includes district staff responsible for professional development for preschool teachers, preschool leaders, instructional coaches/leaders. All of these members of the audience will benefit from information on the extent to which the toolkit improves outcomes, the conditions under which the toolkit is perceived to be most useful, and potential challenges that may emerge when implementing the toolkit. IES and the REL AP team that developed the toolkit will be the secondary audience and will benefit from information on potential refinements to the toolkit.
The efficacy study’s data-collection activities are listed below. Table 2 provides details about the measures, which research questions they will be used to address, as well as timing and administration details. This package requests clearance only for the survey instruments, observation protocols, instructional logs, assessment and administrative data, and associated data-collection procedures. IES has already submitted a separate OMB package requesting clearance for district recruitment activities.
Consent to Participate. The study team will request teacher rosters and email addresses from participating divisions to email invitations to teachers to complete the consent forms, surveys, and logs. Once the teachers have consented, the study team will follow division consent procedures for parents/guardians. The intention is to engage divisions that allow passive consent procedures for parents/guardians to opt out of their child’s participation in the study if they choose. However, if divisions do not allow passive consent procedures, we will follow the division consent procedures for active consent. The study team will ask schools to help communicate information about the study and opt-out procedures by either emailing families and/or sending the information home in student backpacks with other school communications. Because all the student-directed study procedures, including the assessments, will be part of the students’ typical classroom experience, families can only opt out of their child’s data being used in the research study. Families not wanting their child’s data to be used in the research study will be given a website URL and QR code to notify the study team of their decision.
Teacher Instructional Log. Teachers in both the intervention and control groups will complete an instructional log to document time spent on math-focused activities as well as information about the learning goals and format of activities (described in Supporting Statement Part B), capturing a week at pre-intervention, mid-intervention, and post-intervention. The study team decided to collect the log at three time points to balance the desire to collect more frequent data about teacher use of math-focused activities with the need to limit burden on teachers. Teachers will complete the log at the beginning of the school year, approximately 16 weeks later, and at the end of the school year. The study team will administer the survey to ensure that the timing is consistent across treatment conditions and avoids holidays. We estimate these forms will take less than 5 minutes daily to complete. The data in each round of logs should be representative of the classroom instruction. The team has developed an instructional log to capture the frequency of math instructional activities in a preschool classroom, as no such log currently exists in the literature. A subject matter expert has reviewed the log, and the study team completed two cognitive interviews with preschool teachers in the summer and fall of 2023 to refine the log and ensure the instructions were clear to teachers. In addition, two preschool teachers piloted use of the log in fall 2023. Data from these logs will address research question 2: Do teachers in intervention schools implement more math activities, spend more time on math, include more instruction across settings and activities during each day, and teach math on more days than teachers in control-assigned schools?
Student Assessment Data. The study will use the Early Math Assessment (Ginsburg & Pappas, 2016), a direct assessment of young children’s math abilities and knowledge that is administered twice a year as part of the Virginia Kindergarten Readiness Program (VKRP). Data from this assessment will address research question 4: Do preschool students in intervention schools score higher on measures of math achievement in the spring of preschool than students in control-assigned schools? The Early Math Assessment (EMAS) has demonstrated strong reliability, including interrater reliability over 0.90 for coding scores. The original version that was adapted for VKRP had a 0.50 correlation with the Woodcock-Johnson, 3rd edition, Broad Math scores for kindergarten students. The study team also plans to request Phonological Awareness Literacy Screening scores for students from participating divisions or the Virginia Department of Education to use as a covariate in the analyses. Using these assessment data will limit the need for additional student testing, thus reducing burden on study participants.
Division Administrative Data. Student-level administrative data used in the study will include student characteristics such as free or reduced-price lunch eligibility, gender, English learner status, and Individualized Education Program status. The study team will use these data as covariates in the statistical models to increase the precision of the estimates of the intervention’s effect and allow for analyses to test whether the intervention is more effective for certain groups of students. The study team will also request classroom rosters to identify student-teacher links. School-level data will include school characteristics, such as school enrollment and percentage of students considered economically disadvantaged. All administrative data will be collected for both intervention and control schools.
Teacher Surveys. The team will use an existing assessment with face validity and established reliability— the Attitudes, Beliefs, and Confidence (ABC) survey (Reid & Melgar, 2018) — to capture teacher attitudes, beliefs, and confidence about teaching math. This assessment will address research question 1: Do teachers in intervention schools report greater confidence in, and positive attitudes toward, using evidence-based practices in math compared to teachers in control-assigned schools? Reid and Melgar (2018) found the two subscales that the study will use to be internally consistent, with a Cronbach's alpha of .87 for the Confidence Scale and .86 for the Attitudes Scale. Teachers in both the intervention and control groups will complete the ABC survey at the beginning and end of the 2024/2025 school year. Teachers in both the intervention and control groups will also complete a survey about their background, education, training, and past professional development experiences once at the beginning of the 2024/25 school year. The study team will combine the background survey with the first ABC survey to minimize the number of separate surveys that teachers are asked to complete. Each survey should take no more than 30 minutes to complete.
Intervention teachers will complete the Self-Assessment of Mathematics Instruction (SAMI) at the beginning and middle of the 2024/25 school year, and again in March 2025 after completing the last PD module. The SAMI will provide a self-reported measure of teachers’ practices. This measure should take approximately 15 minutes to complete. As part of the evaluation, the study team will track completion of the SAMI as part of the toolkit fidelity, as opposed to analyzing teachers’ responses. SAMI data will not be used to address any research questions, the study team will only track completion to address research questions 5 (Did the professional development components of the toolkit implementation, classroom activities, and instruction occur as intended?) and 6 (What are different ways that teachers engage with the toolkit PD resources? To what extent does teachers’ use of the PD resources vary? What helps or hinders effective learning from the PD resources?).
Intervention teachers will also complete a brief Implementation Checklist at the end of each of the four content modules in which they will report which toolkit professional learning components and classroom activities they completed. These data will also address research questions 5 and 6. Teachers will also complete a Toolkit Satisfaction Survey once at the end of the year to capture teachers’ impressions of the PD activities and any obstacles they experienced during toolkit implementation. The study team will combine the Toolkit Satisfaction Survey with the second ABC survey for the intervention teachers to minimize the number of separate surveys that teachers are asked to complete. The survey at the end of the year for both intervention and control teachers will also include questions about teachers’ professional learning experiences and curriculum and other resources for math instruction that they used during the 2024/2025 school year. This survey will be used to address research questions 6 (What are different ways that teachers engage with the toolkit PD resources? To what extent does teachers’ use of the PD resources vary? What helps or hinders effective learning from the PD resources?) and 7 (What challenges do teachers face in implementing the toolkit and how do teachers attempt to overcome those challenges? What additional supports are needed and what improvements do participants recommend for the toolkit?). Each survey should take no more than 30 minutes to complete.
Teacher Observations. The study team will measure teachers’ use of evidence-based practices in math at the beginning and end of the school year in both the intervention and control groups using the Early Math Teaching Observation Tool (EMTOT). These data will address research question 3: Do teachers in intervention schools demonstrate more frequent use of evidence-based math teaching practices than teachers in control-assigned schools? The EMTOT is an observation-based rubric aligned with the SAMI that will measure preschool teachers’ use of evidence-based math teaching practices. The current version, which the study team has refined through pilot testing, includes observation items rated on a four-point continuum. The study team will collect baseline teacher practice observations for all teachers at the beginning of the 2024/25 school year and again in March 2025 after the intervention teachers have completed the PD modules.
The SAMI and EMTOT were developed by math researchers and methodologists with REL Appalachia and are based on the evidence-based practices identified by authors of the Teaching Math to Young Children Practice Guide. The first author of the practice guide, Art Baroody, served as a subject matter expert and reviewed and provided feedback on both instruments. REL Appalachia conducted cognitive interviews and usability testing with early childhood teachers for the SAMI and early childhood coaches and instructional leaders for the EMTOT. Cognitive interviews and feedback from usability testing observations and interviews informed revisions to both instruments.
If the school or preschool program participates in Virginia’s Quality Rating Improvement System, we will obtain teachers’ Classroom Assessment Scoring System (CLASS) scores to provide a measure of overall teaching quality practices in addition to the math-specific practices. CLASS is a widely used tool for measuring the quality of early learning classrooms in the United States. Multiple validation studies have established interrater reliability and internal consistency for both CLASS dimensions and domains. CLASS dimensions demonstrate good internal consistency in preschool classroom observations: Emotional Support (α = .91), Classroom Organization (α = .87), and Instructional Support (α = .86) (Pianta et al., 2008b).
Table 2. Data Sources Matched to Research Questions Table
Research question |
Measure |
Administration |
Completion time (per administration) |
|||
|
|
pre |
mid |
post |
|
|
Teacher measures (intervention and control groups) |
||||||
1 |
ABC Survey |
X |
- |
X |
25 minutes |
|
2 |
Teacher Instructional Log |
X |
X |
X |
25 minutes |
|
3 |
Early Math Teaching Observation Tool (EMTOT) |
X |
- |
X |
30 minutes |
|
3 |
Classroom Assessment Scoring System (CLASS) |
- |
- |
X |
Administrative data |
|
Student outcomes (intervention and control groups) |
||||||
4 |
Virginia Kindergarten Readiness Program (VKRP) Early Mathematics Assessment System (EMAS) |
X |
|
X |
Administrative data |
|
Covariates (intervention and control groups) |
||||||
1–6 |
Teacher Demographics and Experiences Survey (items added to ABC survey) |
X |
- |
- |
5 minutes |
|
4 |
Student and school demographics |
- |
- |
X |
Administrative data |
|
Implementation measures (intervention group only) |
||||||
5, 6 |
Implementation Checklist |
After each PD content module |
10 minutes |
|||
5, 6 |
Self-Assessment of Math Instruction (SAMI) |
X |
X |
X |
30 minutes |
|
6, 7 |
Toolkit Satisfaction Survey (items added to ABC survey) |
- |
- |
X |
5 minutes |
The data-collection plan is designed to obtain information efficiently in a way that minimizes respondent burden and utilizes extant data whenever possible. REL AP will use an online survey platform, Qualtrics, which will be 508-compliant, to collect data that can be collected directly from school leaders and teachers. REL AP will manage the entire data-collection process, including survey programming, sample management, and monitoring of responses. REL AP will email study participants an individual link to online surveys. To reduce the burden on respondents, the software will allow survey respondents to answer using their preferred device, such as a laptop, smartphone, or tablet, and will save survey progress if a respondent cannot complete the survey in one sitting. The survey will include a telephone number to a staffed help desk and an email address where respondents can send questions. These procedures will minimize the survey burden on respondents.
When possible, the evaluation team will collect data from administrative sources rather than through primary data collection. Division staff will submit information electronically using secure file transfer procedures. The materials for preparing the teacher list will include an email address to which respondents can direct their questions.
The study team will not collect information that is available from alternative data sources. The data-collection activities will draw on information already available from extant administrative records. Specifically, the evaluation team will collect data related to school-level characteristics such as size and percentage of students considered economically disadvantaged, as well as student-level characteristics, such as student achievement, directly from division administrative records to minimize the length of surveys administered directly to teachers and prevent duplication of effort.
The primary data collection for this study will include only information not available from other sources. Information obtained from the instructional logs, surveys, and observations are not available elsewhere.
The data-collection activities will not affect any small businesses, but some of the schools might be small entities. The use of administrative records will reduce the burden on school educators by ensuring that only the minimum amount of primary data will be requested from schools to meet the objectives of this study. Aside from requests for administrative records and the survey links emailed directly to participants, the evaluation team will not contact schools to request additional data. The study team will secure permission from individual schools to share their student-level assessment data and then request these data directly from the Virginia Department of Education (VDOE) or school divisions, to minimize burden on schools.
The Education Science Reform Act of 2002, Part D, Section 174 states that the central mission and primary function of the RELs includes supporting applied research and providing technical assistance to state and local education agencies within their region (20 U.S.C. 9564). Failure to approve the data collection activities related to the evaluation of the Early Math Toolkit will jeopardize this attempt to study the impact of the toolkit and thereby prevent the REL AP contractor from fulfilling its mission.
The data-collection activities will contribute to understanding the toolkit’s potential to affect student and teacher outcomes. If this study does not collect data from the surveys, implementation logs, and district administrative records, the study team will be unable to draw conclusions about the toolkit’s effect on teacher outcomes.
No special circumstances are involved with this data collection. Data will be collected in a manner consistent with the guidelines in 5 CFR 1320.5.
A 60-day Federal Register Notice was published on January 16th, 2024 and no comments were received. A 30-day notice will be published.
In addition, throughout the course of this study, we will draw on the experience and expertise of Dr. Todd Grindal. the associate center director and senior principal senior researcher for the Center for Learning & Development, SRI Education, and the subject matter expert for this study.
The study proposal has also gone through external peer review as required by the Education Sciences Reform Act (ESRA) for all REL studies. The study proposal was approved after institutional review board (IRB) review through SRI in January 2024.
The study team will only compensate educators who are participating in the study and completing PD and/or data-collection for their time. Teachers in the intervention group will either be directly paid $40 per hour of PD (up to $800 total across the life of the study, with an estimated 20 hours to participate in the professional learning activities) if the PD is completed outside of working hours, or the study team will provide funds to the school or division to pay for substitute days to allow intervention teachers to complete the PD modules during their normally scheduled working hours. Teachers will be asked to self-report the modules and activities they complete on the Implementation Checklist, and the study team will calculate their stipends using their estimated number of hours for each module. Teachers will not be paid for implementing the activities in the classroom. Teacher compensation will be determined during the initial discussions with division administrators and school leaders and will follow division policy regarding teacher time, fair compensation, and availability of substitute teachers. Intervention teachers will also be given $25 per data-collection instrument for completing the Implementation Checklists. In addition, all intervention and control teachers will be compensated for their time spent completing the surveys and Teacher Instructional Logs ($25 per completed survey/log). The study team will also provide each participating division with $400 for its assistance with data exports from administrative data systems.
All data-collection efforts will be conducted in accordance with all relevant federal regulations and requirements. REL AP will be following the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, which requires “[a]ll collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment.
Respondents will be assured that the study team will protect and maintain confidentiality to ensure respondents and their schools will not be identified.. All members of the study team have obtained their certification on the use of human subjects in research. The following safeguards to carry out confidentiality assurances are routinely employed at SRI International, the contractor executing this study:
All team members will participate in data-collection training that includes a focus on methods to maintain participant confidentiality and data security.
The study team will provide secure environments for housing all data collected for the study. Paper files will be stored in a locked file cabinet and all digital files will be password-protected so that only project researchers can access them.
The study team will immediately deidentify all data collected during the study that can potentially be linked to an individual and will delete temporary files that are stored on encrypted hard drives during on-site data-collection activities.
Only authorized members of the study team will have direct access to deidentified evaluation databases. Study team members will maintain a high level of focus on ensuring the confidentiality of both quantitative and qualitative data.
The team will not share data obtained for this study, which will be deidentified, with any entity or individual other than the Department and will not use the data for purposes other than this evaluation.
The evaluation team will make certain that all data are held in strict confidentiality, as just described. Deidentified data will be shared with IES and made publicly available after completion of the study. The following statement will appear on all letters to respondents on data collection:
Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific division, school, or individual. Any willful disclosure of such information for nonstatistical purposes, without the informed consent of the respondent, is a class E felony.
All survey responses will be kept strictly confidential. No school, district, or state staff member will have access to survey responses that include respondents’ names, school names, or other information that could potentially be used to identify individuals or schools. The project will be reviewed by SRI International’s Institutional Review Board (10331) and we anticipate the review to be completed by January 2024.
In addition, for student information, the data-collection efforts will ensure that all individually identifiable information about students, their academic achievements, their families, and information with respect to individual schools, shall remain confidential in accordance with section 552a of Title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act. The study will also adhere to requirements of subsection (d) of section 183 prohibiting disclosure of individually identifiable information as well as making the publishing or inappropriate communication of individually identifiable information by employees or staff a felony.
The evaluation team will protect the confidentiality of all information collected for the study and will use it for research purposes only. No information that identifies any study participant will be released publicly. Information from participating institutions and respondents will be presented at aggregate levels in reports. No individually identifiable information will be maintained by the study team upon study completion.
We plan to request administrative demographic data on students including gender and race/ethnicity, which we will use as covariates in our analyses. These covariates will increase the precision of the estimates of the intervention’s effect and allow for subgroup analyses to test whether the intervention is more effective for certain subgroups of students. These data are not sensitive primary data.
Table 2 shows the hourly burden for three types of data-collection activities: extant data provided by the districts, teacher data collected from all study participants, and implementation data collected from treatment teachers. The total burden associated with data collection for this study is 696.6 hours. The extant data-collection burden is 76.5 hours, and the data-collection burden is 620.1 hours. The total burden for recruitment and data collection is 791.6 hours. The number of responses is 6,720 for data collection and 7,040 for the study, including recruitment.
Table 2. Estimated Annual Burden and Respondent Costs Table
Information Activity |
Sample Size |
Respondent Response Rate |
Number of respondents |
Responses per Respondent |
Number of Responses |
Average Burden Hours per Response |
Total Annual Burden Hours |
Estimated Respondent Average Hourly Wage |
Total Annual Costs |
Extant Data Collection |
|
|
|
|
|
|
|
|
|
Student assessment data |
1 |
100% |
1 |
1 |
1 |
4 |
4 |
$50.00 |
$200.00 |
Teacher distribution of opt-out form to students/parents |
106 |
85% |
90 |
1 |
90 |
.25 |
22.5 |
$50.00 |
$1,125.00 |
Administrative data on school and students |
50 |
100% |
50 |
1 |
50 |
1 |
50 |
$50.00 |
$2,500.00 |
Subtotal |
|
|
|
|
141 |
|
76.5 |
|
$3,825.00 |
Teacher Data Collection |
|
|
|
|
|
|
|
|
|
Teacher consent |
112 |
90% |
100 |
1 |
100 |
0.2 |
20 |
$50.00 |
$1,000.00 |
Teacher survey — email request to take survey (3 emails per time point) |
106 |
100% |
106 |
6 |
636 |
0.05 |
31.8 |
$50.00 |
$1,590.00 |
Baseline teacher survey |
106 |
85% |
90 |
1 |
90 |
0.5 |
45 |
$50.00 |
$2,250.00 |
Endline teacher survey |
106 |
85% |
90 |
1 |
90 |
0.5 |
45 |
$50.00 |
$2,250.00 |
Teacher log — email request to complete log (10 emails per weekly log) |
106 |
100% |
106 |
30 |
3180 |
0.05 |
159 |
$50.00 |
$7,950.00 |
Teacher log (daily log for 3 weeks) |
106 |
80% |
84 |
15 |
1590 |
0.10 |
159 |
$50.00 |
$7,950.00 |
EMTOT observation |
106 |
100% |
106 |
2 |
212 |
0.5 |
106 |
$50.00 |
$5,300.00 |
Implementation checklist – email request to complete checklist (3 emails per checklist, 4 checklists) |
53 |
100% |
53 |
12 |
636 |
0.05 |
31.8 |
$50.00 |
$1,590.00 |
Implementation checklist |
53 |
85% |
45 |
1 |
45 |
0.5 |
22.5 |
$50.0 |
$1,125.00 |
Subtotal |
|
|
|
|
6,579 |
|
620.10 |
|
$31,005.00 |
Total Data Collection Burden |
|
|
|
|
6,720 |
|
696.60 |
|
$34,830.00 |
Recruitment (already approved under 1850-0991) |
|
|
|
|
320 |
|
95.00 |
|
$4,750.00 |
Recruitment + Data Collection Grand Total |
|
|
|
|
7,040 |
|
791.60 |
|
$39,580.00 |
The total respondent cost associated with data collection for this study is approximately $34,830.00. The extant data-collection cost is $3,825.00, and the respondent cost for the data collection is $31,005.00.
This is a one-time series of data collection activities; there are no plans for follow-up studies or other recurring data collections outside of what is being proposed in this package.
The total cost to the federal government for work conducted over all four years is $1,182,058.26, and the estimated annualized cost to the federal government for each year of the study is $295,514.57.
Funding includes staff time to collect, clean, and analyze data from the study. The total also includes costs incurred by the study team related to study preparation and submission of the study information to IES (from proposed research design through reporting of results).
This is a request for a revision in order to add additional collection of information. The burden for this data collection is being added to the burden approved for recruitment under 1850-0991.
To estimate the impact of the toolkit PD resources on teacher and student outcomes (research questions 1 to 4), we will use hierarchical linear modeling (HLM) to adjust standard errors associated with the clustering of observations within schools and divisions (Raudenbush & Bryk, 2002), controlling for baseline scores on each outcome measure and other relevant covariates. HLM models estimating teacher-level impact will need to account for the nesting of teachers within schools, and models estimating student-level impacts will need to account for the nesting of students within classrooms and schools. Models will be estimated separately for each outcome. To analyze the implementation fidelity research questions (research questions 5 to 7), we will examine means, standard deviations, and frequencies (percentages) of outcome measures. Additionally, we will conduct qualitative analyses of the open-ended responses on the Toolkit Satisfaction Survey to examine how teachers used the toolkit resources as well as how teachers approached challenges and any suggested improvements teachers have for the toolkit.
The results of this study will be made available to the public through a peer-reviewed report published by IES. The study team will produce and disseminate a report on the efficacy study findings with an expected release in 2026. The primary audience consists of preschool teachers, instructional leaders, and division and school leaders, as it will provide them with information on the extent the toolkit improved outcomes as well as implementation context and challenges. The secondary audience consists of IES and the REL team that developed the toolkit, as the findings will inform potential refinements to the toolkit.
The datasets from these studies will be turned over to the REL’s IES contracting officer’s representative to become IES restricted-use datasets requiring a user’s license (see http://nces.ed.gov/pubs96/96860rev.pdf for procedures related to obtaining and using restricted-use datasets). These files will contain all the primary survey data collected for the study with all personal identifiers removed. All restricted-use files are required to be reviewed by IES’s Disclosure Review Board (DRB), comprising members from each National Center for Education Statistics (NCES) Division, representatives from IES’s Statistical Standards Program, and a member from each of the IES Centers. The DRB will review disclosure risk analyses conducted by the REL contractor to ensure that data released do not disclose the identity of any individual respondent. The DRB approves the procedures used to remove direct identifiers from restricted-use data files. Administrative data will not be included in the data file, but instructions on how to obtain those data, and information on how those data were used in the analysis will be included. These data files are made available so that other researchers can replicate the REL’s research or answer additional research questions.
No responses or data will be reported for individual staff members, students, or schools. Reported data will contain no fewer than four cases per reported table cell to protect confidentiality and mask individually identifiable data.
The Institute of Education Sciences is not requesting a waiver for the display of the OMB approval number and expiration date. The data-collection instruments will display the expiration date for OMB approval.
This submission does not require an exception to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).
Duncan, G. J., Dowsett, C. J., Claessens, A., Magnuson, K., Huston, A. C., & Klebanov, P. (2007). School readiness and later achievement. Developmental Psychology, 43, 1428−1446. https://doi.org/10.1037/0012-1649.43.6.1428
Education Sciences Reform Act, Public Law 107-279. 20 U.S.C. ch. 76 §§9501-9631. (2002).
Frye, D., Baroody, A. J., Burchinal, M., Carver, S. M., Jordan, N. C., & McDowell, J. (2013). Teaching math to young children practice guide (NCEE 2014-4005). U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. https://ies.ed.gov/ncee/wwc/practiceguide/18
Ginsburg, H. P., & Pappas, S. (2016). Invitation to the birthday party: Rationale and description. ZDM Mathematics Education, 48, 947–960. https://doi-org.sri.idm.oclc.org/10.1007/s11858-016-0818-4
Pianta, R. C., La Paro, K. M., & Hamre, B. K. (2008). Classroom Assessment Scoring System (CLASS) manual, Pre-K. Paul H. Brookes Publishing.
Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed.). Sage.
Reid, E. E., & Melgar, C. (2018, March). A center-based approach to changing teacher math attitudes in Head Start centers. Poster presented at the annual Erikson Research Symposium.
Virginia Department of Education (VDOE). (n.d.) Fall Membership Build-A-Table. https://p1pe.doe.virginia.gov/apex_captcha/home.do?apexTypeId=304
Watts, T. W., Duncan, G. J., Siegler, R. S., & Davis-Kean, P. (2014). What’s past is prologue: Relations between early mathematics knowledge and high school achievement. Educational Researcher, 43(7), 352–360.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Supporting Statement Part A |
Author | Authorised User |
File Modified | 0000-00-00 |
File Created | 2025-01-03 |