NE 5.1.1 CT Supporting Statement B_revised

NE 5.1.1 CT Supporting Statement B_revised.docx

Visual Representations for Proportional Reasoning: Impacts of a Teacher Professional Development Program for Multilingual Learners and Other Students

OMB: 1850-0978

Document [docx]
Download: docx | pdf

Regional Education Laboratory

Northeast & Islands






Supporting Statement B

Section B. Data Collection Procedures and Statistical Methods

1. Respondent Universe and Sampling Methods

Universe of Schools and Eligibility Requirements

To be eligible to participate in the study, schools must meet the following eligibility criteria:

  • They must serve 7th grade students in general education classroom settings

  • They must be public schools located in Connecticut

Public data available through the Common Core of Data (CCD) indicate that there were approximately 285 public schools within this eligibility pool at the end of the 2021 school year.

Universe of Teachers Within Participating Schools

To be eligible to participate in this study, teachers must meet the following eligibility criteria:

  • They must teach a regular education mathematics class for 7th grade students

Teachers considering participation in the study will indicate their willingness to provide logistical support for the study’s data collection activities. Teacher measures will be administered to all participating teachers. No sampling of teachers will take place.

Universe of Students Within Participating Schools

The evaluation team will include all 7th grade students within participating teachers’ mathematics classes. No sampling of students will take place, and no subgroups will be excluded.

2. Procedures for Collection of Information

All schools that meet eligibility criteria and have at least one teacher that volunteered for the study (XX schools) will be included in the study. The types of data to be collected are:

  1. Teachers’ responses to four measures administered electronically

  2. Student responses to four measures, three of which are administered electronically and one of which is administered on paper

  3. Administrative student records

  4. Administrative teacher records

The processes for collecting the data are further described in the following sections.

Preliminary Activities

Project partners at the Connecticut State Department of Education (CSDE) will lead and carry out all project recruitment activities. After project recruitment and before collecting data, project staff will conduct two activities:

  1. Communicate data needs to schools. Project staff will share a schedule of data collection activities with teachers and any relevant school or district staff contacts to discuss data collection plans. The project team will then conduct a phone conference with the point of contact to discuss data collection plans and confirm the presence of necessary student records.

  2. Obtain the necessary school staff and parental consent. The processes for obtaining consent from these two groups are as follows:

    1. Teacher consent: “Active consent” forms will be shared electronically with teachers considering participation in the study. This electronic form (See Attachment #) informs them of the study’s purpose, the data collection activities in which they will participate, the risks and benefits of participation, their freedom to withdraw from the study without repercussions, and contact information for the Institutional Review Board and the principal investigator. Teachers have the choice of indicating “Yes, I agree to participate” or “No, I do not wish to participate.” Teachers will complete these forms electronically, with a copy of the form and their response automatically emailed to them upon completion for their own records. After all teachers have returned their consent forms, REL staff will request the state issued unique identifier for teacher records. These unique identifiers will allow REL staff to merge teacher data from administrative records with primary data collected by the research team, and also determine response rates for teacher data collections. A separate crosswalk data file containing teachers’ names and their ID numbers. This file will be encrypted, and only shared with REL staff who are (1) e-mailing staff to request that they complete a data collection instrument, or (2) processing the data provided by CSDE.

    2. Parental consent: The project team will provide schools with the necessary “opt out” information letters (passive consent documents; see Attachment #) to distribute to all guardians of 7th grade students. These letters give parents all of the information found on a consent form, and allow parents to “opt out” by indicating they do not consent to any collection of data from their child for study purposes. Schools will collect and keep the letters that are returned, and in mid-September they will provide the study team with the state issued unique identifier for students in grade 7 whose guardians did not withhold consent for their child to participate in the study. These unique identifiers will allow REL staff merge student data from administrative records with primary data collected by the research team, and also determine response rates for student data collections. A separate crosswalk data file containing student ID numbers and their associated teacher and school will be created. This file will be encrypted, and only shared with REL staff who are processing the data provided by CSDE.

Collection of Teacher Measures

All teachers participating in the study will complete each measure electronically. The DTAMs and the pre/post surveys will be completed electronically via Alchemer. Participating teachers will receive an email with a unique link to complete these measures in August 2023 and again between March and May of 2024.

Collection of Student Measures

The REL NEI team will work with participating teachers to coordinate the administration of student measures. The Proportional Problem Solving (PPS), Math Self Concept, and Math Anxiety measures will all be completed electronically via Alchemer. Students whose guardians did not withhold consent for their child to participate in the study will receive an email with a unique link to complete these measures in September of 2023 and again between March and May of 2024. School-issued email addresses will be used to ease the logistical burden on participating teachers to administer these measures.

The visual representation (VR) measure will be administered on paper to all students whose guardians did not withhold consent for their child to participate in the study. Packages with student prompts and response pages will be mailed to each participating school, with pre-paid materials to return the completed measures. All students will complete this measure, and all responses will be returned; however, due to budgetary constraints, only a sample of student responses will be scored. This modification is necessary due to the fact that student responses on the VR measure must be scored according to a rubric and electronic scoring cannot be configured for this specific measure. We will randomly select 25 percent of the student responses for scoring and analysis, stratifying by teacher. To optimize statistical power and emphasize our focus on MLL students, we will strategically select VRs drawn by MLL students at a 3:1 ratio for this analysis.

Administrative Data Request

The REL-NEI team will work with the CSDE to execute a data sharing agreement for the administrative records request. It is expected that this data sharing agreement will be executed in December, 2023. After this agreement is in place, CSDE staff will assemble two data files, one containing teacher level records, and another containing student level records. They can then upload the data files to REL-NEI’s secure FTP site.



3. Methods to Maximize Response Rates and To Deal With Non-Response

The ED is committed to obtaining complete data for this study. To ensure the acquisition of complete data, project staff will maintain e-mail and telephone contact with the appropriate parties. All data received for the study will be reviewed for completeness. When data contain too many missing values, project staff will connect with research staff at schools or state-level educational organizations to obtain a better understanding of the sources of missing data elements.

High response rates are expected for this study, given that teachers volunteer to participate and incentives provide financial compensation for the data collection burden. Nonresponse follow up will be performed to ensure adequate response rates of 85% percent or higher, consistent with the National Center for Education Statistics’ Statistical Standards (Seastrom, 2002). Student nonresponse could occur if a student is absent on the day of the student survey and readiness assessment, if he or she leaves the school during the study year, or if a child’s parents do not give consent for the student to participate. It is expected that parents will not have significant concerns that will result in withdrawal of their children from the study.

Several additional steps will be taken to maximize response rates. Teacher participants and student guardians will receive advance communication explaining the study. The staff consent form will provide teacher participants with full knowledge of the data collection activities involving them and the financial compensation for the data collection burden. The consent form also will provide assurance that the data will remain confidential. Staff members will receive their copies of the consent form, and the form will list contact information to receive additional information about the study or to resolve any questions. The guardian information letter and consent form will provide parents with full knowledge of the data collection activities involving students and the assurance that data will remain confidential. Project staff will track teacher and student completion rates and follow up with teachers who have not completed data collection activities – including teacher and student measures – in the allotted timeframes.

If a response rate lower than 85 percent is achieved, researchers will conduct a bias analysis to compare the characteristics of respondents with those of nonrespondents. Little information about staff characteristics will be available to compare staff members who responded to the survey with those who did not. Instead, researchers will assess the extent to which schools with high response rates differ from schools with low response rates by testing (using a t-test) for statistical differences in school-level characteristics such as percentage of multilingual learners. For students, a bias analysis will be conducted if a response rate lower than 85 percent is achieved to compare characteristics from administrative data of students who responded to the measure with those who did not. Univariate statistical analysis (t-tests) will compare respondents on individual characteristics (e.g., the test scores of students who did and did not respond), and a multivariate statistical analysis (logistic regression analysis) will consider whether a combination of student characteristics predicts nonresponse.

Reports will clearly note any differences found between respondents and nonrespondents.

4. Tests of Procedures or Methods to be Undertaken

This study employs measures that have been previously utilized in empirical research, with existing evidence of reliability and validity (Jitendra et al., 2015; OECD, 2014; Kersting, 2008; Kersting et al., 2012; DePiper et al., 2021; University of Louisville CRIMSTeD, 2020 ). None of the outcome measures administered in this study were developed specifically for the VAM PD or for the purposes of this study.

The implementation study includes data collection using a pre- and post- survey for participating teachers that was developed for this study. The survey protocols were developed in collaboration with partners from CSDE to include contextual information that will support the interpretation of our results. For example, all teachers across groups will be asked about curricular programs they use to teach mathematics, and about instructional practices that they incorporate in their classroom. This descriptive information about classroom context will provide information about service contrast to explain results related to impacts on students who were taught by participating teachers, and reflects standard and expected protocol for ensuring that a study of this nature can adequately describe how the treatment and control conditions differed.



5. Individuals Consulted on Statistical Aspects of the Design

The following individuals were consulted on the statistical, data collection, and analytic aspects of the VAM PD efficacy study:

  • Ryan Williams, Principal Researcher, American Institutes for Research

  • Jingtong Pan, Senior Researcher, American Institutes for Research

  • Jill Battal, Senior Researcher, American Institutes for Research

  • Elisabeth Davis, Principal Researcher, American Institutes for Research

  • Kathleen Jones, Researcher, American Institutes for Research

  • Deborah Holtzman, Principal Researcher, American Institutes for Research

  • Johannah Nikula, Senior Project Director, Education Development Center

  • Pam Buffington, Senior Project Director, Education Development Center

  • Josephine Louie, Senior Research Scientist, Education Development Center

  • Sarah Ryan, Research Scientist, Education Development Center

Electronic file transfers will be conducted by

  • Kathleen Jones, Researcher, American Institutes for Research

Online data collection and analysis to be overseen by

  • Jill Battal

  • Jingtong Pan



6. References


DePiper, J. N., Nikula, J., Buffington, P., Louie, J., & Tierney-Fife, P. (2021b). Learning to attend to and interpret multilingual learners’ mathematical thinking: A professional development story. Manuscript submitted for publication. 

Jitendra, A. K., Harwell, M. R., Dupuis, D. N., Karl, S. R., Lein, A. E., Simonson, G., & Slater, S. C. (2015). Effects of a research-based mathematics intervention to improve seventh-grade students’ proportional problem solving: A cluster randomized trial. Journal of Educational Psychology, 107, 1019–1034. https://eric.ed.gov/?id=ED572835 

Kersting, N. (2008). Using video clips as item prompts to measure teachers’ knowledge of teaching mathematics. Educational and Psychological Measurement, 68, 845–861.

Kersting, N. B., Givvin, K. B., Thompson, B. J., Santagata, R., & Stigler, J. W. (2012). Measuring usable knowledge: Teachers’ analyses of mathematics classroom videos predict teaching quality and student learning. American Educational Research Journal, 49(3), 568–589.  

OECD [Organisation for Economic Co-operation and Development]. (2014). PISA 2012 technical report. Author. Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods. Sage. 

Seastrom, M. M. (2002). NCES statistical standards. Washington, DC: US Government Printing Office, US Department of Education, Institute of Education Sciences. Retrieved October 4, 2022.

University of Louisville CRIMSTeD. (2020). Diagnostic teacher assessments in mathematics and science (DTAMS). https://louisville.edu/education/centers/crimsted/dtams/diag-math-assess-middle 







File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy