SE 4 1 34 OMB Clearance Part A_4 21 14

SE 4 1 34 OMB Clearance Part A_4 21 14.docx

The Impact of Professional Development in Fractions for Fourth Grade

OMB: 1850-0909

Document [docx]
Download: docx | pdf



THE IMPACT OF PROFESSIONAL DEVELOPMENT IN FRACTIONS FOR FOURTH GRADE TEACHERS ON STUDENT ACHIEVEMENT AND TEACHER KNOWLEDGE IN GEORGIA AND SOUTH CAROLINA



OMB Clearance Request

Supporting Statement Part A



April 18, 2014



Prepared for:

NCEE Contracting Officer’s Representative: Sandra Garcia

U.S. Department of Education

Institute of Education Sciences

555 New Jersey Ave., NW, Rm. 506C

Washington, DC 20208

(202) 219-1597


Submitted by:

Regional Educational Laboratory Southeast at the Florida Center for Reading Research – Florida State University

2010 Levy Avenue, Suite 100

Tallahassee, FL 32310

(850) 644-9352



Lab Director:

Barbara Foorman, Ph.D.

Florida State University

2010 Levy Avenue Suite 100

Tallahassee, FL 32310

(850) 644-9352

[email protected]

http://rel-se.fsu.edu

Principal Investigator:

Russell Gersten, Ph.D.

Instructional Research Group

4821 Katella Avenue Suite 205

Los Alamitos, CA 90720

Phone: (714) 826-9600

Fax: (714) 826-9610

http://www.inresg.org





Contents


Supporting Statement for Paperwork Reduction Act Submission

A. Justification 5

  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information 5

  2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection 7

  3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or any other technological collection techniques or forms of information technology, e.g. permitting electronic submission of responses, and the basis for the decision of adopting the means of collection. Also describe any consideration of using information technology to reduce burden 12

  4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use of the purposes described in Item 2 above 12

  5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-1), describe any methods used to minimize burden 13

  6. Describe the consequences to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden 13

  7. Explain any special circumstances that would cause an information collection to be conducted in a manner inconsistent with Section 1320.5(d)(2) of the federal regulations 14

  8. Federal Register Comments and Persons Consulted Outside of the Agency 15

  9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees 16

  10. Describe any assurances of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy 16



  1. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. The justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent 17

  2. Provide estimates of the hour burden of the collection of information 17

  3. Describe any other costs to respondents or record keepers 19

  4. Provide estimates of annualized cost to the Federal government. Also provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expenses that would not have been incurred without this collection information. Agencies may aggregate cost estimates from Items 12, 13, and 14 in a single table 19

  5. Describe any changes in the burden from prior approvals 20

  6. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of the report, publication dates, and other actions 20

  7. Describe arrangements for displaying the number provided by OMB and its expiration date 21

  8. Exceptions to Certification Statement 22



References 23




Appendices


Appendix A: Approved Teacher Consent & Demographic Form A-1

Appendix B: Mathematical Knowledge for Teaching (MKT) Sample Items B-1

Appendix C: Teacher Professional Development Survey C-1

Appendix D: District/School Memorandum of Understanding (MOU) D-1

Appendix E: Frequently Asked Questions E-1

Appendix F: Approved Parent/Guardian Information Letter and Opt-out F-1

Appendix G: Approved Student Assent Form G-1

Appendix H: Process for selecting DMI from possible programs H-1

Appendix I: Placeholder for 60-Day and 30-Day Federal Register Notices I-1

Appendix J: Details for Question A12, Estimates of Burden of DMI PD I-1




SUPPORTING STATEMENT

FOR PAPERWORK REDUCTION ACT SUBMISSION


A. Justification


A1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


The dismal state of student mathematics achievement, especially in fractions, has been well documented in research literature (e.g., Ma, 1999; Yanik, Helding, & Baek, 2006); national reports (e.g., National Mathematics Advisory Panel [NMAP], 2008); the IES practice guide on fractions (Siegler et al., 2010); and the results of national assessments (National Assessment of Educational Progress [NAEP], 2011). The National Mathematics Advisory Panel (NMAP) reported that “difficulty with fractions (including decimals and percent) is pervasive and is a major obstacle to further progress in mathematics, including algebra” (2008, p. xix). In fact, Siegler and his colleagues (2012) recently found that knowledge of fractions at age 10 is predictive of algebra knowledge and overall mathematics achievement in high school, above and beyond the effects of general intellectual ability, other mathematical knowledge, and family background.1 There is, in general, a consensus among researchers, educators, and policy makers that improving American students’ knowledge of fractions should be made a major educational priority. In fact, the National Center of Educational Research at IES recently awarded a $10 million dollar grant to fund a five-year research and development center aimed at understanding difficulties students have with fractions and developing effective interventions for struggling learners in this area (Center for Improving Learning of Fractions, 2013).


Unfortunately, research examining teacher knowledge in mathematics has repeatedly documented that elementary teachers often have limited mathematical knowledge, particularly in the area of fractions (Hill et al., 2008; Hill, Kapitula, & Umland, 2011; Hill, Rowan, & Ball, 2005; Ma, 1999). Clearly, providing professional development that focuses on addressing teachers’ mathematical content knowledge underlying the Common Core State Standards (CCSS) is a necessary step for full and effective implementation of CCSS. With the roll out of the CCSS in mathematics scheduled for 2014–2015 in Georgia and South Carolina2, understanding the mathematical ideas and concepts involved with fractions and decimals–as well as computational algorithms–is a critical professional development need in the region, as well as the nation.


There is a small body of high quality experimental and quasi-experimental research that has examined the impact of intensive mathematics professional development (PD) programs on student mathematics outcomes. The research team conducted a What Works Clearinghouse-type review of rigorous research evaluating mathematics PD approaches. A search of the research literature published between January 2006 and July 2012 (or rigorous studies that were identified in a previous review by Yoon, Duncan, Lee, Scarloss, & Shapley, 2007) was conducted. All studies that potentially could provide causal evidence of the effectiveness of teacher PD on student achievement were analyzed. Based on the five studies that were of acceptable technical quality, two professional development approaches, namely Lesson Study and intensive math content courses yielded either statistically significant outcomes on some mathematics measures or indicated relatively large impacts on mathematics outcomes (effect size greater than .25).


There is clearly a need for additional research conducted under more typical conditions and with a clear topical focus. Thus, the purpose of the proposed study is to assess the impact of a professional development program that focuses on developing teachers’ knowledge of the formal mathematics that underlies fractions as well as pedagogical knowledge of techniques that promote development of student understanding of the mathematical ideas as well as computational proficiency. The study team, in consultation with experts in the field of mathematics, determined that fourth grade will be the focus of this study as it is a critical year for fractions instruction, especially as the foundation is laid for clearly understanding the meaning of fractions in later grades.


For this project, the study team will use a randomized controlled trial design to study the Developing Mathematical Ideas (DMI; Schifter, Bastable, & Russell, 2010) program to determine if such an approach to professional development is effective in producing positive impacts on student achievement in mathematics. DMI was selected after a comprehensive examination of professional development programs, as it has a robust infrastructure for implementing on a large scale.


DMI was selected in partnership with a research alliance sponsored by REL Southeast. Members of the alliance helped review and select DMI from other candidate programs. This process is presented in more detail in Appendix H.

DMI was created and field-tested by the Education Development Center (EDC) through a grant funded by the National Science Foundation. The primary objectives of the DMI program are for teachers to: a) spend time on the actual teaching of mathematical topics to students in their classes, and b) explore students’ thinking about mathematics, including the misconceptions and hazy conceptions of mathematical ideas that develop during early phases of instruction.


Using a random assignment design for this study helps ensure that–all else equal–this study will yield the strongest, most reliable evidence possible on which to base policy and practice.


The current authorization for the Regional Educational Laboratories program is under the Education Sciences Reform Act of 2002, Part D, Section 174, (20 U.S.C. 9564), administered by the Department of Education (ED), Institute of Education Sciences (IES), National Center for Education Evaluation and Regional Assistance (NCEE).


A2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


Research Questions and Overview of the Project Plan


This study is a large-scale evaluation of a mathematics professional development program for grade 4 teachers. Specifically, it is an evaluation of Developing Mathematical Ideas (DMI). The goal of the proposed study is to assess the impact of a professional development program that focuses on developing teachers’ knowledge of the formal mathematics that underlies fractions in order to promote development of student understanding of the mathematical ideas as well as computational proficiency. Information will be collected by the study team at the Instructional Research Group (IRG) and Florida State University, the prime contractor of the Regional Education Laboratory Southeast (REL-SE) and will be analyzed and used to inform the research questions and build upon existing IES initiatives. IES will review all results and release a report describing the study’s findings.


The study team will address the following research questions, the first confirmatory3, and the second exploratory:

RQ1. What is the impact of teacher participation in Developing Mathematical Ideas for one year on students’ proficiency in the area of fractions when compared with those students whose teachers receive existing professional development activities provided by schools and districts?


RQ2. What is the impact of teacher participation in Developing Mathematical Ideas for one year on teacher knowledge of mathematics relevant for teaching fractions to 4th graders when compared with teachers who only participate in existing professional development activities provided by schools and districts?


To answer these questions, the study team has developed an experimental research design wherein data will be collected over the course of one year using a sample of fourth grade teachers from at least 80 schools4 in diverse districts (city/town/rural) in Georgia and South Carolina. Only schools with two or more fourth grade classrooms will be recruited. Schools will be randomly assigned to treatment and control conditions (Shadish, Cook, & Campbell, 2002) and all fourth grade teachers in each participating school will be asked to participate. School-level random assignment will reduce the potential for contamination between intervention and control group teachers. Assuming there are 82 schools participating, this will yield 41 treatment and 41 control schools. The study team will need to collect new information from the participating districts and schools through student and teacher testing, analysis of audio recordings of DMI professional development sessions, and teacher surveys. Data routinely collected and compiled by school districts will also help answer these questions. The data collection measures for each research question are summarized in Table A1.





TABLE A1


PROPOSED DATA COLLECTION PLAN

Data Collection Sources


Mode

Timeline

Key Data

Student

Achievement testing of all students

Post-test only

May–June 2015

Student achievement in mathematics (RQ1)

School/District

Student records

July 2014

Student grade 3 state assessment scores to use as a pretest covariate and student demographic information (RQ1)

Teacher

Consent Form



May–June 2014

Teacher demographic information (RQ2)



Fractions measure (all teachers)


Pretest: Aug.–Sept. 2014

Post-test: May–June 2015

Fractions measure (RQ2)



Monthly online surveys of professional development (all teachers)

September 2014 – April 2015

Professional development activities in treatment and control schools (RQ2)

PD Facilitator

Procedural fidelity checklists to use when analyzing audio tapes of DMI sessions (treatment PD only)

August 2014 – November 2014

DMI fidelity (RQ1 & 2)





Student and Teacher Assessment


Student Assessment: For the student-level confirmatory analysis, grade 4 students will be assessed on a test, Test for Understanding of Fractions (TUF), that is currently being developed by the IES/NCSER (National Center for Special Education Research) Center for Improving Learning of Fractions. TUF is being developed in conjunction with the PIs of the Center (Nancy Jordan, Lynn Fuchs, and Robert Siegler) and will include, in part, measures that have been field-tested in their current longitudinal and intervention research with fourth graders. These existing measures focus on both procedural and conceptual knowledge, and include, for example, items from NAEP, batteries used in the research of Hecht, Close, and Santisi (2003) and measures specifically developed by Jordan (e.g. Jordan et al., 2013).


In summary, items aligned with the fourth grade objectives for fractions in the CCSS will be selected or adapted. As items from various sources will be incorporated into the measure for use in this study, scales will be constructed and evaluated using both classical test theory and item-response theory.


Teacher Assessment: The impact of the DMI intervention on teacher knowledge of mathematics relevant for teaching fractions to 4th graders will be assessed using the Mathematical Knowledge for Teaching (MKT) (Ball & Hill, 2008). The IRT reliability for MKT at the elementary level is .94. (Hill, Rowan, & Ball, 2005). The MKT consists of a large pool of items and relevant items on fractions will be selected under the guidance of Kristin Umland and Jim Lewis, two mathematicians. Possible items are included in Appendix B. The final measure is expected to take less than 1 hour to complete.

Note that the MKT teacher knowledge test is strictly an assessment and not included in the burden estimates. A sample of MKT items is provided in Appendix B.





Data Collection Methods

In each district, data will be collected from both treatment and control schools at the same time, by assessors blind to condition, to guard against bias entering the data collection process.


Collecting Teacher Data: Active consent of individual teachers for participation will be acquired, before assignment of individual schools. Prior to the start of the intervention (August 2014), the teacher demographics form will be collected (e.g., teacher experience and education) as well as the teacher knowledge of mathematics (MKT)5 assessment. The demographic survey is included in our calculations of respondent burden. Demographic information will be collected at the outset of the study as part of the teacher consent form. A copy of this form has been included with this submission as Appendix A. Immediately after the end of the professional development intervention teachers will complete the MKT measure (for RQ2). Note that the MKT teacher knowledge test is strictly an assessment and not included in the burden estimates. Sample MKT items have been included as Appendix B. A monthly online measure assessing the type and amount of professional development teachers participate in will be sent to all teachers using SurveyMonkey™. This survey is included in our calculations of respondent burden. A copy of the monthly teacher survey is included with this document as Appendix C.


Collecting Student Data: Student prior year pretest data on the third grade state assessment will be collected from school districts when it becomes available in 2014. Student demographic data will be gathered from the school databases upon completion of the parent opt-out phase. The study team has a waiver of active consent from the IRB, as direct engagement of students in the study is limited strictly to the fractions outcome assessment. A 30-40 minute group assessment (very similar to the state assessments) appears to represent a very low risk to students.


Post-test data will be collected in the spring after the professional development has ended, around 4-6 weeks prior to the end of the school year. Students will be given the Test for Understanding of Fractions (TUF) measure by trained evaluation staff in large groups during independent seatwork to maximize teacher instructional time during the school day (RQ1).


The administration of an achievement test like the TUF is not considered for information collection burden under 5 CFR 1320.3(h) (7).


Collecting School Data: School level data (e.g., free and reduced lunch, school achievement scores) will be collected from the prior year school report cards for purposes of matching schools.


Prior year student achievement data are pre-existing, as they have already been collected by schools and districts. The time district personnel spend locating and organizing this data for us is included in our calculation of burden. Using existing school records data helps to minimize overall burden because it avoids the collection of these same data items from individual respondents at the school level.


Collecting Fidelity of Implementation Data: DMI facilitators at each site will be asked to audiotape each session they have with participating teachers. Two research staff with training in observation systems will listen to each tape and use the fidelity of implementation instrument to measure adherence to critical core components of the DMI program using checklists developed in conjunction with the developers. Discrepancies will be resolved by senior staff. It is our understanding that the collection of this audio data is not considered in calculations of paperwork burden.


A3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or any other technological collection techniques or forms of information technology, e.g. permitting electronic submission of responses, and the basis for the decision of adopting the means of collection. Also describe any consideration of using information technology to reduce burden.


Wherever possible the research team will use information technologies to maximize the efficiency and completeness of the information gathered for this evaluation and to minimize the burden on respondents. In particular, student demographic and prior year assessment data will be gathered from existing electronic school administrative records. Brief monthly teacher surveys regarding the type and amount of professional development teachers participate in will be sent to all teachers using the online application SurveyMonkey™.


A4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use of the purposes described in Item 2 above.


The data collection effort planned for this project will produce data that are unique and specifically target the research questions identified for this project. A rigorous literature review was conducted along the standards outlined by the What Works Clearinghouse that identified the need for the kind of experimental research that will be conducted in this study. Furthermore, there is no current evidence from studies using randomized controlled designs available on the DMI program.


To complete this research, the study team is planning to administer the TUF to all students in the study sample because the data collected with such an instrument are not available from any other source on a national basis; the state-administered assessments vary in what they measure, and how precisely they can measure student achievement in mathematics. Thus, administering our own assessment is necessary because it provides site-to-site consistency and assures a reliable and common mathematics achievement score.


Data collection through the teacher assessment will provide a measure of teacher knowledge of mathematics relevant for teaching fractions to 4th graders as well as the treatment contrast between DMI (treatment) and non-DMI (control) schools. Surveying all of the teachers on a monthly basis as to professional development activities will provide descriptive data to augment the discussion of treatment versus control schools, and taking audio recordings of DMI professional development sessions will provide useful descriptive data regarding fidelity of implementation of the DMI program. In every case, these data do not currently exist.



A5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-1), describe any methods used to minimize burden.


Not applicable. No small businesses will be burdened by this data collection. The focus of this study is on school districts and the attendant schools–which includes the students, teachers, and administrative staff–within these districts. The study team has reduced burden for respondents using a data collection plan that requests the minimum information needed to successfully execute this study. To minimize burden on respondents, the study design requests information already collected by schools and districts, and any new collection instruments have been designed to ask questions that cannot be answered through any other available sources. In some cases, these data will be collected electronically. The study team will conduct all data collection activities.


A6. Describe the consequences to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

The systematic collection, analysis, and reporting of the data described in this submission are required to accomplish the goals of the research project approved by IES. Participation in all data collection activities is voluntary but the study team will work to assure high participation rates. Multiple data collection strategies are planned and these data are necessary to measure impacts on academic achievement and teacher knowledge.


The TUF will be administered to students at treatment and control schools once as an outcome measure at the end of the 2014–2015 school year. This is the primary outcome measure for mathematics skills and if it is not collected the study would lack a common measure of students’ mathematical ability. There is no other common measure across the two states and multiple districts.


The teacher measure will be administered to teachers at treatment and control schools prior to beginning the DMI program and after professional development at the end of the academic year. This is the principal outcome measure for teacher knowledge and if it is not collected, the study would lack a common measure of teacher knowledge at baseline and after receipt of professional development. Pretesting teacher knowledge is necessary to boost the statistical power, and is assumed in the power analysis, because of the inherently smaller number of teachers in the study. If the pretest is not given, the study may not be as sensitive to effects on teacher knowledge that may be present.


During the course of the study year, treatment and control teachers will be surveyed monthly regarding any professional development in the field of mathematics they received. These surveys will provide the study team with valuable descriptive information about the professional development activities going on in all of the study schools. If the study team surveys the teacher sample less frequently there is a risk that the teachers might have less valid memories of the PD they received.


All DMI professional development sessions with teachers will be audio recorded (24 hours of professional development delivered per site) and these recordings will be analyzed by two independent raters to ensure implementation fidelity of the DMI program. Without these recordings, the study team will not have means to make certain the program is being implemented correctly.


High response rates are anticipated given that states, districts, and even schools will be selected based on their willingness to fully participate in the project. The study team is confident that the use of multiple measures of data collection will ensure accurate analyses and results.


Finally, if this study is not conducted, REL Southeast may not meet the requirements for conducting rigorous research that meets regional needs. REL (NCEE) contracts are awarded contingent on a plan to partner in research alliances with educational policymakers from the region to identify research needs and to conduct studies that address them. These researcher-practitioner alliances are mandated to conduct rigorous studies (typically RCTs) wherever possible. The proposed study meets this contractual requirement.


A7. Explain any special circumstances that would cause an information collection to be conducted in a manner inconsistent with Section 1320.5(d)(2) of the federal regulations.


The proposed study meets all guidelines listed under A7 except for a requirement that respondents report information more than quarterly. As explained in section A6, the study team will survey all teachers about their professional development activities on a monthly basis. This survey is brief (approximately 12 minutes per month) and administered electronically. Not surveying the teachers as regularly could result in less valid responses owing to potential inaccuracy of memory over longer periods.



A8. Federal Register Comments and Persons Consulted Outside of the Agency


A notice to solicit public comments was published in the Federal Register in order to provide the opportunity for public comment. No comments were submitted.


The following individuals were consulted on the statistical aspects of this proposal:


Dr. Sybilla Beckman, University of Georgia

Expertise: mathematician, mathematics education

Email: [email protected]

Phone: 706-542-2548

Dr. John Deke, Mathematica Policy Research

Expertise: methods for large-scale randomized trials

Email: [email protected]

Phone: 609-275-2230


Dr. Mike Garet, American Institutes for Research

Expertise: teacher professional development, research methods, large-scale randomized trials

Email: [email protected]

Phone: 202-403-5000


Dr. Nathan Jones, Boston University

Expertise: measurement of teacher quality, teacher evaluation, special education

Email: [email protected]

Phone: 617-353-3295

Dr. Yaacov Petscher, Florida State University

Expertise: research design, measurement, and statistical methods

Email: [email protected]

Phone: 850-644-0327


Dr. Mengli Song, American Institutes for Research

Expertise: research design, advanced quantitative methods, and evaluations of educational programs and policy

Email: [email protected]

Phone: 202-403-5000


The first five individuals in the list above attended a full-day meeting in Washington, D.C. on October 12, 2012 to discuss the proposal. Of particular focus were the statistical assumptions informing the power analysis, advantages and disadvantages of the mathematics PD approaches under consideration at that time, covariates to be included in the HLM models and the proposed outcome measures.

The sixth individual, Dr. Song, reviewed the power analyses and HLM models independently.



A9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


All teachers in the PD and the control condition will be compensated for their time spent completing forms for this study (see section A12 – Table A2). These forms include the teacher consent/demographics form, the fractions measures (pre- and post-test, relevant for teaching fractions to 4th graders), and the monthly professional development activity survey. Therefore both treatment and control teachers will be remunerated $10 for the consent/demographics form, $50 for two assessments ($25 each), $90 for 9 monthly PD reports ($10 each), for a sum of $150.


Teachers in the experimental group will be paid their typical hourly rate (varies by state and district and often by seniority) for any time they spend attending PD sessions outside of their work day (i.e., on Saturdays). If teachers attend sessions on Saturdays, they will receive their hourly rate for the time they spend attending the session, completing the preparation assignment, and traveling to the PD site. (Note that the cost of paying teachers for the time they attend PD sessions or complete assignments has been included in the cost of the study and is not presented as an additional cost or incentive for this study.)


Budget limitations preclude us from offering the PD to control teachers in a subsequent year.



A10. Describe any assurances of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


REL Southeast will be following the new policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183 requires "All collection, maintenance, use, and wide dissemination of data by the Institute" to "conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h)." These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment.


In addition for student information, "The Director shall ensure that all individually identifiable information about students, their academic achievements, their families, and information with respect to individual schools, shall remain confidential in accordance with section 552a of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act.”


Subsection (c) of section 183 referenced above requires the Director of IES to "develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data".


Subsection (d) of section 183 prohibits disclosure of individually identifiable information as well as making any the publishing or communicating of individually identifiable information by employees or staff a felony.


REL Southeast will protect the confidentiality of all information collected for the study, as permitted by law, and will use it for research purposes only. Information collected for this study comes under the confidentiality and data protection requirements of the Institute of Education Sciences (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Responses to this data collection will be used only for statistical purposes. The reports prepared for the study will summarize findings across the sample and will not associate responses with a specific district, school, or individual. We will not provide information that identifies the participants (teachers and their students), the schools or the districts to anyone outside the study team, except as required by law.


No information that identifies any study participant will be released. Information from participating institutions and respondents will be presented at aggregate levels (across schools or groups) in reports. Staff working on the project have signed confidentiality pledges and been screened for data security through the U.S. Department of Education e-QIP system. All paper protocols will be stored in a locked facility and data stored in digital files will be maintained on a secure server that is backed up daily. Only persons conducting this study and maintaining its records will have access to the records collected that contain individually identifying information.Three years after the end of the project, the data will be destroyed.



A11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. The justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

None of the questions utilized to collect data for this study, including interviews, concern topics commonly considered private or sensitive, such as religious beliefs or sexual practices.


IRB approval has been granted.



A12. Provide estimates of the hour burden of the collection of information.


The estimated burden on respondents for completing the activities included in the study’s data collection activities is listed in Table A2. Respondents will include teachers at treatment and control schools.


Table A2 summarizes reporting burden on respondents for the data collection instruments, including total and annualized estimates. The annual burden has been averaged across the 3 years of the clearance. Annualized, the total number of respondents is 4,185 per year and the total number of responses is 4,964 per year. The total burden across all respondents is expected to be 2112 burden hours per year.


The burden estimates do not include the student assessment (administered at the end of the study) or the teacher assessments (pre and post). These measures are strictly assessments and do not impact burden estimates.


A complete explanation of the burden the DMI PD program may place on teachers can be found in Appendix J.

Table A2. Burden Estimates


Data Collection Activity

Respondents Per

School

Number of

Schoolsa


Total

Respondents

Responses per Respondent

Total Responses

Average Burden Hours per Response

Total Burden Hours

Teacher Consent Forms with Demographics (Appendix A)

3

82

246 teachers

1

246

.2

49.2

DMI Professional Development (sessions, assignments, travel time) b

3

41

123

experimental teachers

1

123

37

4551

Teacher Professional Development Survey (Appendix C)

3

82

246 teachers

9

(monthly)

2214

.2

442.8

Parent Opt-Out Form for Student Participation (Appendix F)

75c

82

6150 parents

1

6150

.1

615

Student Assent Form for Student Participation (Appendix G)

75d

82

6150 students

1

6150

.1

615

School records data collection: Student information

na

na

8 districts

1

8

8

64

TOTAL ESTIMATES



12,554


14,891


6,337

ANNUAL ESTIMATESe



4,185


4,964


2112

a The number of schools needed to detect an effect for this study is 80; however, we will over-recruit up to 84 schools to account for possible school-level attrition. We expect that 82 schools will remain in the study after attrition.

b A complete explanation of the burden the DMI PD program may place on teachers can be found in Appendix J.

c We estimate 25 parents in each classroom will be asked to read the parent opt-out form.

d We estimate 25 students in each classroom will be asked to complete the assent form; however, on average 19 students (of 25) will participate in the study from each classroom (assuming 10% of parents opt to remove their child from the study and 15% of students attrite).

e The annual figures have been averaged over the 3 years of the clearance.

A13. Describe any other costs to respondents or record keepers.


Not applicable. The information collection activities do not place any capital cost or cost of maintaining capital requirements on respondents.


A14. Provide estimates of annualized cost to the Federal government. Also provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expenses that would not have been incurred without this collection information. Agencies may aggregate cost estimates from Items 12, 13, and 14 in a single table.


The proposed data collection is being supported through Department of Education contract ED-IES-12-C-0011. This project is considered Task 4.1.5 of the NCEE REL Southeast contract awarded to Florida State University.


The annualized cost of data collection for this project is calculated simply as the planned annual fixed price allocations for the study during the two calendar years (2014 and 2015) that this study is gearing up for data collection, actually being in the field, and wrapping up data collection. Total cost of data collection for this project is estimated to be: $1,408,400. The annual cost of the data collection over the three years is estimated to be: $469,467. This represents a fixed price contract to conduct these activities in the two calendar years that involve data collection.


2014: Budget = $690,400

  • January – April 2014: Approximate date-range to receive OMB clearance

  • January – June 2014:            After OMB clearance: Recruiting of districts/schools/teachers – sign district/school MOUs

  • January – June 2014:            Work with DMI developers to plan/refine program as needed

  • January – June 2014: Collaborate with the Center for Improving Learning of Fractions on development of student outcome measure.

  • May – June 2014: Distribute/acquire teacher consent forms

  • June – August 2014: Recruit & train fractions measure assessors

  • July 2014:                          Conduct random assignment

  • August – September 2014:   Conduct fraction measure pretests

  • August – December 2014:    Conduct DMI sessions with 123 teachers in 8-10 districts

  • September – Dec 2014:    Collect monthly teacher PD logs


2015: Budget = $718,000

  • January – February 2015:    Continue to conduct DMI sessions with 123 teachers in 8-10 districts

  • January – April 2015: Collect monthly teacher PD logs

  • April – May 2015: Recruit/train assessors

  • May – June 2015: Conduct teacher and student post-tests

  • June 2015: Data entry & cleanup

  • July – October 2015: Conduct data analysis

  • September – Dec. 2015: Report Preparation

  • December 2015: Submit first draft of technical report to IES


Funding includes paying for REL Southeast staff, as well as covering costs of subcontractors on this study who will assist in study design, data collection, and data analysis. Also included are costs associated with delivering the intervention to treatment schools (including paying for teachers for attending sessions, completing assignments and traveling to sessions on Saturdays and for substitutes if the need arises), and collecting data from both treatment and control schools. There are no additional costs, other than staff collection time and duplication, for the student and teacher outcome measures.


A15. Describe any changes in the burden from prior approvals.


This submission to OMB is a new request for approval of data collection plans.


A16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of the report, publication dates, and other actions.


The project schedule is as follows:

2014:

  • January – April 2014: Approximate date-range to receive OMB clearance

  • January – June 2014:            After OMB clearance: Recruiting of districts/schools/teachers – sign district/school MOUs

  • January – June 2014:            Work with DMI developers to plan/refine program as needed

  • January – June 2014: Collaborate with the Center for Improving Learning of Fractions on development of student outcome measure.

  • May – June 2014: Distribute/acquire teacher consent forms

  • June – August 2014: Recruit & train fractions measure assessors

  • July 2014:                          Conduct random assignment

  • August – September 2014:   Conduct fraction measure pretests

  • August – December 2014:    Conduct DMI sessions with 123 teachers in 8-10 districts

  • September – Dec 2014:    Collect monthly teacher PD logs


2015:

  • January – February 2015:    Continue to conduct DMI sessions with 123 teachers in 8-10 districts

  • January – April 2015: Collect monthly teacher PD logs

  • April – May 2015: Recruit/train assessors

  • May – June 2015: Conduct teacher and student post-tests

  • June 2015: Data entry & cleanup

  • July – October 2015: Conduct data analysis

  • September – Dec. 2015: Report Preparation

  • December 2015: Submit first draft of technical report to IES


2016:

  • January – June 2016: Approximate date range to respond iteratively to IES reviews of drafts of report

  • September 2016: Estimated date of final report release by IES



The design, data collection, analysis, and reporting for this study is driven by the primary (confirmatory) research question: What is the impact of teacher participation in Developing Mathematical Ideas on students’ proficiency in the area of fractions when compared with those students whose teachers receive existing professional development activities provided by schools and districts?


A secondary (exploratory) research question will also be addressed, although the study was not specifically powered to do so: What is the impact of teacher participation in Developing Mathematical Ideas for one year on teacher knowledge of mathematics relevant for teaching fractions to 4th graders when compared with teachers who only participate in existing professional development activities provided by schools and districts?


Our approach to estimating the effects of DMI has the following core features:


  1. A focus on student-level impacts based directly on the experimental design.

  2. Estimation of impacts in ways that account for clustering of students within classrooms, within schools, and within districts.


The basic logic of our analysis strategy is to compare the schools that are randomly assigned to receive the treatment to those that are not. As random assignment occurs at the school level, schools are the primary unit of analysis. Schools will be paired within district based on overall demographic characteristics to increase the probability of baseline equivalence.


The research team will use multi-level models in the impact analysis to account for the clustering of students within classrooms, and teachers within schools. The analytic methods are discussed in further detail in Part B2 and Appendix J.


All results for REL rigorous studies will be made available to the public through peer-reviewed evaluation reports that are published by IES. The datasets from these rigorous studies will be turned over to the REL’s IES project officer. These data will become IES restricted use datasets requiring a user’s license that is applied for through the same process as NCES restricted use datasets. Even the REL contractor would be required to obtain a restricted use license to conduct any work with the data beyond the original evaluation.


A17. Describe arrangements for displaying the number provided by OMB and its expiration date.


The approval number provided by OMB and its expiration date will appear in the heading on all instruments for this project.


A18. Exceptions to Certification Statement


No exceptions are necessary for this information collection.




References

Ball, D. L., & Hill, H. C. (2008). Mathematical knowledge for teaching (MKT) measures: Mathematics released items 2008. Ann Arbor, MI: University of Michigan. Retrieved from http://sitemaker.umich.edu/lmt/files/LMT_sample_items.pdf

Center for Improving Learning of Fractions. (2013). Center for Improving Learning of Fractions. Retrieved from https://sites.google.com/a/udel.edu/fractions/

Georgia Department of Education. (2012). Common core Georgia performance standards. Retrieved from http://www.doe.k12.ga.us/Curriculum-Instruction-and-Assessment/Curriculum-and-Instruction/Pages/CCGPS.aspx

Hecht, S. A., Close, L., & Santisi, M. (2003). Sources of individual differences in fraction skills. Journal of Experimental Child Psychology, 86(4), 277-302.

Hill, H., Blunk, M., Charalambous, C., Lewis, J., Phelps, G., Sleep, L., & Ball, D. L. (2008). Mathematical knowledge for teaching and the mathematical quality of instruction: An exploratory study. Cognition and Instruction, 26(4), 430-511.

Hill, H., Kapitula, L., & Umland, K. (2011). A validity argument approach to evaluating teacher value-added scores. American Educational Research Journal, 48(3), 794-831.

Hill, H., Rowan, B., & Ball, D. (2005). Effects of teachers' mathematical knowledge for teaching on student achievement. American Educational Research Journal, 42(2), 371-406.

Jordan, N. C., Hansen, N., Fuchs, L. S., Siegler, R. S., Gersten, R., & Micklos, D. (2013). Developmental predictors of fraction concepts and procedures. Journal of Experimental Child Psychology, 116 (1), 45-58.

Ma, L. (1999). Knowing and teaching elementary mathematics: Teacher's understanding of fundamental mathematics in China and the United States. New York, NY: Routledge.

National Assessment of Educational Progress (2011). The nation’s report card: Mathematics 2011. (NCES: 2012-458). Washington, DC: Institute of Education Sciences, U.S. Department of Education. Retrieved from http://nces.ed.gov/nationsreportcard/pdf/main2011/2012458.pdf

National Mathematics Advisory Panel. (2008). Foundations for success: The final report of the National Mathematics Advisory Panel. Washington, DC: U.S. Department of Education. Retrieved from http://www2.ed.gov/about/bdscomm/list/mathpanel/report/final-report.pdf

Perry, R. R., & Lewis, C. C. (2011). A randomized trial of lesson study with mathematical resources: Measuring the impact on fractions knowledge. Manuscript submitted for publication, School of Education, Mills College, Oakland, California.

Schifter, D., Bastable, V. & Russell, S. J. (2010). Making meaning for operations in the domains of whole numbers and fractions: Facilitator’s guide. Boston, MA: Pearson Education, Inc.

Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston, MA: Houghton Mifflin.

Siegler, R., Carpenter, T., Fennell, F., Geary, D., Lewis, J., Okamoto, Y., . . . Wray, J. (2010). Developing effective fractions instruction for kindergarten through 8th grade: A practice guide (NCEE #2010-4039). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. Retrieved from http://whatworks.ed.gov/publications/practiceguides

Siegler, R. S., Duncan, G. J., Davis-Kean, P. E., Duckworth, K., Claessens, A., Engle, M., . . . Chen, M. (2012). Early predictors of high school mathematics achievement. Psychological Science, 23(7), 691-697. doi: 10.1177/0956797612440101

South Carolina State Department of Education. (2013). Common Core State Standards. Retrieved from http://ed.sc.gov/agency/programs-services/190/

What Works Clearinghouse. (2013). What works clearinghouse: Procedures and standards handbook (Version 3.0). Retrieved from http://ies.ed.gov/ncee/wwc/pdf/reference_resources/wwc_procedures_v3_0_draft_standards_handbook.pdf

Yanik, H. B., Helding, B., & Baek, J. M. (2006). Students’ difficulties in understanding fractions as measures. Problem Solving, 2, 323-325.

Yoon, K. S., Duncan, T., Lee, S. W.-Y., Scarloss, B., & Shapley, K. (2007). Reviewing the evidence on how teacher professional development affects student achievement (Issues & Answers Report, REL 2007–No. 033). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southwest. Retrieved from http://ies.ed.gov/ncee/edlabs



1 Fractions knowledge at age 10-12 was found to be uniquely associated with a 0.17 SD increase (p < .05) in algebra knowledge and a 0.18 SD increase (p < .01) in total math achievement at age 15-17. This is based on a U.S. sample of 599 children who were tested in 1997 as 10-12 year-olds and again in 2002 as 15-17 year-olds. Nearly identical results were found in a UK sample (Siegler et al., 2012).

2 This is based on the information retrieved from the South Carolina State Department of Education and Georgia Department of Education websites (South Carolina State Department of Education, 2013; Georgia Department of Education, 2012).

3 The term confirmatory refers to the primary research question for which the study was specifically designed and powered to address.

4 We will recruit 84 schools in order to account for any school-level attrition. We expect our final sample to include 82 schools.

5 Mathematical Knowledge for Teaching (MKT) (Ball and Hill, 2008). Note that the MKT teacher knowledge test is strictly an assessment and not included in the burden estimates. A sample of MKT items is provided in Appendix B.

5


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEric Rolfhus
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy