PPD_40412-PartA_REVISED PACKAGE FOR OMB_CLEAN

PPD_40412-PartA_REVISED PACKAGE FOR OMB_CLEAN.docx

An Impact Evaluation of Support for Principals

OMB: 1850-0918

Document [docx]
Download: docx | pdf

part a

Impact Evaluation of Support for Principals: OMB Data Collection Package

June 12, 2015



Submitted to:

Institute of Education Sciences
555 New Jersey Ave NW, Suite 502A
Washington, DC 20208

Project Officer: Elizabeth Warner
Contract Number: ED-IES-14-R-0008

Submitted by:

Mathematica Policy Research

P.O. Box 2393
Princeton, NJ 08543-2393
Telephone: (609) 799-3535
Facsimile: (609) 799-0005

Project Director: Susanne James-Burdumy
Reference Number: 40412.492





CONTENTS

PART A. SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION 51

A. Justification 51

1. Circumstances necessitating the collection of information 51

2. Purposes and uses of the data 95

3. Use of technology to reduce burden 128

7. Special circumstances 1510

8. Federal Register announcement and consultation 1510

9. Payments or gifts 1611

10. Assurances of confidentiality 1712

11. Additional justification for sensitive questions 1813

12. Estimates of hours burden 1813

13. Estimates of cost burden to respondents 1813

14. Estimates of annual costs to the federal government 1913

15. Reasons for program changes or adjustments 1913

16. Plan for tabulation and publication of results 2014

17. Approval not to display the OMB expiration date 2217

18. Explanation of exceptions 2217

References 2418

APPENDIX A: PRINCIPAL SURVEY

APPENDIX B: PRINCIPAL DAILY LOG

APPENDIX C: TEACHER SURVEY

APPENDIX D: DISTRICT-LEVEL DATA REQUEST MEMO

APPENDIX E: ADVANCE LETTERS

APPENDIX F: CONFIDENTIALITY AGREEMENT

TABLES

A.1 Study data collection activities, by data source 84

A.2 Time line for data collection activities 95

A.3 Research questions and data sources 105

A.4 Source, mode, and timing 149

A.5 Technical Working Group members 1611

A.6 Annual Reporting and recordkeeping hour burden 1914

A.7 Principal, teacher, and student outcomes for the impact analysis 2015

A.8 Timetable for project publications 2217

EXHIBITS

A.1 Logic model for the CEL Program 73



PART A. SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION

This Office of Management and Budget (OMB) package requests clearance for data collection activities to support the Impact Evaluation of Support for Principals. The evaluation will provide important information on the implementation and impacts of intensive professional development for principals. The Institute of Education Sciences (IES) at the U.S. Department of Education (ED) has contracted with Mathematica Policy Research and its subcontractors, the American Institutes for Research (AIR), Social Policy Research Associates (SPR), and Pemberton Research to conduct this evaluation.

The evaluation will include implementation and impact analyses. The implementation analysis will draw on information on the scope and sequence of the professional development and data from principal surveys to describe principals’ professional development experiences.1 The impact analysis will be based on a random assignment design in which participating schools in each district are randomly assigned to a treatment group whose principals are offered intensive professional development or to a control group whose principals are not. The impact analysis will draw on data from principal and teacher surveys, principal daily logs, and district administrative records to estimate the impacts of the professional development on principal performance, school climate, teacher retention and performance, and student achievement.

A. Justification

1. Circumstances necessitating the collection of information

a. Statement of need for a rigorous evaluation of principal professional development

Principals play an important role in the academic performance of students in their schools (Hallinger and Heck 1998; Harris et al. 2010; Knapp et al. 2006; Leithwood et al. 2004), and there is widespread interest in the potential of intensive principal professional development programs to improve principals’ performance. However, little is known about the effectiveness of these programs and their ability to improve principals’ leadership skills and school quality (Huff et al. 2013). The data collection described in this request will provide essential information for describing implementation of a principal professional development and providing rigorous estimates of its effects on principal performance, school climate, teacher retention and performance, and student achievement.

Legislative authorization for this evaluation is found in Title II, Part A of the Elementary and Secondary Education Act (ESEA), Section 2121-2123, as amended by the No Child Left Behind Act (NCLB) (20 USC 6621-6623). Title II, Part A of ESEA provides funding to states to carry out professional development activities designed to improve the quality of principals. Part F, Section 9601 of ESEA permits program funds to be used to evaluate activities authorized under the act.

b. Study design and research questions

To learn about the effectiveness of intensive principal professional development, the study team will evaluate an intensive professional development program using a rigorous random assignment design. In fall 2014 we held a competition to select a promising principal professional development program, and, with input from a panel of experts, selected the Center for Educational Leadership (CEL) at the University of Washington. In the coming months we plan to recruit 100 elementary schools from 10 school districts across the country and randomly assign them to a treatment group whose principals will be offered CEL’s principal professional development program or to a control group whose principals will not be offered this program.2 We will support and monitor the program for two school years to ensure high quality implementation, and will collect comprehensive data on program implementation and school, principal, teacher, and student outcomes. We will use these data to compare outcomes between treatment and control group schools to obtain rigorous evidence on the effectiveness of the program.

The study will address the following research questions about the implementation and impacts of the CEL principal professional development program:

  • Implementation. What are the professional development experiences of principals in the study?

  • Impacts. What are the impacts of intensive principal professional development on (a) school climate and educator behaviors, (b) teacher retention and effectiveness, and (c) student achievement and behavior?

c. The CEL principal professional development program

CEL’s professional development program is based on a theory of action that principals’ instructional-focused leadership boosts teaching quality and raises achievement. According to the program’s logic model (Exhibit A.1), if principals have a clear understanding of what quality instruction looks like, as well as a common language to describe and promote elements of quality instruction, and if they know how to lead instructional improvement, instructional practice will become more effective and student achievement will improve.

CEL seeks to develop principals’ knowledge of high-quality instruction and to equip them to lead their teachers to deliver quality instruction, by honing their skills in three content areas:

  1. Improving teacher instruction through observation, analysis of data (including information from district evaluation systems), and implementation of “inquiry cycles” that generate useful feedback for teachers (instructional leadership);

  2. Guiding staffing plans and staff development (human capital management); and

  3. Creating a school-wide culture of learning to facilitate improved academic success for all students (organizational leadership).



Exhibit A.1. Logic model for the CEL Program

CEL’s professional development sessions aim to sharpen principals’ skills in observing classrooms and identifying areas in which teachers need to improve. The professional development sessions are tailored to each district’s instructional framework and teacher evaluation process, using crosswalks between the different frameworks. CEL also provides individual coaching and professional learning communities (PLCs) aligned with the group sessions.

While the length of CEL’s program varies across districts, for the purposes of this study CEL will provide its program for 24 months, beginning with a summer institute in 2015.3 During the 4-day summer institute, CEL will introduce principals to core program content; establish relationships between principals and coaches; analyze school data; and help principals plan specific leadership actions. After the summer institute, CEL will support principals in treatment schools throughout the 2015–2016 school year, to help them apply lessons from the summer institute in their schools. This support will include 8 one-day professional development sessions in each district; 10 half-day one-on-one coaching sessions (primarily in-person but some may be virtual); and quarterly online PLCs. In this way, principals will interact at least twice a month with the professional development provider.

The second year of the program will begin with a summer institute in 2016 for any principals who are new to one of the 50 treatment schools in the study. Principal coaching will continue for all principals in treatment group schools throughout the 2016–2017 school year.

d. Data collection needs

This study includes several data collection efforts that are summarized in Table A.1. The purposes and uses of these data are described in Section A.2 below.

Table A.1. Study data collection activities, by data source

Data source

Mode, timing, and respondent

Key constructs and outcomes

Principal survey (treatment and control groups)

Annual surveys

30-minute web-based survey, with telephone option and in-person follow-up, administered spring 2016 and spring 2017 to treatment and control group principals

Amount and usefulness of professional development received by principal during the current school year and the preceding summer

Principals’ leadership practices during the school year (human capital management, instructional leadership, and organizational leadership)

District and school context (resource availability, district policies affecting principal practice)

Principal background characteristics (demographic traits, educational degrees, certification/licensure, and professional experience)

Principal log (treatment and control groups)

Daily logs

15-minute web-based logs completed on 5 consecutive days by treatment and control group principals, during four weeklong periods of the 2015–2016 and 2016–2017 school years

Time spent by principal on professional development activities during the school year

Time spent by principal on various leadership practices (human capital management, instructional leadership, organizational leadership, or other) during the school year

Teacher survey (treatment and control groups)

Annual surveys

30-minute web-based survey, with telephone and hard-copy options and in-person follow-up, administered to treatment and control group teachers in spring 2016 and spring 2017

Principals’ use of leadership practices and quality of principals’ instructional and organizational leadership during the school year

Amount and usefulness of formal and informal professional development received by teachers during the school year

Amount and usefulness of instructional feedback teachers received from principal or other school leaders during the school year

Teachers’ use of student achievement data and changes to instructional practice (teaching methods, student work, student assessment, and topics or materials) during the school year

School climate and culture (professional climate, staff collaboration, student engagement and belonging, family engagement, and school safety)

Teacher background characteristics (demographic traits, educational degrees, certification/licensure, and professional experience)

District records (treatment and control groups)

Staff records

Electronic grade and school assignment data for prior and current school years, and background characteristics for prior school year, requested from all districts in fall 2015, fall 2016, and fall 2017

Principal and teacher mobility, teacher retention, teacher hiring, and principal and teacher background characteristics

Staff performance records

Electronic performance data for prior school years, requested from all districts in fall 2016 and fall 2017b

Teachers’ instructional effectiveness and principals’ leadership effectiveness

Student records

Electronic student records data for prior school years, requested from all districts in fall 2016 and fall 2017 b

Student achievement (test scores), behavioral outcomes (attendance and disciplinary records), and background characteristics (demographic traits, grade level, English language learner status, and special education status)

bWe will request teacher and principal performance data and student records for the 2014–2015 and 2015–2016 school years in fall 2016 and for the 2016–2017 school year in fall 2017.

e. Time line for data collection activities

The evaluation is expected to be completed in five years, with 2.5 years of data collection. Table A.2 shows the schedule of data collection activities.

2. Purposes and uses of the data

To address the study’s research questions, the evaluation will collect and analyze data from several sources. IES will use these data to better understand principals’ professional development experiences, and to estimate the impact of the professional development on school, teacher, and student outcomes. Table A.3 lists research questions and specific data sources that will be used to answer them—in this table we have broken Research Question 2 into three parts, to correspond to three sets of outcomes we will examine. We describe how the study will use each data source below. Information will be collected by Mathematica Policy Research and its partners AIR, SPR, and Pemberton Research, under contract with ED [contract number ED-IES-14-R-0008].

Table A.2. Time line for data collection activities

Activity

Date

Data for 2015-2016 school year

Collect principal daily logs

9/2015 through 5/2016

Collect principal and teacher survey data

3/2016 through 5/2016

Collect district recordsa

8/2015 through 12/2015; 8/2016 through 12/2016

Data for 2016-2017 school year

Collect principal daily logs

9/2016 through 5/2017

Collect principal and teacher survey data

3/2017 through 5/2018

Collect district recordsb

8/2017 through 12/2017

aIn fall 2015, we will only collect data on teacher assignments for fall 2015. In fall 2016 we will collect district records data for the prior (2014–2015) school year, as well as teacher assignment data for the 2015–2016 school year.

bIn this data collection, we will also collect district records data on teacher and principal assignments for the 2017–2018 school year.

Table A.3. Research questions and data sources

Research question

Data source(s)

1. Implementation


What are the professional development experiences of principals in the study?

Principal surveys

2. Impacts


a. Impacts on intermediate outcomes. What are the impacts of intensive principal professional development on school climate and educator behaviors?

Principal surveys

Principal daily logs

Teacher surveys

b. Impacts on teacher outcomes. What are the impacts of intensive principal professional development on teacher retention and effectiveness?

District records

Staff records

Staff performance records

c. Impacts on student outcomes. What are the impacts of intensive principal professional development on student achievement and behavior?

District records

Student records

Principal Surveys. Surveys administered to treatment- and control-group principals in years 1 and 2 will collect information on their leadership practices and perceptions of school climate. We will use this information to examine the effect of professional development on leadership practices and school climate (research question 2a). We will also ask principals to report the amount, content, and usefulness of their professional development experiences. We will use this information to document treatment and control group principals’ perceptions of their professional development experiences (research question 1). In addition, principals will provide data on their backgrounds, professional experience and educational attainment, and the district context in which they operate (including, for example, barriers to school improvement). We will use this information to document the context of principals’ professional development experiences (research question 1).

Principal Daily Logs. Daily logs will collect information on how principals allocate their time across different categories of leadership practice. Logs will be administered to treatment and control groups on five consecutive days during four weeks of school years 1 and 2. We will use this information to examine the effect of professional development on daily time-use in different domains of leadership (research question 2a). As compared to annual surveys, daily logs have been shown to more accurately capture frequently occurring behaviors, such as day-to-day instructional leadership and managerial responsibilities, because they allow for measurement of daily fluctuations and are less susceptible to recall error (Camburn et al. 2010a, 2010b). Principal logs will complement annual principal surveys by ensuring that the study team can examine impacts on how principals are actually spending their time on leadership behaviors on a given day, rather than relying only on annual recall; and allowing detection of changes in principals’ leadership behaviors within a year.

Teacher Surveys. Surveys administered to teachers in years 1 and 2 will collect information on teachers’ experiences during the school year, including the amount and perceived usefulness of professional development and instructional support they received, instructional improvement and data-use practices, and perceptions of their principal’s leadership practices and school improvement efforts. We will use this information to examine the effect of principal professional development on teacher practices and teacher perceptions of principals’ leadership practices. Teachers will also be asked about their schools’ working conditions, staff collaboration and support, and academic culture during the school year. We will use this information to examine the effect of principal professional development on school climate (research question 2a). Finally, teachers will provide data on their backgrounds, educational attainment, and professional experience. We will use this information to provide context for understanding principal’s professional development experiences (research question 1) and to examine how principal professional development affects the mix of teachers hired and retained in the study schools (research question 2b).

District Records. We will collect data on principals, teachers, and students from the district records listed below. We will use this information to estimate effects of principal professional development on staff mobility, retention, and performance (research question 2b) and on student achievement and behavior (research question 2c), and we will also use district data to draw a sample of teachers to take the teacher survey in years 1 and 2.

  • Staff records. We will collect data on principals’ school assignments and teachers’ school assignments and the grades and subjects taught. We will use this information to examine effects of principal professional development on principal turnover and teacher retention (research question 2b). We will also use it to draw a random sample of teachers to take the survey in years 1 and 2 of the study. We will collect available information on characteristics of the study sample, such as principal and teacher demographics (age, sex, race, and ethnicity), educational attainment (certifications, degrees, and scores on licensure or certification exams), and years of teaching experience. We will use this information to describe the study context in the implementation analysis (research question 1), and to examine whether impacts of principal professional development vary depending on baseline values of these characteristics, such as principal experience (research question 2).

  • Staff performance records. We will collect principal and teacher performance ratings from district evaluation systems. These may include composite ratings, observation ratings, teacher value-added scores, school value-added scores, or scores on student learning objectives. We will use this information to examine the effect of principal professional development on teacher and principal effectiveness as assessed by their district systems (research question 2b). In conjunction with staff records described above, we will also use this information to examine effects on hiring and retention of effective teachers, dismissal of ineffective teachers, and ways that principals assign teachers of different performance levels to classes (research question 2b).

  • Student records. We will collect information from student records for years 1 and 2, as well as the school year before the program started. We will obtain student standardized test scores in math and English/language arts for all tested grades, which we will use to examine the effect of principal professional development on student achievement (research question 2c). We will also obtain data on student attendance and discipline, which we will use to examine effects on student behavior. Finally, we will collect data on student demographics (age, sex, race, and ethnicity) and participation in other programs (free or reduced-price lunch eligibility, English Language Learner status, and individualized education plan status). We will use this information to describe students in the study sample and use baseline values of these measures and student test scores to improve precision of estimates in the impact analysis (research question 2).

Data collection in year 1. We will use information collected during year 1 to examine principals’ professional development experiences (research question 1) and impacts of principal professional development during the initial year of program implementation (research question 2). We will use information on year 1 educator behaviors and teacher and student outcomes to inform research question 2, by examining whether principal professional development resulted in differences between the treatment and control groups after the first year of the intervention’s implementation.

Data collection in year 2. We will use the information collected during year 2 to examine the impacts of intensive principal professional development one year later, after two years of the intervention’s implementation (research question 2). More specifically, we will use the data collected in year 2 to examine whether principal professional development resulted in differences in school climate, teacher and principal practices, and teacher, principal, and student outcomes after two years of professional development.

3. Use of technology to reduce burden

The data collection plan is designed to obtain information in an efficient way that minimizes respondent burden. When feasible, we will gather information from existing data sources, using the most efficient methods available. Table A.4 provides information on the source, mode, and timing for each data collection activity.

A web-based survey will be the primary mode of data collection for principals and teachers in the study. The web-based surveys and log will enable respondents to complete the data collection instrument at a location and time of their choice, and its built-in editing checks and programmed skips will reduce the level of response errors. We will also offer two additional modes (telephone and in-person) of interviewing for educators who prefer these methods to reduce burden on them.

District records we will collect will be computer files provided by districts. Whenever possible, districts will upload the files to a secure Mathematica FTP site. While we will specify the required data elements, we will accept any format in which the data are provided, to reduce burden for the district. Regardless of the form in which it is received, these data will be converted to a consistent format so that they can be combined with data submitted by other districts and will be suitable for analysis.

4. Efforts to avoid duplication of effort

Whenever possible, the study team will use existing data from districts, such as from their records for staff and students. This will reduce respondent burden and minimize duplication of data collection efforts.

5. Methods to minimize burden on small entities

No small businesses or entities will be involved as respondents.

6. Consequences of not collecting data

These data are needed to evaluate intensive principal professional development. Little is known about the ability of principal professional development programs to improve principals’ leadership skills and school quality. Failing to collect these data would miss an opportunity to inform the field with rigorous evidence on the effectiveness of intensive principal professional development.

Table A.4. Source, mode, and timing

Data source

Mode and timing

Respondent group

Principal survey

Annual Survey

30-minute web-based survey, with telephone option and in-person follow-up, administered in spring 2016 and spring 2017

Treatment and control

Principal log

Daily logs

15-minute web-based logs completed on 5 consecutive days, during four weeklong periods of the 2015–2016 and 2016–2017 school years

Treatment and control

Teacher survey

Annual Survey

30-minute web-based survey, with telephone and hard-copy options and in-person follow-up, administered in spring 2016 and spring 2017

Treatment and control

District records

Staff records

Electronic grade and school assignment data and background characteristics for prior and current school years, requested from districts in fall 2015, fall 2016, and fall 2017

Treatment and control

Staff performance records

Electronic performance data for prior school years, requested from districts in fall 2016 and fall 2017

Treatment and control

Student records

Electronic student records data for prior school years, requested from districts in fall 2016 and fall 2017

Treatment and control


Consequences of not collecting data from specific sources. Each of the four data sources provides information needed for the evaluation.

Without information from principal surveys, we will be unable to examine the professional development experiences of principals in the study, nor to compare the experiences of control-group principals to those of treatment-group principals. In addition, we will be unable to examine intermediate impacts of intensive principal professional development on principals’ leadership practices and school-wide safety. Finally, without information on principals’ demographic background, professional experience, and district context, we will be unable to describe the sample of principals participating in the study or capture principal and district characteristics that may influence the implementation or effectiveness of the principal professional development.

Without the information from principal daily logs, we will be unable to examine how professional development impacts principals’ daily time-use in different domains of school leadership. Without collecting this information on five consecutive days per observation period, we will be unable to reliably measure principal time-use. Without collecting this information during four weeklong observation periods per school year, we will be unable to assess whether principal professional development induces changes in principals’ time-use over the school year.

Without teacher surveys, we will be unable to examine how intensive principal professional development impacts teachers’ behaviors and their perceptions of professional development opportunities, principals’ leadership practices, and school improvement efforts. Without the information on teacher perceptions of school climate, we will be unable to assess school-wide changes in working conditions, staff collaboration and support, and academic culture. Finally, without the information on teachers’ demographic backgrounds, educational attainment, and professional experience, we will be unable to capture teacher characteristics that may influence the implementation or effectiveness of the principal professional development.

Without district records, we will not be able to analyze the ultimate impact of the principal professional development on teacher or student outcomes. Without the following information, we will not be able to assess changes in principals’ ability to hire and retain effective teachers, determine effective teaching assignments, improve teachers’ performance, or improve student achievement and behavior:

  • Without information from staff records, we will be unable to examine impacts of the principal professional development on staff mobility (principal turnover and teacher turnover), hiring and retention outcomes (hiring and retention of effective teachers and dismissal of ineffective teachers), and teacher assignments to grade levels or subjects. We will also be unable to draw a random sample of teachers for the teacher surveys, preventing us from capturing teacher perceptions about principal practices.

  • Without information from staff performance records, we will be unable to examine impacts of the principal professional development on principal and teacher effectiveness or teacher-effectiveness-based staffing outcomes (e.g., hiring or retaining effective teachers).

  • Without information from student records, we will be unable to examine impacts of the principal professional development on student outcomes (achievement, attendance, and disciplinary problems). Moreover, without information from prior-year student records, we will be unable to assess the validity of random assignment (i.e., establish baseline equivalence between treatment and control groups).

Consequences of not collecting data at the end of years 1 and 2. If we do not collect data at the end of year 1, we will be unable to examine program impacts—such as changes in principal or teacher practices—after the initial year of program implementation. In addition, we will not be able to examine differences across the treatment and control groups in principals’ professional development experiences—such as the amount, focus, or quality of professional development received—during the first year of program implementation. If we do not collect this data at the end of year 2, we will be unable to examine effects of the principal professional development after the second year of the program. It may take time for principals to implement leadership improvements learned through professional development. Consequently, school improvements and changes in teacher behaviors, teacher outcomes, or student outcomes may require time to become apparent. Without the year 2 information, we will be unable to detect any such effects that require time to develop.

7. Special circumstances

There are no special circumstances involved with this data collection.

8. Federal Register announcement and consultation

a. Federal Register announcement

The 60-day notice to solicit public comments was published in volume 80, number 69, pages 19298-19305of the Federal Register on April 10, 2015. As of June 9, 2015, no public comments were received. The 30-day notice to solicit public comments was published in [insert volume, number, and page] of the Federal Register on [insert date].

b. Consultations outside the agency

In formulating the evaluation design, the study team sought input from the technical working group (TWG), which includes practitioners and experts in evaluation methods and data analysis, state assessment programs, and education reform. We will continue to consult with the TWG throughout the study on other issues that would benefit from their input. Table A.5 lists the TWG members.

Table A.5. Technical Working Group members

Name

Title and affiliation

Eric Camburn

Associate Professor and Senior Researcher, Consortium for Policy Research in Education (CPRE), University of Wisconsin-Madison

Roger Goddard

Professor and Fawcett Chair, Department of Educational Studies, the Ohio State University

Jason Grissom

Assistant Professor and Faculty Affiliate, Center for the Study of Democratic Institutions, Vanderbilt University

Jason Huff

Educational Consultant, Seattle Public Schools

Carolyn Kelley

Professor, Educational Leadership and Policy Analysis, University of Wisconsin-Madison

Jim Kemple

Executive Director of the Research Alliance for New York City Schools; Research Professor at the Steinhardt School of Culture, Education, and Human Development, New York University

Susanna Loeb

Barnett Family Professor of Education, Stanford University

John Nunnery

Executive Director, Center for Educational Partnerships, Old Dominion University

Jeff Smith

Professor, Economics, University of Michigan


9. Payments or gifts

Burden payments have been proposed for the principal survey, the principal log, and the teacher survey. Payment will be made in the form of a check or gift card. During each round of data collection, principals completing the survey will receive a $30 payment, which is consistent with a high burden survey. In addition, principals will be given $10 for each day a log is completed for a possible total of $50 per round of data collection. The $10 payment for each log is consistent with low burden. While all aspects of this data collection are vital to the success of the study, the principal log will provide data on a particularly critical outcome.  Adequately incentivizing principals to regularly (1) log time spent performing activities throughout the day and (2) enter the data into the online database will increase the likelihood that accurate data is collected across the span of the data collection period.

Teachers will receive a payment of $30 for completing the teacher survey. The payments are proposed because high response rates are needed to make the survey findings reliable, and we are aware that principals and teachers are the target of numerous requests to complete surveys on a wide variety of topics from state and district offices, independent researchers, and ED. The proposed amounts are within the incentive guidelines outlined in the March 22, 2005 memo, “Guidelines for Incentives for NCEE Evaluation Studies,” prepared for OMB.

10. Assurances of confidentiality

The study team has established procedures designed to protect the confidentiality and security of its data. This approach will be in accordance with all relevant regulations and requirements, in particular the Education Sciences Institute Reform Act of 2002, Title I, Subsection (c) of Section 183, which requires the director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.”  The study will also adhere to requirements of Subsection (d) of Section 183 prohibiting disclosure of individually identifiable information as well as making the publishing or inappropriate communication of individually identifiable information by employees or staff a felony.

The study team will protect the full privacy and confidentiality of all individuals who provide data. The study will not have data associated with personally identifiable information (PII), as study staff will be assigning random ID numbers to all data records and then stripping any PII from the data records. In addition to the data safeguards described here, the study team will ensure that no respondent names, schools, or districts are identified in publicly available reports or findings, and if necessary, the study team will mask distinguishing characteristics. A statement to this effect will be included with all requests for data:

Mathematica Policy Research and its subcontractors AIR and SPR follow the confidentiality and data protection requirements of IES (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Responses to this data collection will be used only for research purposes. The reports prepared for the study will summarize findings across the sample and will not associate responses with a specific district, school, or individual. We will not provide information that identifies respondents to anyone outside the study team, except as required by law.”

Mathematica employs the following safeguards to protect confidentiality:

  • All Mathematica employees sign a pledge that emphasizes the importance of confidentiality and describes their obligation (see Appendix F).

  • Secure FTP services allow participating school districts to securely transfer encrypted data files to the study team. Internal networks are all protected from unauthorized access utilizing defense-in-depth best practices, which incorporate firewalls and intrusion detection and prevention systems. The networks are configured so that each user has a tailored set of rights, granted by the network administrator, to files approved for access and stored on the LAN. Access to hard-copy documents is strictly limited. Documents are stored in locked files and cabinets. Discarded materials are shredded.

  • Computer data files are protected with passwords, and access is limited to specific users, who must change their passwords on a regular basis and conform to strong password policies.

  • Especially sensitive data are maintained on removable storage devices that are kept physically secure when not in use.

After the study concludes, the study data may be transmitted to the National Center for Education Statistics (NCES) for safekeeping as a restricted-use file. All other versions of the data will be destroyed. Prior to transmittal, the data will undergo careful screening, and modification if necessary, to ensure that there is no unacceptably high level of disclosure risk for protected respondents. Researchers wishing to access the data for secondary analysis must apply for an NCEE license and agree to the applicable rules and procedures guiding the use of restricted-use files.

11. Additional justification for sensitive questions

No questions of a sensitive nature will be included in this study.

12. Estimates of hours burden

Table A.6 provides an estimate of time burden for the data collections, broken down by instrument and respondent. These estimates are based on our experience collecting administrative data from districts and administering surveys to school principals and teachers. In addition, the table presents annualized estimates of indirect costs to all respondents for each data collection instrument. Details on the time and cost burdens are provided below for each of the separate data collection activities.

The total of 2,235 hours covers all five years of the evaluation, and includes the following efforts: up to 560 minutes, annually for three years, for each of the 10 districts to collect and assemble administrative records on teachers, principals, and students participating in the evaluation;4 30 minutes, annually for two years, for 85 principals (85 percent of the anticipated 100 principals in the sample) to complete the principal survey, plus 15 minutes, 20 times per year for two years, for 85 principals to complete the principal log; and 30 minutes, annually for two years, for 1,020 teachers (85 percent of the anticipated 1,200 teachers in the sample) to complete the teacher survey. Annual number of respondents is 372 and annual responses for three years of this collection are 1,880. The total annual burden for this collection is 745 hours.

A total burden cost of $31,292.45 was calculated across all data collection activities. The estimated cost burden for district-level data collection is based on an average wage of $44.13 per hour coming to a total of $4104.09. For principals, the average wage of $44.13 per hour was used to calculate the total burden cost of $1,235.64 for completing the principal survey data collection and $12,532.92 for completing the principal log data collection. An average wage of $39.47 was used to calculate the total burden cost of $13,419.80 for the teacher survey data collection.

13. Estimates of cost burden to respondents

There are no additional respondent costs associated with this data collection beyond the burden estimated in item A12.

14. Estimates of annual costs to the federal government

The estimated cost for this five-year study, including selection of the intervention provider, data collection instruments, justification package, district recruitment, provision of professional development, support for the provider, data collection, data analysis, and report preparation, is $12,198,695 or approximately $2,439,739 per year.

15. Reasons for program changes or adjustments

This is a new information collection request.



Table A.6. Annual Reporting and recordkeeping hour burden

Respondent/data request

Total number
of respondents (target/expected response rate)

Total annual responses

Total amount of time per response (minutes)

Total annual hours

Respondent average hourly wagea

Respondent labor cost

Districts







Teacher and principal grade and school assignments, teacher and principal performance measures, and student records request (Appendix D)

10

10

560

93

$44.13

$4,104.09

Principals







Principal survey (Appendix A)

85

56

30

28

$44.13

$1,235.64

, Principal log ) (Appendix B)

85

1134

15

284

$44.13

$12,532.92

Teachers







Teacher survey (Appendix C)

1020

680

30

340

$39.47

$13,419.80

Total

1,115b

1,880


745

na

$31,292.45

Average Annual

372






Note: Reporting on an annualized data collection over 3 years. Teacher and principal grade and school assignment data will be collected three times (once annually for three years). Principal logs will be collected daily, four weeks per year, for two years. All other data will be collected twice (once annually for two years).

a Average wages use the Bureau of Labor Statistics (BLS) May 2014 National Occupational and Employment Wage Estimates. Note that BLS does not provide an average hour wage for elementary school teachers, because they may not work on a 12-month schedule. Thus, for these categories the estimate is based upon an annual salary divided by 1,440 hours (roughly 180 days). BLS also does not provide average hourly wage for principals. Principal hourly salary is computed assuming 2,080 hours per year.

b The same principals will be administered the survey and log, so the total number of respondents (target/expected response rate) does not include the 85 principal log respondents.



16. Plan for tabulation and publication of results

a. Tabulation plans

Implementation analysis

The implementation analysis will contextualize findings from the impact analysis, detailing the fidelity of program implementation and challenges encountered, identifying mechanisms through which the professional development might improve outcomes, and documenting the extent of the contrast between the intervention and the counterfactual (that is, whether the experiences of principals in the treatment group differed significantly from those of principals in the control group).

Descriptive analysis of professional development. Drawing on information on the intended scope and sequence of the professional development gathered by the study’s technical assistance team, we will first describe the intervention as planned, including its objectives, and the timing, hours of professional development (session-based and coaching), delivery mode, and content of support.5 We will draw on principal surveys to describe the professional development received. We will describe the number of hours and focus of professional development received by principals and principals’ perceptions of the quality or usefulness of the professional development. Using information from the principal surveys, teacher surveys, and district records data, we will also describe the characteristics of the districts in which the intervention was delivered.

Comparison of the experiences of treatment and control group principals. Using data from the year 1 principal survey, we will describe the professional development received by the treatment group principals relative to that received by control group principals. The analysis will detail the content and frequency of professional development both groups received and test for statistically significant differences between the two groups.

Impact analysis

The impact analysis will examine the effects of CEL on the principal, teacher, and student outcomes in Table A.7. Key outcomes of interest include (1) teachers’ receipt of professional development and perceived quality and frequency of feedback from the principal; (2) school climate outcomes, including staff collaboration and perceptions of support; (3) teachers’ perceptions of principals’ leadership practices, school climate, and educator behaviors; (4) teacher effectiveness (as measured by the district’s central office) and teacher retention; and (5) student test scores, attendance, and behavior.

Table A.7. Principal, teacher, and student outcomes for the impact analysis

Level

Outcomes

Principal

  • District central office performance ratings

Teacher

  • Perceptions of school climate

  • Staff collaboration

  • Perceived support

  • Perceptions of principals’ leadership practices

  • Teachers’ receipt of professional development and principal feedback (quality and usefulness)

  • Retention

  • Performance ratings

  • All current teachers

  • New teachers

  • Assignment

  • Difference between teacher performance in high-stakes, low-stakes grades

  • Variance of average teacher performance across grades (to examine whether high- and low-performing teachers grouped within grades)

Student

  • Achievement

  • Attendance

  • Behavior



We will estimate impacts using the following model:

( 1) ,

where Yij is the outcome of interest for individual (principal, teacher, or student) i in school j; α is an intercept term; Tj is an indicator equal to one if the school is assigned to the treatment group and zero otherwise; Pij is a vector of baseline school- or student-level characteristics, and Zj is a vector of fixed effects corresponding to the study’s random assignment blocks; δ and γ are coefficient vectors; and εij is a random error term, clustered at the school level. The coefficient β represents the average impact of the principal professional development.6 We will apply a weighting scheme that gives an equal weight to each of the schools in the sample (regardless of the number of principals, teachers, or students in the sample in each school).

For teacher assignment, we will investigate how assignment to the CEL program affects the way principals assign teachers to classes—for instance, whether it leads principals to pair higher-performing teachers with lower-performing teachers in the same grade (to facilitate mentoring), or whether it leads them to reassign low-performing teachers to grades not covered by high-stakes tests. We will also examine principal and teacher performance ratings as measures of effectiveness, but we will interpret these data with caution. Some districts will rate principals at least partly based on school value added. For new principals, this rating is affected by the teachers inherited from prior principals (Chiang et al. 2012). Similarly, teacher performance ratings may have two limitations: (1) our experience suggests that nearly all teachers are given ‘satisfactory’ ratings, so there may be little variation in the data; and (2) teacher evaluations typically incorporate ratings by principals, so differences in ratings could reflect effects of the professional development program.

Our main impact estimates will reflect the impact of the professional development on the schools whose principals were offered the professional development, whether or not they actually participated for the full two years. To examine the effects on schools that received the full intended “dosage”—that is, treatment schools whose principals received both years of the CEL professional development—we will estimate local average treatment effects (Imbens and Angrist 1994), using treatment status as an instrumental variable for the proportion of total hours of intended CEL activities the principal attended. Even schools with principal turnover during the study could receive the full intended dosage if both the original and replacement principals attend all offered CEL activities during their time at the school.

Subgroup analyses. Because new principals may be more in need of and more receptive to coaching and formal group professional development sessions than more experienced principals, we will conduct subgroup analyses separately examining the effects of the professional development for novice and experienced principals. Statistical power for the subgroup analyses will be lower than for the full sample—subgroup findings will thus be less definitive than those for the full sample, but can provide suggestive evidence of impacts for these groups.

b. Publication plans

The study team will prepare two reports. The first will describe the implementation of the principal professional development program and present initial impacts of the program on intermediate outcomes, including school climate and educator behaviors, in year 1 (the 2015–2016 school year). The second report will examine implementation and the effects of principal professional development on additional intermediate outcomes, including school climate and educator behaviors, in year 2 (the 2016–2017 school year), teacher effectiveness and retention, and student outcomes, including test scores, attendance, and behavior from years 1 and 2. Both reports will be released on the IES website.

17. Approval not to display the OMB expiration date

All data collection instruments will include the OMB expiration date.

18. Explanation of exceptions

No exceptions are requested.

Table A.8. Timetable for project publications

Activity

Date

Year 1 report

First draft

January 2017

Revised draft

March 2017

Final report

December 2017

Year 2 report

First draft

August 2018

Revised draft

October 2018

Final report

June 2019





References

Camburn, E., J. Spillane, and J. Sebastian. “Assessing the Utility of a Daily Log for Measuring Principal Leadership Practice.” Educational Administration Quarterly, vol. 46, 2010a, pp. 707–737.

Camburn, E., J. Huff, E. Goldring, and H. May. “Assessing the Validity of an Annual Survey for Measuring Principal Leadership Practice.” Elementary School Journal, vol. 111, no. 2, 2010b, pp. 314–335.

Camburn, E. Goldring, J. Sebastian, H. May, and J. Huff. “Conducting Randomized Experiments with Principals: A Case Study.” Educational Administration Quarterly, forthcoming.

Goddard, R., A. Miller, M. Kim, Y. Goddard, P. Schroeder, and E.S. Kim. “Evaluation of Principals’ Professional Development Learning: Results from A Randomized Control Trial.” Paper presented at the Annual University Council for Educational Administration Convention, Pittsburgh, Pennsylvania, 2011;

Goff, P.J., E. Guthrie, E. Goldring, and L. Bickman. “Changing Principals’ Leadership Through Feedback and Coaching.” Journal of Education Administration, vol. 52, no. 5, 2014, pp. 682-704.

Kottkamp, R.B., and E.A. Rusch. “The Landscape of Scholarship on the Education of School Leaders, 1985-2006.” In Handbook of Research on the Education of School Leaders, edited by M.D. Young, G.M. Crow, J. Murphy, and R.T. Ogawa. New York: Routledge, 2009, pp. 23-85.

Nunnery, J.A., S.M. Ross, and C. Yen. “An Examination of the Effect of a Pilot of the National Institute for School Leadership’s Executive Development Program on School Performance Trends in Massachusetts.” Norfolk, VA: Old Dominion University, Center for Educational Partnerships, 2010a.

Nunnery, J., S.M. Ross, and C. Yen. “The Effect of a Pilot of the National Institute for School Leadership’s Executive Development Program on School Performance Trends in Pennsylvania.” Norfolk, VA: Old Dominion University, Center for Educational Partnerships, 2010b.

Nunnery, J., S.M. Ross, and C. Yen. “The Effect of a Pilot of the National Institute for School Leadership’s Executive Development Program on School Performance Trends in Pennsylvania: 2006-2010 Pilot Cohort Results.” Norfolk, VA: Old Dominion University, Center for Educational Partnerships, 2011.





www.mathematica-mpr.com

Improving public well-being by conducting high quality,
objective research and data collection

Shape1

Mathematica® is a registered trademark
of Mathematica Policy Research, Inc.

Princeton, NJ Ann Arbor, MI Cambridge, MA Chicago, IL Oakland, CA Washington, DC



1 We are not requesting OMB approval for collection of data on the scope and sequence of the professional development because this information will be collected from the professional development provider and will not impose any burden on study participants.

2 To help increase the chances that treatment and control groups are well balanced in terms of key baseline characteristics, before random assignment we will organize participating schools in each district into blocks, each of which contains two schools with similar baseline characteristics. Characteristics we will consider in the formation of blocks include (1) grade span; (2) school type (charter, magnet, or traditional public school); (3) average 3rd-, 4th-, and 5th-grade reading and math test scores from the 2013–2014 school year; (4) principal experience; (5) school enrollment; (6) percentage of students eligible for free or reduced-price lunch; and (7) percentage of students who are nonwhite.



3 The institute will be offered on two dates, to help ensure all treatment group principals can attend.

4 This average reflects 4 hours in the first year, 12 hours in the second year, and 12 hours in the third year, for a total of 28 hours (equal to 1,680 minutes) across all three years (or 560 minutes per year).

5 These data will be collected by the study’s technical assistance team from the professional development provider, and their collection will not impose burden on study participants.

6 We will not adjust for principal and teacher characteristics in this model. Because the intervention could influence the composition of principals and teachers, though retention or mobility, observed characteristics of principals and teachers could reflect program effects, and including them could lead to an incorrect estimate of program impacts.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePPD_40412-Part A_Initial Package for OMB 040315
SubjectImpact Evaluation of Support for Principals: OMB Data Collection Package
AuthorMathematica Staff
File Modified0000-00-00
File Created2021-01-25

© 2024 OMB.report | Privacy Policy