Att_1850-0877 4413ARRA Evaluation Part A for baseline survey 4_1

Att_1850-0877 4413ARRA Evaluation Part A for baseline survey 4_1.docx

Integrated Evaluation of American Recovery and Reinvestment Act (ARRA) Funding, Implementation and Outcomes

OMB: 1850-0877

Document [docx]
Download: docx | pdf

Integrated Evaluation of ARRA Funding, Implementation and Outcomes



Statement for Paperwork Reduction Act Submission


PART A: Justification



Contract ED-IES-10-CO-0042







December 2010







Contents



Page





Part A: Justification

This package is the second of three for the Integrated Evaluation of ARRA Funding, Implementation, and Outcomes. Our initial request sought approval for execution of a sampling plan and recruitment of the selected sites. This OMB approval was received on January 13th, 2011 (see 1850-0877 v.1 (4385)). This package requests approval for an initial round of data collection that will include surveys of all states and a nationally representative sample of districts and schools in spring 2011. The third and final package will request approval for follow up surveys with the same groups in 2012 and 2013. A fast response from OMB is critical if the study is to field the spring 2011 surveys successfully, since much preparation work is necessary to ensure a high response rate from sampled school districts and schools.


Introduction


On February 17, 2009, President Obama signed the American Recovery and Reinvestment Act (ARRA) into law (Pub. L. 111-5). ARRA provides an unprecedented $100 billion of additional funding for the U.S. Department of Education (ED) to administer. While the initial goal of this money is to deliver emergency education funding to states, ARRA is also being used as an opportunity to spur innovation and reform at different levels of the U.S. educational system. Specifically, ARRA requires those receiving grant funds to commit to four core reforms: (1) adopting rigorous college-ready and career ready standards and high quality assessments, (2) establishing data systems and using data to improve performance, (3) increasing teacher effectiveness and the equitable distribution of effective teachers, and (4) turning around the lowest performing schools. Investment in these innovative strategies is intended to lead to improved results for students, long-term gains in school and local education agency (LEA) capacity for success, and increased productivity and effectiveness.


The education component of ARRA consists of several grant programs targeting states and LEAs and, in some cases, consortia led by non-profit organizations. The programs under ARRA fall into three general categories: (1) existing programs that received an infusion of funds (e.g., Individuals with Disabilities Education Act, Parts B & C; Title I; State Educational Technology grants; Statewide Longitudinal Data Systems grants); (2) a new program intended mainly for economic stabilization (i.e., State Fiscal Stabilization Fund); and (3) newly created programs that are reform-oriented in nature. Due to the number and scope of these programs, a large proportion of districts and schools across the country will get some ARRA funding. In turn, ARRA represents a unique opportunity to encourage the adoption of school improvement focused reforms and to learn from reform initiatives as they take place.


Although ARRA funds are being disbursed through different grant programs, their goals and strategies are complementary if not overlapping, as are the likely recipients of the funds. For this reason, an evaluative approach where data collection and analysis occurs across grant programs (i.e., it is “integrated”), rather than separately for each set of grantees will not only reduce respondent burden but will also provide critical information about the effect of ARRA as a whole.






Overview of the Study


The Integrated Evaluation of ARRA Funding, Implementation and Outcomes is being conducted under the Institute of Education Sciences (IES), ED’s independent research and evaluation arm. The study is one of several that IES will carry out to examine ARRA’s effects on education (see Exhibit A-1).


Exhibit 1. IES Evaluation of ARRA’s Effects on Education


The Integrated Evaluation is designed to assess how ARRA efforts are unfolding over time and is therefore primarily descriptive. While information will be gathered on many of the grant programs, the evaluation will focus primarily on the reform-oriented programs (e.g., Race to the Top (RTT), Title I School Improvement Grants (SIG), Investing in Innovation (i3), and the Teacher Incentive Fund (TIF)) since those are of the greatest policy interest. 1 The study will support the various impact evaluations IES is conducting by providing critical context for those strategies being rigorously investigated – e.g., by documenting the relative frequency with which they are being implemented across the country, whether they are unique to the particular grant programs, and how they are being combined with other reform approaches.


To achieve these objectives, the Integrated Evaluation will draw heavily on existing information (grant funding allocations, district and school outcomes databases, performance reporting where available) and administer new surveys to all 50 states and the District of Columbia, and to a nationally representative survey of districts and schools. The surveys will be conducted annually for at least three years, in spring 2011, 2012, and 2013.2 In addition, two district polls of a subsample of sampled districts will be conducted between the 2011 and 2012 and between the 2012 and 2013 larger surveys to capture key, evolving issues of interest to ED officials and other policy makers as they consider shifting technical assistance efforts and further legislative action.


The evaluation’s theory of action, which is displayed in Figure A-1, posits that ARRA and the individual programs included in it are appropriately understood as a federal strategy for intervening in ongoing state and local education reform efforts. As the theory of action suggests, states have more or less well-defined reform agendas and priorities and many of these existed prior to ARRA. The arrows from the top and bottom boxes to the box on the left side of the display suggest that state reform priorities and strategies have been and continue to be influenced by the availability of ARRA education funds and the requirements established by the various ARRA programs.


The four ARRA assurances define the core elements of the federal strategy. The theory of action suggests that two of the four assurances, increasing educator effectiveness and equitable distribution and improving low-performing schools, are the primary foci of ARRA expectations and ARRA-supported reforms. The “School Improvement” box on the right side of the model appears as it does to suggest that ARRA has high aims for improving all schools, while at the same time targeting significant resources to improving the lowest-performing schools. Setting new standards and developing new assessments aligned with these standards and establishing statewide longitudinal student data systems (the other two ARRA assurances areas) are important to the reform equation, but are best understood in terms of how they contribute to reforms in the other two areas.


The location of the “District Reform Priorities and Strategies” box suggests that while states exert considerable leadership in education reform, much of the work is done at the local levels as district and school staff work to improve instruction. Nowhere is this more clearly demonstrated than in the implementation of the myriad of strategies associated with increasing educator effectiveness and equitable distribution. These strategies include (1) designing education preparation programs and ongoing professional development aligned with the state and local performance standards; (2) designing and implementing quality induction programs for new teachers and principals; (3) designing and implementing new educator evaluation systems that include clear evidence of gains in student achievement as a criterion for effective performance; and (4) designing and implementing new systems of compensation and incentives which recognize and reward quality performance and help to ensure that highly-effective educators are assigned to and continue to work in hard-to-staff schools. Together these strategies define an aligned human resource management system, which, in turn prepares and supports educators’ efforts to improve schools, especially the lowest-performing schools. The ultimate goal of ARRA programs and the reforms they support is to improve student learning.


The left-to-right arrows connecting the boxes through the middle of the diagram, labeled “C3” as shorthand for communication, coordination, and collaboration, suggest the importance of the linkages among state, district, and school reform efforts. Understanding what each of these linking strategies looks like and its contributions to advancing reform efforts is important to understanding the overall effectiveness of the reforms. The “Lessons” arrows connecting the boxes through the middle of diagram from right to left are intended to convey the idea that the lessons learned as implementation proceeds may lead to mid-course corrections in strategies and/or implementation of strategies.


Figure A-1. Integrated Evaluation of ARRA Funding, Implementation, and Outcomes: Theory of Action



    • ARRA Funding, Policy, and Requirements (e.g., RTT, i3, TIF, SLDS, Tech Grants, SIG, SFSF)

  • Shape1

Shape3 Shape2

  • Statewide Technology Infrastructure


  • State Reform

  • Priorities and Strategies


  • State Reform

  • Priorities and Strategies

  • District Reform

  • Priorities and Strategies

  • Student Outcomes

  • Standards and Assessments

Shape11 Shape10 Shape5 Shape9 Shape8
  • ARRA Funding, Policy, and Requirements (e.g., RTT, i3, TIF, SLDS, Tech Grants, SIG, SFSF)

Shape4 Shape7 Shape12 Shape6

Shape14


  • C5

  • C5

  • C3

  • School Improvement

  • Low Performing Schools

Shape15
  • C3

  • C3


Shape17

  • Educator Effectiveness and Equitable Distribution

Shape16

    • Lessons

    • Lessons

    • Lessons

    • Lessons

  • Shape18

Shape19 Shape22 Shape20

Shape25 Shape24 Shape23


  • Data Systems


Shape28





The “Statewide Technology Infrastructure” is included in the theory of action to underscore the fact that in many states developing technology infrastructures are fast becoming part of the glue that holds the reform efforts together and may also be catalysts for increasing the pace of reforms. New student data systems rely heavily on technology and state, district, and school capacity to use the systems. Full implementation of new assessments will also depend heavily on the capacity of data systems to store data and produce timely, user-friendly reports to inform instructional planning. The new data systems are also hypothesized to facilitate reform. Increasingly, states and districts are relying on technology to store instructional materials, especially as new content standards are introduced and applied in classrooms. Finally, states and districts are increasingly relying on technology as the medium for educator professional development.


The theory of action acknowledges that state and local education reforms did not begin with ARRA. The left to right progression displayed in the theory of action suggests that some reforms must be completed or at least make some progress before others can be completed. At the same time, the theory of action probably does not adequately reflect important time dimensions of the various reforms that are underway. A task for this evaluation is to examine how long it will take to implement the planned/expected reforms and how long will it take to see results. A second task will be to examine how the pace and sequence of individual reform efforts interact in the development and implementation of new policies, programs, and practices. Work driven in substantial ways by ARRA funding and expectations is proceeding on many fronts simultaneously. Yet the reality is that some things must be completed (e.g., implementation of new standards and assessments and new data systems) before others (e.g., full implementation of new educator evaluation systems). For example, implementation of the Common Core State Standards is already underway as states and districts develop new instructional resources and provide professional development to teachers and principals to introduce the standards and explain how they can and should be applied in the classroom. The development of new assessments aligned with the standards is also underway, but new state assessments are not expected to be in place for several years. Thus, teachers are likely to face a situation in which they are teaching to the new standards while being held accountable for assessment results that reflect mastery of a different set of standards. Thus, one could argue that, despite early progress, implementation of the new standards will not be complete until the new assessments are implemented.


Similarly, many states and districts are moving quickly on the development of new educator evaluation systems that rely on student learning gains as primary criterion for evaluating educator effectiveness. Because these systems will ultimately rely on student outcomes defined in mastery of the new standards, the systems cannot be considered fully implemented until the standards have been implemented and the new assessments are in place. Finally, logic dictates that it will not be possible to gauge the full impact of ARRA on student learning and other outcomes until these complex reforms are completed. This assumption, along with the others laid out by the theory of action presented above, will guide this study and in turn shape the collection of data efforts.



A.1 Explanation of Circumstances That Make Collection of Data Necessary

The Integrated Evaluation of ARRA Funding, Implementation, and Outcomes is a key component of ED’s efforts to learn lessons from the scope and structures of the ARRA funding. By providing the most comprehensive and independent assessment of ARRA implementation and outcomes across funding streams, it responds to taxpayer interest in how ARRA funds were spent. Although there are other groups and researchers external to ED that are examining some of the same issues (see Sec. A.4), only an ED sponsored contractor will have access to and report on the full set of data. The breadth of policy/research questions that will be addressed by IES’ Integrated Evaluation sets it apart from other ARRA studies:


  1. To what extent did ARRA funds go to the intended recipients?

  • To what extent did high need states, districts, and schools receive support? And how did this vary by ARRA program?

  • To what extent did funds go to states, districts, and schools that positioned themselves to implement reforms? And how did this vary by ARRA program?


  1. Is ARRA associated with the implementation of the key reform strategies it promoted?

  • Which reform strategies received the most attention in terms of (a) how SEAs and districts allocated their ARRA funds and (b) where the greatest implementation activity took place?

  • Comparing pre-2009 and post-2009 levels of implementation, are more key reform strategies implemented after the ARRA education funds were allocated?

  • Is having more ARRA funding associated with more deeper/broader implementation of the reform strategies?


  1. Which implementation supports and challenges are associated with ARRA?

  • What mechanisms are in place at the state and district levels to ensure that the reform efforts are (a) progressing as planned and (b) achieving the intended results?

  • Is alignment of priorities more evident in RTT states, where there is more of a focus on state capacity building?

  • Looking across the four reform areas and related strategies, what implementation challenges do SEAs, districts and schools report? How do these vary by the characteristics of states, districts, and schools?


  1. Is ARRA associated with improved outcomes?

  • Is ARRA funding associated with improved student outcomes?

  • Is ARRA associated with improved distribution of effective teachers?

  • Is there a relationship between the use of particular reform strategies promoted by ARRA (or bundles of strategies) and outcomes? Are these relationships moderated by any key variables such as coordination, communication, fiscal context, etc?


In addition, the study is designed to provide ongoing, formative feedback to ED through the district polls and feedback to states through state-specific tabulations of the survey data.



A.2 How the Information Will Be Collected, by Whom, and For What Purpose

The Integrated Evaluation will rely on information collected from existing sources, for which there are no respondents or burden, and from a new set of surveys in order to address the research questions described above. See Table A-1 for the linkages between the research questions and the sources of information to answer the questions. We then discuss the extant data sources and the new data collections.


Table A-1. Research Questions


Research Questions

Extant Data

(source, description)

Level*

ARRA Evaluation Survey

(description)

Level*

To what extent did ARRA funds go to the intended recipients?

1) To what extent did high need states, districts, and schools receive support? And how did this vary by ARRA program?

ED grants database – identify grantees, award amounts

S, D, Sch

Funding amount by ARRA program, for sampled schools

Sch

Recovery.gov – identify grantees and subgrantees, award and subaward amounts, amount received, total expenditure, subaward funds disbursed

S, D 

Type of funding source (ARRA, non-ARRA, state, other) by reform area

S, D

2) To what extent did funds go to states, districts, and schools that positioned themselves to implement reforms? And how did this vary by ARRA program?

Quality Counts state policy survey: standards/assessment, teacher effectiveness

S

Pre-2009 reform activities

S, D, Sch

DQC state survey (annual): development of longitudinal data systems

S

CEP State surveys (2009, 2010): status of state reform efforts

S

NCSL: education legislation

S

Grant applications (RTTT, SFSF)

S

*Level refers to the level of the organizational unit for which data are available (e.g., state, district, or school). Relevant survey data may come from one or more of the surveys to be conducted in this study.



Table A-1. Research Questions (continued)


Research Questions

Extant Data

(source, description)

Level

ARRA Evaluation Survey

(description)

Level

Is ARRA associated with the implementation of the key reform strategies it promoted?

3) Which reform strategies received the most attention in terms of (a) how SEAs and districts allocated their ARRA funds and (b) where the greatest implementation activity took place?

ED grants database – identify grantees, award amounts

S, D, Sch

Type of funding source (ARRA, non-ARRA, state, other) by reform area

S, D

Recovery.gov – identify grantees and subgrantees, award and subaward amounts, amount received, total expenditure, subaward funds disbursed

S, D 

Use of strategies by reform area

  • standards/assessment

  • educator recruitment

  • support for new educators

  • educator evaluation systems

  • educator compensation and incentives

  • low-performing schools

  • longitudinal student data systems

S, D, Sch

CEP LEA survey (2010): use of funds by strategy; reform strategies by implementation status

D

Implementation status (not planned, in planning/development, being provided/made available) for strategies within reform areas

S, D, Sch

4) Comparing pre-2009 and post-2009 levels of implementation, are more key reform strategies implemented after the ARRA education funds were allocated?

Grant applications (RTTT, TIF, i3,SIG)

S

Use of strategies by reform area

  • standards/assessment

  • educator recruitment

  • support for new educators

  • educator evaluation systems

  • educator compensation and incentives

  • low-performing schools

  • longitudinal student data systems


pre-2009 reform activities

S, D, Sch

NLS: teacher effectiveness reforms, school improvement strategies (pre-ARRA)

D

CEP State surveys (2009, 2010): status of state reform efforts

S

DQC state survey (annual): development of longitudinal data systems

S


Table A-1. Research Questions (continued)


Research Questions

Extant Data

(source, description)

Level

ARRA Evaluation Survey

(description)

Level

Is ARRA associated with the implementation of the key reform strategies it promoted?

5) Is having more ARRA funding associated with more deeper/broader implementation of the reform strategies?

Performance reports (RTTT, TIF, i3,SIG, SFSF):

  • descriptions of efforts implemented

  • school intervention models

  • educator evaluation, compensation systems

S, D, Sch

Implementation status (not planned, in planning/development, being provided/made available) for strategies within reform areas

S, D, Sch

ED grants database – identify grantees, award amounts

S, D, Sch

School improvement activities, charter and management organizations

Sch

Recovery.gov – identify grantees and subgrantees, award and subaward amounts, amount received, total expenditure, subaward funds disbursed

S, D 

Participation in professional development, by reform strategies

Sch




Table A-1. Research Questions (continued)


Which implementation supports and challenges are associated with ARRA?

6) What mechanisms are in place at the state and district levels to ensure that the reform efforts are (a) progressing as planned and (b) achieving the intended results?

 

 

SEA communication strategies

S. D

LEA communication strategies

D, Sch

Types of oversight, guidance, and technical assistance

D, Sch

7) Is alignment of priorities more evident in RTT states, where there is more of a focus on state capacity building?

 

 

Priority level by reform area activities

S, D, Sch

LEA strategies address state requirement (Y/N)

D

8) Looking across the four reform areas and related strategies, what implementation challenges do SEAs, districts and schools report? How do these vary by the characteristics of states, districts, and schools?

 

 

Specific challenges by reform area, (major, minor, not a problem)

S, D


Table A-1. Research Questions (continued)


Is ARRA associated with improved outcomes?

9) Is ARRA funding associated with improved student outcomes?

EDFacts: student proficiency on state assessments, graduation rates

S, D, Sch

Funding amount by ARRA program, for sampled schools

Sch

SFSF public reporting (future): college enrollment, completion

S, D, Sch

SIG performance reports: student advanced coursework

Sch

10) Is ARRA associated with improved distribution of effective teachers?

TIF performance reports: teacher/principal effectiveness

D

Funding amount by ARRA program, for sampled schools

Sch

RTTT performance reports: teacher and principal performance ratings and distribution by school type

S

SFSF public reporting and SIG performance reports: distribution of principal/teacher performance ratings

D

11) Is there a relationship between the use of particular reform strategies promoted by ARRA (or bundles of strategies) and outcomes? Are these relationships moderated by any key variables such as coordination, communication, fiscal context, etc?

(see extant data in 9 & 10, above) 

S, D, Sch

School improvement activities, charter and management organizations

Sch

Pew Center on the States (2009): indicators of state fiscal health

S

Participation in professional development, by reform area

Sch

Communication and coordination


Extant Data Sources


  • ED Databases. We will use data from the National Center for Education Statistics’ Common Core of Data (CCD) and ED’s EDFacts to assemble the sampling frame for this study. Data items will include urbanicity, school level, poverty status, improvement status, total enrollment and limited English proficient student enrollment, and adequate yearly progress of correction action status under the Elementary and Secondary Education Act (ESEA). These same data systems can provide information on outcomes such as academic proficiency rates under ESEA and graduation rates.


  • ED ARRA Program Files. We will use data from ED program files to compile a list of grant recipients for TIF to be used in assembling the school district sampling frame for this study. We will use data from ED program files and state websites to compile the list of persistently lowest achieving schools to be used in assembling the school sampling frame for this study. We will obtain data from ED program files on the amounts of funding received by states and districts from each formula grant and each discretionary program.


  • ARRA Required Reporting and Information. We are examining the types of information that the statute and ED require to be reported by states, districts and school as a condition of program participation. Some data are provided directly to ED, including application materials, performance indicators, and progress reports. Some data must be reported “publicly” but not necessarily to ED. Other than for SFSF, much of the reporting has not begun. Information that is available will be used in reporting and, to the extent possible, will not be duplicated in the surveys to be administered to states, schools districts, and schools. However, it is important for analytic purposes to have the same data collected for grantees and non-grantees to allow for comparison – e.g., district reports of state support provided in RTT states versus non-RTT states. It is equally important, from a research perspective, for the data collection modes or mechanisms to be the same. We will balance these research needs with the desire not to add burden to grantees who already have reporting responsibilities.


  • Non-ED Databases. We will obtain data from other, non-ED sources to provide context variables or outcomes measures. For example, we are exploring whether the National Longitudinal School-Level State Assessment Score Database or State Education Data Center provide more comprehensive or historical district and school measures of proficiency rates than does the Department’s EdFacts systems. Other organizations, like the Rockefeller Institute and Pew Foundation have developed measures of state’ and, in some cases, large districts’ fiscal conditions that would provide important analytic opportunities for examining the conditions under which ARRA implementation is taking place.


New Data Collections


Because there is currently no reliable source of detailed information on the strategies being implemented under the various ARRA programs, we will administer a set of surveys to obtain this information (see Table A-2).

Table A-2. Description of Information to be Collected


Instrument

Type of Respondent

Data Collection Mode

Content

State survey

Chief state school officer as point of contact; sections to be distributed and completed by appropriate state staff

Paper and pencil

State strategies for adopting new standards, establishing aligned assessments, supporting new educators, establishing educator evaluation systems, supporting the improvement of low performing schools, and using statewide longitudinal data systems; state efforts to encourage district-level and school-level adoption of reform strategies; challenges related to reform efforts; funding sources for reform initiatives; state priorities for future reform.

District survey

Superintendent to designate a district liaison; sections to be distributed and completed by appropriate district staff

Web with hard copy (paper and pencil) if requested

District strategies for adopting new standards, supporting new educators, establishing educator evaluation systems, supporting the improvement of low performing schools, and using statewide longitudinal data systems; districts efforts to encourage school-level adoption of reform strategies; challenges related to reform efforts; funding sources for reform initiatives; district priorities for future reform.

School survey

Principal

Web with hard copy (paper and pencil) if requested

Specific reform activities taking place at the school-level; depth and breadth of professional development related to reform efforts; resources purchased or received to support reform; challenges related to reform efforts.

Poll #1 (fall 2011)

Superintendent to designate a district liaison; questions to be distributed and completed by appropriate district staff

Web with telephone option if requested

Types of guidance provided by the state; types of assistance needed; district partnerships to support reform.


Poll #2 (fall 2012)

Superintendent to designate a district liaison; questions to be distributed and completed by appropriate district staff

Web with telephone option if requested

Types of guidance provided by the state; types of assistance needed; district partnerships to support reform.






In developing the surveys, several underlying principles guided what information we are asking for and how we are asking for that information:


  • It is not possible to do a general ledger examination to document what state and districts spent the money on.  ARRA is too new and, as a set of primarily temporary programs, states and school districts have not set up formal permanent book keeping on “use.”


  • Schools will not know what state/district money or guidance is ARRA-related, with the possible exception of TIF and i3.


  • Because the ARRA money is integrated with state funds, it is more important and reliable to ask about reform activities being implemented rather than only those implemented with ARRA funds.


State Surveys


The state surveys will be administered as paper and pencil instruments, sent electronically so that the respondent has the option of typing responses and sending it back electronically, or printing the instrument, completing it and mailing it back. We determined that 51 respondents did not warrant the development of a web survey but we will reconsider the use of a web-based survey in the pilot testing phase. The survey will be sent to the chief school officer in each of the 50 states and the District of Columbia in spring 2011, 2012, and 2013. Each chief will be responsible for determining whether he or she is in the best position to respond the questions in the instrument or for requesting that other state education agency (SEA) officials take the lead for responding to individual sections. The survey will be modularized to allow for both options—for example, questions on teacher quality and evaluation could be completed by the state official who is most responsible for enacting that part of a state plan, questions about state data systems can be completed by someone else. We will record who the respondent(s) are for each round of the survey and conduct sensitivity testing to examine the influence of shifts in single versus multiple respondent responses over time. The state surveys will be used to examine state priorities for ARRA funding and implementation, shifts in state policy and legislation to support their ARRA efforts, and types of supports and communication provided to districts and schools. The initial state survey is in Appendix A.


District Surveys


Web-based surveys will be administered to the deputy superintendents of districts sampled for the study. We have found that the deputy superintendents are typically designated as responsible for research efforts and are best suited to determining if additional staff is needed to complete sections of the survey. Like the state survey, the district survey will be modularized to allow for completion by one or multiple respondents. The survey will collect information such as district priorities for improvement efforts, status of implementation, supports and technical assistance provided by the state to the district and by the district to schools in their community. While we will link some questions to ARRA, we anticipate that many districts will not know which specific funds received from the state came from ARRA grants and so the bulk of the questions will simply relate to aspects of the reform strategies that they are implementing. The initial district survey is in Appendix B.


School Surveys


Web-based surveys will also be administered to each sampled school principal. We anticipate that the principal will be the sole respondent, as he/she will be in a position to answer questions about the emphasis of their school improvement efforts including how they evaluate teachers, state and district provided professional development and other supports, instructional changes, and use of data. The initial school survey is in Appendix C.



District Polls


These short web-based surveys to a small set of district superintendents are intended to provide a snapshot on specific time and context sensitive issues that could not be covered in the annual surveys or for which more ongoing information is needed.  As “polls,” the surveys will not require assembly or compilation of information.  The polls will focus on a small subset of items that might cover such topics as perceptions of districts’ reform related needs or perceptions of state support.  Because the first poll is intended to address an issue of particular relevance to the time period between when the first and second annual surveys are fielded, this OMB package includes four options, each of equal burden, in Appendix D.  Which of these four polls will be used will depend on which topic appears to be most policy relevant prior to the fielding date.    



A.3 Use of Improved Information Technology to Reduce Burden

We will administer the district and school surveys via the web, so it is easily accessible to respondents. This will not only save money in postage, coding, keying, and cleaning the survey data but also, we have found, is a preferred method for survey completion among many respondents. Burden will be reduced with the use of skip patterns and prefilled information based on responses to previous items when appropriate.


The web-based surveys will also facilitate the completion of the surveys by multiple respondents, so that the most appropriate individual will be able to access and provide the data in their area of expertise. This approach will reduce burden for respondents as (a) each individual will have fewer questions to answer themselves and (b) respondents will be asked questions concerning topics in which they are well versed and answers should be readily available.


For respondents that choose not to use the web-based survey, paper and phone survey options will be offered to respondents as part of the nonresponse follow-up effort. Thus, if paper and phone methods are needed to achieve a high response rate, they will be used.



A.4 Efforts to Identify and Avoid Duplication

Our identification and avoidance of duplication falls into two categories: (1) extensive use of extant data in place of new data collection, and (2) analysis of other large scale surveys about ARRA


Use of Extant Data


In section A.2, we detailed sources of existing data and described how we plan to use this data for sampling and for reporting.


Analysis of Other Surveys about ARRA

While we are aware of other studies focused on ARRA education funds, they suffer from various limitations including a one-time collection effort, a focus on a single ARRA grant program, lack of representative sampling, problems in obtaining high response rates, and no linkages across the various educational levels.


The Integrated Evaluation is unique in multiple ways. First, the study stands out as the only evaluation that will survey the universe of states and nationally representative samples of districts and schools that are large enough to detect differences between states (and districts) and relationships between funding and outcomes. For example, the American Association of School Administrators August 2009 survey “Schools and the Stimulus: How America’s Public Schools Districts Are Using ARRA Funds” asked interesting questions concerning the use of ARRA funds to fill budget holes. However, only 160 administrators were surveyed in all and one-in-four states were not included in the sample.


Second, the Integrated Evaluation offers a unique opportunity to document the flow of funds and implementation efforts from states, through districts, to schools. As far as we know, there are no other surveys of schools, which is where much of the actual implementation and success of reform efforts will take place. While the Center on Education Policy (CEP) is surveying state and districts, a relatively small number of districts were surveyed (less than 1/6th the number that will be surveyed for this study) and not all states. Therefore, the CEP surveys do not provide the opportunity for nested analyses and obtaining a comprehensive picture of communication as will the Integrated Evaluation. In addition, while the CEP asked only very basic questions concerning the four assurances (focusing instead on jobs saved by ARRA funds) the Integrated Evaluation is the only study we are aware of that will examine in more detail the type and stage of strategies being implemented. We will, however, consider the possibility of repeating some items from other surveys that have already been administered in order to create an earlier baseline for some measures.


Finally, we are reviewing reports from the General Accounting Office regarding ARRA. For example a recent GAO study “Recovery Act: Opportunities to Improve Management and Strengthen Accountability over States’ and Localities’ Uses of Funds” (GAO-10-999, September 20, 2010) provides information the uses and accountability for ARRA funds in selected states and localities. This information will inform survey development.



A.5 Efforts to Minimize Burden on Small Business or Other Entities

No small businesses will be involved as respondents. Every effort will be made to minimize the burden on respondents. As described in section A.3, we will administer the district and school surveys via the web, so it is easily accessible to respondents. Burden will be reduced with the use of skip patterns and prefilled information based on responses to previous items when appropriate. The web-based surveys will also facilitate the completion of the surveys by multiple respondents, so that the most appropriate individual will be able to access and provide the data in their area of expertise. This approach will reduce burden for respondents as (a) each individual will have fewer questions to answer themselves and (b) respondents will be asked questions concerning topics in which they are well versed and answers should be readily available.



A.6 Consequences of Less-Frequent Data Collection

The data collection plan described in this submission is necessary for ED to conduct a rigorous national evaluation of ARRA funding and implementation progress. Although ED is required to obligate all ARRA funds by September 30, 2010, depending on the specific program, states and districts will have anywhere from one year to several years to use the funds. Moreover, a key question for the study is whether the activities undertaken while additional funding was available continue after those funds disappear. For these reasons, annual surveys until at least 2013 are critical.



A.7 Special Circumstances Requiring Collection of Information in a Manner Inconsistent with Section 1320.5(d)(2) of the Code of Federal Regulations

There are no special circumstances associated with this data collection.



A.8 Federal Register Comments and Persons Consulted Outside the Agency

A notice was published in the Federal Register on October 6, 2010, Vol 75, page 61710. One public comment was received from the State of California and responses to the specific concerns raised appear in Appendix F.


A Technical Working Group (TWG) has been assembled for this study. The current TWG members are listed below. Additional consultation may be sought during later phases of the study (e.g., data analysis).


Thomas Cook, Northwestern University

Margaret Goertz, University of Pennsylvania

Jack Jennings, Center on Education Policy

Sharon Lohr, Arizona State University

Rachel Tompkins, Rural School and Community Trust

Marilyn Troyer, Ohio Department of Education



A.9 Payments to Respondents

There will be no payments with regard to the collection of the survey data.



A.10 Assurance of Confidentiality

Other than the names and contact information for the respondents, which is information typically already available in the public domain (i.e., state and district websites) no data collected for this survey will contain personally identifiable information. While some basic summary information focused on funding and implementation is likely to be displayed by state, no names and contact information will be released.


Responses will be used for research or statistical purposes. Participation is voluntary.


The following language will be included on the cover sheet of each survey: Information collected for this study come under the confidentiality and data protection requirements of the Institute of Education Sciences (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Information that could identify an individual or institution will be separated from the survey responses submitted, kept in secured locations, and be destroyed as soon as they are no longer required. Survey responses will be used only for research purposes. The reports prepared for the study will summarize findings across individuals and institutions and will not associate responses with a specific district, school, or person. We will not provide information that identifies district or school respondents to anyone outside the study team, except as required by law.


The Education Sciences Reform Act of 2002. Title I, Part E, Section 183 of this Act requires, “All collection, maintenance, use, and wise dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” Respondents will be assured that confidentiality will be maintained, except as required by law. Specific steps to guarantee confidentiality include the following:


  • Identifying information about respondents (e.g., respondent name, address, and telephone number) will not be entered into the analysis data file, but will be kept separate from other data and will be password protected. A unique identification number for each respondent will be used for building raw data and analysis files.

  • A fax machine used to send or receive documents that contain confidential information will be kept in a locked field room, accessible only to study team members.

  • Confidential materials will be printed on a printer located in a limited access field room. When printing documents that contain confidential information from shared network printers, authorized study staff will be present and retrieve the documents as soon as printing is complete.

  • In public reports, findings will be presented in aggregate by type of respondent or for subgroups of interest. No reports will identify individual respondents or local agencies.

  • Access to the sample files will be limited to authorized study staff only; no others will be authorized such access.

  • All members of the study team will be briefed regarding confidentiality of the data.

  • A control system will be in place, beginning at sample selection, to monitor the status and whereabouts of all data collection instruments during transfer, processing, coding, and data entry. This includes sign-in/sign-out sheets and the hand-carrying of documents by authorized project staff only.

  • All data will be stored in secure areas accessible only to authorized staff members. Computer-generated output containing identifiable information will be maintained under the same conditions.

  • When any hard copies containing confidential information are no longer needed, they will be shredded.



A.11 Questions of a Sensitive Nature

Questions of a sensitive nature will not be asked in any of the three surveys or in the poll.



A.12 Estimates of Respondent Burden

We will administer the initial surveys to respondents in:


  • The 50 states and the District of Columbia,

  • 1,700 sampled school districts, and.

  • 3,800 sampled schools (within the sampled school districts).


In all, responses will be required in spring 2011 from 5,551 respondents (51 state officials; 1,700 district officials; and 3,800 school officials). Although we expect that at the state and district level, there may be more than one respondent completing the survey, we are estimating the burden to complete the total survey as one respondent per state/district times the number of minutes for the total survey. We estimate that it will take (1) state and district respondents an average of 75 minutes for the surveys, (2) school officials 45 minutes for the survey; and (3) district respondents 10 minutes for the poll, so total burden is 304,025 minutes or 5,322.05 hours (see Table A-3 below).


Table A-3. Estimates of Respondent Burden


Respondent

Anticipated number completed

Minutes
per completion

Burden in minutes


Burden in hours


Burden in Dollars


(a)

(b)

(c) a x b

c/60


State official

51

75

3,825

63.75

$2,868.75

District official (survey)

1,700

75

127,500

2,125

$95,625.00

District official (poll)

1,700

10

17,000

283.3

1,2748.50

School official

3,800

45

171,000

2,850

$128,250.00

Total burden

5,551


319,325

5,322.05

$ 227,992.25

NOTE: Assumes an hourly rate of $45 per hour (from the Bureau of Labor Statistics’ Occupational Employment Statistics for educational administrators, May 2009).


The burden from the already approved recruitment package (1,509 burden hours) will be carried over and added to the burden requested (5,322 burden hours) for the baseline data collection package. Therefore, the total annual burden will be 6,831 burden hours.


A.13 Estimates of the Cost Burden to Respondents

There are no annualized capital/startup or ongoing operation and maintenance costs associated with collecting the information.



A.14 Estimates of Annualized Government Costs

The amount for the design, conduct of three annual surveys and two polls, analysis, and reporting for this evaluation is $8,440,922. The annualized cost is $2,110,230.



A.15 Changes in Hour Burden

The first submission reflected the hour burden for recruitment (1,509 burden hours). This second submission reflects the hour burden for conducting a state survey in spring 2011, a district survey in spring 2011, a school survey in spring 2011, and a district poll in fall 2011. So, this submission reflects a program change of 5,322 hours.


A.16 Time Schedule, Publication, and Analysis Plan

We will produce evaluation reports for policy makers and practitioners and generate useful annual tabulations for individual states. In writing reports, we will follow the principals of the Federal Plain Language Action and Information Network and adhere to the requirements of the NCES Statistical Standards (2002), IES Style Guide (2005) and other IES guidance and requirements for public reporting.

Each evaluation report will answer a clearly established set of questions using both extant sources of data and information from the state, district, and school surveys. Each report will start with an outline of highlights. Then for each question, the report will include a discussion of the context for understanding the findings, the data sources used and their limitations, the data collection methodology, the analyses conducted and findings. Appendices will provide more detailed information about, for example, the purpose of the evaluation and its design, the approaches to data collection, sampling methodology, and survey response rates.


Table A-4 summarizes plans for tabulating data and publishing reports to address the policy/research questions.


Table A-4. Reporting schedule


Report

Content

Data sources

Date

Distribution of funding report

Early distribution of ARRA funding. Includes descriptive information on the characteristics of states and districts that received funds.

Data for this analysis will come from ARRA grant awards, combined with extant data.

Summer 2011

Baseline survey report

Overview of pre- and early ARRA funding and implementation strategies.

Based on the 2011 surveys.

Spring 2012

State tabulations

State specific reports, providing aggregate survey data for the sampled districts in the state; will not be state representative.

Distribution of funding report and baseline survey report.

Summer 2012

Early implementation report

Policy/research question 1-3, expanding upon state, district and school strategies implemented under ARRA.

Funding applications, performance reports, state web sites, 2011 survey.

Spring 2013

State tabulations

State specific reports, providing aggregate survey data for the sampled districts in the state; will not be state representative.

Early implementation report and 2012 survey.

Summer 2013

State tabulations

State specific reports, providing aggregate survey data for the sampled districts in the state; will not be state representative.

2013 survey

Summer 2014

Final report

Summative report covering all aspects of the evaluation including baseline, implementation progress, and outcomes.

Extant data, annual reports, surveys, funding applications, state web sites.

August 2014


The series of evaluation reports described above will be supported by analyses that will have two main objectives: (1) the largest effort will be to describe the allocation of ARRA education funds at the state, district, and school levels and the extent to which ARRA reform strategies are being implemented; and (2) a smaller but important effort to link “exposure” to ARRA reforms (direct and indirect) to improvements in student outcomes and other important policy goals. Each set of planned analyses is described below.


It is important to note that the analyses described below primarily focus on our plans for the initial round of state, district, and school surveys. The third and final OMB package, requesting clearance for follow-up surveys, will further detail plans for longitudinal data analyses.


ARRA Funding and Implementation


A primary goal of the evaluation will be to document where the ARRA funds went (i.e., how much money from individual programs and overall did states and school districts receive), how they were used (i.e., which general and specific reform strategies were adopted and what implemental processes took place to enact change), and to what extent are state, districts, and schools implementing these strategies regardless of funding source. To achieve this goal, in-depth descriptive analyses will be used answer research questions focused on funding, reform strategies, and implementation (detailed in section A.1).


While simple descriptive statistics such as means and percentages will provide answers to many of our questions, cross-tabulations will be important to providing policy relevant information. Cross-tabulations will be also important to illustrate the distribution of funds and adopted reform strategies across states and districts with varying characteristics. Our use of stratification (and oversampling when necessary) in our sample design will allow for certain subgroup comparisons, and additional cross-tabulations will be made based on other variables. Comparisons will include the follow:


  • States will be stratified on whether the state received RTT funding (RTT) or not (non-RTT), to examine issues of within-state coherence in implementation priorities or the types and extent of state assistance provided to districts.


  • Districts will be stratified on high and low poverty, and on urbanicity (central city, urban fringe, town, and rural). We focus on poverty because of the Federal Government’s traditional focus on helping to mediate the effects of local funding constraints on educational opportunity. We focus on urbanicity because of the relationships between educational opportunity and rural isolation and the concentration of poverty in urban schools.


  • Schools will be stratified on school level (elementary, middle, and high), school performance level (persistently lowest-achieving (PLA) schools, Title I schools in need of improvement (SINI) that are not PLA, and all other schools), and school size (small, medium, and large). Schools with concentrations of low achieving students are a particular focus of the Elementary and Secondary Education Act (ESEA) and ARRA funding, and we expect the strategies used under the ARRA programs and possibly their implementation to differ by grade level. We hypothesize that small schools may need to adopt different reform strategies than do large schools.


  • Other comparisons of interest include the degree of coordination between states and districts, the proportion of state funds devoted to the strategies being used, differences in state governance structure (e.g., top-down governance versus a more locally focused structure), and variations in the average level of ARRA funding (i.e., do states with relatively “higher” per student ARRA funding levels make different choices than relatively lower funded states).


With subsequent rounds of data collection after the 2011 surveys, for which we are currently requesting approval, additional types of tabulations will be possible including those examining implementation change over time.


Because of the use of a statistical sample, survey data presented for districts and schools will be weighted to national totals (tabulations will, therefore, provide standard errors for the reported estimated statistics). In addition, the descriptive tables will indicate where differences between subgroups are statistically significant. We will use Chi-Squared tests to test for significant differences among distributions and t-tests for differences in means. Tabulations will be included in the baseline, early implementation, and final reports where appropriate.


The types of data tabulations that we will prepare and report using the baseline survey data are illustrated in Tables A-5 – A-10, provided at the end of this section. Tables A-5 and A-6 are examples of table shells using the SEA baseline survey data. Tables A-7 and A-8 are examples of table shells using the LEA baseline survey. Tables A-9 and A-10 use the school baseline survey.


Link between ARRA Funds, Strategies, and Outcomes


This evaluation, while not an impact study, will examine the relationship between funding, strategies, and outcomes. These analyses clearly cannot draw causal conclusions about the effects of ARRA funding or particular strategies, but if a statistically significant association or correlation is observed, the results provide support for hypotheses about their benefits. It is important to note that, with some ARRA reform grants only awarded to states and districts in summer 2010, it may take time before guidance and funds trickle down to schools where many key activities are expected to take place. Thus, trends in outcomes that could plausibly be associated with ARRA may not be observed until later in our data collection period or beyond.


We have planned or are considering four different types of relational analyses:


  1. Descriptive. The first is a straightforward set of descriptive tables that will show the relationships between outcomes (see below) and ARRA funding levels, strategies used for reform by reform area, and measures of implementation. Breakdowns by the previously discussed state, district or school characteristics will further sharpen the interpretation of these relationships. A possible extension will involve the development of common strategy “bundles” ( collections of reform strategies that together are targeted toward a particular reform area) and comparisons of outcomes across groups of schools or districts that appear to be using different strategy bundles.

  2. Dose Response. The second will be a “dose response” analysis that will investigate the relationship between the level of ARRA exposure (the “dose”) and variation in student achievement outcomes as measured by state assessments in reading and mathematics. This analysis will use propensity analysis methods that have recently been developed to examine dose-response functions when the “treatment” is a measured as a continuous variable (see Imbens, 2000; Hirano & Imbens, 2004; Imai & Dyk, 2004).3 Here the dosage would be either one of the “intensity” measures described below.

  3. Interrupted Time Series. This common, but not strongly rigorous, way to relate changes in outcomes to the introduction of a new program could be used in this evaluation. Because ARRA funding was disbursed over time, not at one particular point in time, this type of analysis could be challenging but we will explore the feasibility of using it.

  4. Modeling. Finally, we will pursue an exploratory analysis to see if we can model the relationship between reform strategies and outcomes, taking into account the multiple levels of ARRA activity (state, district, and school).

These analyses will focus on outcomes for which we can readily obtain the data across the different funding streams and levels of grant recipients: (1) improvements in test scores (including proficiency levels, achievement gaps, and school improvement status); and (2) improvements in high school graduation rates. We recognize that state tests and proficiency cut off points vary in rigor. For this reason, we are proposing a longitudinal analysis to examine improvements in achievement as defined in each state rather than trying to compare across states. For particular funding streams (SFSF, RTT, SIG) we may also be able to augment these outcomes with data on rates of college application and/or enrollment, and on a measure of the equity of the distribution of effective teachers, depending on the reliability of reported indicator data.


Conceptually, these analyses posit a theory that the greater the “intensity” of ARRA exposure the more there is a relationship to the observed improvements in key outcomes (e.g., greater “gains” in student’s academic proficiency rates or in high school graduation rates over time). But what do we mean by “intensity” of exposure? We are considering two possible measures:


  • ARRA funding levels: The most obvious measure of intensity is the amount of federal resources provided (and here we refer to both direct grant dollars as well as “monetized” staff participation) to states, districts, and schools. Some theorize that more resources provided per student (or school) should be associated with greater relative improvements in student outcomes. Of course, a complication of analysis using this type of intensity measure is that ARRA funding may merely be substituting for a decline in other (state or district) funding sources or it may vary across our units of analysis. Thus, what could appear to be a high level of ARRA funding support in some locations may not be a high level of overall education funding. We will attempt to control for this confounding using for both survey-related and extant data-related measures of fiscal distress.

  • Breadth and depth of exposure: A second way to think about intensity is to develop an index that captures (within states and districts, and for individual schools) the strength of the reform effort that can be related to ARRA. By breadth we mean the rate of coverage of districts, schools, and individuals (leaders, teachers, and students); by depth, we mean the force with which reform strategies are applied to participants, e.g., the duration of professional development.

What we want to be able to assess is variation in the power of ARRA as a driver of reform and to then see if we can associate this variation to changes in important educational outcomes.



Table A-5. Percentage distribution of states by implementation status for strategies related to adopting new standards and aligning assessments during the 2010-2011 school year, by RTT status



SEA Standards & Assessments Strategies

RTT States

Non-RTT States

% Not Planned

% In Planning

% In Use


% Not Planned

% In Planning

% In Use


Implementing New State Standards

Professional development for teachers focused on new state standards adopted since January 2009 for:

Mathematics









Reading/English language arts









Science and/or social studies









Professional development for teachers focused on helping:

English Language Learners (ELL) master new state standards









Special education students master new state standards









Instructional materials aligned with new state standards (e.g. selection and/or development of curriculum guides, pacing guides, etc. aligned with new state standards) for:,

Mathematics









Reading/English language arts









Science and/or social studies









English Language Learners (ELL)









Special education students










Table A-5. Percentage distribution of states by implementation status for strategies related to adopting new standards and aligning assessments during the 2010-2011 school year, by RTT status -- continued



SEA Standards & Assessments Strategies

RTT States

Non-RTT States

% Not Planned

% in Planning

% in Use


% Not Planned

% in Planning

% in Use


Implementing Assessments Aligned with New State Standards

Assessments in core academic subjects aligned with new state standards (e.g., development and adoption of new assessments) in:

Mathematics









Reading/English language arts









Science and/or social studies









Assessments aligned with new state standards (e.g., development and adoption of new assessments) for:

English Language Learners (ELL)









Special education students









Professional development to prepare teachers to use data from new assessments to improve instruction









Professional development to prepare principals and other school leaders to use data from new assessments in school improvement planning











Table A-5. Percentage distribution of states by implementation status for strategies related to adopting new standards and aligning assessments during the 2010-2011 school year, by RTT status -- continued



SEA Standards & Assessments Strategies

RTT States

Non-RTT States

% Not Planned

% in Planning

% in Use


% Not Planned

% in Planning


% in Use

Using Assessment Data and Assessment Systems

Professional development focused on improving instruction by using data from::

State assessments









District assessments









Locally developed formative assessments









Facilitate local access to and use of state data systems









Link local data systems to state data systems









Source: U.S. Department of Education, Institute of Education Sciences, Integrated Evaluation of ARRA Funding, Implementation, and Outcomes: Spring 2011 SEA Survey.


Tabulations would also be done for other reform areas and for other state classification variables such as Census region

Table A-6. Percentage distribution of states by SEA reform priorities for 2011-2012, by RTT status and level of priority


SEA Reform Priorities for 2011-2012

RTT States

Non-RTT States

% Highest Priority


% High Priority


% Medium Priority

% Low Priority

% Highest

Priority

% High Priority


% Medium Priority

% Priority

Development or implementation of:

New state content standards in reading/English/language arts and/or mathematics









New content standards in other subjects









New summative assessments









New formative assessments









On-line data systems that provide information on student achievement growth or gains









Improved ways to recruit and hire effective educators









Improved educator induction programs









Evaluation systems that rely in part on value added or growth models to hold teachers accountable for improved student outcomes









Performance-based compensation systems for educators









Incentives or programs to attract and retain highly-qualified educators in the LEA’s low-performing schools









Programs or strategies to improve the performance of the LEA’s low-performing schools









Source: U.S. Department of Education, Institute of Education Sciences, Integrated Evaluation of ARRA Funding, Implementation, and Outcomes: Spring 2011 SEA Survey



Within each state, the priority level for each activity identified by the SEA, LEAs and schools will be compared to determine the level of agreement among the entities of reform priorities. We will compare this level of agreement for RTT states and non-RTT states, and for districts that are urban/rural, high/low poverty, and, where possible, by ARRA program stream participation.

Table A-7. Percentage distribution of districts by implementation status for strategies related to implementing new standards and assessments during the 2010-2011 school year, by RTT status



LEA Standards & Assessments Strategies

Districts in RTT States

Districts in Non-RTT States

% Not Planned

% In Planning/

Development

% Available to All Schools

% Available to Some Schools

% Not Planned

% In Planning/

Development

% Available to All Schools

% Available to Some

Schools

Implementing New State Standards

Instructional materials (e.g., curriculum guides, curriculum frameworks, pacing guides) aligned with new state standards that were developed for:

The district









The state









A school-site instructional specialist or coach to support instruction tied to new state standards for:

Mathematics









Reading/English/language arts









Science or social studies









English Language Learners (ELL)









Criteria for schools to use when selecting a new curriculum aligned with new state standards









On-line access to professional development programs that are aligned with new state standards for educators










Table A-7. Percentage distribution of districts by implementation status for strategies related to implementing new standards and assessments during the 2010-2011 school year, by RTT status – continued



LEA Standards & Assessments Strategies

Districts in RTT States

Districts in Non-RTT States

% Not Planned

% In Planning/

Development

% Available to All Schools

% Available to Some Schools

% Not Planned

% In Planning/

Development

% Available to All Schools

% Available to Some Schools

Implementing Assessments and Data Systems

District summative assessments in:

Non-NCLB tested grades









Non-NCLB tested subjects









Formative student assessments to aid teachers in adapting instruction to students needs









Assuring that tests are vertically scaled across grades to better measure student growth









Teachers have on-line access to individual student results from:

State summative assessments









District summative assessments









Formative assessments









Teachers have on-line access to students’ demographic information, attendance, or discipline data linked to student assessment data









Provide teachers and principals with computers for use in accessing district student data systems









Source: U.S. Department of Education, Institute of Education Sciences, Integrated Evaluation of ARRA Funding, Implementation, and Outcomes: Spring 2011 LEA Survey.

Tabulations would also be done for other reform areas and for other district classification variables such as high versus low poverty districts and urban versus rural districts.

Table A-8. Percentage distribution of districts by LEA reform priorities for 2011-2012, by RTT status and level of priority


LEA Reform Priorities for 2011-2012

Districts in RTT States

Districts in non-RTT States

% Highest Priority


% High Priority


% Medium Priority

% Low Priority

% Highest

Priority

% High Priority


% Medium Priority

% Priority

Development or implementation of:

New state content standards in reading/English/language arts and/or mathematics









New content standards in other subjects









New summative assessments









New formative assessments









On-line data systems that provide information on student achievement growth or gains









Improved ways to recruit and hire effective educators









Improved educator induction programs









Performance evaluation systems that hold educators accountable for improved student outcomes









Performance-based compensation systems for educators









Incentives or programs to attract and retain highly-qualified educators in the LEA’s low-performing schools









Programs or strategies to improve the performance of the LEA’s low-performing schools









Source: U.S. Department of Education, Institute of Education Sciences, Integrated Evaluation of ARRA Funding, Implementation, and Outcomes: Spring 2011 LEA Survey


Tabulations would also be done for other breakdowns, for example, by high versus low poverty districts, urban versus rural districts.





Table A-9. Percentage distribution of schools by implementation status related to adopting new standards and assessments during the 2010-2011 school year, by RTT status


School Standards and Assessment Strategies

Schools in RTT States

Schools in Non-RTT States

% Not in Use

% Pilot Testing

% Implementing

% Not in Use

% Pilot Testing

% Implementing

Implementing New State Standards

A new curriculum aligned with new state standards for:

Mathematics







Reading/English/ language arts







Science or social studies







A curriculum specifically focused on meeting English Language Learner (ELL) students needs to meet new state standards







New curricula selected:

From an approved list provided by state or district







Based on state or district guidance







Teachers have instructional materials aligned with new state standards for at least some subjects/grades







An instructional specialist or coach to support instruction tied to new state standards in:

Mathematics







Reading/English/language arts







Science or social studies







English Language Learners (ELL)







Professional development on the new standards for:

Teachers about how to apply them in their classrooms







Instructional coaches and/or mentors to develop skills to help teachers with the new standards







The principal about how to monitor their classroom application







Educators have on-line access to professional development programs aligned with new state standards







Source: U.S. Department of Education, Institute of Education Sciences, Integrated Evaluation of ARRA Funding, Implementation, and Outcomes: Spring 2011 School Survey.


Tabulations would also be done for other breakdowns, for example schools in high versus low poverty districts, schools in urban versus rural districts, and persistently low performing (PLA) schools versus non-PLA schools.

Table A-10. Percentage distribution of schools by school reform priorities for 2011-2012, by RTT status and level of priority


School Reform Priorities for 2011-2012

Schools in RTT States

Schools in non-RTT States

% Highest Priority


% High Priority


% Medium Priority

% Low Priority

% Highest

Priority

% High Priority


% Medium Priority

% Priority

Implementation of:

New content standards in reading/English/language arts and mathematics









New content standards in other subjects









New summative assessments









New formative assessments









On-line data systems that provide information on student learning growth or gains









Improved ways to recruit and hire effective educators









Improved educator induction programs









Performance evaluation systems that hold educators accountable for improved student outcomes









Performance-based compensation systems for educators









Incentives or programs to attract and retain effective educators









School restructuring or reorganization









Strategies for improving instruction or related student services









Source: U.S. Department of Education, Institute of Education Sciences, Integrated Evaluation of ARRA Funding, Implementation, and Outcomes: Spring 2011 School Survey


Tabulations would also be done for other breakdowns, for example, schools in high versus low poverty districts, schools in urban versus rural districts, and persistently low performing (PLA) schools versus non-PLA schools.




A.17 Display of Expiration Date for OMB Approval

The Institute of Education Sciences is not requesting a waiver for the display of the OMB approval number and expiration date. The recruitment letters will display the expiration date for OMB approval.



A.18 Exceptions to Certification Statement

This submission does not require an exception to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).


1 The degree to which the State Fiscal Stabilization Fund and bolstering of already established programs (e.g., IDEA) successfully saved and created new education jobs is of policy interest as well. However, (a) this topic has been examined in other forums and (b) at the time when this study is fielded, funds tied to job retention and creation are unlikely to still be available to states, districts, and schools.



2 If additional evaluation resources are available, IES may consider an additional round of data collection in 2014 to more fully capture how implementation efforts change after ARRA funds are spent down.

3 Imbens, G. (2000). The role of the propensity score in estimating dose-response functions. Biometrika, 87(3): 706-710. Hirano, K., and Imbens, G. (2004). The propensity score with continuous treatments. In A. Gelman and X-L Meng (Eds.), Applied Bayseian Modeling and Causal Inference From an Incomplete Data Perspective. John Wiley and Sons, Ltd. Imai, K., and Dyk, D. (2004). Causal inference with general treatment regimes: Generalizing the propensity score. Journal of the American Statistical Association, 99(467): 854-866.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Double-Sided Body Template
AuthorAbt Associates Inc
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy