Att_1850-0877 v3 4754 ARRA Evaluation Part A 2-7-12

Att_1850-0877 v3 4754 ARRA Evaluation Part A 2-7-12.docx

Integrated Evaluation of American Recovery and Reinvestment Act (ARRA) Funding, Implementation and Outcomes

OMB: 1850-0877

Document [docx]
Download: docx | pdf

Integrated Evaluation of ARRA Funding, Implementation and Outcomes



Statement for Paperwork Reduction Act Submission


PART A: Justification



Contract ED-IES-10-CO-0042







February 2012







Contents



Page





Part A: Justification

This package is the third of three for the Integrated Evaluation of ARRA Funding, Implementation, and Outcomes. Our initial request sought approval for execution of a sampling plan and recruitment of the selected sites. Approval for these activities was received on January 13, 2011 (see 1850-0877 v.1 (4385)). Our second request sought approval for an initial round of data collection to include surveys of all states and a nationally representative sample of districts and schools in spring 2011. Approval for baseline data collection was received on April 5, 2011 (see 1850-0877). This third and final package is requesting approval to conduct follow up surveys with the same respondents in 2012 that were sampled and surveyed in 2011.


Please note that this OMB package is identical to the OMB package that was approved for baseline data collection – with the overall purpose of the study and primary data collection activities remaining unchanged – with a few exceptions that are discussed below:


(1) We will no longer be conducting the final round of follow up surveys planned for 2013. This decision is related to a newly awarded IES contract focused on examining the implementation of Title I/II program initiatives and IES’ interest in coordinating efforts to reduce burden on state, district, and school respondents. Like the ARRA Evaluation, the Title I/II study will also involve nationally representative surveys examining the implementation of reform efforts. Therefore, IES is coordinating the two studies so that they are mutually informative and will avoid duplication of efforts by fielding surveys for the Title I/II study only in 2013. This will allow IES to track key reform activities over time, while (a) avoiding the potential difficulty that might arise when seeking high response rates to a survey focused on ARRA years after the funds were distributed and (b) avoiding undue burden for respondents.


(2) Since we will not be conducting the final round of data collection, a set of outcomes analyses that were described in Part B of the second OMB package will not be conducted. Although we will examine relationships between ARRA funding and the implementation of reforms thought to promote achievement, it will not be feasible to examine more direct relationships between funding and achievement.


(3) Finally, we will not be conducting polls in fall/winter 2011 and in fall/winter 2012. This decision is a based on the amount of information captured by the study’s baseline and follow up surveys and the desire to reduce burden for respondents.


Changes to this third OMB package (as compared to the second submission), which all stem from the decisions discussed above, are highlighted in yellow.


Introduction


On February 17, 2009, President Obama signed the American Recovery and Reinvestment Act (ARRA) into law (Pub. L. 111-5). ARRA provides an unprecedented $100 billion of additional funding for the U.S. Department of Education (ED) to administer. While the initial goal of this money is to deliver emergency education funding to states, ARRA is also being used as an opportunity to spur innovation and reform at different levels of the U.S. educational system. Specifically, ARRA requires those receiving grant funds to commit to four core reforms: (1) adopting rigorous college-ready and career ready standards and high quality assessments, (2) establishing data systems and using data to improve performance, (3) increasing teacher effectiveness and the equitable distribution of effective teachers, and (4) turning around the lowest performing schools. Investment in these innovative strategies is intended to lead to improved results for students, long-term gains in school and local education agency (LEA) capacity for success, and increased productivity and effectiveness.


The education component of ARRA consists of several grant programs targeting states and LEAs and, in some cases, consortia led by non-profit organizations. The programs under ARRA fall into three general categories: (1) existing programs that received an infusion of funds (e.g., Individuals with Disabilities Education Act, Parts B & C; Title I; State Educational Technology grants; Statewide Longitudinal Data Systems grants); (2) a new program intended mainly for economic stabilization (i.e., State Fiscal Stabilization Fund); and (3) newly created programs that are reform-oriented in nature. Due to the number and scope of these programs, a large proportion of districts and schools across the country will get some ARRA funding. In turn, ARRA represents a unique opportunity to encourage the adoption of school improvement focused reforms and to learn from reform initiatives as they take place.


Although ARRA funds are being disbursed through different grant programs, their goals and strategies are complementary if not overlapping, as are the likely recipients of the funds. For this reason, an evaluative approach where data collection and analysis occurs across grant programs (i.e., it is “integrated”), rather than separately for each set of grantees will not only reduce respondent burden but will also provide critical information about the effect of ARRA as a whole.


Participation in the evaluation is required to maintain a benefit.  The required participation can be found in EDGAR regulations sections 75.591 and 75.592 (see below).  This expectation is communicated with states and districts receiving ED grants letting them know that they do have an obligation to cooperate with evaluation studies.


§ 75.591 Federal evaluation—cooperation by a grantee.

A grantee shall cooperate in any evaluation of the program by the Secretary.

(Authority: 20 U.S.C. 1221e–3 and 3474)

[45 FR 86297, Dec. 30, 1980]


§ 75.592 Federal evaluation—satisfying requirement for grantee evaluation.

If a grantee cooperates in a Federal evaluation of a program, the Secretary may determine that the

grantee meets the evaluation requirements of the program, including §75.590.

(Authority: 20 U.S.C. 1221e–3 and 3474)


Overview of the Study


The Integrated Evaluation of ARRA Funding, Implementation and Outcomes is being conducted under the Institute of Education Sciences (IES), ED’s independent research and evaluation arm. The study is one of several that IES will carry out to examine ARRA’s effects on education (see Exhibit A-1).



Exhibit A-1. IES Evaluation of ARRA’s Effects on Education


The Integrated Evaluation is designed to assess how ARRA efforts are unfolding over time and is therefore primarily descriptive. While information will be gathered on many of the grant programs, the evaluation will focus primarily on the reform-oriented programs (e.g., Race to the Top (RTT), Title I School Improvement Grants (SIG), Investing in Innovation (i3), and the Teacher Incentive Fund (TIF)) since those are of the greatest policy interest. 1 The study will support the various impact evaluations IES is conducting by providing critical context for those strategies being rigorously investigated – e.g., by documenting the relative frequency with which they are being implemented across the country, whether they are unique to the particular grant programs, and how they are being combined with other reform approaches.


To achieve these objectives, the Integrated Evaluation will draw heavily on existing information (grant funding allocations, district and school outcomes databases, performance reporting where available) and administer new surveys to all 50 states and the District of Columbia, and to a nationally representative survey of districts and schools. The first year of this two year survey was conducted in the spring 2011 and the second year of the study will be in spring 2012.  


The evaluation’s theory of action, which is displayed in Figure A-2, posits that ARRA and the individual programs included in it are appropriately understood as a federal strategy for intervening in ongoing state and local education reform efforts. As the theory of action suggests, states have more or less well-defined reform agendas and priorities and many of these existed prior to ARRA. The arrows from the top and bottom boxes to the box on the left side of the display suggest that state reform priorities and strategies have been and continue to be influenced by the availability of ARRA education funds and the requirements established by the various ARRA programs.


The four ARRA assurances define the core elements of the federal strategy. The theory of action suggests that two of the four assurances, increasing educator effectiveness and equitable distribution and improving low-performing schools, are the primary foci of ARRA expectations and ARRA-supported reforms. The “School Improvement” box on the right side of the model appears as it does to suggest that ARRA has high aims for improving all schools, while at the same time targeting significant resources to improving the lowest-performing schools. Setting new standards and developing new assessments aligned with these standards and establishing statewide longitudinal student data systems (the other two ARRA assurances areas) are important to the reform equation, but are best understood in terms of how they contribute to reforms in the other two areas.


The location of the “District Reform Priorities and Strategies” box suggests that while states exert considerable leadership in education reform, much of the work is done at the local levels as district and school staff work to improve instruction. Nowhere is this more clearly demonstrated than in the implementation of the myriad of strategies associated with increasing educator effectiveness and equitable distribution. These strategies include (1) designing education preparation programs and ongoing professional development aligned with the state and local performance standards; (2) designing and implementing quality induction programs for new teachers and principals; (3) designing and implementing new educator evaluation systems that include clear evidence of gains in student achievement as a criterion for effective performance; and (4) designing and implementing new systems of compensation and incentives which recognize and reward quality performance and help to ensure that highly-effective educators are assigned to and continue to work in hard-to-staff schools. Together these strategies define an aligned human resource management system, which, in turn prepares and supports educators’ efforts to improve schools, especially the lowest-performing schools. The ultimate goal of ARRA programs and the reforms they support is to improve student learning.


The left-to-right arrows connecting the boxes through the middle of the diagram, labeled “C3” as shorthand for communication, coordination, and collaboration, suggest the importance of the linkages among state, district, and school reform efforts. Understanding what each of these linking strategies looks like and its contributions to advancing reform efforts is important to understanding the overall effectiveness of the reforms. The “Lessons” arrows connecting the boxes through the middle of diagram from right to left are intended to convey the idea that the lessons learned as implementation proceeds may lead to mid-course corrections in strategies and/or implementation of strategies.


Figure A-2. Integrated Evaluation of ARRA Funding, Implementation, and Outcomes: Theory of Action



    • ARRA Funding, Policy, and Requirements (e.g., RTT, i3, TIF, SLDS, Tech Grants, SIG, SFSF)

  • Shape1

Shape3 Shape2

  • Statewide Technology Infrastructure


  • State Reform

  • Priorities and Strategies


  • State Reform

  • Priorities and Strategies

  • District Reform

  • Priorities and Strategies

  • Student Outcomes

  • Standards and Assessments

Shape11 Shape10 Shape5 Shape9 Shape8
  • ARRA Funding, Policy, and Requirements (e.g., RTT, i3, TIF, SLDS, Tech Grants, SIG, SFSF)

Shape4 Shape7 Shape12 Shape6

Shape14


  • C5

  • C5

  • C3

  • School Improvement

  • Low Performing Schools

Shape15
  • C3

  • C3


Shape17

  • Educator Effectiveness and Equitable Distribution

Shape16

    • Lessons

    • Lessons

    • Lessons

    • Lessons

  • Shape18

Shape19 Shape22 Shape20

Shape25 Shape24 Shape23


  • Data Systems


Shape28





The “Statewide Technology Infrastructure” is included in the theory of action to underscore the fact that in many states developing technology infrastructures are fast becoming part of the glue that holds the reform efforts together and may also be catalysts for increasing the pace of reforms. New student data systems rely heavily on technology and state, district, and school capacity to use the systems. Full implementation of new assessments will also depend heavily on the capacity of data systems to store data and produce timely, user-friendly reports to inform instructional planning. The new data systems are also hypothesized to facilitate reform. Increasingly, states and districts are relying on technology to store instructional materials, especially as new content standards are introduced and applied in classrooms. Finally, states and districts are increasingly relying on technology as the medium for educator professional development.


The theory of action acknowledges that state and local education reforms did not begin with ARRA. The left to right progression displayed in the theory of action suggests that some reforms must be completed or at least make some progress before others can be completed. At the same time, the theory of action probably does not adequately reflect important time dimensions of the various reforms that are underway. A task for this evaluation is to examine how long it will take to implement the planned/expected reforms and how long will it take to see results. A second task will be to examine how the pace and sequence of individual reform efforts interact in the development and implementation of new policies, programs, and practices. Work driven in substantial ways by ARRA funding and expectations is proceeding on many fronts simultaneously. Yet the reality is that some things must be completed (e.g., implementation of new standards and assessments and new data systems) before others (e.g., full implementation of new educator evaluation systems). For example, implementation of the Common Core State Standards is already underway as states and districts develop new instructional resources and provide professional development to teachers and principals to introduce the standards and explain how they can and should be applied in the classroom. The development of new assessments aligned with the standards is also underway, but new state assessments are not expected to be in place for several years. Thus, teachers are likely to face a situation in which they are teaching to the new standards while being held accountable for assessment results that reflect mastery of a different set of standards. Thus, one could argue that, despite early progress, implementation of the new standards will not be complete until the new assessments are implemented.


Similarly, many states and districts are moving quickly on the development of new educator evaluation systems that rely on student learning gains as primary criterion for evaluating educator effectiveness. Because these systems will ultimately rely on student outcomes defined in mastery of the new standards, the systems cannot be considered fully implemented until the standards have been implemented and the new assessments are in place. Finally, logic dictates that it will not be possible to gauge the full impact of ARRA on student learning and other outcomes until these complex reforms are completed. This assumption, along with the others laid out by the theory of action presented above, will guide this study and in turn shape the collection of data efforts.



A.1 Explanation of Circumstances That Make Collection of Data Necessary

The Integrated Evaluation of ARRA Funding, Implementation, and Outcomes is a key component of ED’s efforts to learn lessons from the scope and structures of the ARRA funding. By providing the most comprehensive and independent assessment of ARRA implementation and outcomes across funding streams, it responds to taxpayer interest in how ARRA funds were spent. Although there are other groups and researchers external to ED that are examining some of the same issues (see Sec. A.4), only an ED sponsored contractor will have access to and report on the full set of data. The breadth of policy/research questions that will be addressed by IES’ Integrated Evaluation sets it apart from other ARRA studies:


  1. To what extent did ARRA funds go to the intended recipients?

  • To what extent did high need states, districts, and schools receive support? And how did this vary by ARRA program?

  • To what extent did funds go to states, districts, and schools that positioned themselves to implement reforms? And how did this vary by ARRA program?


  1. Is ARRA associated with the implementation of the key reform strategies it promoted?

  • Which reform strategies received the most attention in terms of (a) how SEAs and districts allocated their ARRA funds and (b) where the greatest implementation activity took place?

  • Comparing pre-2009 and post-2009 levels of implementation, are more key reform strategies implemented after the ARRA education funds were allocated?

  • Is having more ARRA funding associated with more deeper/broader implementation of the reform strategies?


  1. Which implementation supports and challenges are associated with ARRA?

  • What mechanisms are in place at the state and district levels to ensure that the reform efforts are (a) progressing as planned and (b) achieving the intended results?

  • Is alignment of priorities more evident in RTT states, where there is more of a focus on state capacity building?

  • Looking across the four reform areas and related strategies, what implementation challenges do SEAs, districts and schools report? How do these vary by the characteristics of states, districts, and schools?


While relationships between ARRA funding and reform activities thought to impact achievement will be examined, elimination of the final year of data collection will preclude analyses focused on achievement outcomes (i.e., research question 4 presented in the second OMB package has been removed). However, findings from the study will be useful in hypothesis generation for future studies focused on particular reform strategies and achievement outcomes.





A.2 How the Information Will Be Collected, by Whom, and For What Purpose

The Integrated Evaluation will rely on information collected from existing sources, for which there are no respondents or burden, and from a new set of surveys in order to address the research questions described above. See Table A-1 for the linkages between the research questions and the sources of information to answer the questions. We then discuss the extant data sources and the new data collections.

Table A-1. Research Questions


Research Questions

Extant Data

(source, description)

Level*

ARRA Evaluation Survey

(description)

Level*

To what extent did ARRA funds go to the intended recipients?

1) To what extent did high need states, districts, and schools receive support? And how did this vary by ARRA program?

ED grants database – identify grantees, award amounts

S, D, Sch

Funding amount by ARRA program, for sampled schools

Sch

Recovery.gov – identify grantees and subgrantees, award and subaward amounts, amount received, total expenditure, subaward funds disbursed

S, D 

Type of funding source (ARRA, non-ARRA, state, other) by reform area

S, D

2) To what extent did funds go to states, districts, and schools that positioned themselves to implement reforms? And how did this vary by ARRA program?

Quality Counts state policy survey: standards/assessment, teacher effectiveness

S

Pre-2009 reform activities

S, D, Sch

DQC state survey (annual): development of longitudinal data systems

S

CEP State surveys (2009, 2010): status of state reform efforts

S

NCSL: education legislation

S

Grant applications (RTTT, SFSF)

S

*Level refers to the level of the organizational unit for which data are available (e.g., state, district, or school). Relevant survey data may come from one or more of the surveys to be conducted in this study.



Table A-1. Research Questions (continued)


Research Questions

Extant Data

(source, description)

Level

ARRA Evaluation Survey

(description)

Level

Is ARRA associated with the implementation of the key reform strategies it promoted?

3) Which reform strategies received the most attention in terms of (a) how SEAs and districts allocated their ARRA funds and (b) where the greatest implementation activity took place?

ED grants database – identify grantees, award amounts

S, D, Sch

Type of funding source (ARRA, non-ARRA, state, other) by reform area

S, D

Recovery.gov – identify grantees and subgrantees, award and subaward amounts, amount received, total expenditure, subaward funds disbursed

S, D 

Use of strategies by reform area

  • standards/assessment

  • educator recruitment

  • support for new educators

  • educator evaluation systems

  • educator compensation and incentives

  • low-performing schools

  • longitudinal student data systems

S, D, Sch

CEP LEA survey (2010): use of funds by strategy; reform strategies by implementation status

D

Implementation status (not planned, in planning/development, being provided/made available) for strategies within reform areas

S, D, Sch

4) Comparing pre-2009 and post-2009 levels of implementation, are more key reform strategies implemented after the ARRA education funds were allocated?

Grant applications (RTTT, TIF, i3,SIG)

S

Use of strategies by reform area

  • standards/assessment

  • educator recruitment

  • support for new educators

  • educator evaluation systems

  • educator compensation and incentives

  • low-performing schools

  • longitudinal student data systems


pre-2009 reform activities

S, D, Sch

NLS: teacher effectiveness reforms, school improvement strategies (pre-ARRA)

D

CEP State surveys (2009, 2010): status of state reform efforts

S

DQC state survey (annual): development of longitudinal data systems

S


Table A-1. Research Questions (continued)


Research Questions

Extant Data

(source, description)

Level

ARRA Evaluation Survey

(description)

Level

Is ARRA associated with the implementation of the key reform strategies it promoted?

5) Is having more ARRA funding associated with more deeper/broader implementation of the reform strategies?

Performance reports (RTTT, TIF, i3,SIG, SFSF):

  • descriptions of efforts implemented

  • school intervention models

  • educator evaluation, compensation systems

S, D, Sch

Implementation status (not planned, in planning/development, being provided/made available) for strategies within reform areas

S, D, Sch

ED grants database – identify grantees, award amounts

S, D, Sch

School improvement activities, charter and management organizations

Sch

Recovery.gov – identify grantees and subgrantees, award and subaward amounts, amount received, total expenditure, subaward funds disbursed

S, D 

Participation in professional development, by reform strategies

Sch




Table A-1. Research Questions (continued)


Which implementation supports and challenges are associated with ARRA?

6) What mechanisms are in place at the state and district levels to ensure that the reform efforts are (a) progressing as planned and (b) achieving the intended results?

 

 

SEA communication strategies

S. D

LEA communication strategies

D, Sch

Types of oversight, guidance, and technical assistance

D, Sch

7) Is alignment of priorities more evident in RTT states, where there is more of a focus on state capacity building?

 

 

Priority level by reform area activities

S, D, Sch

LEA strategies address state requirement (Y/N)

D

8) Looking across the four reform areas and related strategies, what implementation challenges do SEAs, districts and schools report? How do these vary by the characteristics of states, districts, and schools?

 

 

Specific challenges by reform area, (major, minor, not a problem)

S, D


Extant Data Sources


  • ED Databases. We will use data from the National Center for Education Statistics’ Common Core of Data (CCD) and ED’s EDFacts to assemble the sampling frame for this study. Data items will include urbanicity, school level, poverty status, improvement status, total enrollment and limited English proficient student enrollment, and adequate yearly progress of correction action status under the Elementary and Secondary Education Act (ESEA).


  • ED ARRA Program Files. We will use data from ED program files to compile a list of grant recipients for TIF to be used in assembling the school district sampling frame for this study. We will use data from ED program files and state websites to compile the list of persistently lowest achieving schools to be used in assembling the school sampling frame for this study. We will obtain data from ED program files on the amounts of funding received by states and districts from each formula grant and each discretionary program.


  • ARRA Required Reporting and Information. We are examining the types of information that the statute and ED require to be reported by states, districts and school as a condition of program participation. Some data are provided directly to ED, including application materials, performance indicators, and progress reports. Some data must be reported “publicly” but not necessarily to ED. Other than for SFSF, much of the reporting has not begun. Information that is available will be used in reporting and, to the extent possible, will not be duplicated in the surveys to be administered to states, schools districts, and schools. However, it is important for analytic purposes to have the same data collected for grantees and non-grantees to allow for comparison – e.g., district reports of state support provided in RTT states versus non-RTT states. It is equally important, from a research perspective, for the data collection modes or mechanisms to be the same. We will balance these research needs with the desire not to add burden to grantees who already have reporting responsibilities.


  • Non-ED Databases. We will obtain data from other, non-ED sources to provide context variables or outcomes measures. For example, we are exploring whether the National Longitudinal School-Level State Assessment Score Database or State Education Data Center provide more comprehensive or historical district and school measures of proficiency rates than does the Department’s EdFacts systems. Other organizations, like the Rockefeller Institute and Pew Foundation have developed measures of state’ and, in some cases, large districts’ fiscal conditions that would provide important analytic opportunities for examining the conditions under which ARRA implementation is taking place.


New Data Collections


Because there is currently no reliable source of detailed information on the strategies being implemented under the various ARRA programs, we will administer a set of surveys to obtain this information (see Table A-2).


Table A-2. Description of Information to be Collected


Instrument

Type of Respondent

Data Collection Mode

Content

State survey

Chief state school officer as point of contact; sections to be distributed and completed by appropriate state staff

Paper and pencil

State strategies for adopting new standards, establishing aligned assessments, supporting new educators, establishing educator evaluation systems, supporting the improvement of low performing schools, and using statewide longitudinal data systems; state efforts to encourage district-level and school-level adoption of reform strategies; challenges related to reform efforts; funding sources for reform initiatives; state priorities for future reform.

District survey

Superintendent to designate a district liaison; sections to be distributed and completed by appropriate district staff

Web with hard copy (paper and pencil) if requested

District strategies for adopting new standards, supporting new educators, establishing educator evaluation systems, supporting the improvement of low performing schools, and using statewide longitudinal data systems; districts efforts to encourage school-level adoption of reform strategies; challenges related to reform efforts; funding sources for reform initiatives; district priorities for future reform.

School survey

Principal

Web with hard copy (paper and pencil) if requested

Specific reform activities taking place at the school-level; depth and breadth of professional development related to reform efforts; resources purchased or received to support reform; challenges related to reform efforts.

Poll #1 (fall 2011)

This has been eliminated by IES.

Superintendent to designate a district liaison; questions to be distributed and completed by appropriate district staff

Web with telephone option if requested

Types of guidance provided by the state; types of assistance needed; district partnerships to support reform.


Poll #2 (fall 2012)

This has been eliminated by IES.

Superintendent to designate a district liaison; questions to be distributed and completed by appropriate district staff

Web with telephone option if requested

Types of guidance provided by the state; types of assistance needed; district partnerships to support reform.






In developing the surveys, several underlying principles guided what information we are asking for and how we are asking for that information:


  • It is not possible to do a general ledger examination to document what state and districts spent the money on.  ARRA is too new and, as a set of primarily temporary programs, states and school districts have not set up formal permanent book keeping on “use.”


  • Schools will not know what state/district money or guidance is ARRA-related, with the possible exception of TIF and i3.


  • Because the ARRA money is integrated with state funds, it is more important and reliable to ask about reform activities being implemented rather than only those implemented with ARRA funds.


State Surveys


The state surveys will be administered as paper and pencil instruments, sent electronically so that the respondent has the option of typing responses and sending it back electronically, or printing the instrument, completing it and mailing it back. We determined that 51 respondents did not warrant the development of a web survey but we will reconsider the use of a web-based survey in the pilot testing phase. The survey will be sent to the chief school officer in each of the 50 states and the District of Columbia in spring 2011 and 2012. Each chief will be responsible for determining whether he or she is in the best position to respond the questions in the instrument or for requesting that other state education agency (SEA) officials take the lead for responding to individual sections. The survey will be modularized to allow for both options—for example, questions on teacher quality and evaluation could be completed by the state official who is most responsible for enacting that part of a state plan, questions about state data systems can be completed by someone else. The state surveys will be used to examine state priorities for ARRA funding and implementation, shifts in state policy and legislation to support their ARRA efforts, and types of supports and communication provided to districts and schools. The state survey is in Appendix A.


District Surveys


Web-based surveys will be administered to the deputy superintendents of districts sampled for the study. We have found that the deputy superintendents are typically designated as responsible for research efforts and are best suited to determining if additional staff is needed to complete sections of the survey. Like the state survey, the district survey will be modularized to allow for completion by one or multiple respondents. The survey will collect information such as district priorities for improvement efforts, status of implementation, supports and technical assistance provided by the state to the district and by the district to schools in their community. While we will link some questions to ARRA, we anticipate that many districts will not know which specific funds received from the state came from ARRA grants and so the bulk of the questions will simply relate to aspects of the reform strategies that they are implementing. The district survey is in Appendix B.


School Surveys


Web-based surveys will also be administered to each sampled school principal. We anticipate that the principal will be the sole respondent, as he/she will be in a position to answer questions about the emphasis of their school improvement efforts including how they evaluate teachers, state and district provided professional development and other supports, instructional changes, and use of data. The school survey is in Appendix C.



A.3 Use of Improved Information Technology to Reduce Burden

We will administer the district and school surveys via the web, so it is easily accessible to respondents. This will not only save money in postage, coding, keying, and cleaning the survey data but also, we have found, is a preferred method for survey completion among many respondents. Burden will be reduced with the use of skip patterns and prefilled information based on responses to previous items when appropriate.


The web-based surveys will also facilitate the completion of the surveys by multiple respondents, so that the most appropriate individual will be able to access and provide the data in their area of expertise. This approach will reduce burden for respondents as (a) each individual will have fewer questions to answer themselves and (b) respondents will be asked questions concerning topics in which they are well versed and answers should be readily available.


For respondents that choose not to use the web-based survey, paper and phone survey options will be offered to respondents as part of the nonresponse follow-up effort. Thus, if paper and phone methods are needed to achieve a high response rate, they will be used.



A.4 Efforts to Identify and Avoid Duplication

Our identification and avoidance of duplication falls into two categories: (1) extensive use of extant data in place of new data collection, and (2) analysis of other large scale surveys about ARRA


Use of Extant Data


In section A.2, we detailed sources of existing data and described how we plan to use this data for sampling and for reporting.


Analysis of Other Surveys about ARRA

While we are aware of other studies focused on ARRA education funds, they suffer from various limitations including a one-time collection effort, a focus on a single ARRA grant program, lack of representative sampling, problems in obtaining high response rates, and no linkages across the various educational levels.


The Integrated Evaluation is unique in multiple ways. First, the study stands out as the only evaluation that will survey the universe of states and nationally representative samples of districts and schools that are large enough to detect differences between states (and districts) and relationships between funding and outcomes. For example, the American Association of School Administrators August 2009 survey “Schools and the Stimulus: How America’s Public Schools Districts Are Using ARRA Funds” asked interesting questions concerning the use of ARRA funds to fill budget holes. However, only 160 administrators were surveyed in all and one-in-four states were not included in the sample.


Second, the Integrated Evaluation offers a unique opportunity to document the flow of funds and implementation efforts from states, through districts, to schools. As far as we know, there are no other surveys of schools, which is where much of the actual implementation and success of reform efforts will take place. While the Center on Education Policy (CEP) is surveying state and districts, a relatively small number of districts were surveyed (less than 1/6th the number that will be surveyed for this study) and not all states. Therefore, the CEP surveys do not provide the opportunity for nested analyses and obtaining a comprehensive picture of communication as will the Integrated Evaluation. In addition, while the CEP asked only very basic questions concerning the four assurances (focusing instead on jobs saved by ARRA funds) the Integrated Evaluation is the only study we are aware of that will examine in more detail the type and stage of strategies being implemented. We will, however, consider the possibility of repeating some items from other surveys that have already been administered in order to create an earlier baseline for some measures.


Finally, we reviewed reports from the General Accounting Office regarding ARRA. For example the GAO study “Recovery Act: Opportunities to Improve Management and Strengthen Accountability over States’ and Localities’ Uses of Funds” (GAO-10-999, September 20, 2010) provided information on the uses and accountability for ARRA funds in selected states and localities. This information informed survey development.



A.5 Efforts to Minimize Burden on Small Business or Other Entities

No small businesses will be involved as respondents. Every effort will be made to minimize the burden on respondents. As described in section A.3, we will administer the district and school surveys via the web, so it is easily accessible to respondents. Burden will be reduced with the use of skip patterns and prefilled information based on responses to previous items when appropriate. The web-based surveys will also facilitate the completion of the surveys by multiple respondents, so that the most appropriate individual will be able to access and provide the data in their area of expertise. This approach will reduce burden for respondents as (a) each individual will have fewer questions to answer themselves and (b) respondents will be asked questions concerning topics in which they are well versed and answers should be readily available.



A.6 Consequences of Less-Frequent Data Collection

The data collection plan described in this submission is necessary for ED to conduct a rigorous national evaluation of ARRA funding and implementation progress. Although ED is required to obligate all ARRA funds by September 30, 2010, depending on the specific program, states and districts will have anywhere from one year to several years to use the funds. Moreover, a key question for the study is whether the activities undertaken while additional funding was available continue after those funds disappear. For these reasons, a follow up survey in spring 2012 is critical.



A.7 Special Circumstances Requiring Collection of Information in a Manner Inconsistent with Section 1320.5(d)(2) of the Code of Federal Regulations

There are no special circumstances associated with this data collection.

A.8 Federal Register Comments and Persons Consulted Outside the Agency

The 60 day FR notice was published Vol.76, page 76392, on 12/7/2011.  No public comments have been received to date.


A Technical Working Group (TWG) has been assembled for this study.


Thomas Cook, Northwestern University

Margaret Goertz, University of Pennsylvania

Jack Jennings, Center on Education Policy

Sharon Lohr, Arizona State University

David Lussier, Austin Independent School District, Austin TX

Philip Price, North Carolina Department of Public Instruction

Rachel Tompkins, Rural School and Community Trust

Marilyn Troyer, New Albany-Plain Public Schools, New Albany OH



A.9 Payments to Respondents

There will be no payments with regard to the collection of the survey data.



A.10 Assurance of Confidentiality

Other than the names and contact information for the respondents, which is information typically already available in the public domain (i.e., state and district websites) no data collected for this survey will contain personally identifiable information.  While some basic summary information focused on funding and implementation is likely to be displayed by state, no names and contact information will be released. Responses will be used for research or statistical purposes. 


The following language will be included on the cover sheet of each survey: Information collected from these surveys comes under the confidentiality and data protection requirements of the Institute of Education Sciences (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Information that could identify an individual or institution will be separated from the survey responses submitted, kept in secured locations, and be destroyed as soon as they are no longer required. Survey responses will be used only for research purposes. The reports prepared for the study will summarize findings across individuals and institutions and will not associate responses with a specific district, school, or person. We will not provide information that identifies district or school respondents to anyone outside the study team, except as required by law.


The Education Sciences Reform Act of 2002. Title I, Part E, Section 183 of this Act requires, “All collection, maintenance, use, and wise dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” Respondents will be assured that confidentiality will be maintained, except as required by law. Specific steps to guarantee confidentiality include the following:


  • Identifying information about respondents (e.g., respondent name, address, and telephone number) will not be entered into the analysis data file, but will be kept separate from other data and will be password protected. A unique identification number for each respondent will be used for building raw data and analysis files.

  • A fax machine used to send or receive documents that contain confidential information will be kept in a locked field room, accessible only to study team members.

  • Confidential materials will be printed on a printer located in a limited access field room. When printing documents that contain confidential information from shared network printers, authorized study staff will be present and retrieve the documents as soon as printing is complete.

  • In public reports, findings will be presented in aggregate by type of respondent or for subgroups of interest. No reports will identify individual respondents or local agencies.

  • Access to the sample files will be limited to authorized study staff only; no others will be authorized such access.

  • All members of the study team will be briefed regarding confidentiality of the data.

  • A control system will be in place, beginning at sample selection, to monitor the status and whereabouts of all data collection instruments during transfer, processing, coding, and data entry. This includes sign-in/sign-out sheets and the hand-carrying of documents by authorized project staff only.

  • All data will be stored in secure areas accessible only to authorized staff members. Computer-generated output containing identifiable information will be maintained under the same conditions.

  • When any hard copies containing confidential information are no longer needed, they will be shredded.



A.11 Questions of a Sensitive Nature

Questions of a sensitive nature will not be asked in any of the three surveys.



A.12 Estimates of Respondent Burden

We will administer the follow up surveys to respondents in:


  • The 50 states and the District of Columbia,

  • 1,700 sampled school districts, and.

  • 3,800 sampled schools (within the sampled school districts).


In all, responses will be required in spring 2012 from 5,551 respondents (51 state officials; 1,700 district officials; and 3,800 school officials). Although we expect that at the state and district level, there may be more than one respondent completing the survey, we are estimating the burden to complete the total survey as one respondent per state/district times the number of minutes for the total survey. We estimate that it will take (1) state and district respondents an average of 75 minutes for the surveys and (2) school officials 45 minutes for the survey, so total burden is 302,325 minutes or 5,038.75 hours (see Table A-3 below).




Table A-3. Estimates of Respondent Burden


Respondent

Anticipated number completed

Minutes
per completion

Burden in minutes


Burden in hours


Burden in Dollars


(a)

(b)

(c) a x b

c/60


State official

51

75

3,825

63.75

$2,868.75

District official (survey)

1,700

75

127,500

2,125

$95,625.00

School official

3,800

45

171,000

2,850

$128,250.00

Total burden

5,551


302,325

5,038.75

$ 226,743.75

NOTE: Assumes an hourly rate of $45 per hour (from the Bureau of Labor Statistics’ Occupational Employment Statistics for educational administrators, May 2009).


The burden from the already approved recruitment package (1,509 burden hours) and the already approved baseline data collection package (5,322 burden hours) totaled 6,831 burden hours for the first year of this study. These data collection activities have now been completed. This package is requesting 5,039 burden hours for the second year of this study (that is, the follow up surveys for 2012); therefore the total annual burden will be 5,039 burden hours for this submission.



A.13 Estimates of the Cost Burden to Respondents

There are no annualized capital/startup or ongoing operation and maintenance costs associated with collecting the information.



A.14 Estimates of Annualized Government Costs

The amount for the design, conduct of two annual surveys, analysis, and reporting for this evaluation is $6,940,922. The annualized cost is $1,735,230.



A.15 Changes in Hour Burden

The first submission reflected the hour burden for recruitment (1,509 burden hours). The second submission reflected the hour burden for conducting a state survey in spring 2011, a district survey in spring 2011, a school survey in spring 2011, and a district poll in fall 2011 (5,322 burden hours). This third submission reflects the burden hours for conducting a state survey in spring 2012, a district survey in spring 2012, and a school survey in spring 2012. The request for these 2012 surveys is 5,039 hours. The recruitment process and 2011 surveys resulted in a total of 6,831 burden hours. These data collection activities are now complete. The difference in the burden hours requested previously and the burden hours being requested for this data collection results in a program change of -1,792.



A.16 Time Schedule, Publication, and Analysis Plan

We will produce evaluation reports for policy makers and practitioners and generate useful annual tabulations for individual states. In writing reports, we will follow the principals of the Federal Plain Language Action and Information Network and adhere to the requirements of the NCES Statistical Standards (2002), IES Style Guide (2005) and other IES guidance and requirements for public reporting.

Each evaluation report will answer a clearly established set of questions using both extant sources of data and information from the state, district, and school surveys. Each report will start with an outline of highlights. Then for each question, the report will include a discussion of the context for understanding the findings, the data sources used and their limitations, the data collection methodology, the analyses conducted and findings. Appendices will provide more detailed information about, for example, the purpose of the evaluation and its design, the approaches to data collection, sampling methodology, and survey response rates.


Table A-4 summarizes plans for tabulating data and publishing reports to address the policy/research questions.


Table A-4. Reporting schedule


Report

Content

Data sources

Date

Distribution of funding report

Early distribution of ARRA funding. Includes descriptive information on the characteristics of states and districts that received funds.

Data for this analysis will come from ARRA grant awards, combined with extant data.

Winter 2012

Baseline survey report

Overview of pre- and early ARRA funding and implementation strategies.

Based on the 2011 surveys.

Summer 2012

State tabulations

State specific reports, providing aggregate survey data for the sampled districts in the state; will not be state representative.

Distribution of funding report and baseline survey report.

Summer 2012

Final implementation report

Policy/research question 1-3, expanding upon state, district and school strategies implemented under ARRA.

Funding applications, performance reports, state web sites, 2012 survey.

Spring 2013

State tabulations

State specific reports, providing aggregate survey data for the sampled districts in the state; will not be state representative.

Final implementation report and 2012 survey.

Summer 2013


The series of evaluation reports described above will be supported by analyses to describe the allocation of ARRA education funds at the state, district, and school levels and the extent to which ARRA reform strategies are being implemented.


A primary goal of the evaluation will be to document where the ARRA funds went (i.e., how much money from individual programs and overall did states and school districts receive), how they were used (i.e., which general and specific reform strategies were adopted and what implemental processes took place to enact change), and to what extent are state, districts, and schools implementing these strategies regardless of funding source. To achieve this goal, in-depth descriptive analyses will be used answer research questions focused on funding, reform strategies, and implementation (detailed in section A.1).


While simple descriptive statistics such as means and percentages will provide answers to many of our questions, cross-tabulations will be important to providing policy relevant information. Cross-tabulations will be also important to illustrate the distribution of funds and adopted reform strategies across states and districts with varying characteristics. Our use of stratification (and oversampling when necessary) in our sample design will allow for certain subgroup comparisons, and additional cross-tabulations will be made based on other variables. Comparisons will include the follow:


  • States will be stratified on whether the state received RTT funding (RTT) or not (non-RTT), to examine issues of within-state coherence in implementation priorities or the types and extent of state assistance provided to districts.


  • Districts will be stratified on high and low poverty, and on urbanicity (central city, urban fringe, town, and rural). We focus on poverty because of the Federal Government’s traditional focus on helping to mediate the effects of local funding constraints on educational opportunity. We focus on urbanicity because of the relationships between educational opportunity and rural isolation and the concentration of poverty in urban schools.


  • Schools will be stratified on school level (elementary, middle, and high), school performance level (persistently lowest-achieving (PLA) schools, Title I schools in need of improvement (SINI) that are not PLA, and all other schools), and school size (small, medium, and large). Schools with concentrations of low achieving students are a particular focus of the Elementary and Secondary Education Act (ESEA) and ARRA funding, and we expect the strategies used under the ARRA programs and possibly their implementation to differ by grade level. We hypothesize that small schools may need to adopt different reform strategies than do large schools.


  • Other comparisons of interest include the degree of coordination between states and districts, the proportion of state funds devoted to the strategies being used, differences in state governance structure (e.g., top-down governance versus a more locally focused structure), and variations in the average level of ARRA funding (i.e., do states with relatively “higher” per student ARRA funding levels make different choices than relatively lower funded states).


With the follow up round of data collection after the 2011 surveys, for which we are currently requesting approval, additional types of tabulations will be possible including those examining implementation change over time.


Because of the use of a statistical sample, survey data presented for districts and schools will be weighted to national totals (tabulations will, therefore, provide standard errors for the reported estimated statistics). In addition, the descriptive tables will indicate where differences between subgroups are statistically significant. We will use Chi-Squared tests to test for significant differences among distributions and t-tests for differences in means. Tabulations will be included in the baseline and final reports where appropriate.


The types of data tabulations that we will prepare and report using the follow up survey data are illustrated in Tables A-5 – A-7. Table A-5 is an example of a table shell using the SEA survey data, Table A-6 is an example of a table shell using the LEA survey, and Table A-7 uses the school survey.



Table A-5. Percentage distribution of states by implementation status for strategies related to adopting new standards and aligning assessments during the 2011-2012 school year, by RTT status



SEA Standards & Assessments Strategies

RTT States

Non-RTT States

% Not Planned

% In Planning

% In Use


% Not Planned

% In Planning

% In Use


Implementing New State Standards

Professional development for teachers focused on new state standards adopted since January 2009 for:

Mathematics









Reading/English language arts









Science and/or social studies









Professional development for teachers focused on helping:

English Language Learners (ELL) master new state standards









Special education students master new state standards









Instructional materials aligned with new state standards (e.g. selection and/or development of curriculum guides, pacing guides, etc. aligned with new state standards) for:,

Mathematics









Reading/English language arts









Science and/or social studies









English Language Learners (ELL)









Special education students










Table A-5. Percentage distribution of states by implementation status for strategies related to adopting new standards and aligning assessments during the 2011-2012 school year, by RTT status -- continued



SEA Standards & Assessments Strategies

RTT States

Non-RTT States

% Not Planned

% in Planning

% in Use


% Not Planned

% in Planning

% in Use


Implementing Assessments Aligned with New State Standards

Assessments in core academic subjects aligned with new state standards (e.g., development and adoption of new assessments) in:

Mathematics









Reading/English language arts









Science and/or social studies









Assessments aligned with new state standards (e.g., development and adoption of new assessments) for:

English Language Learners (ELL)









Special education students









Professional development to prepare teachers to use data from new assessments to improve instruction









Professional development to prepare principals and other school leaders to use data from new assessments in school improvement planning











Table A-5. Percentage distribution of states by implementation status for strategies related to adopting new standards and aligning assessments during the 2011-2012 school year, by RTT status -- continued



SEA Standards & Assessments Strategies

RTT States

Non-RTT States

% Not Planned

% in Planning

% in Use


% Not Planned

% in Planning


% in Use

Using Assessment Data and Assessment Systems

Professional development focused on improving instruction by using data from::

State assessments









District assessments









Locally developed formative assessments









Facilitate local access to and use of state data systems









Link local data systems to state data systems









Source: U.S. Department of Education, Institute of Education Sciences, Integrated Evaluation of ARRA Funding, Implementation, and Outcomes: Spring 2011 SEA Survey.


Tabulations would also be done for other reform areas and for other state classification variables such as Census region

Table A-6. Percentage distribution of districts by implementation status for strategies related to implementing new standards and assessments during the 2011-2012 school year, by RTT status



LEA Standards & Assessments Strategies

Districts in RTT States

Districts in Non-RTT States

% Not Planned

% In Planning/

Development

% Available to All Schools

% Available to Some Schools

% Not Planned

% In Planning/

Development

% Available to All Schools

% Available to Some

Schools

Implementing New State Standards

Instructional materials (e.g., curriculum guides, curriculum frameworks, pacing guides) aligned with new state standards that were developed for:

The district









The state









A school-site instructional specialist or coach to support instruction tied to new state standards for:

Mathematics









Reading/English/language arts









Science or social studies









English Language Learners (ELL)









Criteria for schools to use when selecting a new curriculum aligned with new state standards









On-line access to professional development programs that are aligned with new state standards for educators










Table A-6. Percentage distribution of districts by implementation status for strategies related to implementing new standards and assessments during the 2011-2012 school year, by RTT status – continued



LEA Standards & Assessments Strategies

Districts in RTT States

Districts in Non-RTT States

% Not Planned

% In Planning/

Development

% Available to All Schools

% Available to Some Schools

% Not Planned

% In Planning/

Development

% Available to All Schools

% Available to Some Schools

Implementing Assessments and Data Systems

District summative assessments in:

Non-NCLB tested grades









Non-NCLB tested subjects









Formative student assessments to aid teachers in adapting instruction to students needs









Assuring that tests are vertically scaled across grades to better measure student growth









Teachers have on-line access to individual student results from:

State summative assessments









District summative assessments









Formative assessments









Teachers have on-line access to students’ demographic information, attendance, or discipline data linked to student assessment data









Provide teachers and principals with computers for use in accessing district student data systems









Source: U.S. Department of Education, Institute of Education Sciences, Integrated Evaluation of ARRA Funding, Implementation, and Outcomes: Spring 2011 LEA Survey.

Tabulations would also be done for other reform areas and for other district classification variables such as high versus low poverty districts and urban versus rural districts.

Table A-7. Percentage distribution of schools by implementation status related to adopting new standards and assessments during the 2011-2012 school year, by RTT status


School Standards and Assessment Strategies

Schools in RTT States

Schools in Non-RTT States

% Not in Use

% Pilot Testing

% Implementing

% Not in Use

% Pilot Testing

% Implementing

Implementing New State Standards

A new curriculum aligned with new state standards for:

Mathematics







Reading/English/ language arts







Science or social studies







A curriculum specifically focused on meeting English Language Learner (ELL) students needs to meet new state standards







New curricula selected:

From an approved list provided by state or district







Based on state or district guidance







Teachers have instructional materials aligned with new state standards for at least some subjects/grades







An instructional specialist or coach to support instruction tied to new state standards in:

Mathematics







Reading/English/language arts







Science or social studies







English Language Learners (ELL)







Professional development on the new standards for:

Teachers about how to apply them in their classrooms







Instructional coaches and/or mentors to develop skills to help teachers with the new standards







The principal about how to monitor their classroom application







Educators have on-line access to professional development programs aligned with new state standards







Source: U.S. Department of Education, Institute of Education Sciences, Integrated Evaluation of ARRA Funding, Implementation, and Outcomes: Spring 2011 School Survey.


Tabulations would also be done for other breakdowns, for example schools in high versus low poverty districts, schools in urban versus rural districts, and persistently low performing (PLA) schools versus non-PLA schools.




A.17 Display of Expiration Date for OMB Approval

The Institute of Education Sciences is not requesting a waiver for the display of the OMB approval number and expiration date. The surveys will display the expiration date for OMB approval.



A.18 Exceptions to Certification Statement

This submission does not require an exception to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).


1 The degree to which the State Fiscal Stabilization Fund and bolstering of already established programs (e.g., IDEA) successfully saved and created new education jobs is of policy interest as well. However, (a) this topic has been examined in other forums and (b) at the time when this study is fielded, funds tied to job retention and creation are unlikely to still be available to states, districts, and schools.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Double-Sided Body Template
AuthorAbt Associates Inc
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy