ARRA OMB for Recruitment Part A after 60 day public notice (2) rev11-23

ARRA OMB for Recruitment Part A after 60 day public notice (2) rev11-23.docx

Integrated Evaluation of American Recovery and Reinvestment Act (ARRA) Funding, Implementation and Outcomes (Recruitment )

OMB: 1850-0877

Document [docx]
Download: docx | pdf

Integrated Evaluation of ARRA Funding, Implementation and Outcomes



Statement for Paperwork Reduction Act Submission


PART A: Justification



Contract ED-IES-10-CO-0042







November 2010







Contents



Page





Part A: Justification

This package is the first of three for the Integrated Evaluation of ARRA Funding, Implementation, and Outcomes. Our initial request seeks approval for execution of a sampling plan and recruitment of the selected sites. Subsequent packages will request approval for: (1) an initial round of data collection that will include surveys of all states and a nationally representative sample of districts and schools in spring 2011, and (2) follow up surveys with the same groups in 2012 and 2013. A fast response from OMB is critical if the study is to field the spring 2011 surveys successfully, since much preparation work is necessary to ensure a high response rate from sampled school districts and schools.



Introduction


On February 17, 2009, President Obama signed the American Recovery and Reinvestment Act (ARRA) into law (Pub. L. 111-5). ARRA provides an unprecedented $100 billion of additional funding for the U.S. Department of Education (ED) to administer. While the initial goal of this money is to deliver emergency education funding to states, ARRA is also being used as an opportunity to spur innovation and reform at different levels of the U.S. educational system. Specifically, ARRA requires those receiving grant funds to commit to four core reforms: (1) adopting rigorous college-ready and career ready standards and high quality assessments, (2) establishing data systems and using data to improve performance, (3) increasing teacher effectiveness and the equitable distribution of effective teachers, and (4) turning around the lowest performing schools. Investment in these innovative strategies is intended to lead to improved results for students, long-term gains in school and local education agency (LEA) capacity for success, and increased productivity and effectiveness.


The education component of ARRA consists of several grant programs targeting states and LEAs and, in some cases, consortia led by non-profit organizations. The programs under ARRA fall into three general categories: (1) existing programs that received an infusion of funds (e.g., Individuals with Disabilities Education Act, Parts B & C; Title I; State Educational Technology grants; Statewide Longitudinal Data Systems grants); (2) a new program intended mainly for economic stabilization (i.e., State Fiscal Stabilization Fund); and (3) newly created programs that are reform-oriented in nature. Due to the number and scope of these programs, a large proportion of districts and schools across the country will get some ARRA funding. In turn, ARRA represents a unique opportunity to encourage the adoption of school improvement focused reforms and to learn from reform initiatives as they take place.


Although ARRA funds are being disbursed through different grant programs, their goals and strategies are complementary if not overlapping, as are the likely recipients of the funds. For this reason, an evaluative approach where data collection and analysis occurs across grant programs (i.e., it is “integrated”), rather than separately for each set of grantees will not only reduce respondent burden but will also provide critical information about the effect of ARRA as a whole.




Overview of the Study


The Integrated Evaluation of ARRA Funding, Implementation and Outcomes is being conducted under the Institute of Education Sciences (IES), ED’s independent research and evaluation arm. The study is one of several that IES will carry out to examine ARRA’s effects on education (see Exhibit 1).


Exhibit 1. IES Evaluation of ARRA’s Effects on Education

Shape1


The Integrated Evaluation is designed to assess how ARRA efforts are unfolding over time and is therefore primarily descriptive. While information will be gathered on many of the grant programs, the evaluation will focus primarily on the reform-oriented programs (e.g., Race to the Top (RTT), Title I School Improvement Grants (SIG), Investing in Innovation (i3), and the Teacher Incentive Fund (TIF)) since those are of the greatest policy interest1. The study will support the various impact evaluations IES is conducting by providing critical context for those strategies being rigorously investigated – e.g., by documenting the relative frequency with which they are being implemented across the country, whether they are unique to the particular grant programs, and how they are being combined with other reform approaches.


To achieve these objectives, the Integrated Evaluation will draw heavily on existing information (grant funding allocations, district and school outcomes databases, performance reporting where available) and administer new surveys to all 50 states and the District of Columbia, and to a nationally representative survey of districts and schools. The surveys will be conducted annually for at least three years, in spring 2011, 2012, and 2013.2 In addition, two polls of a subsample of sampled districts will be conducted between the 2011 and 2012 and between the 2012 and 2013 larger surveys to capture key, evolving issues of interest to ED officials and other policy makers as they consider shifting technical assistance efforts and further legislative action.


The evaluation’s theory of action, which is displayed in Figure A-1, posits that ARRA and the individual programs included in it are appropriately understood as a federal strategy for intervening in ongoing state and local education reform efforts. As the theory of action suggests, states have more or less well-defined reform agendas and priorities and many of these existed prior to ARRA. The arrows from the top and bottom boxes to the box on the left side of the display suggest that state reform priorities and strategies have been and continue to be influenced by the availability of ARRA education funds and the requirements established by the various ARRA programs.


The four ARRA assurances define the core elements of the federal strategy. The theory of action suggests that two of the four assurances, increasing educator effectiveness and equitable distribution and improving low-performing schools, are the primary foci of ARRA expectations and ARRA-supported reforms. The “School Improvement” box on the right side of the model appears as it does to suggest that ARRA has high aims for improving all schools, while at the same time targeting significant resources to improving the lowest-performing schools. Setting new standards and developing new assessments aligned with these standards and establishing statewide longitudinal student data systems (the other two ARRA assurances areas) are important to the reform equation, but are best understood in terms of how they contribute to reforms in the other two areas.


The location of the “District Reform Priorities and Strategies” box suggests that while states exert considerable leadership in education reform, much of the work is done at the local levels as district and school staff work to improve instruction. Nowhere is this more clearly demonstrated than in the implementation of the myriad of strategies associated with increasing educator effectiveness and equitable distribution. These strategies include (1) designing education preparation programs and ongoing professional development aligned with the state and local performance standards; (2) designing and implementing quality induction programs for new teachers and principals; (3) designing and implementing new educator evaluation systems that include clear evidence of gains in student achievement as a criterion for effective performance; and (4) designing and implementing new systems of compensation and incentives which recognize and reward quality performance and help to ensure that highly-effective educators are assigned to and continue to work in hard-to-staff schools. Together these strategies define an aligned human resource management system, which, in turn prepares and supports educators’ efforts to improve schools, especially the lowest-performing schools. The ultimate goal of ARRA programs and the reforms they support is to improve student learning.


The left-to-right arrows connecting the boxes through the middle of the diagram, labeled “C3” as shorthand for communication, coordination, and collaboration, suggest the importance of the linkages among state, district, and school reform efforts. Understanding what each of these linking strategies looks like and its contributions to advancing reform efforts is important to understanding the overall effectiveness of the reforms. The “Lessons” arrows connecting the boxes through the middle of diagram from right to left are intended to convey the idea that the lessons learned as implementation proceeds may lead to mid-course corrections in strategies and/or implementation of strategies.



Figure A-1. Integrated Evaluation of ARRA Funding, Implementation, and Outcomes: Theory of Action



    • ARRA Funding, Policy, and Requirements (e.g., RTT, i3, TIF, SLDS, Tech Grants, SIG, SFSF)

  • Shape2

Shape4 Shape3

  • Statewide Technology Infrastructure


  • State Reform

  • Priorities and Strategies


  • State Reform

  • Priorities and Strategies

  • District Reform

  • Priorities and Strategies

  • Student Outcomes

  • Standards and Assessments

Shape12 Shape11 Shape6 Shape10 Shape9
  • ARRA Funding, Policy, and Requirements (e.g., RTT, i3, TIF, SLDS, Tech Grants, SIG, SFSF)

Shape5 Shape8 Shape13 Shape7

Shape15


  • C5

  • C5

  • C3

  • School Improvement

  • Low Performing Schools

Shape16
  • C3

  • C3


Shape18

  • Educator Effectiveness and Equitable Distribution

Shape17

    • Lessons

    • Lessons

    • Lessons

    • Lessons

  • Shape19

Shape20 Shape23 Shape21

Shape26 Shape25 Shape24


  • Data Systems


Shape29






The “Statewide Technology Infrastructure” is included in the theory of action to underscore the fact that in many states developing technology infrastructures are fast becoming part of the glue that holds the reform efforts together and may also be catalysts for increasing the pace of reforms. New student data systems rely heavily on technology and state, district, and school capacity to use the systems. Full implementation of new assessments will also depend heavily on the capacity of data systems to store data and produce timely, user-friendly reports to inform instructional planning. The new data systems are also hypothesized to facilitate reform. Increasingly, states and districts are relying on technology to store instructional materials, especially as new content standards are introduced and applied in classrooms. Finally, states and districts are increasingly relying on technology as the medium for educator professional development.


The theory of action acknowledges that state and local education reforms did not begin with ARRA. The left to right progression displayed in the theory of action suggests that some reforms must be completed or at least make some progress before others can be completed. At the same time, the theory of action probably does not adequately reflect important time dimensions of the various reforms that are underway. A task for this evaluation is to examine how long it will take to implement the planned/expected reforms and how long will it take to see results. A second task will be to examine how the pace and sequence of individual reform efforts interact in the development and implementation of new policies, programs, and practices. Work driven in substantial ways by ARRA funding and expectations is proceeding on many fronts simultaneously. Yet the reality is that some things must be completed (e.g., implementation of new standards and assessments and new data systems) before others (e.g., full implementation of new educator evaluation systems). For example, implementation of the Common Core State Standards is already underway as states and districts develop new instructional resources and provide professional development to teachers and principals to introduce the standards and explain how they can and should be applied in the classroom. The development of new assessments aligned with the standards is also underway, but new state assessments are not expected to be in place for several years. Thus, teachers are likely to face a situation in which they are teaching to the new standards while being held accountable for assessment results that reflect mastery of a different set of standards. Thus, one could argue that, despite early progress, implementation of the new standards will not be complete until the new assessments are implemented.


Similarly, many states and districts are moving quickly on the development of new educator evaluation systems that rely on student learning gains as primary criterion for evaluating educator effectiveness. Because these systems will ultimately rely on student outcomes defined in mastery of the new standards, the systems cannot be considered fully implemented until the standards have been implemented and the new assessments are in place. Finally, logic dictates that it will not be possible to gauge the full impact of ARRA on student learning and other outcomes until these complex reforms are completed. This assumption, along with the others laid out by the theory of action presented above, will guide this study and in turn shape the collection of data efforts.



A.1 Explanation of Circumstances That Make Collection of Data Necessary

The Integrated Evaluation of ARRA Funding, Implementation, and Outcomes is a key component of ED’s efforts to learn lessons from the scope and structures of the ARRA funding. By providing the most comprehensive and independent assessment of ARRA implementation and outcomes across funding streams, it responds to taxpayer interest in how ARRA funds were spent. Although there are other groups and researchers external to ED that are examining some of the same issues (see Sec. A.4), only an ED sponsored contractor will have access to and report on the full set of data. The breadth of policy/research questions that will be addressed by IES’ Integrated Evaluation sets it apart from other ARRA studies:


  1. To what extent did ARRA funds go to the intended recipients?

  • To what extent did high need states, districts, and schools receive support? And how did this vary by ARRA program?

  • To what extent did funds go to states, districts, and schools that positioned themselves to implement reforms? And how did this vary by ARRA program?


  1. Is ARRA associated with the implementation of the key reform strategies it promoted?

  • Which reform strategies received the most attention in terms of (a) how SEAs and districts allocated their ARRA funds and (b) where the greatest implementation activity took place?

  • Comparing pre-2009 and post-2009 levels of implementation, are more key reform strategies implemented after the ARRA education funds were allocated?

  • Is having more ARRA funding associated with more deeper/broader implementation of the reform strategies?


  1. Which implementation supports and challenges are associated with ARRA?

  • What mechanisms are in place at the state and district levels to ensure that the reform efforts are (a) progressing as planned and (b) achieving the intended results?

  • Is alignment of priorities more evident in RTT states, where there is more of a focus on state capacity building?

  • Looking across the four reform areas and related strategies, what implementation challenges do SEAs, districts and schools report? How do these vary by the characteristics of states, districts, and schools?


  1. Is ARRA associated with improved outcomes?

  • Is ARRA funding associated with improved student outcomes?

  • Is ARRA associated with improved distribution of effective teachers?

  • Is there a relationship between the use of particular reform strategies promoted by ARRA (or bundles of strategies) and outcomes? Are these relationships moderated by any key variables such as coordination, communication, fiscal context, etc?


In addition, the study is designed to provide ongoing, formative feedback to ED through the district polls and feedback to states through state-specific tabulations of the survey data.



A.2 How the Information Will Be Collected, by Whom, and For What Purpose

The Integrated Evaluation will rely on information collected from existing sources, for which there are no respondents or burden, and from a new set of surveys in order to address the policy/research questions described above.



Extant Data Sources


  • ED Databases. We will use data from the National Center for Education Statistics’ Common Core of Data (CCD) and ED’s EDFacts to assemble the sampling frame for this study. Data items will include urbanicity, school level, poverty status, improvement status, total enrollment and limited English proficient student enrollment, and adequate yearly progress of correction action status under the Elementary and Secondary Education Act (ESEA). These same data systems can provide information on outcomes such as academic proficiency rates under ESEA and graduation rates. We may use other data obtained from these sources as outcomes or tracking measures, including corrective action status and characteristics of the schools (e.g., for examining efforts to turn around low performing schools). Our review of these databases will be discussed in further detail for the next forms clearance submission.


  • ED ARRA Program Files. We will use data from ED program files to compile a list of grant recipients for RTT, i3, and TIF to be used to assemble the sampling frame for this study. We will obtain data from ED program files on the amounts of funding received by states and districts from each formula grant and each discretionary program.


  • ARRA Required Reporting and Information. We are examining the types of information that the statute and ED require to be reported by states, districts and school as a condition of program participation. Some data are provided directly to ED, including application materials, performance indicators, and progress reports. Some data must be reported “publicly” but not necessarily to ED. Other than for SFSF, much of the reporting has not begun. Information that is available will be used in reporting and, to the extent possible, will not be duplicated in the surveys to be administered to states, schools districts, and schools. However, it is important for analytic purposes to have the same data collected for grantees and non-grantees to allow for comparison – e.g., district reports of state support provided in RTT states versus non-RTT states. It is equally important, from a research perspective, for the data collection modes or mechanisms to be the same. We will balance these research needs with the desire not to add burden to grantees who already have reporting responsibilities.


  • Non-ED Databases. We will obtain data from other, non-ED sources to provide context variables or outcomes measures. For example, we are exploring whether the National Longitudinal School-Level State Assessment Score Database or State Education Data Center provide more comprehensive or historical district and school measures of proficiency rates than does the Department’s EdFacts systems. Other organizations, like the Rockefeller Institute and Pew Foundation have developed measures of state’ and, in some cases, large districts’ fiscal conditions that would provide important analytic opportunities for examining the conditions under which ARRA implementation is taking place.


New Data Collections


Because there is currently no reliable source of detailed information on the strategies being implemented under the various ARRA programs, we will administer a set of surveys to obtain this information (see Table A-1). These surveys will be part of the second OMB package. Final decisions on the specific content and delivery modes are subject to advice from the evaluation’s Technical Working Group and from pre-testing.


  • State Surveys: Surveys will be administered as paper and pencil instruments, sent electronically so that the respondent has the option of typing responses and sending it back electronically, or printing the instrument, completing it and mailing it back. We considered that 51 respondents did not warrant the development of a web survey but we will consider the use of a web-based survey in the pilot testing phase. The survey will be sent to the chief school officer in each of the 50 states and the District of Columbia in spring 2011, 2012, and 2013. Each chief will be responsible for determining whether he or she is in the best position to respond the questions in the instrument or for requesting that other state education agency (SEA) officials take the lead for responding to individual sections. The survey will be modularized to allow for both options—for example, questions on teacher quality and evaluation could be completed by the state official who is most responsible for enacting that part of a state plan, questions about state data systems can be completed by someone else. We will record who the respondent(s) are for each round of the survey and conduct sensitivity testing to examine the influence of shifts in single versus multiple respondent responses over time. The state surveys will be used to examine state priorities for ARRA funding and implementation, shifts in state policy and legislation to support their ARRA efforts, and types of supports and communication provided to districts and schools.

  • District Surveys: Web-based surveys will be administered to the deputy superintendents of districts sampled for the study. We have found that the deputy superintendents are typically designated as responsible for research efforts and are best suited to determining if additional staff is needed to complete sections of the survey. Like the state survey, the district survey will be modularized to allow for completion by one or multiple respondents. The survey will collect information such as district priorities for improvement efforts, status of implementation, supports and technical assistance provided by the state to the district and by the district to schools in their community. While we will link some questions to ARRA, we anticipate that many districts will not know which specific funds received from the state came from ARRA grants and so the bulk of the questions will simply relate to aspects of the reform strategies that they are implementing.

  • School Surveys: Web-based surveys will also be administered to each sampled school principal. We anticipate that the principal will be the sole respondent, as he/she will be in a position to answer questions about the emphasis of their school improvement efforts including how they evaluate teachers, state and district provided professional development and other supports, instructional changes, and use of data.

  • District Polls: These short telephone surveys to a small set of district superintendents are intended to provide a snapshot on specific issues that could not be covered in the annual surveys or for which more ongoing information is needed. As “polls,” the surveys will not require assembly or compilation of information. The polls will focus on a small subset of items, covering such topics as perceptions of implementation challenges or perceptions of state support.











Table A-1. Description of Information to be Collected


Instrument

Type of Respondent

(preliminary)

Data Collection Mode1

Content

State survey

Chief state school officer as point of contact; sections to be distributed and completed by appropriate state staff

Paper and pencil

State policies, legislative changes, state priorities for ARRA funding, types of supports/technical assistance provided

District survey

Superintendent to designate a district liaison; sections to be distributed and completed by appropriate district staff

Web with hard copy (paper and pencil) if requested

District reform priorities; district strategies for adopting new standards and assessments, supporting new educators, establishing educator evaluation systems, supporting the improvement of low performing schools, and using statewide longitudinal data systems; knowledge of state priorities; types of supports/technical assistance provided by the state; types of supports/technical assistance provided to their schools; district quality assurance strategies; district capacity building methods

School survey

Principal

Web with hard copy (paper and pencil) if requested

Reform activities; professional development priorities; resources to support reform; types of supports/technical assistance provided by the state and the district

Poll #1 (fall 2011)

Superintendent to designate a district liaison; questions to be distributed and completed by appropriate district staff

Telephone

Types of guidance provided by the state; types of assistance needed; district partnerships to support reform

Poll #2 (fall 2012)

Superintendent to designate a district liaison; questions to be distributed and completed by appropriate district staff

Telephone

Types of guidance provided by the state; types of assistance needed; district partnerships to support reform

1 Data collection mode decisions are subject to change based on pre-testing.



A.3 Use of Improved Information Technology to Reduce Burden

The current forms clearance request, for sampling and recruitment, involves minimal burden to potential respondents and will mostly involve reading a letter about the study and responding to a phone call to answer any questions respondents might have and collect updated contact information (see Part B of this package for a description of the process). Because of the need for direct personal contact, no information technology is anticipated for these activities. A description of information technology to reduce burden in the administration of the surveys will be described in the second OMB forms clearance package.



A.4 Efforts to Identify and Avoid Duplication

Our identification and avoidance of duplication falls into two categories: (1) extensive use of extant data in place of new data collection, and (2) analysis of other large scale surveys about ARRA.


Use of Extant Data


In section A.2, we detailed sources of existing data and described how we plan to use this data for sampling and for reporting.


Analysis of Other Surveys about ARRA

While we are aware of other studies focused on ARRA education funds, they suffer from various limitations including a one-time collection effort, a focus on a single ARRA grant program, lack of representative sampling, problems in obtaining high response rates, and no linkages across the various educational levels.


The Integrated Evaluation is unique in multiple ways. First, the study stands out as the only evaluation that will survey the universe of states and nationally representative samples of districts and schools that are large enough to detect differences between states (and districts) and relationships between funding and outcomes. For example, the American Association of School Administrators August 2009 survey “Schools and the Stimulus: How America’s Public Schools Districts Are Using ARRA Funds” asked interesting questions concerning the use of ARRA funds to fill budget holes. However, only 160 administrators were surveyed in all and one-in-four states were not included in the sample.


Second, the Integrated Evaluation offers a unique opportunity to document the flow of funds and implementation efforts from states, through districts, to schools. As far as we know, there are no other surveys of schools, which is where much of the actual implementation and success of reform efforts will take place. While the Center on Education Policy (CEP) is surveying state and districts, a relatively small number of districts were surveyed (less than 1/6th the number that will be surveyed for this study) and not all states. Therefore, the CEP surveys do not provide the opportunity for nested analyses and obtaining a comprehensive picture of communication as will the Integrated Evaluation. In addition, while the CEP asked only very basic questions concerning the four assurances (focusing instead on jobs saved by ARRA funds) the Integrated Evaluation is the only study we are aware of that will examine in more detail the type and stage of strategies being implemented. We will, however, consider the possibility of repeating some items from other surveys that have already been administered in order to create an earlier baseline for some measures.


Finally, we are reviewing reports from the General Accounting Office regarding ARRA. For example a recent GAO study “Recovery Act: Opportunities to Improve Management and Strengthen Accountability over States’ and Localities’ Uses of Funds” (GAO-10-999, September 20, 2010) provides information the uses and accountability for ARRA funds in selected states and localities. This information will inform survey development.


A.5 Efforts to Minimize Burden on Small Business or Other Entities

No small businesses will be involved as respondents. The current forms clearance request, for sampling and recruitment, involves minimal burden to potential respondents and will mostly involve reading a letter about the study and responding to a phone call to answer any questions respondents might have and collect updated contact information.



A.6 Consequences of Less-Frequent Data Collection

The data collection plan described in this submission is necessary for ED to conduct a rigorous national evaluation of ARRA funding and implementation progress. Although ED is required to obligate all ARRA funds by September 30, 2010, depending on the specific program, states and districts will have anywhere from one year to several years to use the funds. Moreover, a key question for the study is whether the activities undertaken while additional funding was available continue after those funds disappear. For these reasons, annual surveys until at least 2013 are critical.



A.7 Special Circumstances Requiring Collection of Information in a Manner Inconsistent with Section 1320.5(d)(2) of the Code of Federal Regulations

There are no special circumstances associated with this data collection.



A.8 Federal Register Comments and Persons Consulted Outside the Agency

The 60-day Federal Register notice was published on September 14, 2010, Vol 75, page 55780. No public comments have been received.


A Technical Working Group (TWG) has been assembled for this study. The current TWG members are listed below. Additional consultation may be sought during later phases of the study (e.g., data analysis).


Thomas Cook, Northwestern University

Margaret Goertz, University of Pennsylvania

Jack Jennings, Center on Education Policy

Sharon Lohr, Arizona State University

Rachel Tompkins, Rural School and Community Trust

Marilyn Troyer, Ohio Department of Education



A.9 Payments to Respondents

There will be no payments with regard to recruitment.



A.10 Assurance of Confidentiality

This collection is for recruitment purposes and no data will be collected for this phase one collection. The data collection activity will start with the second phase, the study phase of the collection, which will be submitted to OMB shortly as the second forms clearance package for this study.


Other than the names and contact information for the respondents, which is information typically already available in the public domain (i.e., state and district websites) no data collected for this survey will contain personally identifiable information. No names and contact information will be released.


Responses will be used for research or statistical purposes. Participation is voluntary.


The following language will be included in the recruitment letters: Information collected for this study come under the confidentiality and data protection requirements of the Institute of Education Sciences (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Information that could identify an individual or institution will be separated from the survey responses submitted, kept in secured locations, and be destroyed as soon as they are no longer required. Survey responses will be used only for research purposes. The reports prepared for the study will summarize findings across individuals and institutions and will not associate responses with a specific district, school, or person. We will not provide information that identifies district or school respondents to anyone outside the study team, except as required by law.


The Education Sciences Reform Act of 2002, Title I, Part E, Section 183 of this Act requires, “All collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” Respondents will be assured that confidentiality will be maintained, except as required by law. Specific steps to guarantee confidentiality include the following:


  • Identifying information about respondents (e.g., respondent name, address, and telephone number) will not be entered into the analysis data file, but will be kept separate from other data and will be password protected. A unique identification number for each respondent will be used for building raw data and analysis files.

  • A fax machine used to send or receive documents that contain confidential information will be kept in a locked field room, accessible only to study team members.

  • Confidential materials will be printed on a printer located in a limited access field room. When printing documents that contain confidential information from shared network printers, authorized study staff will be present and retrieve the documents as soon as printing is complete.

  • In public reports, findings will be presented in aggregate by type of respondent or for subgroups of interest. No reports will identify individual respondents or local agencies.

  • Access to the sample files will be limited to authorized study staff only; no others will be authorized such access.

  • All members of the study team will be briefed regarding confidentiality of the data.

  • A control system will be in place, beginning at sample selection, to monitor the status and whereabouts of all data collection instruments during transfer, processing, coding, and data entry. This includes sign-in/sign-out sheets and the hand-carrying of documents by authorized project staff only.

  • All data will be stored in secure areas accessible only to authorized staff members. Computer-generated output containing identifiable information will be maintained under the same conditions.

  • When any hard copies containing confidential information are no longer needed, they will be shredded.



A.11 Questions of a Sensitive Nature

Questions of a sensitive nature will not be asked in any of the three surveys or in the poll.



A.12 Estimates of Respondent Burden

For recruitment, we will send notification letters and follow up with a telephone call to secure participation and answer any questions to:


  • The 50 states and the District of Columbia

  • 1,700 sampled school districts. If the school district has any special approval processes, we will gather that information and undertake the approval process.

  • 3,800 sampled schools (within the sampled school districts).


These notification letters are provided in Appendix A.


In all, responses will be required one time from 5,551 respondents (51 state officials; 1,700 district officials; and 3,800 school officials). We estimate that it will take state and district respondents an average of 30 minutes throughout the recruitment process, and school officials 10 minutes, so total burden is 96,530 minutes or 1,508.8 hours (see Table A-2 below).


Table A-2. Estimates of Respondent Burden


Respondent

Anticipated number completed

Minutes per completion

Burden in minutes

Burden in hours

Burden in Dollars


(a)

(b)

(c) a x b

c/60


State official

51

30

1,530

25.5

$1,147.50

District official

1,700

30

51,000

850

$38,250.00

School official

3,800

10

38,000

633.3

$28,498.50

Total burden

5,551


90,530

1,508.8

$ 67,896.00

NOTE: Assumes an hourly rate of $45 per hour (from the Bureau of Labor Statistics’ Occupational Employment Statistics for educational administrators, May 2009).



A.13 Estimates of the Cost Burden to Respondents

There are no annualized capital/startup or ongoing operation and maintenance costs associated with collecting the information.



A.14 Estimates of Annualized Government Costs

The amount for the design, conduct of three annual surveys and two polls, analysis, and reporting for this evaluation is $8,440,922. The annualized cost is $2,110,230.


We estimate that the costs for designing the sample, drawing the sampling, and recruiting states, districts and schools – all in the first year – are about $170,000 (of the total $8,440.922). This estimate is based on Westat’s prior experience with similar, large-scale studies that require designing and drawing a sample, and recruiting participantss.



A.15 Changes in Hour Burden

The burden ours for this collection are considered to be a program change since this is a new collection.



A.16 Time Schedule, Publication, and Analysis Plan

We will produce evaluation reports for policy makers and practitioners and generate useful annual tabulations for individual states. In writing reports, we will follow the principals of the Federal Plain Language Action and Information Network and adhere to the requirements of the NCES Statistical Standards (2002), IES Style Guide (2005) and other IES guidance and requirements for public reporting.


Each evaluation report will answer a clearly established set of questions using both extant sources of data and information from the state, district, and school surveys. Each report will start with an outline of highlights. Then for each question, the outline will include a discussion of the context for understanding the findings, the data sources used and their limitations, the data collection methodology, the analyses conducted and findings. Appendices will provide more detailed information about, for example, the purpose of the evaluation and its design, the approaches to data collection, sampling methodology, and survey response rates.


Table A-3 summarizes plans for tabulating data and publishing reports to address the policy/research questions.


The series of evaluation reports described in Table A-3 will be supported by analyses that will have two main objectives: (1) the largest effort will be to describe the allocation of ARRA education funds at the state, district, and school levels and the extent to which ARRA reform strategies are being implemented; and (2) a smaller but important effort to link “exposure” to ARRA reforms (direct and indirect) to improvements in student outcomes and other important policy goals.

The analysis plan will be described more fully in the second forms clearance package.


Table A-3. Reporting schedule


Report

Content

Data sources

Date

Distribution of funding report

Early distribution of ARRA funding. Includes descriptive information on the characteristics of states and districts that received funds.

Data for this analysis will come from ARRA grant awards, combined with extant data.

Summer 2011

Baseline survey report

Overview of pre- and early ARRA funding and implementation strategies.

Based on the 2011 surveys.

Spring 2012

State tabulations

State specific reports, providing aggregate survey data for the sampled districts in the state; will not be state representative.

Distribution of funding report and baseline survey report.

Summer 2012

Early Implementation report

Research question 1-3, expanding upon state, district and school strategies implemented under ARRA.

Funding applications, performance reports, state web sites, 2011 survey.

Spring 2013

State tabulations

State specific reports, providing aggregate survey data for the sampled districts in the state; will not be state representative.

Early implementation report and 2012 survey.

Summer 2013

State tabulations

State specific reports, providing aggregate survey data for the sampled districts in the state; will not be state representative.

2013 survey.

Summer 2014

Final report

Summative report covering all aspects of the evaluation including baseline, implementation progress, and outcomes.

Extant data, annual reports, surveys, funding applications, state web sites.

August 2014



A.17 Display of Expiration Date for OMB Approval

The Institute of Education Sciences is not requesting a waiver for the display of the OMB approval number and expiration date. The recruitment letters will display the expiration date for OMB approval.



A.18 Exceptions to Certification Statement

This submission does not require an exception to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).



1 The degree to which the State Fiscal Stabilization Fund and bolstering of already established programs (e.g., IDEA) successfully saved and created new education jobs is of policy interest as well. However, (a) this topic has been examined in other forums and (b) at the time when this study is fielded, funds tied to job retention and creation are unlikely to still be available to states, districts, and schools.

2 If additional evaluation resources are available, IES may consider an additional round of data collection in 2014 to more fully capture how implementation efforts change after ARRA funds are spent down.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Double-Sided Body Template
AuthorAbt Associates Inc
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy