Supporting Statement A for Paperwork Reduction Act Submission7 28 11

Supporting Statement A for Paperwork Reduction Act Submission7 28 11.docx

Equitable Distribution of Effective Teachers: State and Local Response to Federal Initiatives

OMB: 1875-0260

Document [docx]
Download: docx | pdf

Supporting Statement for Paperwork Reduction Act Submission

Justification (Part A)

  1. Circumstances Making Collection of Information Necessary

Improving student learning outcomes and reducing gaps in achievement between disadvantaged students and their nondisadvantaged peers are key goals of federal education policy. These gaps in achievement are large and persistent. For example, among fourth-grade public school students in 2009, the percentage of students achieving “below basic” in reading on the National Assessment of Educational Progress was 49 percent among students eligible for free or reduced-price lunch, compared to 21 percent among other students (National Center for Education Statistics, 2009, p. 58).

To address these disparities, particular emphasis has been placed on the quality of the teacher workforce and the equitable distribution of teachers. Across core academic subjects, national data show that 27 percent of classes in high-poverty schools were assigned to a teacher who had neither certification nor a major in the subject taught, compared with 14 percent of classes in low-poverty schools (Education Trust, 2008). National data also reveal disparities in other indicators (Imazeki, 2007).

The most recent reauthorization of the Elementary and Secondary Education Act (ESEA) in 2002 placed substantial policy emphasis on the key role of teachers by requiring that by the end of 2005–06, all core subjects be taught by highly qualified teachers (HQTs). In addition, ESEA required that states provide assurances and develop plans to “ensure that poor and minority children are not taught at higher rates than other children by inexperienced, unqualified, or out of field teachers” (Section 1111 (b)(8)(C)). In 2009, American Recovery and Reinvestment Act (ARRA) requirements reinforced the focus on equitable distribution of teachers by requiring states applying for education stimulus funds to provide updated assurances and to publicize their most recent “equity plans.” ARRA also establishes competitive grants to help states build their pool of effective teachers and address inequities in the distribution of teachers, through, for example, the Race to the Top (RttT) program, for which one priority area is effective teachers and leaders.

In addition to their focus on the equitable distribution of teacher quality, federal programs also have been promoting shifts in how teacher quality is measured, away from teacher qualifications and toward measures of instructional practice and effectiveness at raising student achievement. Federal programs such as the Teacher Incentive Fund (TIF) and RttT have provided incentives for states and districts to move in this direction, including funds to support some of the technical aspects of development.

Federal policymakers need to know whether the policies and programs they sponsor under these laws contribute to the development of state and district policies related to teacher quality for disadvantaged students. Hence, the U.S. Department of Education (ED) is requesting a study documenting the state and local actions to (a) develop new measures of teacher quality, (b) analyze the distribution of teacher quality, and (c) develop and implement plans to ensure teacher quality for disadvantaged students. To inform federal policymakers, the study will examine the implementation of these activities with attention to implementation challenges, the role of state and local context, and the roles of the federal programs designed to foster these activities.1

  1. Purposes and Uses of Data

The planned data collections will serve four objectives:

  1. To examine how states and districts analyze the distribution of teacher quality, plan actions to address inequities, and monitor progress.

  2. To examine how states and districts are changing their measures of teacher quality, and to understand their experiences in doing so.

  3. To examine state and local actions to improve teacher quality for disadvantaged students (i.e., students in high-poverty or high-minority schools).

  4. To describe the perceived contributions of federal programs to state and local actions aimed at improving the quality of teachers for disadvantaged students, and how state and local contexts mediate these contributions.

To address these objectives, the design includes telephone interviews with states and local education agencies (LEAs).

  1. Uses of Technology to Reduce Burden

The contractor will use a variety of information technologies to maximize the efficiency and completeness of the information gathered for this evaluation and to minimize the burden that the evaluation places on respondents at the state and LEA levels:

  • To streamline the interview process and reduce burden on state and LEA officials, the research team will use state and LEA Web sites to gather information on interview questions prior to conducting the interviews. (For more information on the use of extant sources, see the Data Collection Procedures section in Part B.)

  • A toll-free number and e-mail address will be available during the data collection process to permit respondents to contact interview staff with questions or requests for assistance. The toll-free number and e-mail address will be included in all communication with respondents.

  1. Efforts to Identify Duplication

Where possible, the research team will use existing data, including state and LEA equity plans, state monitoring reports, and state and LEA report cards, as well as data from state and LEA Web sites. Using these sources will greatly reduce the number of questions asked during the interviews, thus reducing respondent burden and minimizing duplication of data collection efforts.

The contractor has also conducted a scan to identify related federal studies and identified two studies with which AIR will coordinate efforts to avoid duplication.  The first is the Teacher Quality Distribution and Measurement (TQDM) study, which was awarded by ED in September 2010.  The TQDM study will assess the link between district policies and the equitable distribution of teachers, as measured using value-added analysis of student achievement data in 30 purposively selected districts. By contrast, the EDET will provide descriptive information only, and will describe state and district activity in broad samples, including a non-overlapping pool of 75 leading edge school districts, as explained further in Part B under Sampling Design.  Since the design of the TQDM requires data on district policies related to the equitable distribution of teachers, the ED project officers for EDET and TQDM have discussed the need for coordination.  The project officer for TQDM anticipates that no TQDM districts will be among the 75 leading edge districts.

The second study identified with which the study team will coordinate its efforts is the Integrated Evaluation of ARRA Funding, Implementation, and Outcomes. We have communicated with the project officer for the Integrated Evaluation of ARRA and have reviewed their study design and instruments. The ARRA evaluation focuses on the role of ARRA in education reform efforts, and its primary data collections are questionnaires. The EDET study uses semi-structured interview protocols. While the instruments from both studies include questions on the topic of teacher quality, the content of the instruments does not overlap. The ARRA evaluation will collect status information (e.g., whether or not particular policies exist, for each of several years). The EDET will collect explanatory information (e.g., descriptive information about approaches to measuring teacher effectiveness, an explanation of why a particular policy was chosen, or an explanation of the challenges encountered).

  1. Methods to Minimize Burden on Small Entities

To be considered a small entity by OMB, a school district would need to have a population of less than 50,000 students. We will avoid small entities whenever possible in selecting the district sample. 

  1. Consequences of Not Collecting Data

The level of inequity regarding the distribution of high-quality teachers reaffirms the need to ensure that states and school districts are responding to the federal initiatives to improve teacher quality for disadvantaged students (i.e., students in high-poverty or high-minority schools). Failure to collect the data proposed through this study would prevent policymakers from understanding the adequacy of state and local responses to federal efforts to promote better measurement of teacher quality, state and local analysis of teacher quality for disadvantaged students, and state and local development and implementation of plans to ensure teacher quality for disadvantaged students. Without the information provided by this study, federal policymakers will be less able to develop effective strategies for monitoring the implementation of federal initiatives, and to provide sound guidance to states and districts.

Additionally, the consequences of not collecting the data include an inability to examine how context shapes state and local activities and the unique challenges associated with implementing the particular approaches. Thus, the information gained through this study will inform state and local policymaking to ensure teacher quality for disadvantaged students.

  1. Special Circumstances

There are no special circumstances that apply to this data collection effort.

  1. Federal Register Comments and Persons Consulted Outside the Agency

A 60-day notice was published in the Federal Register, Volume 75, page 67,057, on November 4, 2010. Comments were received from the California Department of Education. These comments and the research team’s responses can be seen in the accompanying file, Federal Register Public Comments.

Throughout the duration of this study, the contractor will draw on the experience and expertise of three expert reviewers who provide a diverse range of experiences and perspectives. The members of this team, their affiliations, and areas of expertise are listed in Exhibit 4.

Exhibit 4. Expert Reviewers

Proposed Expert Reviewer

Professional Affiliation

Area(s) of Expertise

Jennifer Imazeki

San Diego State University

Equitable distribution of teachers, teacher labor markets

Wesley Williams

Ohio State Department of Education

Teacher recruitment, teacher quality

Dale Ballou

Vanderbilt University

Innovative measures of teacher effectiveness, teacher labor markets


  1. Payment or Gifts

No payments or gifts will be used during the course of this study.

  1. Assurances of Confidentiality

As a research contractor, the research team is concerned with maintaining the confidentiality and security of its records and will protect the confidentiality of the data to the extent possible through a variety of means. The contractor’s project staff has extensive experience collecting information and maintaining confidentiality, security, and integrity of interview and survey data. The team has worked with the Institutional Review Board (IRB) at the American Institutes for Research to seek and receive approval of this study and the measures used to protect confidentiality, including the following:

  • Project team members will be educated about the confidentiality protections given to respondents and about the sensitive nature of materials and data to be handled. Each person assigned to the study will be cautioned not to discuss confidential data.

  • All electronic data will be protected using several methods. AIR’s internal network is protected from unauthorized access by using defense-in-depth best practices, which incorporate firewalls and intrusion detection and prevention systems. The network is configured so that each user has a tailored set of rights, granted by the network administrator, to files approved for access and stored on the local area network (LAN). Access to AIR’s computer systems is password protected, and network passwords must be changed on a regular basis and conform to a strong password policy. All project staff assigned to tasks involving sensitive data will be required to provide specific assurance of confidentiality and obtain any clearances that may be necessary. All staff will sign a statement attesting to the fact that they have read and understood the security plan and ED’s security directives. A copy of this statement is featured in Appendix F.

  • For district and state interviews, respondents’ names and contact information will be used for data collection purposes only and will be disassociated from the data as they are entered into the database. As information is gathered from respondents or from sites, each will be assigned a unique identification number; this number will be used for printout listings on which the data are displayed and for analysis files. The unique identification number also will be used for data linkage. Data analysts will not be aware of any individual’s identity.

  • The contractor will shred all hardcopy documents containing identifiable data as soon as the need for the hardcopy documents no longer exists. They also will destroy any data tapes or disks containing sensitive data.

  • Participants will be informed of the purposes of the data collection and the uses that may be made of the data collected. All respondents will be asked to sign an informed consent form (see drafts in Appendix C). Consent forms will be collected and stored in secure file cabinets at the contractor’s office in Washington, DC.

In informing participants and obtaining their consent, the research team will provide explanations regarding confidentiality. The explanations will differ somewhat for LEA and state respondents. For LEA respondents, they will explain that no names of LEA respondents, their districts, or their schools will be identified in reports or presentations. Responses to the LEA data collections will be used primarily to summarize findings in an aggregate manner (e.g., across types of districts) and secondarily to provide examples of program or strategy implementation. When providing such examples, the research team will do so in a manner that does not associate responses with a specific individual or district. They will not provide information that associates responses or findings with a district-level subject or to a district to anyone outside of the study team except if required by law.

The case of state-level respondents is somewhat different. The state-level data collections, by their very nature, focus on policy topics that are in the public domain. Moreover, it would not be difficult to identify policymakers involved in teacher-quality issues in each state and thus determine the identity of the state-level respondents. Therefore, the explanation to state-level respondents will indicate that the research team will avoid using their names in reports or attributing any quotes to specific individuals.

Finally, for both LEA and state-level respondents, the study team will convey that participation in the data collection is expected for recipients of Title II funds. Sections 9304 and 9306 of ESEA require state and local grantees, respectively, to provide assurances that they will cooperate with ED’s evaluations of ESEA programs. Thus, the letters used to introduce the study indicate that participation is expected, and the Paperwork Reduction Act statements on those letters indicate that “the obligation to respond to this collection is required to obtain or retain a benefit.”

  1. Justification of Sensitive Questions

No questions of a sensitive nature will be included in the study.

  1. Estimates of Hour Burden

The estimated hour burden for the data collections is 290 hours, which is 97 burden hours on an annual basis. Based on average hourly wages for participants, this amounts to an estimated monetary cost of $13,050. Exhibit 5 summarizes the tabulations of respondent burden for study activities. Note that the unit of analysis in the tabulations below is the entity—SEA or LEA. The study anticipates that the largest entities will divide the interview content between multiple individuals, as explained below in Part B under Data Collection Procedures. For both SEAs and LEAs, the assumed burden includes:

  • Gaining cooperation. Time for staff to respond to scheduling requests, receive information, and prepare for the interview (0.5 hour/entity).

  • Conducting interviews. Time for staff to participate in telephone interviewing and follow-up communication totaling 2 hours (2 hours/entity).


Exhibit 5. Summary of Estimates of Hour Burden


Task

Total Sample Size

Estimated Response Rate

Time Estimate

(in hours)

Total Hour Burden

Hourly Rate

Estimated Monetary Cost of Burden

State Interviews

Gaining cooperation

52

100%

0.5

26

$45

$1,170

Conducting interviews


52

100%

2

104

$45

$4,680

Total for State Interviews

--

--

--

130

--

$5,850

LEA Interviews

Gaining cooperation

75

85%

0.5

32

$45

$1,440

Conducting interviews

75

85%

2

128

$45

$5,760

Total for LEA Interviews

--

--

--

160

--

$7,200

STUDY TOTAL

127



290


$13,050

ANNUAL TOTAL

42



97


$4,350


  1. Estimate of Cost Burden to Respondents

There are no additional respondent costs associated with this data collection beyond the hour burden estimated in item A12.

  1. Estimate of Annual Cost to the Federal Government

The estimated cost for this study, including development of a detailed study design, data collection instruments, justification package, data collection, data analysis, and report preparation, is $523,070 for the three years, or approximately $174,357 per year. For the cost by subtask, please see Exhibit 6.

Exhibit 6. Budget by Subtask

Review of extant data sources

Design and instrumentation

Data collection

Analysis and reporting

Total

$26,644

$105,823

$207,914

$182,689

$523,070



  1. Program Changes or Adjustments

This request is for a new information collection.

  1. Plans for Tabulation and Publication of Results

The research team has designed the data collections, data management, and analysis procedures to accommodate the short data collection period of August 15, 2011, to March 31, 2012. The research team will develop coding materials for entering and preparing for analysis the data collected, as they are received, and will enter all data into an electronic database. The team will ensure accuracy of the data and will analyze the data as described in the analytic approach. The reports produced on the basis of these analyses will include disclaimers to clarify that the results of these analyses are not generalizable beyond the sample from which the data were collected.


Exhibit 7. Key Reporting Dates

Deliverable

Date Due

First draft of report

July 31, 2012

Second draft of report

September 18, 2012

Third draft of report

October 23, 2012

Fourth draft of report

December 17, 2012

Final report

February 27, 2013

For this study, the research team also will communicate and disseminate information to ED and other stakeholders through the following:

  • In-person briefing for ED staff each year of the contract

  • A user-friendly policy brief and fact sheet targeting policymakers, educators, media, and the public

  • Dissemination of the fact sheet and nontechnical executive summary for each report completed to the study participants

  • Dissemination of the reports, nontechnical executive summaries, policy briefs, and fact sheets to a number of audiences through organizations that focus on the equitable distribution of teachers

Finally, the team will prepare public use data files and submit them to ED no later than the end of the contract, February 27, 2013. Specifically, the research team will produce a CD-ROM that can be formatted to the National Center for Education Statistics’ (NCES’) Electronic Codebook (ECB). The team will ensure that all public use data files are in compliance with all privacy protection laws, maintaining the strictest confidentiality of all individual data collected in this study. They also will submit codebooks, technical reports, and other study materials to ED.

  1. Approval to Not Display OMB Expiration Date

Both interview instruments will include the OMB expiration date.

  1. Explanation of Exceptions

No exceptions are requested.



1 For a complete list of the relevant federal progams, see the section “Conceptual Framework” in the introduction to this document.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAndrew Abrams
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy