Revised TAH Supp Stmt SecA_8-4-09

Revised TAH Supp Stmt SecA_8-4-09.doc

Evaluation of the Teaching American History Grants Program: Data Collection Instruments

OMB: 1875-0252

Document [doc]
Download: doc | pdf








February 11, 2009

Revised Introductory Section 8/4/09


Request for Clearance of Data Collection Instruments for the Teaching American History Evaluation



Evaluation of the Teaching American History Grants Program

ED-04-CO-0027





Prepared for:


Policy and Program Studies Service

Office of the Under Secretary

U.S. Department of Education

400 Maryland Avenue, SW

Washington, D.C. 20202



Submitted by:


Berkeley Policy Associates

440 Grand Avenue, Suite 500

Oakland, California 94610-5085


SRI International
333 Ravenswood Avenue
Menlo Park, CA 94025-3


Contents



I. INTRODUCTION 1

The Teaching American History Grants Program 1

The Evaluation of the Teaching American History Grants Program 1

Evaluation Questions and Data Sources 2

Data Collection Activities 3

Instruments to be Cleared Through This Submission 4

II. SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION 5

A. Justification for the Evaluation of the Teaching American History Grants Program 5

1. Necessity of Information Collection 5

2. Use of Information 5

3. Use of Information Technology 5

4. Efforts to Identify Duplication 6

5. Methods to Minimize Burden on Small Entities 6

6. Consequences if Information is not Collected or is Collected Less Frequently 6

7. Special Circumstances 7

8. Federal Register Comments and Persons Consulted Outside the Agency 7

9. Respondent Gifts 8

10. Assurances of Confidentiality 8

11. Questions of a Sensitive Nature 9

12. Estimate of Information Collection Burden 9

13. Estimate of Total Annual Cost Burden 10

14. Estimates of Annualized Costs 10

15. Change in Annual Reporting Burden 10

16. Project Time Schedule 10

17. OMB Expiration Date 10

18. Exceptions to Certification Statement 10

B. Collections of Information Employing Statistical Methods 12

1. Respondent Universe and Sample Selection 12

2. Data Collection 15

3. Methods to Maximize Response Rates 15

4. Pilot Testing 16

5. Contact Information 16

EXHIBITS

Exhibit 1: Respondent Burden 9

Exhibit 2: Schedule of Data Collection Tasks and Deliverables…………………………… 11

APPENDICES 17

Appendix A: Legislation Authorizing the Teaching American History Program

Appendix B: Statutory Authority for Evaluations


Request for Clearance of Data Collection Instruments for the

Teaching American History EVALUATION

I. INTRODUCTION

This document has been prepared to support the clearance of data collection instruments for the evaluation of the Teaching American History (TAH) Grants Program. The Policy and Program Studies Service (PPSS) of the U.S Department of Education (ED) is conducting this evaluation. In the introduction to the supporting statement, we provide a description of the TAH program and a description of the evaluation questions and study design. The remaining sections of this document respond to specific instructions of the Office of Management and Budget (OMB) for the preparation of a supporting statement, including the justification for the study and statistical methods the evaluation will employ.

The Teaching American History Grants Program

The Teaching American History Grants Program is authorized by Title II, Part C, Subpart 4 of the Elementary and Secondary Education Act (ESEA), as amended by the No Child Left Behind Act of 2001 (see Appendix A). Grants are made by the U.S. Department of Education’s TAH program office, which is located in the Office of Innovation and Improvement. Since Fiscal Year 2001, the Department has awarded over 700 grants worth over $700 million dollars to support local projects in all 50 states, the District of Columbia, and Puerto Rico that improve the quality of instruction in traditional American history and thereby raise student achievement in the subject. TAH grants support efforts that enable teachers to improve their knowledge, skills, and teaching practices in traditional American history, as a subject distinct from general social studies education.

LEAs are the lead grant recipients in the program, and must work in partnership with one or more of the following entities: institutions of higher education (IHEs); nonprofit history or humanities organizations; and libraries or museums. Partner entities must have extensive content expertise in American history. Grant funds support activities designed to deliver ongoing, intensive professional training, education, and support for teachers of American history.

TAH grantees may target the entire K-12 system within an LEA, or may focus on a specific grade span. Grantees are also required to document, evaluate, and disseminate innovative and cohesive models of activities that provide teachers with the content knowledge they need to teach American history effectively.


Teaching American History Evaluation

Berkeley Policy Associates (BPA) and SRI International (SRI) are conducting the evaluation of the Teaching American History (TAH) Grants Program on behalf of the U.S. Department of Education’s Policy and Program Studies Service (PPSS). Building upon previous TAH grantee evaluations conducted by both SRI International (2001-2005) and BPA (2005), this current evaluation is unique in that it will focus on the outcomes of the TAH programs, particularly on program contributions to teachers’ content knowledge and student achievement based on standardized assessments.

The study includes the following principal components:

  • An analysis of the feasibility of using standardized student history assessment data to document changes in student performance and to associate these changes with teacher participation in the TAH program (Task 2);

  • A state data analysis using student assessment data from those states able to provide such data to analyze outcomes of the TAH grants (Task 3);

  • A review of annual performance reports (Task 6);

  • A meta-analysis/empirical synthesis of grantee evaluations that meet selection criteria for quality and suitability for the study (Task 7);

  • Case studies to selected sites to document practices related to increasing student achievement and to increasing teacher content knowledge (Tasks 4 and 5). Four high performing and four typically performing sites will be selected for each category of the case studies, with selection based on assessments of performance in teacher content knowledge and student achievement. We are asking for OMB approval for the case studies portion of the evaluation.



The Feasibility Study (Task 2) was completed in July of 2008 and recommended that the State Data Analysis (Task 3) be conducted with multi-year student history assessment data collected from the states in which such data were available. Based on the statistical power analyses and preliminary research on data availability, the study team concluded that conducting an impact analysis of the TAH program would be feasible using either a regression discontinuity design (RDD) or an interrupted time series (ITS) design. Given that these designs rely on the same underlying data, the team proposed to estimate impacts using both models. This would enable the study to capitalize on the different strengths of the two designs and present the more precise estimates from the ITS design while having the protection from threats to internal validity afforded by the RDD. The proposal was to include as many states as possible, within the financial and logistical constraints of the project, to increase the external validity of the estimates and the statistical power of the study as a whole. The plan included separate analyses for each state from which adequate data could be obtained, followed by a meta-analysis that would synthesize state by state findings.

The Department accepted the final recommendations of the Feasibility Study, choosing to exercise the State Data Analysis (Task 3). The Department also recommended that the analysis focus on participating schools within grantee districts, rather than districtwide, to the extent that these schools could be identified. Adequate data have now been received from five states: California, New York, Texas, Georgia and Virginia. In addition, we have been able to identify participating schools for about two thirds of the grants these states. The data matching, merging, and analyses for this task are currently underway, including the two analysis designs: a regression discontinuity design and an interrupted time series design. A final report for this analysis will be produced in September of 2009.

State data collected for Task 3 have also been used to compare grantee outcomes and select case studies for Task 4, which focus on program practices associated with positive student outcomes.

The Annual Performance Report Review (Task 6) was completed in April 2009 and focused on teacher outcomes of 2006 grants as reported in the 2008 APRs. The study team reviewed evaluation methods and outcomes of over one hundred 2006 grantees that had submitted APRs. The results of this review were the basis of the selection of the Task 5 Case Studies, which focus on identifying teacher practices associated with positive outcomes. The process of site selection for both Task 4 and Task 5 grantees is described in detail in Section B.

A Meta-Analysis (Task 7) is underway and will be completed in September 2009. The purpose of the meta-analysis is to use grantee-level evaluations in order to estimate an effect of the Teaching American History (TAH) program on student learning and to also estimate the effects of certain components of the TAH project activities on student learning. In order to prepare for the meta-analysis the study team reviewed the 92 available, final evaluation reports of the 2004 grantees, presented information on the evaluation methods used in these reports, and identified fourteen reports that were suitable for inclusion in a meta-analysis. Reports were selected for inclusion based on: 1) the presence of American history student test score data; 2) the use of a rigorous research design (quasi-experimental or experimental design); 3) the reporting of quantitative information that can be aggregated in a meta-analysis (i.e., sample size, means and standard deviations for student test scores for each group.) The team will weight the studies in the meta-analysis according to: (1) the within-study sample size (i.e., large-sample studies will be weighted more heavily than small studies); 2) the reliability and validity of the learning assessments based on the available coefficients; and 3) study quality, in particular, weighting the studies in terms of aspects of design that facilitate causal inference (e.g., random assignment of units to conditions).


Finally, the evaluation will conclude with a final report that provides a thorough analysis of all data gathered and fully addresses all evaluation questions. Overall, the evaluation will use both quantitative and qualitative data sources to assess and explain outcomes of the TAH grants. The audiences for the report will be both policy-makers and practitioners.

Evaluation Questions and Data Sources

The evaluation questions for this study include the following:



  • What is the association between participation in TAH and teacher content knowledge?

  • What is the association between teacher participation in TAH and student achievement?

  • What are the grantee practices associated with gains in teacher content knowledge?

  • What are the grantee practices associated with gains in student achievement?

  • What is the technical quality and rigor of existing TAH grantee evaluations that use national or state tests to measure teacher content knowledge and student achievement?



Data Collection Activities

In this section, we describe the data collection activities for which we are seeking OMB approval.

Case Studies We intend to conduct two sets of case studies, including up to sixteen cases in total, to collect in-depth information on grantee practices. Eight grantees, including four that have produced larger than average gains in teacher content knowledge and four that have produced more “typical” increases in teacher content knowledge, will be selected for two- to three-day site visits. In addition, case studies of another eight grantees, including four that have produced larger than expected gains in student achievement and four that have produced typical increases in student achievement will also be conducted. In a limited number of cases it may be that a site will have produced larger than expected gains in both teacher content knowledge and student achievement, and both categories of gains will be studied as part of a single site visit.

The purpose of these case study visits will be to deepen our understanding of the factors and conditions that support improved outcomes in student achievement and teacher content knowledge as a result of participation in the grant and to describe practices for various subgroups of teachers. At each grantee site, we will conduct interviews with the project director, professional development providers, history experts and/or mentors, and teachers. The site visits will be conducted from June to October, 2009, with a focus on scheduling visits primarily during a period when a TAH summer institute is underway. This will allow us to observe training activities at each site and gain easier access to teachers and professional development providers who are attending the summer institute or another intensive professional development activity.

Sites will be selected using methods described in detail in Section B.

The case study visits will last from two to three days, and teams of two researchers will make the visits. We will gather information from all sites on the following topics:



  • Background information on participants (project director, trainers/partners, teachers).

  • How the project was planned and developed, and how its goals were established.

  • Roles and relationships among partnering organizations.

  • Background information on participating schools, districts and student populations.

  • Nature, relevancy, usefulness and quality of the professional development activities conducted through the TAH grant.

  • Grantee practices associated with gains in teacher content knowledge.

  • Grantee practices associated with gains in student achievement.

  • Other outcomes (qualitative and quantitative) of teacher participation in grant programs.

  • Participant recommendations for program improvement.



The visits will include interviews with the project director, teacher training providers, representatives of partner organizations, participating teachers, administrators, and other individuals identified by TAH project representatives. When possible, the site visits will begin with an interview with the project director and end with a brief wrap-up session with the project director. A purposive sample of up to twenty teachers per site will be interviewed individually before and after sessions, during breaks and at other designated times during the summer institute or professional development activity. The protocols for these face-to-face interviews are included with this OMB package.

In addition to the interviews, we will also conduct observations of professional development activities, such as seminars or training sessions offered during the summer institutes. A Training Observation Summary is also attached. Included will be:

  • A description of the content and nature of the training.

  • A description of the characteristics of the training activity, using an observation checklist that addresses topics such as: the training methodology, whether opportunities were present for teachers to engage in active learning (i.e. small group discussions of complex historical issues), and whether any guidance was given on how to implement new strategies in the classroom.

  • Information about the training provider (followed by an interview with the provider).

  • Information on the number of participating teachers and the grade levels they teach.

  • Description of the level of engagement and participation of teachers.



Site visitors will be trained in using the data collection instruments prior to the visits. After the training they will use a site visit write up guide to prepare a summary of the findings from their visits. These summaries will be used for cross-site analysis. The storage of electronic and hard copy site visit materials will be secure according to guidelines described in the Assurances of Confidentiality section of this document.



Instruments to be Cleared through this Submission

ED is requesting clearance for the following data collection instruments which are included in this document:



  1. Project Director Protocol

  2. Participant (Teacher) Protocol

  3. Training Provider Protocol

  4. Training Observation Protocol

II. Supporting Statement For Paperwork Reduction Act Submission

A. Justification for the Evaluation of the Teaching American History Grants Program

1. Necessity of Information Collection

The Teaching American History Grants Program (TAH) and TAH grants are authorized by Title II, Part C, Subpart 4, of the Elementary and Secondary Education Act (ESEA), as amended by the No Child Left Behind Act of 2001 (see Appendix A). The Teaching American History Grants Program (TAH) reflects the serious interest of Congress and the U.S. Department of Education in supporting the teaching and learning of American history. Since Fiscal Year 2001, the Department has awarded over 700 grants worth well over $700 million dollars to support local projects in all 50 states, the District of Columbia, and Puerto Rico.

Evaluations for programs enacted by the No Child Left Behind Act are authorized under Title IX, Part F, Sec. 9601 of the Act (see Appendix B). This evaluation is the first systematic study of the TAH Grants Program that focuses on the relationship between program practices and outcomes. The TAH program represents a substantial federal investment in teacher training and professional development in the teaching of American history. Because this is the first time a federal investment of this magnitude has been made in the curriculum area of American history, this study (of which the proposed data collection is a critical part) is crucial for establishing whether the program is working as intended by Congress, and for identifying which elements of the program are most effective.

2. Use of Information

The U.S. Department of Education will use the results of this data collection to inform a variety of stakeholders regarding the nature and outcomes of the TAH program. More specifically, the information will be used for the following:



  • To evaluate the effectiveness of the TAH program.

  • To inform future reauthorizations of the TAH program.

  • To inform Congress and other policy-makers with a general interest in the improvement

of the teaching and learning of American history.

  • To evaluate the association between teacher participation in TAH programs and increased teacher content knowledge.

  • To evaluate the association between teacher participation in TAH programs and increases in student achievement on standardized history assessments.

  • To describe grantee practices associated with gains in teacher content knowledge.

  • To describe grantee practices associated with gains in student achievement.

  • To inform future evaluations of federal programs designed to measure student

achievement and the effectiveness of professional development in academic content

areas.



The audience for this evaluation includes the U.S. Department of Education, Congress, education policy-makers, K-12 history teachers, college and university history instructors, researchers, assessment specialists and providers of teacher professional development in history and other subjects. A potential dissemination plan has been provided to the Department with recommendations of organizations that would have an interest in the final report after it is officially released by the Department, including the National History Education Clearinghouse (funded by the Department) and other key history education organizations. Presentations at national conferences and TAH grantee annual meetings have also been discussed.

3. Use of Information Technology

No online surveys are being conducted as part of this study. Technology will be used only by project staff in recording, analyzing, and reporting.


4. Efforts to Identify Duplication

As a part of this evaluation, we have already undertaken two tasks involving a review of existing, extant information. We reviewed background information about the TAH program and did a literature review on the teaching and learning of American history. Through these reviews, we have found virtually no large-scale outcome-focused evaluations of professional development programs that focus on improving teachers’ knowledge and skills in American history, nor have we found case studies that link practice to outcomes of such programs. Therefore, in addition to being the first evaluation of the outcomes of the TAH program, this evaluation will contribute to the broader knowledge base on teaching and learning American history.

While there are some survey data (including NCES’s Schools and Staffing Survey) that address teachers’ formal training and ongoing professional development opportunities, please note that these data are more than three years old and predate the establishment of the TAH program. Neither SASS nor any other data sources provide systematic information about the TAH program.

This data collection does not duplicate annual performance reporting, which is based on self-report data provided by individual grantees. The case studies are designed to provide cross-site information and will be conducted and analyzed by an independent research team with expertise in large-scale evaluation research.

5. Methods to Minimize Burden on Small Entities

There will be minimal burden on small entities. For the few involved with the evaluation, we will take all steps to minimize the burden on respondents and other participants in evaluation activities by collecting much of the data concurrently with summer institutes or other pre-existing training events.

6. Consequences if Information is Not Collected or is Collected Less Frequently

Failure to collect this information will prevent Congress and ED from obtaining evaluation data on a federal program that has spent almost $700 million dollars to support the teaching and learning of American history since FY 2001. Failure to collect this data would also impede the reporting of performance indicators for GPRA/PART. All of the activities planned for this evaluation are one-time data collections, and each participant is involved in only one data collection activity, i.e. the interviews conducted during the site visits.

7. Special Circumstances

Not applicable.

8. Federal Register Comments and Persons Consulted Outside the Agency

60 day Federal Register notice was published on 2/24/09, Vol.74 No.35, Page 8240



Federal Register Comments: No public comments were received.

Consultation Outside the Agency and with Respondent Representatives. We have assembled a Technical Working Group (TWG) of teachers, college and university history professors, research methodologists, and history professional development providers. The TWG is advising the evaluation on all matters related to study design, sample selection, instrumentation, data collection, and data analysis. The members of the TWG, and their titles and professional affiliations, are:

Thomas Adams, Ph.D.

Director, Curriculum Frameworks and Instructional Resources

California Department of Education


Geoffrey Borman, Ph.D.

Professor of Education and Deputy Director, UW Predoctoral Interdisciplinary Research University of Wisconsin


Harold Doran, Ph.D.

Senior Research Scientist

American Institutes of Research


Patrick Duran

Assessment Specialist

Education Consultant


Patricia Muller, Ph.D.

Associate Director and Senior Scientist, Center for Evaluation and Education Policy

Indiana University


James Percoco, MA

High School History Teacher

Springfield High School


Kelly Schrum, Ph.D.

Director of Educational Projects, Center for History and New Media

George Mason University


Clarence Walker, Ph.D.

Professor of American History

University of California, Davis


In addition to the Technical Working Group, we consulted with TAH program officers from the U.S. Department of Education.

9. Respondent Gifts

We will not make any monetary gifts or payments to survey respondents or participants in other data collection activities.

  1. Assurances of Confidentiality

We have established a set of standards and procedures to safeguard the privacy of participants and the security of data as they are collected, processed, stored, and reported. In an initial letter of invitation from ED, we will tell respondents that participation is voluntary and that we will assure the following:

  • Responses to this data collection will be used to summarize findings in an aggregate manner (across groups of sites), or will be used to provide examples of program implementation in a manner that does not associate responses with a specific site or individual. In the report, pseudonyms will be used for each site. The study team may refer to the generic title of an individual (e.g., “project director,” or “eighth grade teacher”) but neither the site name nor the individual name will be used. All efforts will be made to keep the description of the site general enough so that a reader would never be able to determine the true name or identity of the site or individuals at the site. The contractor will not provide information that associates responses or findings with a subject or district to anyone outside the study team, except as required by law. We will educate project team members on how to handle respondents’ materials and data. We will caution all persons assigned to the study not to discuss the identities of the sites or the individual respondents from sites.

  • We will reemphasize the need to protect the privacy of respondents during training for interviewers and other data collection personnel. We will caution personnel not to discuss interview data with others outside the evaluation, and emphasize that they must restrict discussion within the project to the essential needs of the data collection activity.

  • We will disassociate names and addresses from the data as they are entered into the SRI or BPA database and will use this information for data collection purposes only. As we gather information on individuals or sites, we will assign each a unique identification number, which we will use for raw data, print-out listings that display the data, and analysis files. We will also use the unique identification number for data linkage. Surveys and/or questionnaires will have only the unique identification number on them. We will not use any names, addresses, or other information that could connect the survey with the individual on instruments, or in the public data files that we turn over to NCES after each round of data collection.

  • We will inform participants of the purposes of the data collection and the potential uses of the data collected.

  • We will store all electronic recordings of interviews, interview notes, and other project-related documents in secure areas that are accessible only to authorized staff members.

  • We will shred all interview protocols, forms, and other hard-copy documents containing identifiable data as soon as the need for this hard copy no longer exists. We will also destroy any data tapes or disks containing sensitive data.

  • We will duplicate all basic computer files on computer-based backup disks to allow for file restoration in the event of unrecoverable loss of the original data. We will store these backup files under secure conditions in an area separate from the location of the original data.

  • We will provide reports to ED or any employee only in the form of aggregate data or we will use the data to provide examples of program implementation in a manner that does not associate responses with a specific site or individual. We will not include any individual or institutional identifiers in these reports. We will not identify the names of any participating sites or participating individuals in the text of any report; pseudonyms will be used.

  • We will aggregate data across respondent types (e.g., project directors, teachers) for most of the study analyses.

  • We will reiterate all of these assurances at the time that data collection begins.


11. Questions of a Sensitive Nature

We have not included any questions or topics of a sensitive nature on the project director, participant, or training provider protocol. We will collect the names of teacher participants only as needed to identify interview respondents internally.

12. Estimate of Information Collection Burden

Respondent burden for these site visits consists of the time spent participating in interviews. Respondents will not incur any equipment, postage, or travel costs. Project directors will spend limited additional time providing lists of teachers and partners and helping to arrange interviews.

Exhibit 1 displays estimates of the total respondent burden in hours and dollars. These time estimates are based on prior experience with site visits of this nature. We will interview one project director per site; up to twenty teachers per site; and up to three training providers or partners per site. Total respondent burden is estimated to be in hours.



Exhibit 1

Respondent Burden



A

Number of Respondents

B

Time per Response (hours)

C

Total Hours

D

Hourly Wage*

C x D

Total Cost

TAH Project Directors

16

3

48

$60/hr

$2880

TAH Teachers

320 (maximum)

1

320

$30/hr

$9600

Training Providers/Partners

48 (maximum)

1

48

$60/hr

$2880

Total

384 (maxmum)


416


$15360

W age estimates are based on data for teachers and administrators in the Digest of Education Statistics (2007)


13. Estimate of Total Annual Cost Burden

There are no additional respondent costs associated with this data collection.

14. Estimates of Annualized Costs

We estimate that the cost to the federal government for the activities covered under this OMB submission is $445,160. The annual cost to the government would be $148,387.

15. Change in Annual Reporting Burden

There is a program change of 416 hours since this is a new collection.

16. Project Time Schedule

We will conduct the case study tasks according to the schedule shown in Exhibit 2.

17. OMB Expiration Date

We will inform respondents about the OMB expiration date when they are notified about the study.

18. Exceptions to Certification Statement

We are not requesting any exceptions to the certification statement.

Exhibit 2

Schedule of Data Collection Tasks and Deliverables


Site Visits to Selected Projects to

Document Practices in Increasing Student Achievement

Conduct site visits

7/1/09-10/30/09

Draft briefing report on site visit data and analysis

12/31/09

Draft final report on site visit data and analysis

2/27/10


Site Visits to Selected Projects to

Document Practices in Increasing Teacher Knowledge

Conduct site visits

7/1/09-10/30/09

Draft briefing report on site visit data and analysis

12/31/09

Draft final report on site visit data and analysis

2/27/10



Appendix A


Legislation Authorizing the Teaching American History Program



Subpart 4 — Teaching of Traditional American History

SEC. 2351. Establishment of Program

(a) IN GENERAL- The Secretary may establish and implement a program to be known as the Teaching American History Grant Program, under which the Secretary shall award grants on a competitive basis to local educational agencies —

(1) to carry out activities to promote the teaching of traditional American history in elementary schools and secondary schools as a separate academic subject (not as a component of social studies); and

(2) for the development, implementation, and strengthening of programs to teach traditional American history as a separate academic subject (not as a component of social studies) within elementary school and secondary school curricula, including the implementation of activities —

(A) to improve the quality of instruction; and

(B) to provide professional development and teacher education activities with respect to American history.

(b) REQUIRED PARTNERSHIP- A local educational agency that receives a grant under subsection (a) shall carry out activities under the grant in partnership with one or more of the following:

(1) An institution of higher education.

(2) A nonprofit history or humanities organization.

(3) A library or museum.

(c) APPLICATION- To be eligible to receive a grant under this section, a local educational agency shall submit an application to the Secretary at such time, in such manner, and containing such information as the Secretary may require.

SEC. 2352. Authorization of Appropriations

There are authorized to be appropriated to carry out this subpart such sums as may be necessary for fiscal year 2002 and each of the 5 succeeding fiscal years.













Appendix B


Statutory Authority for Evaluations

Part F — Evaluations

SEC. 9601. EVALUATIONS.

(a) RESERVATION OF FUNDS- Except as provided in subsections (b) and (c), the Secretary may reserve not more than 0.5 percent of the amount appropriated to carry out each categorical program and demonstration project authorized under this Act —

(1) to conduct —

(A) comprehensive evaluations of the program or project; and

(B) studies of the effectiveness of the program or project and its administrative impact on schools and local educational agencies;

(2) to evaluate the aggregate short- and long-term effects and cost efficiencies across Federal programs assisted or authorized under this Act and related Federal preschool, elementary, and secondary programs under any other Federal law; and

(3) to increase the usefulness of evaluations of grant recipients in order to ensure the continuous progress of the program or project by improving the quality, timeliness, efficiency, and use of information relating to performance under the program or project.

(b) TITLES I AND III EXCLUDED- The Secretary may not reserve under subsection (a) funds appropriated to carry out any program authorized under title I or title III.

(c) EVALUATION ACTIVITIES AUTHORIZED ELSEWHERE- If, under any other provision of this Act (other than title I), funds are authorized to be reserved or used for evaluation activities with respect to a program or project, the Secretary may not reserve additional funds under this section for the evaluation of that program or project.



2


File Typeapplication/msword
File TitleREQUEST FOR CLEARANCE OF DATA COLLECTION INSTRUMENTS FOR THE EVALUATION OF THE TEACHING AMERICAN HISTORY GRANTS PROGRAM
AuthorPolicy Division
Last Modified By#Administrator
File Modified2009-08-05
File Created2009-08-05

© 2024 OMB.report | Privacy Policy