CTAC OMB Part A

CTAC OMB Part A.doc

National Evaluation of the Comprehensive Technical Assistance Centers

OMB: 1850-0823

Document [doc]
Download: doc | pdf

National Evaluation of the Comprehensive Technical Assistance Centers



Statement for Paperwork Reduction Act Submission


PART A: Justification



Contract ED-04-CO-0028







January 22, 2007





Prepared for

Institute of Education Sciences

U.S. Department of Education


Prepared by

Branch Associates, Inc.

Policy Studies Associates, Inc.

Decision Information Resources, Inc.


Contents




Part A: Justification

A.1 Explanation of Circumstances That Make Collection of Data Necessary

The No Child Left Behind Act of 2001 (NCLB) mandates that states intervene to turn around low-performing schools more quickly than ever before. NCLB requires that each state “establish a statewide system of intensive and sustained support and improvement” for districts and schools that have been identified for improvement under NCLB, especially for those schools identified for corrective action.

In FY 2005, the U.S. Department of Education established a network of 21 Comprehensive Technical Assistance Centers (CTACs) to provide training, professional development, technical assistance, and information dissemination to improve student achievement, close achievement gaps, and support school improvement. There are 16 Regional Comprehensive Technical Assistance Centers and five Content Comprehensive Technical Assistance Centers. The 16 Regional Centers each serve specific states and are expected to provide frontline assistance to state education agencies to increase their capacity to assist districts and schools, while the 5 Content Centers are intended to provide technical assistance to the Regional Centers in the form of in-depth knowledge, expertise and analyses. The five Content Centers are expected to provide the Regional Centers with in-depth Content knowledge in five areas: (1) assessment and accountability, (2) innovation and improvement, (3) instruction, (4) teacher quality, and (5) high school reform. Annual funding for the Content and the Regional Centers totals over $59 million dollars per year.


The proposed data collection is intended to support the National Evaluation of the Comprehensive Technical Assistance Centers, a multi-component study designed to examine the extent to which the Centers have expanded States’ capacity to address the educational needs of Local Education Agencies (LEAs) and schools. This evaluation is Congressionally mandated under Title II of the Education Technical Assistance Act of 2002 (Section 204), Public Law U.S. 107-279 which requires that the National Center for Education Evaluation and Regional Assistance (NCEE), a component of the Department's Institute of Education Sciences, provide for ongoing independent evaluation of the Comprehensive Centers. The statute established the following specific goals for the evaluation:


  • to analyze the services provide by the Centers

  • to determine the extent to which each of the Centers meets the objectives of its respective plan

  • to determine whether the services offered by each Center meet the educational needs of State educational agencies (SEAs), local educational agencies (LEAs), and schools in the region.


A major objective of the evaluation is to assess the extent to which the Centers produce high-quality, relevant, and useful products and services. Definitions of quality, relevance, and usefulness will be reached through consultation with the Evaluation’s Technical Work Group, Department of Education staff, and review of existing measures. Ratings of quality and relevance by independent, qualified review panels and ratings of usefulness from surveys of SEA and Regional Center staff will be used to meet GPRA performance measures for each Comprehensive Center.


This evaluation is an important opportunity to gather information for measuring the performance of the Centers. The performance measures that the Department of Education will apply to the Centers are (1) the percentage of products and services that are deemed to be of high quality; (2) the percentage of products and services that are deemed to be of high relevance to educational policy or practice; and (3) the percentage of the targeted audience of Comprehensive Center clients who find their products and services to be of high usefulness.


A.2 How the Information Will Be Collected, by Whom, and For What Purpose

Research Questions

Data collected from site visits, review panels and surveys will be used to answer the following research questions addressed by the overall Comprehensive Centers evaluation:


  1. What are the objectives of each Comprehensive Center?


  1. What kinds of products and services are provided by each Center?


  1. How do the Centers define their clients’ educational needs and priorities? (“Clients” refers to SEA staff for the Regional Centers and Regional Center staff for the Content Centers.)


  1. To what extent has each of the Comprehensive Centers met the goals in its own plan?


  1. To what extent is the work of each Comprehensive Center of high quality, high relevance and high usefulness?


Exhibit 1 below provides a data collection plan for the evaluation.

Exhibit 1
Data Collection Plan

Instrument Mode

Respondent

Spring - Summer 2007

Fall 2007

Spring – Summer 2008

Fall 2008

Spring – Summer 2009

Fall 2009

Key Data

Regional Center Staff Site Visit Interview Protocol

Regional Center Directors

X






Descriptive information about centers’ goals, structure and operations.

Content Center Staff Site Visit Interview Protocol

Content Center Directors

X






Descriptive information about centers’ goals, structure and operations.










Project Inventory Form

Regional and Content Center Directors

X


X


x


A list of the products and services provided by each Center.










Request for Materials for Expert Review Panel

Regional and Content Center Directors

X


x


X


Documents and artifacts associated with projects selected for rating by review panel.










Client Surveys *

SEA staff and Regional Center Staff


x


x


x

Ratings from clients on the usefulness of CTAC products and services










*Client Surveys are not included here, but will be submitted for OMB approval in July 2007.


This proposed information collection activity covers four data collection protocols: 1) a Regional Center Staff Site Visit Interview Protocol for in-depth interviews with key staff of each Regional Centers; 2) a Content Center Staff Site Visit Interview Protocol for in-depth interviews with key staff of each Content Centers; 3) a Project Inventory Form for Regional and Content Centers to use in describing their products and services; and 4) Request for Materials for Expert Panel Review (for projects the evaluation team samples from the Inventory Forms) that requests copies of the materials and a cover sheet that together will be the basis upon which the reviewers will make their ratings.1


  1. Regional Center Staff Site Visit Interview Protocol

The evaluation team’s first data collection activity with each CTAC will be a site visit to each Center in Spring/Summer 2007. During this visit, the evaluators will use the Regional Center Site Visit Interview Protocol to guide discussions with the Center directors and other key staff. These interviews will enable us to ascertain the Centers’ goals, objectives, structure and operations.


The information collected via staff interviews at the Centers, in conjunction with a review of Center documents, will be used primarily to address two key research questions:


(1) What are the objectives of each Comprehensive Center?; and

(2) What kinds of products and services are provided by each Center?


The site visit interview protocols address a range of subjects including: the Center’s priorities; the extent to which assistance provided to clients is proactive vs. responsive to specific requests from clients; which aspects of NCLB each Center is most often called upon to address; and evidence that clients are developing increased capacity. To provide context for understanding the work being conducted by each Center, staff will also be asked to describe the Center’s structure and staffing patterns.


These protocols will guide interviews conducted with the Director of each Center as well as a limited number of additional Center staff, where necessary, to address specific topics.


  1. Content Center Staff Site Visit Interview Protocol

A second interview protocol was designed to collect similar information to that being collected in site visits to the Regional Centers. The Content Center Staff Site Visit Interview Protocol is designed to answer questions about the Centers’ goals, objectives, structure and the kinds of products and services provided.


  1. Project Inventory Form

Each Center will be requested to provide a detailed list of their products and services for each of three years under this collection using the Project Inventory Form. Evaluation staff will train Center staff to complete the Project Inventory Form. Center staff will complete the inventory and return these forms directly to the evaluators.


Data from these forms will be the basis by which evaluation staff select a sample of each Center’s projects. Selected projects will be reviewed by expert review panels established by the evaluation team. Up to 10 projects per year for three years (for a total of up to 30 projects per center over the course of the evaluation) will be chosen from the inventory for review by a review panel. Details regarding the sampling plan are provided in Section B.1. of this submission.


  1. Request for Materials for Expert Panel Review

Projects will be sampled annually by the evaluation team from each CTAC, on the basis of the Project Inventory Form. Sampled projects will be rated by an expert review panel. To gather from Centers all documents and other materials associated with selected projects (e.g., meeting agendas, briefing books, meeting summaries, training materials, white papers, web resources, etc.), the evaluation team will send electronically to each CTAC the Request for Materials for Expert Panel Review.


In addition, the Request for Materials for Expert Panel Review form will request CTACs complete one cover sheet for each selected project. The cover sheet will provide three key types of information to be used by the expert review panels: (1) background on the purpose of the CTAC project; (2) the project's research basis; and (3) the Center’s role in developing each project (e.g., was the project a joint effort produced with an SEA or other CTAC, or was it produced exclusively by the CTAC?). Centers will send completed cover sheets with project-specific documents and other materials to the evaluation team.


These materials will be assessed by expert review panels to develop ratings in two areas – quality and relevance. These ratings will form the basis upon which the evaluation team will make its ratings in these areas.


A.3 Use of Improved Information Technology to Reduce Burden

The forms for describing products and services and request for materials will be collected via electronic submission. The Project Inventory and Request for Materials (for sampled projects) will be in standard format (Word, Excel) that are easily accessible to Center staff and thereby minimize the burden. In addition, we are requesting that Centers send project materials electronically, wherever possible.


A.4 Efforts to Identify and Avoid Duplication

The information to be collected by this data collection does not currently exist in a systematic format. Efforts are being made to coordinate and share documents with Centers’ own local evaluations in order to avoid duplication.


A.5 Efforts to Minimize Burden on Small Business or Other Entities

No small businesses will be involved as respondents. Every effort has and will be made to minimize the burden on Comprehensive Center staff and clients. As noted below, interviews with Center staff will take three hours and will not have a significant economic impact on a substantial number of small entities. Respondent organizations will be notified about the interviews and they will be scheduled at the Centers’ convenience. Center staff will be permitted to complete the Project Inventory and Request for Materials for selected projects at their convenience (within a specified time period).


A.6 Consequences of Less-Frequent Data Collection

This submission includes site visit interview protocols that will be administered once per Center during a single visit. The inventory form will be completed by Center staff once a year for three years. This data collection is necessary to identify a sample of each Center’s products and services that will be rated by independent review panels according to their quality and relevance. The Request for Materials will also be conducted on an annual basis for three years. Less frequent data collection for these items would make it impossible to rate the quality and relevance of center products – central in the Department of Education’s efforts to measure Centers’ performance and changes in their performance over time.


A.7 Special Circumstances Requiring Collection of Information in a Manner Inconsistent with Section 1320.5(d)(2) of the Code of Federal Regulations

There are no special circumstances associated with this data collection.


A.8 Federal Register Comments and Persons Consulted Outside the Agency

In accordance with the Paperwork Reduction Act of 1995, The Institute of Education Sciences published a notice in the Federal Register announcing the agency’s intention to request an OMB review of data collection activities. The first notice was published on November 21, 2006, in volume 71, number 224, page 67345, and provided a 60-day period for public comments. To date, no public comments have been received from the Federal Register notice.


The data collection instruments were developed by the evaluation research team led by Branch Associates, Inc. (BAI) with Decision Information Resources, Inc. (DIR) and Policy Studies Associates, Inc. (PSA). The Site Visit Protocols and Project Inventory Forms were piloted with six of the Comprehensive Centers in November/December 2006.

A.9 Payments to Respondents

There will be no payments made to respondents. Experience on previous studies indicates that payments are not needed for this type of research.

A.10 Assurance of Confidentiality

Every effort will be made to maintain the privacy and/or confidentiality of respondents.


All respondents included in the study will be assured that the information they provide will be used only for the purpose of this research and that the information obtained through this study will be kept confidential to the extent provided by law.


To ensure data security, all individuals hired by Branch Associates, DIR and PSA are required to adhere to strict standards and sign an oath of confidentiality as a condition of employment. Review panel members will also be asked to sign an oath of confidentiality as a condition of serving on the review panel.


All contractor staff and review panel members will sign an agreement stating that:

  • I will not reveal the name, address or other identifying information about any respondent to any person other than those directly connected to the study.

  • I will not reveal the contents or substance of the responses of any identifiable respondent or informant to any person other than a member of the project staff, except for a purpose authorized by the project director or authorized designate.

  • I will not contact any respondent or informant except as authorized by a member of the project staff.

  • I will not release a dataset or findings from this project (including for unrestricted public use or for other, unrestricted, uses) except in accordance with policies and procedures established by the project director or authorized designate.


A.11 Questions of a Sensitive Nature

The questions included on the data collection instruments for this study do not involve sensitive topics. No personal information is asked.


A.12 Estimates of Respondent Burden

Exhibit 2 presents our estimate of the reporting burden for the intermediary survey respondents. Time estimates are based on experience with similar instruments in other studies of comparable organizations.


Exhibit 2: Estimates of Respondent Burden

Informant

Number of Responses

Number of Rounds

Average Time Per Response (Hours)

Total Respondent Burden (Hours)

Estimated Hourly Wage (Dollars)a

Estimated Lost Burden to Respondents (Dollars)

Regional Center Staff Site Visit Interview Protocol:







Regional Center Directors

16

1

3 hours

48

$57.19

$2,745

Content Center Staff Site Visit Interview Protocol:







Content Center Directors

5

1

3 hours

15

$57.19

$858

Project Inventory Form:







Summer 2007

21

1

16 hours

336

$57.19

$19,216

Summer 2008

21

1

16 hours

336

$57.19

$19,216

Summer 2009

21

1

16 hours

336

$57.19

$19,216

Total




1008


$57,648

Request for Materials for Expert Panel Review







Summer 2007

168b

1

4 hoursc

672

$57.19

$38,432

Summer 2008

168b

1

4 hoursc

672

$57.19

$38,432

Summer 2009

168b

1

4 hoursc

672

$57.19

$38,432

Total




2016


$115,295

Total




3087


$176,546

Notes:

  1. Assumed salary of “Senior Executive Service” of $109,808 annually for Center Directors.

  2. Assumed 8 projects * 21 Centers = 168 responses

  3. This calculation covers the time spent filling out the cover sheet and collecting all materials.


A.13 Estimates of the Cost Burden to Respondents

There are no annualized capital/startup or ongoing operation and maintenance costs associated with collecting the information. In addition to their time, which is estimated in Exhibit 1, other direct monetary costs to respondents include the costs of duplicating and mailing copies of materials that are not available electronically. (Based on the pilot test, we anticipate that the vast majority of materials will be available electronically.)


A.14 Estimates of Annualized Government Costs

The total cost to the Federal government for the National Comprehensive Technical Assistance Centers Evaluation is $6,630,085, and the annual cost is 887,228 in FY 2007, $1,962,373 in FY 2008, $1,827,046 in FY 2009 and $1,953,439 in FY 2010. Of that total, approximately $486,987 will be used for the data collection activities for which clearance is currently being requested.


A.15 Changes in Hour Burden

A program change of 1,071 burden hours is reflected since this is a new collection.


A.16 Time Schedule, Publication, and Analysis Plan

Time Schedule and Publication of Reports


The schedule shown below in Exhibit 3 displays the sequence of activities required to conduct these information collection activities and includes key dates for activities related to instrument design, data collection, analysis, and reporting.


Exhibit 3

Time Schedule

Activities and Deliverables

Date

Instrument Design (Regional and Content Center Staff Site Visit Interview Protocols, Project Inventory Form, Request for Materials for Expert Panel Review)2

Fall 2006

Site visits with Centers and Training on Data Collection Forms

Spring 2007

Instrument Design (Surveys of SEA staff & Regional Center staff)3

Winter/Spring 2007

Sampling and Analysis Plan (description of the sampling plan for documents and services on the inventory that will be selected for review by independent panels, plans for survey sampling and analysis)

Winter 2007

First meeting of Review Panel

Fall 2007

First survey of SEA and Regional Center staff3

Fall/Winter 2007-08

First Report

Spring 2008

Second meeting of Review Panel

Fall 2008

Second survey of SEA and Regional Center staff3

Fall/Winter 2008-09

Second Report

Spring 2009

Third meeting of Review Panel

Fall 2009

Third survey of SEA and Regional Center staff3

Fall/Winter 2009-10

Final Report

Summer 2010


Analysis of Data Collected Through Center Staff Site Visit Interview Protocols

The data collected in interviews with center staff will address three of the evaluation’s research questions:


  • What are the objectives of each Comprehensive Center?


  • What products and services do they provide?


  • How do center clients (SEAs or Regional Centers) define their needs and priorities?


Immediately following each site visit, the site visit team will write up its interview notes, combining interview data from multiple respondents into one record for each regional and each content center, following the format of the interview protocol. Senior analysts will then review each center write up, coding the interview data for evidence of particular characteristics of Center goals or operations.


The evaluation will produce a cross-site report that will address both the generalizations that hold true across all or most Centers and the ways in which the Centers vary. Text tables in the report will display variation in qualitative dimensions such as the priorities addressed, how those priorities are identified (e.g., by formal prospective needs assessment or by responding to requests), the offices within SEAs with which each Regional Center works and the Regional Centers with which each Content Center works, and substantive areas of focus for products and services. Quantitative analysis will provide tables of descriptive statistics on matters such as the size of Center staff and numbers of staff with various backgrounds.


This analysis will also generate narrative description that will illustrate the centers’ goals, structure and operations. These Center-specific narratives will illustrate how a Center prioritizes and organizes its work.


The cross-site analysis will present findings that address: (1) the centers’ missions as they have evolved to date; (2) the number and types of products and services offered by the centers; (3) how the centers are structured (e.g., whether the RCCs organize their work by state, by functional specialization, or a combination of the two); (4) how centers operate (e.g., the mix of response to request vs. independent initiation of products or services).



Analysis of Data Collected Through the Project Inventory Forms

The primary purpose of the project inventory forms will be to generate a sampling frame that can be used to select a sample of projects to be reviewed and rated by expert panels (see section B.1 for a discussion of the evaluation’s sampling approach).

Analysts will tabulate the number of ongoing and completed projects within each major topic area for each center, as well as across centers. Drawing on interview data, analysts will identify those topic areas and projects where each CTAC has invested the largest proportion of its staff time. The analysis will summarize data collected during site visits and draw on these tabulations to describe the number and types of products and services offered by the centers.


Analysis of Data Collected Through the Request for Materials for Expert Review Panels

Centers will be requested to provide materials (e.g., documents produced, conference proceedings, policy briefs, etc.) for each of the projects selected for panel review. Panels of independent experts will review the materials provided by each Center to rate the quality and relevance of each selected project. Each sampled project will be reviewed by three experts to assess quality and by three additional panel members to assess relevance. Reviewers will rate quality and relevance using rubrics developed by the evaluation team in consultation with the Technical Work Group and IES staff.


Results of the panel ratings will be reported for each Center in two metrics – the percentage of projects rated as “high” quality or “high” relevance and mean quality and relevance ratings across projects reviewed for each Center (see Exhibit 4).


Given that projects reviewed will be selected using a combination of Center nominations and stratified random sampling (detailed in Section B.1), the resulting Center-level ratings may represent an upper bound measurement of the percentage of projects for each CTAC that are of high quality and of high relevance. Comparison of average ratings for nominated projects and randomly selected projects will provide an indication of the extent of bias introduced by including projects nominated by Center staff in the review process.


Ratings for nominated and randomly selected projects will be aggregated to estimate overall quality and relevance for each Center. To arrive at these aggregated estimates we propose to weight ratings for nominated and randomly selected projects based on the proportion of the level of effort they represent for each Center.


In addition to Center-level results, ratings for each Center will be aggregated to estimate the percentage of projects that are of high quality and the percentage that are of high relevance for the overall Comprehensive Technical Assistance Center system. The percentage of projects rated as high quality and the percentage rated as having high relevance at both the Center level and for the overall system are consistent with GPRA reporting requirements. We propose that the ratings for each Center be weighted proportionate to its annual funding at an estimate of the overall quality and relevance of the work of the Centers.


The evaluators will consult with the Technical Work Group and IES Staff regarding the best method for developing aggregate estimates for each Center and for the CTAC system.


Exhibit 4. Analysis of Quality



Center

Number of Overall Projects

Projects Nominated by Center

Projects Selected Randomly

Overall

Number

Proportion of level of effort

% High Quality

Mean Quality

Number

Proportion of level of effort

% High Quality

Mean Quality

Number

% High Quality

Mean Quality

Center A













Center B
































































Center U













Total


NA

NA

NA

NA

NA

NA

NA

NA





Exhibit 5. Analysis of Relevance



Center

Number of Overall Projects

Projects Nominated by Center

Projects Selected Randomly

Overall

Number

Proportion of level of effort

% High Quality

Mean Quality

Number

Proportion of level of effort

% High Quality

Mean Quality

Number

% High Quality

Mean Quality

Center A













Center B
































































Center U













Total


NA

NA

NA

NA

NA

NA

NA

NA




A.17 Display of Expiration Date for OMB Approval

Institute of Education Sciences is not requesting a waiver for the display of the OMB approval number and expiration date on the data collection instruments. All data collection instruments will display the expiration date for OMB approval.


A.18 Exceptions to Certification Statement

This submission does not require an exception to the Certificate for Paperwork Reduction Act (5 CFR 1320.9)

1 Evaluation plans call for a second OMB Clearance Package to be submitted in July 2007 that will include a client survey form to gain information from SEA and Regional Center staff about the usefulness of the products and services provided by the Centers.

2 These are the data collection instruments for which we are currently seeking OMB clearance.

3 These are data collection instruments for which we plan to seek OMB clearance in July 2007.

File Typeapplication/msword
File TitleAbt Double-Sided Body Template
AuthorAbt Associates Inc
Last Modified ByDoED
File Modified2007-01-23
File Created2007-01-23

© 2024 OMB.report | Privacy Policy