Att_TAD OMB Part A

Att_TAD OMB Part A.docx

National Evaluation of the Technical Assistance and Dissemination (TA&D) Program: Grantee Questionnaire/Interview and State Survey Data Collection

OMB: 1850-0887

Document [docx]
Download: docx | pdf

Evaluation of the IDEA Technical Assistance and Dissemination Program



Statement for Paperwork Reduction Act Submission


PART A: Justification



Contract ED-04-CO-0059/0032







June 28, 2011






Prepared for

Institute of Education Sciences

U.S. Department of Education


Prepared by

Westat


Contents



Page





Part A: Justification

This package requests approval for a data collection for the National Evaluation of the Technical Assistance and Dissemination (TA&D) Program. Data collection will focus on gathering relevant information on the Office of Special Education Programs (OSEP)-funded TA&D Program from program grantees and from state agency staff. A separate package for a subsequent data collection – that will be shaped, in part, by findings from the first collection – will be submitted for review at a later date.

Introduction


Part D of the Individuals with Disabilities Education Act (IDEA) specifies that the TA&D Program will provide technical assistance, support model demonstration projects, disseminate useful information, and implement activities that are supported by scientifically based research (IDEA 2004, P.L. 108-446 Part D Section 663, 118 Stat. 2781). While the federal government has been funding projects that provide technical assistance related to the education of individuals with disabilities for four decades, the TA&D Program assumed its current structure with the 1997 reauthorization of IDEA. The current TA&D Program (funded at 49.5 million dollars in FY 2010) awards grants in nearly 20 subprogram areas, with grants ranging in size from approximately $65,000 per year to approximately $2.8 million per year. Program grantees are located throughout the U.S. and recipients include institutions of higher education, for-profit organizations and private nonprofit institutes and organizations.


The IDEA TA&D Program is based on the assumption that outcomes can be improved for children with disabilities when a knowledge base is disseminated to practitioners (and families) through technical assistance. In turn, the Program awards grants to fund a network of organizations staffed by skilled technical assistance providers. Exhibit A-1 depicts the theory of action that describes how the TA&D Program should work. It begins with the assumption that useful evidence-based knowledge exists. This knowledge is drawn on, and added to, by TA&D Program grantees who determine the best methods for translating the knowledge into forms that can be used by practitioners. For the sake of efficiency, and to ensure that the knowledge reaches the broadest target audience of practitioners, the grantees work through other entities that have direct relationships with practitioners. These entities include state education agencies (SEAs) and Part C lead agencies, institutes of higher education (IHE) faculty and researchers, other relevant stakeholders such as other child-serving agencies, and, in some cases, families and children.


As the diagram illustrates, the focus of the technical assistance that grantees provide is informed by both the knowledge base and the need for technical assistance expressed by the entities served. OSEP has determined that the most powerful course for change is the provision of technical assistance to SEAs and Part C lead agencies, which builds state capacity and enables the state agencies to better support local education agencies and local Part C organizations. This technical assistance ultimately results in changes in local policy and local practice that have a positive impact on children. In sum, the theory of action underpinning the TA&D Program is that OSEP-supported grantees can translate and disseminate evidence-based knowledge into forms that can be disseminated, through technical assistance activities, to practitioners whose changed practice will lead to improved outcomes for children with disabilities.


Shape1


While the model depicted in Exhibit A-1 focuses on the TA&D Program, it should be noted that the TA&D Program is part of a larger system of technical assistance funded by OSEP called the TA&D Network. In recent years, OSEP has evolved its TA&D efforts toward this concept of a network of centers that provide complementary services. The TA&D Network is a group of 46 centers (plus the center responsible for coordination), funded from across OSEP Part D programs and organized into 13 categories. These centers are intended to coordinate their efforts to provide states and other recipients with appropriate assistance and without duplication of efforts to work toward the goal of improved outcomes for children with disabilities and their families. Of the current 86 TA&D Program grantees, 27 of them are also members of the TA&D Network. The remaining 60 TA&D Program grantees are not members of the TA&D Network. While an evaluation of network functioning is an important topic, it is not the focus of this evaluation.


All funded projects provide technical assistance and dissemination services with the broad goals of (1) ensuring that Parts B and C of IDEA are implemented effectively and (2) improving results for children with disabilities. However, projects vary in structural and substantive ways. Most broadly, the currently active grantees can be described as belonging to one of five groups: 1) National centers, which focus on a particular topic area; 2) Regional Resource Centers, which serve and support state needs; 3) the PEPNet program, which has as its mission to improve transition services and educational access for students who are deaf or hard of hearing 4) Model demonstration centers, which have as their goal to examine a specific practice in a limited number of sites; and 5) State deaf-blind centers, which serve students in this population within their state. Grantees also differ in the way in which they provide their services, including the topic areas of focus; type of TA provided; intended outcomes; methods of service delivery; level/intensity of services and activities provided; and population and number of customers served.


Overview of the Evaluation


The National Evaluation of the TA&D Program is being conducted by the National Center for Education Evaluation (NCEE) in the Institute of Education Sciences (IES) at the U. S. Department of Education (ED). While technical assistance is often hailed as a critically important area for the effective implementation of federal policy, no independent data are currently available on the role that the TA&D Program plays in supporting state agencies in their implementation of IDEA. This evaluation will provide important information on the needs that state agencies have for technical assistance in special education, the products and services provided by the TA&D Program, and the extent to which these are meeting the needs of the state agencies for the particular topic areas upon which they focus. This information will set the stage for a more focused examination of how technical assistance might relate to improved state-capacity, changes in local policies and practices, and ultimately, improved child outcomes.


This data collection will focus on gathering specific information on the TA&D Program from program grantees and from officials at SEAs and Part C lead agencies. A TA&D Program grantee questionnaire/interview will yield detailed descriptive information about TA&D Program grantees, including the topic areas on which they focus, the particular practices and outcomes on which grantees are focused, as well as the technical assistance products and services provided by the TA&D Program grantees and to whom they provide them. State surveys will provide information concerning the needs that SEAs and Part C lead agencies have for technical assistance to support the implementation of IDEA and support improvement of child outcomes, and the technical assistance services and products that have been received or accessed at the state level from OSEP TA&D Program centers and satisfaction with those services and products.


A subsequent data collection for the evaluation is planned to focus on implementation of practices at the local level following the state agency’s receipt of technical assistance from TA&D Program-supported centers. The design of the second data collection will be informed by the findings of the first, particularly those findings that provide information about improving state-capacity, changing local policies and practices, and ultimately, improving child outcomes. These data collection plans will be submitted for public comment and OMB review under a separate package at a later date.



A.1 Explanation of Circumstances That Make Collection of Data Necessary

The National Evaluation of the Technical Assistance and Dissemination (TA&D) Program is part of the National Assessment of the Individuals with Disabilities Education Improvement Act of 2004 (hereafter referred to as the National Assessment) being conducted by NCEE. Section 664b of IDEA 2004 requires the National Assessment to evaluate “the effectiveness of schools, local educational agencies, States, other recipients of assistance under this title, and the Secretary in achieving the purposes of this title.” by: (i) improving the academic achievement of children with disabilities and their performance on regular statewide assessments as compared to nondisabled children, and the performance of children with disabilities on alternate assessments; (ii) improving the participation of children with disabilities in the general education curriculum; (iii) improving the transitions of children with disabilities at natural transition points; (iv) placing and serving children with disabilities, including minority children, in the least restrictive environment appropriate; (v) preventing children with disabilities, especially children with emotional disturbances and specific learning disabilities, from dropping out of school; and (vi) addressing the reading and literacy needs of children with disabilities. To date, NCEE has awarded five other contracts to support studies that are part of the National Assessment. Of the National Assessment studies, this evaluation is the only one that is focused on the role of the TA&D Program and its relation to implementation of IDEA.

The data collected for the National Evaluation of the TA&D Program will be used by ED to report to Congress as part of the National Assessment. Failure to collect these data may result in ED being unable to adequately report to Congress on the National Assessment. Additionally, if this evaluation were not completed, ED and Congress would not have an accurate understanding the relationship between the TA&D Program and early intervention/special education policy, practice, and outcomes at the state and local levels. The information from the evaluation will assist Congress in the reauthorization of the IDEA and to further improve early intervention and special education services with the ultimate goal of improving outcomes for children with disabilities.


This data collection will provide unique, detailed data and information on state agency needs for technical assistance to implement IDEA 2004 and the extent to which needs are addressed; the products and services provided by the TA&D Program and intended outcomes; and the technical assistance products and services that state agencies receive from TA&D Program grantees and how satisfied they are with these products and services. These data are not currently available from other sources but are necessary in order to accurately understand and improve upon the role that the TA&D Program plays in supporting state agencies in their implementation of IDEA.



A.2 How the Information Will Be Collected, by Whom, and For What Purpose

As noted earlier, this data collection has two components. First, we will administer a questionnaire followed by a semi-structured interview with the TA&D Program grantees to better understand the activities that are being supported through the TA&D Program. Second, we will be administering a state survey to obtain data on states’ needs for technical assistance to implement IDEA 2004 effectively and improve outcomes for children with disabilities and the extent to which these needs are being addressed by the technical assistance they receive. The state survey will have two sections; the first will focus more broadly on states needs across a range of topic areas while the second will ask more specific questions about a small number of “focal” topic areas. These two components are described in greater detail in this section and the research questions being addressed are listed below.


Research questions and sub-questions:


  1. What technical assistance do state agencies (i.e., state educational agencies and Part C lead agencies) need to implement IDEA 2004 effectively and improve outcomes for children with disabilities?

    1. In what topic areas do state agencies identify a need for TA and for what topic areas is this need the greatest?

    2. Within focal topic areas, what are state agencies’ specific types of needs for TA, and for which of these is the need greatest?

  2. To what extent do state agencies receive TA, in areas of need, to implement IDEA 2004 effectively and improve outcomes for children with disabilities?

    1. To what extent are state agencies’ needs for TA addressed?

    2. In which topic areas do state agencies’ needs for TA go unaddressed because TA was not received?

  3. What are the topic areas addressed by TA&D grantees and on which outcomes in particular are grantees focused?

    1. On what topic areas do TA&D grantees provide products and services?

    2. To what extent do the topic areas of TA&D grantee focus align with areas of TA stipulated in the law?

    3. Which outcomes do TA&D grantees aim to affect with the TA they provide?

  4. What technical assistance products and services do TA&D program grantees provide?

    1. What TA products and services do TA&D grantees provide?

    2. For what TA&D grantee products and services does demand for products and services exceed available resources?

    3. To whom do TA&D grantees provide TA products and services?

    4. To whom do TA&D grantees provide their most extensive TA products and services?

  5. What technical assistance products and services do state agencies receive in order to help meet their needs to implement IDEA 2004 effectively and improve outcomes for children with disabilities?

    1. On what topic areas do state agencies receive TA?

    2. For focal topic areas, from which TA&D grantees do state agencies receive TA products and services?

    3. For focal topic areas, what type and level of TA products and services do state agencies receive from TA&D grantees?

    4. For focal topic areas, from what other sources do state agencies receive TA?

  6. For focal topic areas, to what extent are state agencies satisfied with the products and services received from TA&D grantees?

    1. For TA products and services received from TA&D grantees, how does satisfaction vary by focal topic area?

    2. For TA products and services received from TA&D grantees, what factors drive state agency satisfaction?

    3. How does state agency satisfaction with TA products and services vary by characteristics of the TA provider and the provider- state agencies relationship?


Data Collection Activities and Instruments


Data collection instruments for the TA&D Program Grantee Questionnaire/Interview and the State surveys are included in Appendices A through C.


TA&D Program Grantee Questionnaire/Interview


As discussed above, the TA&D Program Grantee Questionnaire/Interview will yield detailed descriptive information of TA&D Program grantees’ activities concerning: (1) the topic areas addressed by TA&D Program grantees and the outcomes in particular on which grantees are focused and (2) the technical assistance products and services provided by the TA&D Program grantees and to whom they provide them. The specific focus will be on the activities that are designed to help state agencies in their efforts to support local programs implement IDEA and more specifically improve child outcomes.


To obtain systematic data from across grantees and address the specific research questions and sub-questions, we will administer a short questionnaire to the TA&D Program grantees. This questionnaire will ask about the topic areas that grantees cover with their technical assistance and dissemination activities; the customers that they serve or intend to serve, including those customers to which they allocate the most time and resources; the products and services that they provide including the demand for those products and services. Subsequent to the administration of the questionnaire, two semi-structured interviews will be completed with each project director. The interviews will ask more in-depth information about topics covered by the questionnaire and will include specific questions about the areas of outcomes they aim to affect with their technical assistance products and services, what kinds of technical assistance activities they provided to those customers, and policies and procedures that would be different and the outcomes that are expected to be improved as a result of these activities. The semi-structured interviews will permit some level of flexibility with regard to the diverse nature of the TA&D Program grantee work and assure the collection of information key to later evaluating relationships between receipt of technical assistance and local level implementation of practices.


There are different extant data sources that provide information about the activities of the TA&D Program grantees. Prior to administering the TA&D Program grantee questionnaire/interview, data from these extant data sources will be reviewed to better focus the interviews and to reduce burden on the TA&D Program grantees. While these extant data sources provide relevant information about TA&D Program grantees, they do not provide systematic data that can be used to address the research questions and sub-questions. The extant data sources to be reviewed include:


  • Request For Applications and cooperative agreements,

  • Grantee midstream (3+2) briefing books and reviews,

  • Grantee continuation reports,

  • State Part B and Part C Annual Performance Reports for FFY 2008 and FFY 2009,

  • Grantee websites, and

  • TA&D Network Survey, which was developed by the Technical Assistance Coordination Center (TACC), the National Dissemination Center for Children with Disabilities (NDC), and the National Parent Technical Assistance Center (NPTAC) to identify and determine collaborative relationships among network members.

State Surveys


As noted above, the state surveys will provide information concerning: (1) the needs that SEAs and Part C lead agencies have for technical assistance to support the implementation of IDEA and support improvement of child outcomes and (2) the technical assistance services and products that have been received by selected staff at the state level from OSEP TA&D Program centers and their satisfaction with those services and products.


There will be two state surveys: (1) a Part B State Education Agency Survey and (2) a Part C State Lead Agency Survey. Part B of IDEA serves children and youth between the ages of 3 and 21, and Part C serves infants and toddlers from birth to age 3. Each survey will be web-based, with the same questions asked across the two instruments.


Each survey will consist of two sections. Section I will broadly ask about state agencies’ needs for technical assistance. Questions will include areas on which the state agency had a need for technical assistance, those areas were state agency needs were the greatest, whether technical assistance was received in these areas, and the degree to which state agency’s needs for technical assistance were met. In this section, we will ask about a wide range of topic areas. Section II of the surveys will consist of modules that ask specific questions about different “focal topic” areas. The Part B survey contains 11 modules and the Part C survey contains 5 modules. The same set of questions will be asked in each module. Module questions will ask about state agency needs for specific types of technical assistance (e.g., support in SPP/APR indicators; training; capacity building); whether the type of technical assistance was received, how well it met the state agency’s needs, and reasons any needs were unmet; the TA&D Centers from which the state agency accessed technical assistance, the methods by which technical assistance was accessed, the nature of the state agency’s relationship with the TA&D Center, and their satisfaction with technical assistance received; other sources from which the state agency received technical assistance; and whether the state agency provided technical assistance on that focal topic area and to whom.


To develop the list of topics areas related to state agencies’ needs for technical assistance for Section I of the survey, we reviewed various sources, including IDEA 2004, staff assignment/staff directories of state special education websites, grantee project descriptions, and the Part B and Part C State Performance Plan/Annual Performance Report indicators. This review resulted in a comprehensive list of topic areas, which were then further refined resulting in 33 topic areas for Part B and 18 topic areas for Part C. Having a module on each of these topic areas for Section II would be a considerable burden to the state agency respondents. Therefore, we decided to include modules only on those topic areas that met the following inclusion criteria: (1) the topic area was mentioned in Sec. 663 and (2) the topic area was a focus of a TA&D Program grantee that serves states. As noted above, this resulted in 11 focal topic areas for Part B and 5 focal topic areas for Part C being included as modules for Section II.


The Part B and the Part C surveys are designed so that first section of the survey will be completed by the state Part B Director or state Part C Coordinator, respectively. To ensure that the modules in Section II are completed by the most appropriate state agency staff members, the Part B Director or Part C Coordinator will be asked at the end of Section I to identify the staff member who is the most responsible for providing or overseeing technical assistance to local districts/programs for each of the focal topic areas. If there is no one directly responsible, then they will be asked to provide the staff member most knowledgeable about that focal topic area. We expect that in some instances, the Part B Directors or Part C Coordinators will name themselves as most responsible or most knowledgeable and will be the one to complete that Section II module.

Respondents


We will collect data to address the six research questions listed above from three groups of respondents: (1) project directors of the TA&D Program grants; (2) state Part B Directors and state Part C Coordinators, and (3) key supporting staff from the state agencies.


Project Directors of the TA&D Program Grants


Project directors of selected TA&D Program grants will receive a questionnaire and follow-up interview via telephone. Centers that are active as of August 1, 2011 (including those under a no-cost extension) will be included, with the exception of two groups of centers: state deaf-blind project grantees and model demonstration grantees. Both of these two groups of grantees are under evaluation through other data collection efforts, and we do not want to duplicate effort by obtaining similar information from them at the same time. In addition, these grantees are also substantively different from the other grantees in that they do not focus primarily on increasing state or system capacity. Based on the currently active grantees, there will be 27 TA&D Program grantee project directors that will be included. Project directors will be invited to incorporate other staff in the interview process.


State Part B Directors and State Part C Coordinators


Two state surveys will be administered: (1) a Part B State Education Agency Survey and (2) a Part C State Lead Agency Survey. The surveys will be sent to the 50 states, the District of Columbia, Puerto Rico, the Virgin Islands, American Samoa, the Commonwealth of the Northern Mariana Islands, and Guam. Section I of the Part B survey will be completed by the state Part B Director and Section I of the Part C survey will be completed by the state Part C Coordinator. It is also expected that, in some instances, the state Part B Directors or state Part C Coordinators will complete some of the Section II modules.


Key State Agency Staff


This third group of respondents will be key state agency staff who will complete the Section II modules of the Part B and Part C surveys. These individuals will be identified by the state Part B Directors and state Part C Coordinators. By targeting those individuals who have had the most responsibility for provision of technical assistance for each of the focal topic areas, we both reduce burden on the state director/coordinator and obtain information from the most knowledgeable individuals at the state level.



A.3 Use of Improved Information Technology to Reduce Burden

The short questionnaire for TA&D Program grantees will be conducted using a fillable PDF. This process is highly similar to the experience of a web-based survey, but for a short survey such as the grantee questionnaire, is a more cost-conscious way to collect information. TA&D Program grantees will also immediately have a copy of their responses for reference during the follow-up phone call. The structured interview of TA&D Program grantees will take place via telephone to eliminate any requirements for participant travel.


We will administer the state agency surveys via the web so they are easily accessible to respondents. Administration of web-based surveys enables reduced burden through complex skip patterns that are invisible to respondents, as well as prefilled information based on responses to previous items when appropriate. In addition, web based surveys allow for multiple respondents to easily complete the various modules of the survey. Additional costs associated with data entry are not incurred as the respondent enters data while completing the survey leading to improved data quality as well. The nature of data entry in web-based surveys also leads to decreased costs associated with processing and increased data collection speed. Paper and phone survey options will be offered to respondents as part of the nonresponse follow-up effort and will be available to any respondent who prefers a paper mode.



A.4 Efforts to Identify and Avoid Duplication

The detailed information to be collected through these instruments does not currently exist within ED or other agencies in a systematic format. Any relevant data that do exist will be obtained from the Office of Special Education Program and will be used in addition to the data collected through the instruments. For example, midstream evaluation reports written by grantees, State Performance Plans and Annual Performance Reports, and TA&D grantee websites will all be used to supplement the data collected via the TA&D Program grantee questionnaire/interview and state surveys, as appropriate. However, these sources do not provide the systematic data required to completely address the research questions. It should be noted that two groups of centers will not be included as part of the TA&D Program grantee questionnaire/interviews: the state deaf-blind project grantees and model demonstration grantees. Both groups of grantees are under evaluation through other data collection efforts and we do not want to duplicate effort by obtaining information from them at the same time. Also, as previously described, this evaluation covers only those centers that are part of the TA&D Program and does not cover the other centers that are a part of TA&D Network which would represent a duplication of effort. For example, the centers supported by IDEA Personnel Development Program are part of the TA&D Network are not included in the evaluation; these centers are being evaluated by IES as part of the Evaluation of the IDEA Personnel Development Program, which is one of the other National Assessment of IDEA studies authorized under Section 664 of IDEA.



A.5 Efforts to Minimize Burden on Small Business or Other Entities

No small businesses will be involved as respondents. Every effort has and will be made to minimize the burden on TA&D grantees and state agency staff. For the TA&D grantee questionnaire/interview, in advance of the interview, we will obtain and review relevant extant data such as the Request For Applications and their cooperative agreements, the midstream briefing books (for relevant centers), and their continuation reports. These extant data sources will allow us to have a better understanding how each of these unique centers operate, which will allow us to better focus our efforts during the interviews. In addition, having two interviews will reduce fatigue and additionally reduce burden on the TA&D grantees. As discussed in section A.3, the grantee questionnaire will be administered via fillable PDF, and this process is highly similar to the experience of a web-based survey. Also as described in section A.3, we will administer the state agency surveys via the web, so they will be easily accessible to respondents. Burden will be reduced with the use of prefilled information based on responses to previous items when appropriate. We are also attempting to reducing burden on the state Part B directors and state Part C coordinators by having them identify the individual who is most responsible for providing technical assistance on specific focal topic areas as the respondents for the Section II modules, as opposed to having the state Part B director or Part C coordinator complete the entire survey.



A.6 Consequences of Less-Frequent Data Collection

The data collection will occur only once. If the data collection is not completed, OMB, administrators, policymakers, and the public will not know whether the TA&D program is performing effectively.


A.7 Special Circumstances Requiring Collection of Information in a Manner Inconsistent with Section 1320.5(d)(2) of the Code of Federal Regulations

There are no special circumstances associated with this data collection.



A.8 Federal Register Comments and Persons Consulted Outside the Agency

A 60 day and 30 day notice was published in the Federal Register for public comment.


The data collection instruments were developed by the evaluation research team led by Westat, with assistance from senior consultant Debra Price-Ellingstad (Minnesota Department of Education) and Sharon Walsh (Walsh-Taylor Inc.), under the direction of the IES COR. OSEP staff reviewed portions of the data collection instruments. The TA&D Program grantee questionnaire/interview was tested with five project directors of other technical assistance and dissemination centers. The Part B and Part C state surveys were tested with five former state Part B Directors and three former state Part C Coordinators to test the survey items and assess potential burden. These procedures informed our time estimates and the comments from the pilot test respondents were addressed in the revised instruments.


A Technical Work Group (TWG) met in February 2011 to discuss the evaluation design and the data collection activities and instruments. Members of the TWG include:


  • Elaine Bonner-Tompkins, Montgomery County Office of Legislative Oversight

  • Sandy Christenson, University of Minnesota

  • Larry Gloeckler, Special Education Institute at the International Center for Leadership in Education

  • Jim Hamilton (retired)

  • John Killoran, Western Oregon University

  • Robin McWilliam, Siskin Children’s Institute

  • Stephen Smith, University of Florida


A.9 Payments to Respondents

There will be no payments made to TA&D Program grantee or state respondents.



A.10 Assurance of Confidentiality

Other than the names and contact information for the respondents, which is information typically already available in the public domain (i.e., state and district websites), no data collected will include information that could identify an individual respondent. In reporting, no TA&D Program grantee staff or state respondent will be named. No names and contact information will be released.


An explicit statement regarding confidentiality will be communicated to all respondents. The following statement will be included in the cover letter of the TA&D Program grantee questionnaire:


Every effort will be made to protect the confidentiality of data collected through this study, while balancing the evaluation’s mandate to report results about the TA&D Program. Reports on this evaluation will not name individuals and will not include any information that could be used to identify individual respondents. We will not provide information that identifies you to anyone outside the study team, except as required by law.


With regard to the state survey, state level responses may be reported but only for broadly descriptive variables that are reported in Section I of the state survey. Specific state responses related to need for technical assistance, receipt of technical assistance, satisfaction with technical assistance will not be tied to specific states or state-level respondents. In addition, ratings of satisfaction will not be reported by individual TA&D Program grantee. Therefore, a state respondent may provide information about their satisfaction with specific centers without concern that they will be identifiable. The following statement will be included in the cover letter of the Part B and Part C surveys:


All information that would permit identification of the individual respondents to this survey will be held in strict confidence, will be used only by persons engaged in and for the purposes of the survey, and will not be disclosed or released to others for any purpose except as required by law.


ED, in the conduct of the study, will follow procedures for ensuring and maintaining participant privacy, consistent with the Education Sciences Reform Act of 2002. Title I, Part E, Section 183 of this Act requires, “All collection, maintenance, use, and wise dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment. Respondents were assured that confidentiality was maintained, except as required by law. Specific steps to guarantee confidentiality include the following:


  • Identifying information about respondents (e.g., respondent name, email address, and telephone number) will at no point be stored in the same file as the survey data. Through the web-based system, those data will be automatically extracted into a separate file and will be password protected. A unique identification number for each respondent will be used for building raw data and analysis files.

  • In emails, participants will be referred to by unique identification number. Files containing more information will be password protected.

  • A fax machine used to send or receive documents that contain confidential information will be kept in a locked field room, accessible only to study team members.

  • Confidential materials will be printed on a printer located in a limited access field room. When printing documents that contain confidential information from shared network printers, authorized study staff will be present and retrieve the documents as soon as printing is complete.

  • In public reports, findings will be presented in aggregate by type of respondent (e.g., SEA Part B personnel) and by focal topic area (e.g., behavior, early childhood transitions). No reports will identify individual respondents.

  • Access to the sample files will be limited to authorized study staff only; no others will be authorized such access.

  • All members of the study team will be briefed regarding confidentiality of the data and will sign a statement with the following information:

  • I will not reveal the name, address, or other identifying information about any respondent to any person other than those directly connected to the study.

  • I will not reveal the contents or substance of the responses of any identifiable respondent or informant to any person other than a member of the project staff, except for a purpose authorized by the project director or authorized designate.

  • I will not contact any respondent or informant except as authorized by a member of the project staff.

  • I will not release a dataset or findings from this project (including for unrestricted public use or for other, unrestricted, uses) except in accordance with policies and procedures established by the project director or authorized designate.

  • A control system will be in place, beginning at sample selection, to monitor the status and whereabouts of all data collection instruments during transfer, processing, coding, and data entry. This includes sign-in/sign-out sheets and the hand-carrying of documents by authorized project staff only.

  • All data will be stored in secure areas accessible only to authorized staff members. Computer-generated output containing identifiable information will be maintained under the same conditions.

  • When any hard copies containing confidential information are no longer needed, they will be shredded.



A.11 Questions of a Sensitive Nature

The questions included on the data collection instruments for this study do not involve sensitive topics.



A.12 Estimates of Respondent Burden

In all, responses will be required one time from a maximum total of 1,035 respondents (27 TA&D Program grantees, 112 state Part B directors and Part C coordinators, and up to 896 state agency staff members). We estimate that it will take respondents between 20 and 210 minutes to complete each instrument, so total burden is 61,670 minutes or 1,028 hours (see Exhibit A-2 below for a breakdown of burden by respondent type).


Exhibit A-2. Estimates of Respondent Burden


Respondent

Anticipated number completed

Minutes
per completion

Burden in minutes


Burden in hours


Burden in Dollars


(a)

(b)

(c) a x b

c/60


TA&D Program grantee

27

210

5,670

95

$4,940

State Part B directors and Part C coordinators

112

20

2,240

37

$1,924

State agency staff members

896

60

53,760

896

$46,592

Total burden

1,035


61,670

1,028

$53,456

NOTE: Assumes an hourly rate of $52 per hour.



A.13 Estimates of the Cost Burden to Respondents

There are no annualized capital/startup or ongoing operation and maintenance costs associated with collecting the information.



A.14 Estimates of Annualized Government Costs

The total cost to the federal government for the National Evaluation of the Technical Assistance and Dissemination Program for the design, completion, analysis, and reporting of portion of the evaluation covered by this package is $1,842,453. Annualized costs are $284,855 for FFY 2010, $287,606 for FFY 2011, $1,161,391 for FFY 2012 and $108,601 for FFY 2013. The average annualized government cost of for the design, completion, analysis, and reporting of portion of the evaluation covered by this package is $460,613.


A.15 Changes in Hour Burden

This is a new collection. This submission reflects the hour burden for conducting TA&D Program grantee questionnaires/ interviews and state surveys. This submission reflects a program change of 1,028 hours.



A.16 Time Schedule, Publication, and Analysis Plan

Time Schedule


The schedule shown below in Exhibit A-3 displays the sequence of activities required to conduct the data collection activities, including key dates for activities related to instrument design, data collection, analysis, and reporting.


Exhibit A-3. Time Schedule for TA&D Program Grantee Questionnaire/Interview

and State Survey


Activities

Date

TA&D Program grantee questionnaire/interview data collection

September-November 2011

State survey data collection

September-November 2011

Analyze data

November 2011-February 2012

Draft report

March 2012

TWG meeting

May 2012

Final report and restricted-use data file

December 2012


Publication


For the final report, we will follow the principals of the Federal Plain Language Action and Information Network and adhere to the requirements of the National Center for Education Statistics (NCES) Statistical Standards (2002), IES Style Guide (2005) and other IES guidance and requirements for public reporting. The final report will address the research questions and sub-questions using both information from the state surveys and TA&D Program grantee questionnaires/interviews. Each report will start with an outline of highlights. Then for each research question, the report will include a discussion of the context for understanding the findings, the data sources used and their limitations, the data collection methodology, the analyses conducted and findings. Appendices will provide more detailed information about, for example, the purpose of the evaluation and its design, the approaches to data collection, and survey response rates.


Analysis Plan


This section describes the anticipated response rate and analysis plans for the data collected through the TA&D Program grantee questionnaires/interviews and the state surveys.


As described in the respondent section, state surveys will be administered to all 50 states, the District of Columbia, Puerto Rico, the Virgin Islands, American Samoa, the Commonwealth of the Northern Mariana Islands, and Guam. No survey non-response is anticipated for two reasons. First, Westat achieved 100 percent state response rates when collecting data from State agency respondents in the National Assessment Implementation Study (NAIS) (Bradley, Daley, Levin, O’Reilly, Parsad et al, 2011) and Study of State and Local Implementation and Impact of the Individuals with Disabilities Education Act (SLIIDEA) (Schiller, Fritts, Bobronnikov, Fiore, O'Reilly et al, 2006) studies. Westat staff have long-standing working relationships with state administrators that facilitate data collection efforts, and we will follow the same follow-up procedures as used in the NAIS. Second, we hope that a letter from ED encouraging respondents to complete the survey and highlighting the importance of these data will also facilitate the expected high response rate. Therefore, we expect the questionnaire responses will represent a census of the states.


The TA&D Program grantee questionnaire/interview will be administered to the 6 Regional Resource Centers, 4 PEPnets, and 17 national TA&D Centers. We anticipate a 100% response rate among these respondents; as the TA&D Program is a Federal grant program, according to EDGAR, grantees technically are required to participate in studies such as this one in order to continue receiving funds (20 U.S.C. 1221e–3 and 3474). A letter from ED reminding respondents of this obligation, as well as presentations at relevant meetings will also facilitate the expected high response rate.


Exhibit A-4 presents the items from each source of data that will be used to address each of the research questions and sub-questions for this data collection, and the level of analysis at which the question will be addressed.


Exhibit A-4. Research Questions and Corresponding Survey/Questionnaire/Interview Items


Research Question and Sub-Question

Data collection tool and level of analysis

Tool

Level of analysis

  1. What technical assistance do state agencies (i.e., state educational agencies and Part C lead agencies) need to implement IDEA 2004 effectively and improve outcomes for children with disabilities?

  1. In what topic areas do state agencies identify a need for TA and for what topic areas is this need the greatest?

State survey

I-1, I-2, I-3

Across states

  1. Within focal topic areas, what are state agencies’ specific types of needs for TA, and for which of these is the need greatest?

State survey

II-1, II-2

Within focal topic area


  1. To what extent do state agencies receive TA, in areas of need, to implement IDEA 2004 effectively and improve outcomes for children with disabilities?

  1. To what extent are state agencies’ needs for TA addressed?

State survey

I-5

Across states

  1. In which topic areas do state agencies’ needs for TA go unaddressed because TA was not received?

State survey

I-4, II-3, II-4

Across states and within focal topic area


  1. What are the topic areas addressed by TA&D grantees and on which outcomes in particular are grantees focused?

  1. On what topic areas do TA&D grantees provide products and services?

Grantee questionnaire/ interview
I-1, I-2, II-1

Across grantees

  1. To what extent do the topic areas of TA&D grantee focus align with areas of TA stipulated in the law?

Grantee questionnaire/ interview, IDEA 2004 regulations
I-2, II-2

Across grantees

  1. Which outcomes do TA&D grantees aim to affect with the TA they provide?

Grantee questionnaire/ interview

II-2, II-4

Across grantees

  1. What technical assistance products and services do TA&D program grantees provide?

  1. What TA products and services do TA&D grantees provide?

Grantee questionnaire/ interview

I-8, II-5, II-6,
II-7, II-8, II-9, II-10

Across grantees

  1. For what TA&D grantee products and services does demand for products and services exceed available resources?

Grantee questionnaire/ interview

II-12

Across grantees

  1. To whom do TA&D grantees provide TA products and services?

Grantee questionnaire/ interview

I-3, I-4

Across grantees

  1. To whom do TA&D grantees provide their most extensive TA products and services?

Grantee questionnaire/ interview

I-5, I-6, I-7

Across grantees

  1. What technical assistance products and services do state agencies receive in order to help meet their needs to implement IDEA 2004 effectively and improve outcomes for children with disabilities?

  1. On what topic areas do state agencies receive TA?

State survey

I-4, II-3

Across states and within focal topic area


  1. For focal topic areas, from which TA&D grantees do state agencies receive TA products and services?

State survey

II-5

Within focal topic area

  1. For focal topic areas, what type and level of TA products and services do state agencies receive from TA&D grantees?

State survey

II-6, II-7, II-8

Within focal topic area

  1. For focal topic areas, from what other sources do state agencies receive TA?

State survey

II-11

Within focal topic area

  1. For focal topic areas, to what extent are state agencies satisfied with the products and services received from TA&D grantees?

  1. For TA products and services received from TA&D grantees, how does satisfaction vary by focal topic area?

State survey

II-10

Within focal topic area

Across focal topic area

  1. For TA products and services received from TA&D grantees, what factors are related to overall state agency satisfaction?

State survey

II-10

Within focal topic area Across focal topic area

  1. How does state agency satisfaction with TA products and services vary by characteristics of the provider-state agency relationship?

State survey

II-6, II-10

Within focal topic area Across focal topic area


In the sections below, we present more detailed information about anticipated analyses for each of the three types of data obtained (close-ended items, open-ended items, and scaled data), along with examples for each of the three units of analysis (across states, within focal topic areas, and across grantees).


Although we do not anticipate survey non-response, we will treat item non-response and survey non-response in the same manner. First, we will indicate in our tables the number of missing responses. Additionally, we will decrease the denominator by the number of missing responses. This means the summation represents the number of states responding to each item.


Analysis of close-ended items

With the exception of research questions 6b and 6c, which are discussed below, all other questions will be addressed through the use of simple descriptive statistics such as means and percentages. This remains true regardless of question type (e.g., select one, check all that apply, ranking) and when the unit of analysis is all states (e.g., 1a, 2a, 2b, and 5a) or the unit of analysis is the focal topic area (e.g., 1b, 2b, 5a, 5b, 5c, 5d, 6a). Similarly, the majority of data obtained through the grantee questionnaire/interview will also be addressed through simple frequencies and means. Below, we present three examples to illustrate the analyses based on the three different units of analysis we will include.


First, an illustrative example of a state level analysis examining percentages is shown in Exhibit A-5. The data from this example will come from the survey item determining areas in which SEAs received TA during 2010-2011:


I-4. For each of the topic areas listed, did your SEA receive TA during 2010-2011?


Respondents indicate, for each topic area where they have listed a need for TA, one of the following options:


  • No, TA was not sought

  • No, TA was sought but not received

  • Yes, TA was received and is ongoing

  • Yes, TA was received and is complete


Since the data are a census, rather than a sample, there is no need for calculation of standard errors or confidence intervals as these are statistical concepts that apply to sample data. It is common to present the standard error of an estimate or a 95% confidence interval around an estimate, but in this case, the percent calculated is not an estimate but is in fact the true population value.


Exhibit A-5: Receipt of TA by Part B SEAs (School Year 2010-2011)

TA topic area

No TA was sought

TA was sought but not received

TA was received and is ongoing

TA was received and is done

N

%

N

%

N

%

N

%

Assistive technology









Autism









Behavior









Coordinated Early Intervening Services









Deaf-blind









Discipline









Disproportionality









(etc)









EXHIBIT READS: XX Part B SEAs (XX percent) reported a need for TA in assistive technology but did not seek TA in this area. XX Part B SEAs (XX percent) reported a need for TA in assistive technology and sought but did not receive TA in this area.

For No TA was sought, N = XX; for TA was sought but not received, N = XX; for TA was received and is ongoing, N = XX; for TA was received and done, N = XX.


Second, we show an example in Exhibit A-6 to illustrate how focal topic related data will be summarized.


II-1. Related to the area of Behavior, check whether your SEA needed TA on each of the following. Check “yes” if your SEA had a need, whether or not TA was received.


Exhibit A-6: State Reported Needs for Specific TA Related to Behavior (School Year 2010-2011)

Specific need for TA

States having a need for TA in 2010-2011

N

%

Needs assessment at the state or local level related to behavior



Support related to SPP/APR indicators related to behavior



Development or dissemination of materials on effective practices related to behavior



Training and other personnel development activities (preservice or inservice) related to behavior



State and local capacity-building to enhance service delivery and scale up effective practice related to behavior



Support related to finance systems and funding sources related to behavior



Evaluation of practices or activities related to behavior



Support related to policies and procedures related to behavior



Collaboration with other agencies, stakeholders, groups and participation in communities of practice related to behavior



Work with parents/families or parent-focused organizations related to behavior



EXHIBIT READS: XX SEAs (XX percent) reported having a need for TA in the area of needs assessment at the state or local level related to behavior.


Third, an example of analysis of grantee data using straight descriptive tabulations is presented in Exhibit A-7.

II-1 “In Question 2 of the questionnaire you filled out, you named the topics of [topic1, topic2, topic3] as the ones on which you focus most. What proportion of your overall resources would you estimate are spent on each of these areas?”

Exhibit A-7: Grantee Proportion of Resources Allocated to Top Three Topic Areas of Focus


Area of focus

Mean Proportion of Resources

Number of Centers

M

SD

N

Assistive technology




Autism




Behavior, including positive behavioral support (PBS)




Child and family outcomes




Coordinated Early Intervening Services (CEIS)




Deaf-blind




(etc)




EXHIBIT READS: TA&D Program grantees report allocating an average of XX percent of their resources in the area of Assistive Technology.


Analysis of open-ended items

In the state survey, several questions allow respondents to indicate an “Other” option. These items will first be examined for possible upcoding. Data that cannot be upcoded will be used descriptively as relevant and as space permits in the report. No open-ended data will be used that would compromise confidentiality of a state respondent.

Similar to the state survey, any open ended responses that appear as “Other” in the TA&D Program grantee questionnaire/interview will be upcoded if possible. The grantee interview also includes five free-standing open-ended questions. Data from these questions will be coded and used for descriptive purposes.

Analysis of scale items

As noted, most of the research questions in this study are amenable to reporting through frequencies and means. We propose additional analyses to address two of the research questions that are slightly more complex:


6b. For TA products and services received from TA&D grantees, what factors are related to overall state agency satisfaction?

6c. How does state agency satisfaction with TA products and services vary by characteristics of the provider-state agency relationship?


The data for these questions come from two items in the state surveys, II-6 and II-10. To examine satisfaction with services, we will first use factor analysis to determine whether latent satisfaction constructs were captured in the survey. In addition, the individual components of satisfaction in question II-10 will be used in a linear regression analysis to predict overall satisfaction and address research question 6b. This analysis is particularly important, since to the best of our knowledge, there is no previous work that highlights what aspects of special education and early intervention technical assistance are most associated with client satisfaction. Our analysis may provide the first of such data.


To address how state agency satisfaction with TA products and services vary by characteristics of the provider-state agency relationship, we propose to examine the distribution and pattern of products and services received by state respondents that are captured in item II-6. Based on the data, we will use an empirical approach to develop an overall categorization of center usage that incorporates both the individualization of services (represented by the rows) and the frequency of use (represented by the columns). Through this examination, we will create a threshold-of-use variable to create either two or three categories of use.


Depending on the outcome of the analyses focused on satisfaction, we may be able to generate a continuous satisfaction score and then present the average satisfaction with TA&D centers. We will also be able to examine the relationship between satisfaction and characteristics of the provider-recipient relationship, which is provided by question II-6. We note that analyses of data from II-6 in conjunction with the satisfaction rating will be conducted among those respondents who surpass the minimum threshold of use in order to avoid examining satisfaction among recipients who have only had minimal contact with a center. However, data from the low-use group can still be examined in conjunction with the survey item assessing overall satisfaction.


The project team will include a detailed description of research methods as well as results in the evaluation report.



A.17 Display of Expiration Date for OMB Approval

The Institute of Education Sciences is not requesting a waiver for the display of the OMB approval number and expiration date on the data collection instruments. All data collection instruments will display the expiration date for OMB approval.



A.18 Exceptions to Certification Statement

This submission does not require an exception to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).


References

Bradley, M.C., Daley, T.C., Levin, M., O’Reilly, F., Parsad, A., Robertson, A. & Werner, A. (2011). IDEA 2004 National Assessment Implementation Study. Cambridge, MA: Abt Associates, Inc.

Schiller, E., Fritts, J., Bobronnikov, E., Fiore, T., O'Reilly, F., & St. Pierre, R. (2006). Volume I: The SLIIDEA Sourcebook Report (1999-2000, 2002-2003, 2003-2004, and 2004-2005 School Years). Cambridge, MA: Abt Associates, Inc.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorTamara Daley
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy