Att_IDEA-NAIS.OMB Part A.FINAL 12.08

Att_IDEA-NAIS.OMB Part A.FINAL 12.08.doc

Individuals with Disabilities Education Act (IDEA) 2004 National Assessment Implementation Study (NAIS)

OMB: 1850-0863

Document [doc]
Download: doc | pdf









Cambridge, MA

Bethesda, MD

Chicago, IL

Durham, NC

Hadley, MA

Lexington, MA


Supporting Statement for Paperwork Reduction Act Submission to OMB:

Part A




Individuals with Disabilities Education Act (IDEA) 2004 National Assessment Implementation Study (NAIS)






June 25, 2008



Prepared for

Lauren Angelo

Project Officer

Institute of Education Sciences

U.S. Department of Education

555 New Jersey Ave., NW

Washington, D.C. 20208



Abt Associates Inc.

55 Wheeler Street

Cambridge, MA 02138-1168

Prepared by

Abt Associates Inc.




Contents


Appendix A: Copy of Statute

Appendix B: State Part C Coordinator Survey

Appendix C: State 619 Coordinator Survey

Appendix D: Start Part B Administrator Survey

Appendix E: District Part B Administrator Survey

Appendix F: Crosswalk of Research Questions and Survey Items

Appendix G: Mathematical Proof of Why the Potential for Bias Resulting from Not Refreshing Our Sample to Include “Births” in 2004 is Small

Appendix H: Calculation of Minimum Detectable Effects for District-Level Proportions Using the IDEA National Implementation Study (IDEA NAIS) District Sample






Part A Submission

Introduction and Overview of the Study

This document presents the Supporting Statement for the Individuals with Disabilities Education Act (IDEA) National Assessment Implementation Study (NAIS). The IDEA NAIS will examine how states and school districts have implemented the 2004 Amendments to IDEA (IDEA 2004). The foci are four key interrelated areas: (1) identification of children for early intervention and special education; (2) early intervention service delivery systems and coordination with special education; (3) academic standards and personnel qualifications; and (4) dispute resolution and mediation.


We are requesting OMB approval for the data collection activities associated with the IDEA NAIS at this time. The total data collection consists of three state-level surveys and one district-level survey. Data are to be collected from all states and from a nationally representative sample of school districts. In addition, extant data will be retrieved to examine state policies regarding identification and eligibility of children with disabilities for special education, highly qualified staff definitions, and general demographic information on states and school districts. Part A describes the justification for these data collection activities. Part B, submitted under separate cover, describes the statistical methods for selecting the nationally representative sample of school districts and describes the techniques to be used for analyses of the collected data.


The provision of educational services to children with disabilities has a relatively short history in the context of U.S. education. Enactment of the Education of the Handicapped Children’s Act (EHA), P.L. 94-142, in 1975, guaranteed that eligible children and youth with disabilities would have available to them a free and appropriate public education (FAPE), a program designed to meet each child’s unique educational needs in the least restrictive environment, and protection of the rights of children and their families through procedural safeguards. Part B of the law provides funds to states to assist them in providing FAPE to both pre-school aged children (3-5) and school aged children (6-21) with disabilities who are in need of special education and related services. To be eligible for funding under the Preschool Grants program (Part B, Section 619) a state must make FAPE available to all children with disabilities residing in the state, ages 3 through 5. Since P.L. 94-142 was first enacted, the U.S. Department of Education’s (ED) Annual Report to Congress has chronicled the status of the implementation of this landmark legislation. From these annual progress reports as well as some policy studies and Congressional testimony, Congress has reauthorized EHA eight times since its enactment in 1975. Major modifications made to EHA through these reauthorizations are described below.


Amendments to EHA in 1986 established the Early Intervention Program for Infants and Toddlers with Disabilities under Part H (now Part C). The Part C program assists states in developing and implementing a statewide, comprehensive, coordinated, multidisciplinary, interagency system to make early intervention services available to all children with disabilities from birth through age 2 and their families.


During the 1990 reauthorization of EHA, the legislation was renamed the Individuals with Disabilities Education Act (IDEA). The 1997 reauthorization of IDEA shifted legislative attention toward improving educational results for children and youths with disabilities while maintaining the emphasis on equal access to an education. Standards-based reforms provided the policy framework for this IDEA reauthorization, thus ensuring that students with disabilities could have access to the same challenging curriculum as other students and participate in assessments as a way to mark their progress toward improved results.


The 1997 amendments of IDEA also required a National Assessment “to examine how well schools, local education agencies, states and other recipients of assistance” were making progress toward a number of goals for children with disabilities and their families, in nine areas: (1) improving academic performance, (2) increasing participation in the general curriculum (3) improving transitions from preschool to school and from school to work; (4) increasing the placement of students with disabilities in the least restrictive environment; (5) decreasing the drop out rate; (6) increasing the use of effective strategies for addressing behavioral problems; (7) improving coordination of services; (8) reducing disputes; and (9) increasing parent involvement.


To complete the requested assessment, ED’s Office of Special Education Programs (OSEP) commissioned seven studies, three of which were topic-specific studies and four of which were longitudinal child outcomes-based studies. The topic-specific studies included: (1) a study of the costs of special education, (2) a study of personnel needs in special education and; (3) the Study of State and Local Implementation and Impact of the Individuals with Disabilities Education Act (SLIIDEA), a longitudinal study of the implementation of IDEA at the state, local and school levels.


The four longitudinal child outcomes studies addressed different populations of students with disabilities, including: (1) the needs of infants and toddlers served under Part C of IDEA; (2) pre-school aged children with disabilities served under Part B, Section 619; (3) elementary- and middle-school aged students with disabilities served under Part B; and (4) high school students and youth with disabilities served under Part B.


The most current reauthorization of IDEA, which occurred in 2004, again instructs the Department of Education to carry out a National Assessment of the law to measure: (1) progress in the implementation of IDEA 2004; and (2) the relative effectiveness of the law in achieving its purposes. A primary goal of the National Assessment of IDEA 2004 is to determine how IDEA is being implemented with a focus on the changes resulting from these most recent Amendments and ultimately how the implementation of state and local programs in response to IDEA 2004 are associated with academic and developmental outcomes for children with disabilities.


The IDEA NAIS under which the proposed data collection will occur is one of four studies being supported by ED as part of the National Assessment of IDEA 2004.1 The implementation study supported by the data collection will include all states and a nationally representative sample of school districts. The study will use three state level surveys to gather information on state policies and program implementation as it relates to (1) infants and toddlers with disabilities served under Part C of IDEA, (2) preschool children with disabilities served under Part B, Section 619 of IDEA, and (3) school aged children and youth served under Part B of IDEA. A fourth, district level survey will be used to gather local implementation data on children and youth served under Part B of IDEA.


The study will be conducted by Abt Associates Inc. and its subcontractors Westat and the Windwalker Corporation, under a 30 month contract with ED’s Institute of Education Sciences (IES), National Center for Education Evaluation.


A1. Circumstances Requiring the Collection of Data

Collection of information is needed for the IDEA National Assessment Implementation Study (NAIS) to assess state and school district implementation of IDEA 2004. As noted, above, the IDEA NAIS is one of a number of studies being conducted for the mandated National Assessment of Progress under IDEA 2004 (see Appendix A for statutory language). The studies to be carried out as part of the National Assessment of IDEA 2004 will build on and extend the knowledge generated from the studies completed as part of the 1997 National Assessment and will focus on changes resulting from the 2004 Amendments for which no systematic information has yet been collected.


These changes, which addressed primarily the Part B program, continued to align IDEA requirements with reforms underway in general education. Changes are reflected in new Individualized Education Program (IEP) provisions, and establishment of criteria for determining highly qualified special education teachers. Other major changes included new dispute resolution procedures, new special education eligibility determination procedures, and increased flexibility in use of funds to address disproportionate representation of selected demographic groups in special education.


The IDEA NAIS, a new implementation study, will gather information on state and local progress in implementing the new provisions of IDEA 2004 and will provide a comprehensive, representative national picture of the implementation of early intervention and special education policies and practices at the state and district levels. This information will be useful to Congress in considering the success of the IDEA 2004 changes in meeting desired goals, will provide information for future reauthorizations of the Act, and will be useful to ED, researchers and practitioners concerned with implementing policies and practices that will ultimately lead to improved outcomes for students with disabilities.


A2. Purposes and Uses of the Data

The data collected for the IDEA NAIS will be used by the U. S. Department of Education to report to Congress on the implementation of the 2004 Amendments to IDEA by states and districts. Failure to collect implementation data may result in the Department being unable to adequately report to Congress on the status of implementation of the 2004 Amendments to IDEA. The information can also be used to provide context for impact studies of the 2004 Amendments and to guide future projects on the implementation and impact of IDEA. Additionally, if this study was not completed ED and Congress would not have an accurate understanding of local and state special education policy and practice. The information from the implementation study will assist Congress in the reauthorization of the IDEA, and to further improve early intervention and special education services with the ultimate goal of improving outcomes for children with disabilities.






The data proposed to be collected for the study are needed to address 12 primary research questions listed below within the four focal areas for the study:


I. Identification of Children for Early Intervention and Special Education

  1. How do rates of identification of children for early intervention or special education vary according to the different disability definitions used by states?

  2. How does each state determine, under IDEA 2004, significant disproportionalities by race and ethnicity in the identification of students for special education for the districts within the state, and to what extent have districts been so identified?

  3. Which early intervening and/or Response to Intervention (RtI) strategies do districts use at various grade levels prior to the identification of children for special education?

  4. How do rates of identification for special education vary according to the use of different early intervening or RtI strategies? How do rates of identification for special education vary according to whether districts are required to adopt Early Intervening Services because of significant disproportionalities?


II. Early Intervention Service Delivery and Coordination

  1. Across states, what are the different models of service delivery for the Part C program supported through IDEA, and of coordination among the various early intervention and special education services provided through Part C and Part B?


III. Academic Standards and Personnel Qualifications

  1. Do state or district policies explicitly reference state academic standards, and do these policies require that goals and objectives on IEPs reference those standards?

  2. What are states and districts doing in terms of certification, professional development, and compensation for special education teachers to promote compliance with the “highly qualified teacher” provisions of IDEA and No Child Left Behind?

  3. How do the efforts of states and districts to provide a sufficient number of qualified early intervention and special education personnel for children with disabilities vary with state definitions of “highly qualified teacher” under NCLB?


IV. Dispute Resolution and Mediation

  1. Over time, have there been changes in the incidence of disputes between early intervention personnel and parents/guardians on issues related to identification for early intervention?

  2. How does the incidence of disputes between early intervention personnel and parents/guardians regarding identification for early intervention vary with the use of mediation by states and districts?

  3. Over time, have there been changes in the incidence of disputes between special education personnel and parents/guardians on issues related to special education services?

  4. How does the incidence of disputes between special education personnel and parents/guardians regarding special education services vary with the use of mediation by states and districts?


The IDEA NAIS is a descriptive study that is based primarily on four surveys that will provide a comprehensive picture of the State and local implementation of IDEA 2004 across the age ranges 0-21. Three state-level surveys will be fielded to collect data from: (1) State special education administrators responsible for programs providing special education services to school aged children with disabilities (6-21); (2) State 619 coordinators who oversee preschool programs for children with disabilities aged 3-5, and; (3) State IDEA Part C coordinators who are responsible for early intervention programs serving infants and toddlers. A fourth survey will be fielded at the district level to collect data from local special education administrators about preschool and school-age programs for children with disabilities aged 3-21. While state-level surveys will provide information on policies, guidelines, and supports/resources, the district surveys focus more on the strategies and procedures used to implement state policies.


Study Design

To assess the implementation of the 2004 Amendments to IDEA, the study plans to survey state administrators of IDEA Part C and Part B programs (including both preschool and school age programs) and a nationally representative sample of district Part B administrators. In addition, extant data gathered from state websites and taken from pre-existing databases will be utilized to eliminate duplicative efforts and enable longitudinal comparisons.


Data Collection Plan

This forms clearance submission covers the new data collection activities to take place in the IDEA NAIS: (1) a paper survey to be completed by State Part C Coordinators; (2) a paper survey to be completed by State 619 Coordinators; (3) a paper survey to be completed by State Part B Administrators; and, (4) a web-based survey to be completed by a nationally representative sample of local education agency (LEA or district) Part B Administrators. These surveys will be conducted once in the fall of 2008 between September 2008 and December 2008.


Surveys will be sent to three state-level administrators each of whom is responsible for a particular age group of children and students receiving services through IDEA. The State Part B Administrator survey will be completed by the individual in the state who is responsible for, and thus knowledgeable of, service to students between the ages of 6 and 21. The individual responsible for, and knowledgeable of, services to children between the ages of 3 and 5 will complete the State 619 Coordinator Survey. The State Part C Coordinator survey will be addressed by the individual in charge of services to children from birth to age 3. This group of three surveys is referred to as the “state surveys.”


The district-level survey will be sent to the district Part B administrator, typically the special education director, who is responsible for services to children between the ages of 6 and 21 in the district. This survey is referred to as the “district survey.” Following is a brief description of the state- and district-level surveys. Copies of the surveys can be found in Appendices B–E.


State Surveys

As noted, each state survey will ask the respondent questions about the particular IDEA funded program for which he/she is responsible. Information on policies, procedures, strategies and supports will be gathered for a variety of topics that are common to each of the state surveys, including:


  • Identification, eligibility and referral;

  • Early intervening services and response to intervention;

  • Personnel standards, and recruitment and retention issues;

  • State academic and developmental standards and their use in IEPs and IFSPs;

  • Parent/family involvement; and,

  • Frequency of disputes and uses of mediation and other dispute resolution strategies.


The state survey for Part C coordinators also asks about coordination with Part B preschool programs, program organization and structure, and funding and finance issues.


District Survey

A web-based questionnaire will be used to administer the district Part B administrator survey to a nationally representative sample of 1,200 district Part B administrators. The district survey will collect information on a series of topics that parallel those for the state surveys, including:


  • Identification, eligibility and referral;

  • Early intervening and response to intervention;

  • Personnel standards, and recruitment and retention issues;

  • State academic and developmental standards and their use in IEPs and IFSPs;

  • Parent/family involvement; and,

  • Frequency of disputes and uses of mediation and other dispute resolution strategies.


Extant Data

In addition to the survey collection for which form clearance is requested, extant data will be used to make use of data already available and enable longitudinal comparisons. Extant data retrieval will be conducted between January 2008 and December 2008.


A review of state websites will be conducted to provide information on State policies regarding: (1) disability and eligibility definitions for early intervention and special education; and (2) highly qualified teachers and other personnel providing early intervention and special education services.


Extant databases will be used to provide information to address the research areas. Publicly accessible databases including National Early Childhood Technical Assistance Center (NECTAC) Notes, Child Count Files from Ideadata.org, the Part B Annual Performance Review, and the National Association of State Directors of Special Education (NASDSE) Summary of Part B will be used to provide information on identification. The Part C Update, NECTAC Summary, Center to Inform Personnel Preparation Policy and Practice in Early Identification and Preschool Education Data Report, and the Part B and C Annual Performance Reviews will be used to provide information on service delivery and coordination. Part B Annual Performance Reviews, National Center on Education Outcomes (NCEO), Participation/Accommodation Data and AYP data, and the Education Commission of the States Child Outcomes Summary Form (2006), the Schools and Staffing Survey (SASS), and the Integrated Education Data System (IPEDS) will be used to provide information on academic standards and personnel qualifications. Additionally information from the Department’s EDFacts will be used to provide contextual information. Along with public use and EDFacts data, data from the Study of State and Local Implementation and Impact of the Individuals with Disabilities Education Act (SLIIDEA) 2 will be used to explore if there have been any changes over time in the frequency, reason for, or mechanisms used to resolve disputes. SLIIDEA was completed as part of the National Assessment of the 1997 Amendments to IDEA.

Appendix F contains a crosswalk that shows which survey items and extant data sources will be utilized to address the 12 research questions guiding the IDEA NAIS.


A3. Use of Information Technology to Reduce Burden

The data collection plan for the IDEA NAIS reflects sensitivity to issues of efficiency, accuracy, and respondent burden. Where feasible, information will be gathered from existing sources, such as state education agency websites and data warehouses, including EDFacts. The data collection plan utilizes both paper and web-based surveys for primary data collection.

Web-based surveys are gaining in popularity as increasing numbers of respondents have e-mail and/or internet access. Advantages associated with web-based surveys include potential substantial cost saving as no duplication or mailing costs are incurred. Advances in the development and administration of web-based surveys enable complex skip patterns in surveys to be invisible to respondents. Additional costs associated with data entry are not incurred as the respondent enters data while completing the survey leading to improved data quality as well. The nature of data entry in web-based surveys also leads to decreased costs associated with processing and increased data collection speed. Disadvantages associated with web-based surveys include: possible technical issues associated with the design of the survey or connection to the internet, and the differential use and familiarity of computer technology by different groups of people.


The IDEA NAIS study team has extensive experience with both paper and web-based surveys. Both methods were selected for the IDEA NAIS (i.e., paper for the state surveys, web for the district survey) to be the most efficient and appropriate for specific respondent groups. The use of different survey methods is driven primarily by sample size. Each state survey will be administered to 51 respondents. The district survey will be administered to 1200 respondents. Additionally, the district survey includes many more complex skip patterns than the state surveys. As noted earlier, programming of the skip patterns into the web-based survey will increase the chances of individuals responding to only those questions which are pertinent to their district. Thus, a web-based survey was selected as the method most likely to result in accurate study data for the district survey. Programming costs associated with a web-based approach did not seem warranted for the state surveys, which have only 51 respondents each.


A4. Efforts to Avoid Duplication

There are two areas where duplication could arise in this type of study. First, the research questions may have been addressed in other studies. Second, the study may collect data which is already available through other sources. To address the first area of potential duplication of effort, we reviewed previous and on-going studies of IDEA implementation to determine if these studies could yield data sufficient for addressing the research questions for the IDEA NAIS. While a number of studies have examined similar issues as part of the 1997 National Assessment (e.g., the Pre-Elementary Education Longitudinal Study, the Study of Personnel Needs in Special Education) none can provide data on implementation issues reflecting IDEA 2004. Moreover, the 2004 Amendments to IDEA changed service provision sufficiently to render many of the findings of previous studies unrelated to current regulations. Additionally, the 2004 Amendments added new provisions that have not yet been studied. Thus, we are certain that currently available data on IDEA implementation is not sufficient to address the research questions.


We addressed the second potential area of duplication through careful review of existing publicly accessible databases that provide national estimates related to special education programs. This resulted in identification of data elements which can be used in conjunction with survey items to answer the research questions. For example, we will be using data compiled by the Education Commission of the States to examine state definitions of highly qualified personnel for special education teachers. We will also use existing demographic data available from databases maintained by the National Center for Education Statistics, from the website www.ideadata.org and from EDFacts, to gather state and district background and contextual information.


A5. Economic Impact on Small Businesses or Entities

No small businesses will be impacted by this project. The primary entities for this study are school districts and state agencies. Burden is minimized for all respondents by requesting only the minimum information required to achieve the study objectives. Study contractors will carefully specify information needs; questions to districts and states will be restricted to generally available information maintained in district and state administrative records.


A6. Consequences of Not Collecting the Information

Failure to collect the information proposed in this request will result in non-compliance with the 2004 IDEA as the study was mandated by Congress in Section 664 of the law (see Appendix A). Furthermore, not collecting the data will prevent the U. S. Department of Education from assessing the progress of states and school districts in implementing the 2004 Amendments to IDEA, reporting to Congress on this progress, and making improvements to policies and services for children with disabilities. Additionally, there would be a lack of comprehensive and representative information on early intervention and special education state policies and implementation available to researchers and practitioners.


A7. Special Circumstances

There are no special circumstances required for the collection of this information.


A8. Consultation Outside the Department of Education

Public comments were received and ED responded. These comments and responses are attached. A team of researchers and analysts at Abt Associates and Westat, led by Dr. Fran O’Reilly, developed the survey instruments and data collection procedures. In addition, the Abt team utilized three consultants: Dr. Beth Rous (University of Kentucky), Dr. Margaret McLaughlin (University of Maryland), and Ms. Sharon Walsh (Walsh Taylor In.).


The study employs a technical work group (TWG), consisting of eight recognized experts on special education services to children from birth to three years (Part C services), children between three and five years of age (Part B, Section 619 services), and children between six and 21 years of age (Part B services). The following individuals serve on the TWG:


Ms. Diana Allen, Cascade Regional Services

Dr. Mary Brownell, University of Florida

Dr. Mary-Beth Bruder, University of Connecticut Health Center

Dr. Pia Durkin, Attleboro School District

Dr. Mary-Beth Fafard, Brown University

Dr. Douglas Fuchs, Vanderbilt University

Dr. Jane Rhyne, Charlotte-Mecklenburg Public Schools

Dr. Patricia Snyder, University of Florida


A9. Payment or Gifts to Respondents

No payment or gifts to respondents will be provided.


A10. Assurances of Data Privacy

None of the information collected will be reported or published in a manner that would identify individual respondents.


Abt Associates and its subcontractors follow the confidentiality and data protection requirements of IES (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183), which requires that all collection, maintenance, use and wide dissemination of data conform to the requirements of the Privacy Act of 1974 (5 U.S.C. 552a), the Family Educational Rights and Privacy Act of 1974 (20 U.S.C. 1232g), and the Protection of Pupil Rights Amendment (20 U.S.C. 1232h). The Institutional Review Boards at Abt Associates and at Westat, respectively have certified all members of the study team having access to the data as having received training in the importance of confidentiality and data security.


Abt Associates and its subcontractors will protect the confidentiality of all information collected for the study and will use it for research purposes only. The privacy procedures adopted for this study for all data collection, data processing, and analysis activities include the following:


  • All study respondents will be assured that strict rules will be followed to protect their privacy. The report prepared for this study will summarize district findings across the sample and will not associate responses with a specific district or individual. We will not provide information that identifies districts, or individuals to anyone outside the study team, except as required by law. However, policies and resources/supports may be reported by state.

  • Paper surveys will be sent to respondents and returned to Abt Associates via Federal Express. Using Federal Express ensures both rapid transmission of the material and allows for the tracking of surveys should problems arise.

  • To ensure data security, all individuals hired by Abt Associates Inc. are required to adhere to strict standards and sign an oath of confidentiality as a condition of employment. Abt’s subcontractors will be held to the same standards.

  • Hard-copy data collection forms will be delivered to a locked area for receipt and processing. Abt Associates and Westat maintain restricted access to all data preparation areas (i.e., receipt, coding, and data entry). All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only.

  • Individual identifying information will be maintained separately from completed data collection forms and from computerized data files used for analysis. No respondent identifiers will be contained in public use files made available from the study, and no data will be released in a form that identifies individual respondents, districts or states.

  • All study staff involved in data collection or analysis will go through security clearance as mandated by the Department of Education.


  • Identifiable data will be kept for three years after the project end date and then destroyed. Written records will be shredded and electronic records will be purged.


A11. Questions of a Sensitive Nature

None of the four surveys include any questions of a sensitive nature.


A12. Estimate of Respondent Burden

Exhibit 1 presents our estimates of response burden for the data collection activities requested for approval in this submission: three state surveys (Part B, Part C and 619) and one district survey (Part B). Each data collection activity will be administered once. The time required to complete the survey is based on cognitive interviews of the instruments with approximately six retired or former respondents for each survey.


Assuming all 1,353 respondents complete the survey, the total reporting burden associated with this data collection is 2,553 burden hours (see Exhibit 1). The study will be carried out over roughly two and a half years (30 months), from September 2007 to March 2010, with an annual burden of 1,021 hours for 541 respondents.


Exhibit 1: Respondent Burden Estimates

Informant/Instrument

Number of Responsesa

Mean Time per Response (Hours)

Total Respondent Time (Hours)


State Part B Coordinator

Survey

51

1

51


State Part C Coordinator

Survey

51

1

51


State 619 Coordinator

Survey

51

1

51


District Part B Administrator

Survey

1200

2

2400


Total

1353


2553



a Total respondents equal 50 states plus District of Columbia for state surveys and a random sample of 1200 school districts for the district-level survey.




A13. Estimate of Total Capital and Start-up Costs and Maintenance Costs to Respondents or Record-Keepers

There are no annualized capital/start-up or ongoing operation and maintenance costs involved in collecting the information.


A14. Estimate of Costs to the Federal Government

The estimated cost to the federal government to carryout the IDEA NAIS is $2,271,022. As the study will be carried out over roughly two and a half years (30 months), the annual cost of the data collection in the Request for OMB Approval of Data Collection Instruments and the analysis of these data is $908,409.


A15. Changes in Burden

There is a program change of 1,021 hours as this is a new collection.


A16. Plans for Analysis, Tabulation, Publication, and Schedule

Analysis and Tabulation

This section describes our approach to empirical analyses of data for the IDEA NAIS. The types of data being collected through the surveys and extant data as discussed earlier primarily will provide descriptive information on the processes and strategies in place at the state and district levels regarding the implementation of IDEA 2004 in the four broad areas targeted for this study: (1) identification of children for early intervention and special education; (2) Part C service delivery systems and coordination with the Part B program; (3) academic standards and personnel qualifications; and (4) dispute resolution and mediation.


Most of the research questions will be easily addressed through the use of simple descriptive statistics such as means and percentages, as well as cross-tabulations to illustrate the distribution of policies and procedures across states and districts with varying characteristics. We will also have the opportunity to examine several outcomes (identification rates and incidence of disputes), and to contrast outcomes across populations and time frames. For example, some of the research questions require that we examine relationships between various inputs (disability definitions, Early Intervening Services (or EIS)) and variation in these outcomes. Our general approach to the analytic methods we propose to use and the methodological issues associated with the required analyses are described below. The discussion focuses on methodological issues as they relate to:


  • type of inference (simple descriptive, change over time, differences among groups);

  • unit of analysis (state, district);

  • time frame covered by analysis (single time-point, longitudinal) ; and,

  • missing data (survey non-response, item non-response)


In the course of conducting the analyses to answer the research questions, practically every combination of these four topics will come into play for both the state and district level data. As will be evident from examples, the analytic methods we describe below cut across all 12 research questions. We discuss in turn our approach to each type of analysis.


Descriptive Analyses: Single Time Point

State-level Data

The State-level surveys (Part B, Part C and 619) will be administered to all 50 states and the District of Columbia. No survey non-response is anticipated for two reasons. First, our experience with collecting data from State agency respondents demonstrates that it is reasonable to expect there will be no survey non-response. In the SLIIDEA study, we received 100 percent state response rates for all waves of data collection (Schiller et al., 2006, p. 11). Westat has had similar successful response rates in their data collection efforts (e. g., Markovitz et al., 2006, p. 15). Both Abt Associates and Westat staff have long-standing working relationships with state administrators that facilitate data collection efforts. Second, as IDEA is a Federal grant program, states technically are required to participate in studies such as this one in order to continue receiving funds (Education Department General Administrative Regulations (Sec. 75.591, 20 U.S.C. 1221e-3 and 3474)). A letter from ED reminding respondents of this obligation will also facilitate the expected high response rate. Therefore, we expect the questionnaire responses will represent a census of the states.


State-level analyses will generally be presented in the form “the percent of states that....” Consider, for example, the survey item we intend to include on the State Part B Administrator survey:


  1. What best describes the status of your state’s progress in defining significant disproportionality? Select one.


  1. Our state’s definition of significant disproportionality for 2008-09 is finalized and no changes are anticipated



  1. Our state’s definition of significant disproportionality for 2008-09 is finalized but we are planning modifications or revisions in the coming year



  1. Our state’s definition of significant disproportionality for 2008-09 is in the process of being developed


In the case of no missing data, the calculation of the percent of states reporting that their state’s definition of disproportionality for the 2008-09 school year is finalized with no changes anticipated (response option a) is simply 100 multiplied by the number of states that selected response option a, divided by 51. We represent this calculation algebraically as:



where X = 1 if the state’s response is a and X = 0 if the state’s response is not a, and represents summation over the 51 responses (50 states and the District of Columbia).


Since the data are a census, rather than a sample, there is no need for calculation of standard errors or confidence intervals as these are statistical concepts that apply to sample data. It is common to present the standard error of an estimate or a 95% confidence interval around an estimate, but in this case, the percent calculated is not an estimate but is in fact the true population value. An illustrative table for this type of state level analysis is shown in Exhibit 2.


Exhibit 2: Percent and Number of States Reporting Status of Definition for Significant Disproportionality (2008-2009 School Year)

As of 2008 – 09, our state’s definition of significant disproportionality is:

Yes

No

Missing

Total a

N

%

N

%

Finalized and no changes are anticipated







Finalized but modifications or revisions are planned for the coming year







In the process of being developed







Source: State Part B Administrator Questionnaire – Item 1

a Total refers to the number of states that answered the question.


Although we do not anticipate survey non-response, we will treat item non-response and survey non-response in the same manner. First, we will indicate in our tables the number of missing responses (see Exhibit 2). Additionally, we will decrease the denominator by the number of missing responses. This means the summation represents the number of states responding to the item.


District-level Data

Unlike state-level data where we have a census of respondents, we are collecting data for a sample of school districts. That is, the District Part B Administrator Survey will be administered to a nationally representative sample of school districts that includes a planned overlap sample with SLIIDEA to support longitudinal analyses. By “overlapping samples” we mean that a proportion of districts in the IDEA NAIS district sample were also in the 2004-2005 SLIIDEA sample, but there will also be districts in each of the two study samples that are not in the other. The analyses described above for states are not fully applicable to the district level data because we will be using estimates to represent all school districts in the nation; therefore, different analytic techniques are required. As IDEA is a Federal grant program, districts technically are required to participate in studies such as this one in order to continue receiving funds (Education Department General Administrative Regulations (Sec. 75.591, 20 U.S.C. 1221e-3 and 3474)).


In reporting our results for school districts, we will often make statements that begin with, “the percent of districts that ....” We will design our analyses such that the interpretation of “percent of districts” corresponds to the percent of districts out of all school districts in the country, not just the school districts that happen to be in the sample. In order to calculate statistics that are nationally representative, the sampling design must be taken into account. We provide the calculation algorithm below. Note that if the survey item is dichotomous (0/1), then the process described below to estimate a mean actually results in the estimation of a proportion. Multiplying the proportion by 100 will give a percentage.


Let:

yhi be the response on a survey item for district i in stratum h,

whi = the sampling weights for district i in stratum h,

= the estimator of the population percentage,

= the estimator of the population mean,

= the estimator of the population total,

= the estimator of the number of elements (districts) in the population,

h = 1, ..., L enumerate the strata (for the current design, L=12),

i = 1, ..., nh , enumerate the sampled districts in stratum h, note that the

districts are the primary sampling units (PSUs), and nh is the number

of sampled districts in stratum h.


Then:

,


,


,



The estimator given above for is known as a combined ratio estimator. We note that the sum of the sample weights, , is an estimate of the number of school districts in the population. When we know the true population value of M, as we do in the current example where we know the number of districts in each stratum, we also have the option of using a separate ratio estimator. The separate ratio estimator is defined as:


where, is the sum of the known school district sizes and L is the number of strata.


The separate ratio estimator estimates the ratio within each stratum and then forms a weighted average of these separate estimates into a single estimate of the population ratio while the combined ratio estimator is a ratio estimate pooled over all strata.  In cases were there is a lot of variation among strata means, the separate ratio estimator is frequently chosen as it takes advantage of the extra efficiency provided by the stratification and provides a more precise estimate. Determination of whether we will use the combined ratio estimator or the separate ratio estimator will be made based on the amount of variation among strata means. If the stratum by stratum ratio estimates are nearly equal we will use the combined ratio estimator. If they are very different we will use the separate ratio estimator.


The expected precision of the estimates with a sample of 1200 LEAs and an expected 80 percent response rate (previously achieved in Schiller et al., 2006, p. 11) would be plus or minus 3.9 percentage points assuming a design effect of 1.6 because of the unequal sampling weights. As IDEA is a Federal grant program, districts are technically required to participate in studies such as this one in order to continue receiving funds (as are states). A letter from ED reminding respondents of this obligation will also facilitate the expected high response rate.


Statistical Software for Calculating Parameter Estimates and Standard Errors. The estimator of the population mean, shown above, can be easily calculated in statistical software packages that are designed for analysis of complex survey data including the estimation of mean and variance. We can use the variance estimates to produce standard errors and 95% confidence intervals around the estimates of the population means for the district level data.


Exhibit 3 provides an example of this type of analysis at the district level.


Exhibit 3: Percent of Districts That Are Required to Provide EIS Due to Significant Disproportionality (2008-2009 School Year)


Yes

No

Missing

Totala

%

(SE)

%

(SE)

N

N

Required to provide EIS







Source: District Part B Questionnaire - Item 1

a Total refers to the number of districts that answered the question.

Note: Percentages and standard errors are population estimates calculated from weighted data. The numbers shown for missing and total Ns were calculated from unweighted data.


Descriptive Analyses: Change Over Time Analyses

By “change over time analyses,” we mean simple descriptions of change in a variable over time. This is distinct from a model where we try to assess the relationship between some predictor variable(s) and the change in the outcome variable over time. For the IDEA NAIS, we will be able to examine change over time at both the state and district levels where we have repeated items from SLIIDEA.


State-level Data

State-level change over time analyses will utilize extant SLIIDEA data, and data from a new survey of state Part B administrators. Since the state data at each time point represent a census rather than a sample of states, the summary tables will not include standard errors, confidence intervals or p-values. Any differences observed between time points will represent a difference in the true mean. There will be no sampling uncertainty associated with any of the numbers in the table. Exhibit 4 provides an example of this type of analysis at the state level.


Exhibit 4: State Completed Formal Mediations During the Last School Year




2002-2003 School Year
(SLIIDEA)

2003-2004 School Year
(SLIIDEA)

2004-2005 School Year
(SLIIDEA)

2007-2008 School Year
(IDEA NAIS)

Difference (2007-2008 – 2004-2005)

Yes %

Yes N

Total N

Yes %

Yes N

Total N

Yes %

Yes Na

Total N

Yes %

Yes N

Total N

Percentage Points

Formal mediations

100.0

50

50

98.0

49

50

96.0

48

50





a N includes one state that did not provide us number of formal mediations completed, but where follow-up questions implied at least one such mediation was completed.

Sources: SLIIDEA Wave 2 State Questionnaire – Item 16 ; SLIIDEA Wave 3 State Questionnaire – Item 18; SLIIDEA Wave 4 State Questionnaire – Item 17; IDEA NAIS State Part B –Item 42


District-level Data

District-level change over time analyses will utilize extant SLIIDEA data and new data from the IDEA NAIS of a nationally representative sample of 1200 districts. An example of a change over time analysis at the district level is given in Exhibit 5. It shows the percent of districts that conducted or participated in various types of dispute resolution procedures as estimated from three separate school years. Data from the first two years (columns labeled 2002-2003 and 2004-2005, respectively) will come from the extant SLIIDEA data set and will be transcribed from previous SLIIDEA reports. Data for the third year (column labeled 2007-2008) will be obtained from administration of the IDEA NAIS survey of district Part B administrators to the nationally representative sample of 1200 districts and will be calculated as described earlier. Examination of this table will provide descriptive data on whether the percentages of districts that conducted or participated in various dispute resolution procedures fluctuated over time. The example table shows means and standard errors. The final column of the table will show the difference between the means for the 2007-2008 school year and the 2004-2005 school year and the standard error of the difference. A mean difference that is greater in absolute value than two times the standard error of the difference is a statistically significant difference at the .05 level.


Exhibit 5: District Conducted or Participated in Dispute Resolution Procedures


% (SE) of Districts that participated in procedures

Type of Procedure

2002-2003 School Year

(SLIIDEA)


2004-2005 School Year

(SLIIDEA)


2007-2008 School Year

(IDEA NAIS)

Difference
(2007-2008 – 2004-2005)

Informal dispute resolution procedure

25.4 (2.4)

25.6 (2.3)



Mediation following a due process request

11.9 (1.5)

12.3 (1.5)



Impartial due process hearing

10.5 (1.3)

10.4 (1.3)



State administrative review of hearing decision

3.7 (1.2)

3.2 (0.9)



State/Federal court review of hearing decision

1.8 (0.5)

1.3 (0.3)



Litigation

3.3 (0.7)

3.6 (0.7)



Sources: SLIIDEA Wave 1 District Questionnaire – Items K1a-K1d, K2; SLIIDEA Wave 2 District Questionnaire – Items 25, 26a-26f ; SLIIDEA Wave 4 District Questionnaire – Items 27, 28a-28f; IDEA NAIS District Part B – Items 61, 62a-62g


The analytically challenging task for this type of analysis is the calculation of the standard error of the difference (i.e., the last column in Exhibit 5). The challenges arise from the need to take into account that the data from the 2008-2009 IDEA NAIS survey and the 2004-2005 SLIIDEA surveys are comprised of overlapping samples, and that the estimates at each time are derived from different samples having different sampling weights and potentially different design effects. The method proposed below for calculating the standard error of the difference in overlapping samples was adapted from the method described by Kish (1965).


We are interested in estimating the difference between two population proportions from two points in time. In addition to reporting the estimate, the precision of the estimate needs to be reported by computing the standard error of the estimated difference. The simple case has two independent samples at the two time points, thus, the variance of the difference between the two sample proportions is simply the sum of the variance of the first proportion and the variance of the second proportion.


The IDEA NAIS study, however, is not an example of the simple case. Rather, there is a planned overlap of some school districts between the two samples.  Therefore, the variance of the difference is no longer a sum of the variances; rather, it also includes a covariance. The variance of the difference is smaller than if there were two independent samples. The degree to which the variance in the estimate for the overlapping samples is reduced depends on the amount of overlap and the correlation coefficient between the estimates at two time periods. Kish (1965) provides a formula that computes the variance of the difference taking into account the amount of overlap and the correlation between two time periods as described below.

Let denote the estimated proportion from the 2007-2008 sample, with sample of size . Let denote the proportion from the 2004-2005 sample, with sample of size . Let denote the amount of overlap between the two samples. We are interested in obtaining the standard error of the difference between the two sample proportions. We can write the estimated variance of the difference between the two sample proportions as:



where, is the estimated variance of the 2008-2009 proportion based on a sample of units and is the estimated variance of the 2004-2005 proportion based on units.


Under simple random sampling, the estimated variance of the difference in two sample proportions becomes


where is the proportion having the attribute in both the samples based on a sample of units. The district-level data from the two samples must be merged to calculate the quantity .


For estimating the variance under the complex design used in SLIIDEA and proposed for the current study, we can first estimate the variance under simple random sampling using the formula given above but with weighted proportions. Then, we multiply the variance by the design effect, or the average of the design effects from the two studies.


To implement this method we will obtain the variances under the complex design for the two samples using SUDAAN as the values for and , and estimate the covariance as , where the correlation term is calculated as the correlation between 2008-2009 and 2004-2005 measurements for the districts that were measured at both time points. Merging of district-level data is required to calculate this quantity.


The square root of the variance gives the standard error of the difference in the two proportions recognizing that we have overlapping samples and thus that the samples are not independent. The standard error can be reported in a table as shown above in Exhibit 5, or can be used to create 95% confidence intervals for the difference, or used in a statistical test of the null hypothesis of equivalent proportions in the two groups.


A Design Limitation for Longitudinal Analyses

The comparison of results from the IDEA NAIS survey to the results from the 2004-2005 SLIIDEA survey would ideally represent the differences between the national populations at those two time points. However, a limitation of the current design arises from the fact that the SLIIDEA sample of districts was selected to be representative of the national population of school districts in school year 1997-1998. Those districts were followed over time, with the final wave of data collection occurring during the 2004-2005 school year. Since the SLIIDEA sample was not refreshed each year to reflect any new districts that came into existence after 1997, it may be a biased estimator of the national population of districts in 2004-2005. It would not be a biased estimator if the causal mechanism of new births of districts that occurred between 1997 and 2004 were completely uncorrelated with the outcomes measured. We cannot know that correlation. However, we do know that the potential for bias is small because less than 2% of the districts that existed in 2004 did not exist in 1997.


The estimate we have in the SLIIDEA study is basically an estimate of what the 1997 districts were doing in 2004 because we did not update our sample to include "new births." The number of new births (2%) is so small that even if we had included them all they would not change our estimate by very much (see Appendix G for the proof supporting this claim).

Analyses of Differences Among Groups

The main purpose of the study is to describe state and district polices and practices related to early intervention and special education services. However, not all states or districts are similar in how they choose to implement early intervention and special education services to children with disabilities. Thus, the study will also examine differences among groups on selected policies and practices.


State-level Data

The twelfth research question of the study which asks “How does the incidence of disputes between special education personnel and parents/guardians regarding special education services vary with the use of mediation by states?” The response to this question will require classification of states into groups defined on the basis of how often they use mediation, and comparison of dispute rates between those groups. The comparisons would likely involve showing the mean, median, minimum, and maximum. As discussed in previously described analytic approaches, since the state data represent a census rather than a sample of states, the summary tables will not include standard errors, confidence intervals or p-values. Any differences observed between groups represent true differences. There will be no sampling uncertainty associated with the state data. An illustrative example is provided in Exhibit 6.


Exhibit 6: Rate of Impartial Due Process Hearings Completed by Number of Formal Mediations completed by States



Rate of Impartial Due Process Hearings





Range

Number of Formal Mediationsa

N

Mean

Median

Low

High

0






1-20






20-50






50+






a Categories will be created based on the distribution of responses to this item and may include number of mediations per 10,000 students with disabilities.

Source: IDEA NAIS Extant State Policy Data and Part B Survey – Item 42, 47, 48,


District-level Data

We also anticipate comparing outcomes among groups of districts in the nationally representative district sample. An example of a research question requiring this type of analysis is research question 4a. It asks, “How do the rates of identification for special education vary according to use of different early intervening strategies?” This analysis will involve the classification of districts into mutually exclusive and exhaustive categories based on distribution of early intervening service (EIS) activities and resources. Summary tables will show the mean and standard error (or 95% confidence intervals) of identification rates for each classification group of districts (e.g., mandatory; voluntary) (see Exhibit 7 for an example of such a table). Other statistics such as the minimum, maximum, and median for each group may also be deemed as informative and included in the tables.


Exhibit 7: Rate of Identification for Special Education Students by Distribution of Early Intervening Services Activities or Resources



Rate of Identification of Special Education Students





Range

P-value

Early Intervening Services Activities

N

Mean (SE)

Median

Low

High

Early Intervening Services activities or resources are not distributed to individual schools







Early Intervening Services activities or resources are distributed only to schools with evidence of significant disproportionality






Early Intervening Services activities or resources are distributed to all schools, regardless of whether they show significant disproportionality






Source: IDEA NAIS District Part B – Item 3


Publication and Study Schedule

The schedule for key study activities and published reports is presented below.


Study Schedule

Key Activity or Report

Timing

Draft Data Collection and Analysis Plan

April 2008

Final Data Collection and Analysis Plan

May 2008

OMB Data Collection Package Submitted

May 2008

Survey Data Collection

September 2008 – December 2008

Extant Data Retrieval

January 2008 – December 2008

Report on Completion of Data Collection

February 2009

First Draft Study Report

June 2009

Final Study Report

July 2009

Draft Data File and Documentation

December 2009

Final Data File and Documentation

March 2010



A17. Approval To Not Display Expiration Date

The expiration date for OMB approval will be displayed on the survey forms. No exemption is requested.


A18. Exceptions to the Certificate Statement

The submission describing data collection requires no exemptions to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).



References

Dillman, D. A. (2000). Mail and internet surveys: The tailored design method. New York, NY: J. Wiley.

Edwards, P., Roberts, I., Clarke, M., DiGuiseppi, C., Pratap, S., Wentz, R., et al. (2002). Increasing response rates to postal questionnaires: Systematic review. British Journal of Medicine, 324, 1184-1191.

Kish, L. (1965). Survey Sampling. New York, NY: John Wiley & Sons, Inc.

Markovitz, J., Cason, E., Frey, W., Riley, J., Shimshak, A., Heinzen, H., et al. (2006). Preschoolers' Characteristics, Services and Results: Wave 1 Overview Report from the Pre-Elementary Education Longitudinal Study (PEELS). Rockville, MD: Westat.

Schiller, E., Fritts, J., Bobronnikov, E., Fiore, T., O'Reilly, F., & St. Pierre, R. (2006). Volume I: The SLIIDEA Sourcebook Report (1999-2000, 2002-2003, 2003-2004, and 2004-2005 School Years). Cambridge, MA: Abt Associates, Inc.

U. S. Department of Labor. (2005). The National Compensation Survey. Retrieved March 14, 2008, from www.bls.gov/ncs/home.htm#overview.

1 The other three studies include: (1) analyses of extant data on special education; (2) a study of the impact of NLCB requirements on schools in need of improvement as a result of students with disabilities not meeting AYP requirements; and (3) an impact study of the implementation of Response to Intervention policies included in IDEA 2004.

2 The Study of State and Local Implementation and Impact of IDEA (or SLIIDEA) was a longitudinal study collecting data from states in 1999 – 2000, 2002-2003, 2003-2004, and 2004-2005. District surveys were administered in 1999-2000, 2002-2003, and 2004- 2005.

File Typeapplication/msword
File TitleSupporting Statement for Paperwork Reduction Act Submission to OMB:
AuthorCay Bradley
Last Modified ByDoED User
File Modified2008-12-15
File Created2008-12-15

© 2024 OMB.report | Privacy Policy