Att_IDEA NAIS OMB Part B.FINAL 12.08

Att_IDEA NAIS OMB Part B.FINAL 12.08.doc

Individuals with Disabilities Education Act (IDEA) 2004 National Assessment Implementation Study (NAIS)

OMB: 1850-0863

Document [doc]
Download: doc | pdf









Cambridge, MA

Bethesda, MD

Chicago, IL

Durham, NC

Hadley, MA

Lexington, MA


Supporting Statement for Paperwork Reduction Act Submission to OMB:

Part B




Individuals with Disabilities Education Act (IDEA) 2004 National Assessment Implementation Study (NAIS)


Final




June 25, 2008



Prepared for

Lauren Angelo

IES/NCEE

U.S. Department of Education

555 New Jersey Ave., NW

Washington, D.C. 20208



Abt Associates Inc.

55 Wheeler Street

Cambridge, MA 02138-1168

Prepared by

Abt Associates Inc.





Contents

B. Collection of Information Employing Statistical Methods 1

Introduction 1

B1. Respondent Universe 1

B2. Procedures for the Collection of Information and Limitations of the Study 2

B3. Methods to Maximize Response Rates and Deal with Issues of Non-response 9

B4. Tests of Procedures or Methods 9

B5. Names and Telephone Numbers of Individuals Consulted 10

References 11


Appendix A: Copy of Statute

Appendix B: State Part C Coordinator Survey

Appendix C: State 619 Coordinator Survey

Appendix D: Start Part B Administrator Survey

Appendix E: District Part B Administrator Survey

Appendix F: Crosswalk of Research Questions and Survey Items

Appendix G: Mathematical Proof of Why the Potential for Bias Resulting from Not Refreshing Our Sample to Include “Births” in 2004 is Small

Appendix H: Calculation of Minimum Detectable Effects for District-Level Proportions Using the IDEA National Implementation Study (IDEA NAIS) District Sample





B. Collection of Information Employing Statistical Methods

Introduction

The IDEA NAIS is one of a number of studies being conducted by ED to address the congressionally mandated study of IDEA 2004. The IDEA NAIS will examine how states and districts have implemented the 2004 Amendments to the Individuals with Disabilities Education Act (IDEA 2004). The foci are four interrelated areas: (1) identification of children for early intervention and special education; (2) early intervention service delivery systems and coordination with special education; (3) academic standards and personnel qualifications; and (4) dispute resolution and mediation.


The IDEA NAIS is a descriptive study that is based primarily on four surveys that will provide a comprehensive picture of the State and local implementation of IDEA across the age ranges 0-21. Three state-level surveys will be fielded to collect data from: (1) State special education administrators responsible for programs providing special education services to school aged children with disabilities (6-21) (State Part B administrators); (2) State 619 coordinators who oversee preschool programs for children with disabilities aged 3-5, and; (3) State IDEA, Part C coordinators who are responsible for early intervention programs serving infants and toddlers from birth to three years. A fourth survey will be fielded at the district level to collect data from local special education administrators about preschool and school-age programs for children with disabilities aged 3-21 (district Part B administrators).


B1. Respondent Universe

In this section, we describe the proposed target population for each State survey and the proposed sampling design for the nationally representative sample of school districts will receive the district level survey.


State Surveys

State Part B Administrators Survey

Implementation questions spanning all four of the major topic areas for the implementation study require state-level data on special education programs, policies and services to provide a comprehensive picture of how state education agencies are implementing IDEA, with a particular focus on changes resulting from the 2004 IDEA Amendments. Our experience collecting data on special education issues suggests that the state special education director is the most appropriate respondent for the majority of items to be addressed given their experience, knowledge and daily responsibility for the provision of special education services in the state (Schiller et al., 2006). IDEA Part B administrators from all 50 states and the District of Columbia, will be administered this questionnaire; thus there are no sampling considerations.


State 619 Coordinator Survey

In most states, there is an administrator specifically for preschool programs serving children with disabilities who is not the same administrator responsible for school-age programs for students with disabilities referenced above. This administrator, commonly referred to as the Section 619 Coordinator1 even may be located in a separate agency (e.g., early care) or office from the State Part B administrator. To obtain information on the study topics as they relate to preschool special education programs, policies and practices, the State 619 coordinator would be the appropriate respondent for this survey. A census of State 619 coordinators will be administered this survey. Thus, there are no sampling considerations.


State Part C Coordinator Survey

The IDEA NAIS will collect state level information about Part C programs, including the organization and structure of Part C in each state, as well as policies and procedures related to eligibility and identification, coordination with the Part B program, funding and financing, staffing and personnel requirements, early learning standards, family involvement and disputes and mediation. The Part C coordinator at the lead agency for each state’s Part C program would be the respondent most knowledgeable about these topics and how IDEA is being implemented for this population of children. Similar to the other state level surveys, we will administer the Part C survey to the census of state Part C Coordinators. Thus, there are no sampling considerations for this survey.


District Survey

Implementation questions spanning all of the implementation study areas require data on district special education policies and practices. Our experience collecting data on local special education issues suggests that the local special education administrator is the most appropriate respondent for the majority of items to be addressed due to their expertise and role in the district. Unlike the state surveys where we will administer questionnaires to the population of respondents, here we will administer the survey to a sample of 1,200 local school districts, selected in accordance with the sampling plan provided below.


B2. Procedures for the Collection of Information and Limitations of the Study

The following nine steps will be followed in the course of survey administration and collection of information.


Step 1: Collect, and confirm or update, contact information (name, address, telephone, e-mail address) of survey respondents through publicly accessible databases and websites and by follow-up calls to state and district offices.


Step 2: Mail an advance letter from ED to inform respondents of the forthcoming survey, to share with them of the purpose of the study and its importance, and to ask for their cooperation.


Step 3: Separate procedures will be used for the collection of survey data from states and school districts. For state surveys, we will mail the survey packet with cover letter, questionnaire, and return envelope. The letter will be personalized and will explain the survey and what participation entails, and provide assurance of confidentiality. This packet will be mailed one week after the advance letter.


For the district survey, we will mail a personalized cover letter that explains the survey and what participation entails, provides assurance of confidentiality, and provides the web address for the on-line survey with a separate set of instructions for completing the on-line survey. This packet will be mailed within a week after the advance letter.


Step 4: Send thank you/reminder postcard with a toll free number to call to ask questions regarding survey completion (i.e., request another survey, address technical difficulties with the web-based survey) or about specific items. The postcard will be mailed one week after the survey packet.


Step 5: Make first follow-up contact by telephone or email to confirm receipt of the survey information packet by the correct person and to answer questions. The call will be made about one week after the reminder postcard to non-respondents.


Step 6: Send letter and replacement questionnaire to state non-respondents two weeks after the postcard. The letter will stress the importance of participation. Send replacement letter to district non-respondents to stress the importance of participation again.


Step 7: Make second follow-up call/fax/e-mail to non-respondents one week after the follow-up mailing to answer any questions and try to overcome any objections.


Step 8: Continue reminder and refusal-conversion telephone calls as needed.


Step 9: Close out data collection.


B2.1 Statistical Methodology for Stratification and Sample Selection

Sampling Plan


State Surveys

The full population of State Part B, Part C and 619 administrators, i.e. Part B, Part C and 619 administrators from all 50 states and the District of Columbia, will be asked to complete the appropriate state level questionnaire; thus there are no sampling considerations.


District Surveys

It is not feasible for burden and cost reasons to survey the full population of 13,988 school districts in the United States (U. S. Department of Education National Center for Education Statistics, 2008). Thus, a stratified sample of school districts will be drawn as discussed below. The sample of districts for IDEA NAIS will be chosen with two goals in mind: (1) to be nationally representative so as to enable a national description of IDEA implementation; and (2) to have sufficient overlap with the Year 4 Study of State and Local Implementation and Impact of the Individuals with Disabilities Education Act (SLIIDEA) respondents to allow for longitudinal analyses. To meet these two goals, a sample of 400 districts (S1 districts) will be selected from the 849 districts that responded to the Year 4 SLIIDEA district survey. Additionally, an independent sample of 800 additional districts (S2 districts) will be selected from the current national population of school districts. This approach will yield a total sample of 1,200 school districts. The expected precision (95 percent confidence interval) of an estimated proportion assuming a sample of 1,200 LEAs and an 80 percent response rate, which was previously achieved (Schiller et al., 2006, p. 11), would be plus or minus 3.9 percentage points (see Appendix H for details).


Region and Urbanicity Classifications

The population of school districts in both sampling frames will be stratified by four Census regions and three categories of urbanicity creating 12 strata for sample selection. Urbanicity will be defined according to the metro status code variable on the corresponding CCD data sets. Region will be defined according to the four Census regions.


B2.2 Estimation Procedure

The plans for the statistical analysis of the data are presented in Part A, Section A.16.


B2.3 Degree of Accuracy Needed for the Purpose Described in the Justification (MDEs)

The study design utilizes the full population for the three state surveys, thus no minimum detectable effect needs to be calculated as no sampling is involved.


Responding to the research questions for the study requires the use of a nationally representative sample of school districts that can also be used to support longitudinal analysis. Thus, the IDEA NAIS district survey must be nationally representative and must overlap with the districts that responded to the Year 4 SLIIDEA district survey. This overlap is needed to support longitudinal analyses. To meet these two parameters, a sample of 400 districts (S1 districts) will be selected from the 849 districts that responded to the Year 4 SLIIDEA district survey. Subsequently, an independent sample of 800 additional districts (S2 districts) will be selected from the current national population of school districts. This approach will yield a total sample of 1,200 school districts. Typically, research of this nature uses a 95% confidence interval. The expected precision (95 percent confidence interval) of an estimated proportion assuming a sample of 1,200 LEAs and an 80 percent response rate (Schiller et al., 2006, p. 11) would be plus or minus 3.9 percentage points (see Appendix H for details).


B2.4 Unusual Problems Requiring Specialized Sampling Procedures

We do not anticipate any unusual problems requiring specialized sampling procedures.


B2.5 Use of Periodic (Less Frequent Than Annual) Data Collection Cycles.

The proposed surveys are one-time data collection efforts.


B3. Methods to Maximize Response Rates and Deal with Issues of Non-response

Section B2. contains the steps that we will utilize to implement the state- and district-level surveys. These procedures were developed to encourage cooperation and completion of the survey within the data collection period. Exhibit 1 highlights these steps and other specific strategies we will employ to maximize response rates and deal with issues of non-response.




Exhibit 1: Strategies to Maximize Response Rates

Advance notification of survey

  • Gain support and cooperation of district and state administrators by providing advance notice of the survey

Provide clear instructions and user-friendly materials

  • For state-level surveys: send individually-labeled survey packets with: 1) introductory letter from ED; 2) Survey and cover page that includes purpose of the study, provisions to protect respondents’ privacy and confidentiality; a toll-free telephone number to call for questions; and 3) a postage-paid return envelope

  • For district-level surveys: send introductory letter from ED along with a personalized cover letter that explains the survey and what participation entails, provides assurance of confidentiality, and provides the web address for the on-line survey along with instructions for completing the on-line survey.

Offer technical assistance for survey respondents

  • Provide toll-free technical assistance telephone number

  • Provide study website with instructions for web-based survey completion

Monitor progress regularly

  • Produce weekly data collection report of completed surveys

  • Maintain regular contact between study team members to monitor response rates, identify non-respondents, and resolve problems

  • Use follow-up and reminder calls and e-mails to non-respondents


We expect a response rate in the high 80th percentile for the district survey based on our prior successes (Schiller et al., 2006, p. 11) and the successful use of the strategies described above We expect a 100 percent response rate for each of the state-level surveys based on previous work (Schiller et al., 2006, p. 11), the strategies above, and our established relationships with states special education administrators.


B4. Tests of Procedures or Methods

In designing the survey instruments, we drew on questions and survey items that respondents were able to complete in previous studies such as the Study of State and Local Implementation and Impact of the Individuals with Disabilities Act (SLIIDEA) and the Pre-Elementary Education Longitudinal Study (PEELS).


Consequently, many of the survey questions have been tested on large samples with prior OMB approval and were found useful for obtaining targeted information. In addition, we will conduct cognitive testing with up to nine respondents for each survey who have been recruited to participate in the cognitive testing. The cognitive testing will involve sending a paper copy of the survey to individuals who previously held the target position for each survey or who are currently in the target position. These respondents will be asked to complete the survey instrument to determine what problems respondents might face in providing the requested information, such as whether the information is readily available, clarity of items, and appropriateness of response categories. Study staff who are experts in cognitive testing will follow-up by telephone to review the respondent’s feedback on an item-by- item basis. The cognitive testing will also provide estimates on time to complete each survey. The results of the cognitive testing will be used to make revisions to the instruments prior to final submission.


For the district survey, questionnaire items will be tested using the process described above with a paper copy of the survey. Once the items are finalized, usability testing of the web-based version will be undertaken with project staff to ensure that all programming aspects of the survey, such as skip patterns, are working appropriately.


B5. Names and Telephone Numbers of Individuals Consulted

The following people were consulted on the statistical aspects of IDEA NAIS.


Name

Title

Telephone

Fran O’Reilly

Project Director (Abt)

617-349-2756

Amanda Parsad

Director of Analysis (Abt)

301-634-1791

Cristofer Price

Project Quality Advisor (Abt)

301-534-1852

Thomas Fiore

Sub-contract Director (Westat)

919-474-0349

Kadaba P. Srinath

Senior Sampling Statistician

301-634-1836

References

Lohr, S. L. (1999). Sampling: Design and Analysis. Pacific Grove, CA: Brooks/Cole Publishing Company.

Schiller, E., Fritts, J., Bobronnikov, E., Fiore, T., O'Reilly, F., & St. Pierre, R. (2006). Volume I: The SLIIDEA Sourcebook Report (1999 - 2000, 2002 - 2003, 2003 - 2004, and 2004 - 2005 School Years). Cambridge, MA: Abt Associates, Inc.

U. S. Department of Education National Center for Education Statistics. (2008). Common Core of Data (CCD). Retrieved March 1, 2008, from nces.ed.gov/ccd




1 Section 619 of Part B of IDEA addresses preschool programs for children with disabilities.

File Typeapplication/msword
File TitleSupporting Statement for Paperwork Reduction Act Submission to OMB:
AuthorCay Bradley
Last Modified ByDoED User
File Modified2008-12-15
File Created2008-12-15

© 2024 OMB.report | Privacy Policy