OMB Responses

OMB responses final 5242011.doc

National Educational Study of Transition

OMB Responses

OMB: 1850-0882

Document [doc]
Download: doc | pdf


1. RQ1 deals with the characteristics of youth with disabilities and that have IEPs – we thought that the study would also compare characteristics over time (from cohorts in previous transition studies) and characteristics of youth with plans under Section 504 of Rehabilitation Act of 1973 and youth without IEPs and or plans under Section 504.  That was the original intent as we understood it.  Are these analyses still planned? If not, why are they no longer part of the study?

OMB’s original assumptions are correct. As stated on page 4 of Supporting Statement Part A, the study will draw a sample of youth without IEPs and from that group there will be a subsample of students who have 504 plans. Questions #4 and #5 in Table A.1 indicate that students with IEPs, students with Section 504 plans, and students with neither Section 504 plans nor IEPs will be compared in terms of their school experiences and outcomes (and their family and personal characteristics, although those factors were inadvertently left out of the question stem). In addition, research Question #6 demonstrates our intention to compare the experiences and outcomes (and characteristics) of students with IEPs in 2012 with those of students with IEPs in the two earlier NLTS cohorts.

2. RQ3 and RQ7 deal with academic, social, and economic outcomes for youth with disabilities (I assume with IEPs).  We also thought that the study would compare these outcomes over time from cohorts in previous studies.  Are these analyses still planned and if not, why are they no longer part of the study?

The reviewer’s assumption about RQ3 and RQ7 (Supporting Statement Part A, Table A.1, p. 4) is correct: these questions were intended to pertain to students with IEPs, and the questions will be rephrased to make that clearer. As noted above, Question #6 is where we indicate that we will compare the experiences and outcomes (academic, social, and economic) of youth with IEPs in 2012 with the experiences and outcomes of youth with IEPs in the two earlier NLTS cohorts.

3. How does IES manage sample overlap among its various studies?  Given the size and national scope of this one, along with several others just beginning that are national in scope, this seems like a potentially larger issue now than ever.

IES has not so far explicitly addressed sample overlap among studies. First, until recently NCEE primarily conducted smaller, focused intervention evaluations while NCES conducted larger, nationally-representative surveys. Within NCEE, we made efforts to share site recruitment contacts and successes among contractors working on different evaluations but we had little overlap in terms of schools, teachers or students. We recognize that now that NCEE is conducting program evaluations that require nationally representative samples, it raises the possibility that some larger urban districts will be sampled with certainty in a number of studies both within NCEE and NCES. However, we believe that this likelihood requires more sensitivity to recruitment rather than a change in sampling or estimation procures, for two reasons:

  1. The distinct sample requirements for each study should minimize the overlap of students between studies within those districts.

  2. Removing districts from possible sampling because they are included in another study would have a negative impact on the representativeness and precision of the various studies.

4. Is IES characterizing (in SS A1 and elsewhere) as “federal disability categories” those listed in IDEA?  Please clarify.

The references to “federal disability categories” as a stratifying variable (p. 3, Part A of the Supporting Statement) and elsewhere in the document refer to the thirteen federal disability categories in IDEA.

5. Will students who receive district funds to attend private schools be in scope or out for this study?

The Respondent Universe (Part B, section 1, p. 2) will include all students enrolled in public school districts, including students receiving special education services whose districts determine that the least restrictive placement for the student is in a private school.

6. The approach of interviewing students solely outside of school is quite unusual for a school-based study.  Besides a desire to minimize burden on the schools, what factors led to IES selecting this approach?

As noted in Part A section 5 (p. 7 of Part A of the Supporting Statement), students will be interviewed by telephone, and not in school. This approach was selected to limit both school burden and project costs, and reflects the expected dispersion of the sample across many schools. The challenges of sampling for representation of the major federal disability groups mean that, in the vast majority of districts, the sample will not be clustered by school and a relatively small number of students in each school will be included. This is quite a different design from most NCEE and NCES studies. However, a recent modification to the NLTS 2012 design will most likely warrant some data collection in the schools.

Subsequent to the submission of the recruitment package, for which clearance is being requested at this time, IES decided to execute an option in the contract to conduct an academic assessment. Originally, IES designed the study to rely solely on student records (attendance, state assessment scores, GPA) for measures of academic outcomes. This strategy was intended to limit costs, and to reduce burden on and potentially improve cooperation from districts and schools. In addition, evidence from 12th grade NAEP and other sources suggest that many high school students do not take low-stakes tests seriously and thus the measures of achievement may have significant “noise” around their estimates or even be unreliable. We continue to believe that a study-administered assessment may suffer from these attributes, but after meeting with our Technical Working Group and the Office of Special Education Programs (OSEP), we now weigh some analytic and policy objectives as higher priority. For example, relying only on state test scores would not allow us to examine trends in the academic achievement of students with disabilities over time; too many state assessments have already changed since NLTS data collection. We were also concerned that students within disability categories may be unevenly distributed across states with different types of assessments, making comparisons of average achievement across categories difficult.

This change may necessitate in-school data collection of the academic assessment, Woodcock Johnson III Normative Update. Chosen to allow comparison with NLTS 2, the assessment requires one-on-one administration. We will likely conduct the assessment on school campuses but will remain open to using other public spaces (e.g., libraries, community centers) as alternatives. We remain committed to conducting the student survey by telephone at the time we obtain consent and complete the parent survey. Our reasoning is that 1) the academic assessment can take a class period or more for many students and, like schools and parents, we want the children to miss as little class time as possible and 2) because the assessment will only be given to students when they are 16 years or older we will not be able to interview the entire sample at baseline if the interview is linked to the academic assessment. However, if we have not completed the student interview by phone when the assessment is being administered, we will ask the student to complete the interview survey after completing the assessment. See our response in Q12.1 for more information about administration of the academic assessment.

7. Who are the IES staff responsible for this study?

The Contracting Officer’s Representative for this study is Dr. Amanda DeGraff. Dr. DeGraff works closely with Marsha Silverberg, the team leader assigned to this study and who oversees NCEE’s agenda of studies on high school reforms and postsecondary transitions.

8. Why does the sampling plan cluster students in those schools for which no district level lists are available but not other students?   How will the combining of these two approaches affect the variance estimation and analysis plans for the study?

As noted in Part A, section 5 (pp. 5-6), our preferred approach to sample selection is to obtain lists of all students in the district with information about federal disability category and section 504 status and to select the student sample from across all district schools without stratifying by school. We believe that sampling at the district rather than school level is the best way to obtain precise estimates for students with IEPs overall and for each of the federally defined disability categories. We expect that the vast majority of districts will provide a list of all students; in these cases we will sample directly from these lists and do not need to cluster the sample by school. However, we have also developed a plan to deal with any districts that either do not have a list or are not willing to provide a list of all students for the study.   In the latter case, we will select schools within the districts with probability proportional to size using information on the number of students with IEPs and those without an IEP; after sampling schools we will sample students.   In addition, we plan to sample some special schools that only serve youth with disabilities.

The sampling will have two stages (if the districts provide lists of all students) and three stages (if a school-level sample is required). The samples at all stages will be selected without replacement. In the analysis, estimates of the sampling variance can be computed using formulas that assume the samples are selected without replacement. However, when samples are selected with probability proportional to size and without replacement, some of the factors required by these equations to account for the without replacement sampling are computationally difficult. Instead of computing these factors, the common survey data analysis practice is to assume that the first stage of sampling (the district in this case) is selected with replacement. By assuming with replacement sampling at the first stage, sampling theory shows that the squared differences among the PSU-level totals accounts for the sampling variance at this and all subsequent stages of sampling.1 This very powerful result is used in many of the national surveys sponsored by the federal government.

This variance estimation approach will provide a reasonably precise, albeit somewhat conservative, estimate of sampling variance. This assumption will result in somewhat larger standard error estimates than when the without replacement factors are computed and the without replacement equations are used.  In addition to the difficulty in computing the factors (the joint inclusion probabilities), only a few software packages can handle the computation of the sampling variances using the correct equations and with the joint inclusion probabilities.  Because we will assume sampling with replacement, our estimation of the variance of sample estimates will not be affected by the need to use 2-stage sampling in most districts and 3-stage sampling in some districts.

9. There are a number of details about sample stratification written as though they had not yet been decided in SS B2.  Please update with actual plans.


Below is an updated version of section B2 of the Supporting Statement (Part B, pp 3-7) which reflects actual plans.

B2. Statistical Methods for Sample Selection and Degree of Accuracy Needed

Two stage sampling will be used to select approximately 15,000 youth ages 13 to 21 as of December 2011. Of these youth, approximately 12,000 or 80% are expected to respond. The respondents will include approximately 9,600 students with IEPs and 2,400 students without IEPs. Of the 2,400 students without IEPs, approximately 600 will be students with Section 504 plans and 1,800 will be students with no IEP and no Section 504 plan.

The sampling design balances several objectives but places the highest priority on obtaining precise overall estimates for all students with IEPs and precise estimates for each of the federally defined disability categories. Other priorities are to obtain estimates for the Section 504 students and students with no IEP and no Section 504 plan.

The sampling design for this study was developed to support survey estimates with precision needed for policy analysis for the 13 categories of students with disabilities specified in the Individuals with Disability Education Act. Among these disability categories, the prevalence of the disability varies substantially with some disability categories being more prevalent (such as students with learning disabilities and students with intellectual disabilities) than others (such as students who or either deaf, blind or both). For most of the students with the more prevalent disabilities, the school district is an efficient vehicle for identifying and selecting a sample of students and we will use a two-stage sampling design for selecting these students. For students who are deaf and/or blind, a major portion of these students will be educated in state-sponsored schools for the deaf and/or blind. We will use these schools as a primary source of students in this disability category and supplement this sample with those selected through the district-based sample.

The primary sample will be selected in two stages. In the first stage, the study team will form primary sampling units and randomly select approximately 450 district units using ED’s Common Core of Data (CCD) with the expectation that approximately 300 district units (approximately 375 districts) will agree to participate in the study. We expect that the 450 district units will be comprised of approximately 560 individual districts (including charter school districts). Additional district units will be randomly selected as a reserve just in case they are needed to recruit 300 participating district units. For the second stage, the study team will obtain lists of students with IEPs, students with a Section 504 plan, and students with neither an IEP nor a section 504 plan from the 375 participating districts. We will then allocate the sample among these strata of students in the disability categories, students with a Section 504 plan, and other students to select the student samples. Details of the proposed sample selection are described below.

a. District Sampling Frame

The sampling frame for the districts in the study comes from the NCES Common Core of Data (CCD). Approximately 14,200 Local Education Agencies (LEAs) nationwide serve students with IEPs in grades 7-12 or between the ages of 13 and 21. To achieve sufficient sample among the least prevalent disability categories, we estimate that the primary district level sampling unit for the study will need to serve at least 375 students with IEPs. This number of students with an IEP per primary sampling unit is necessary to ensure that the sample includes adequate numbers of students with low incidence categories of disability to support descriptions of these key groups. This estimate is inflated to account for missing or suspect IEP counts in the CCD data file and loss of sample due to non-response. To support efficient data collection, the study team will combine some nearby districts into district units: in particular districts with more than 30 and fewer than 375 students with IEPs will be combined so that they contain at least 375 students with IEPs. Larger districts will not need to be combined but will serve as their own district unit. The 5,140 districts serving fewer than 30 students with an IEP, which in the aggregate serve less than 3 percent of all students with IEPs, will be excluded from the study. For the selection of students in district units consisting of multiple districts, the study team will compile sample information from all component LEAs and sample from the combined student populations. Within district size strata, the sample of districts and district units will be selected with probability proportional to a composite size measure that includes the IEP and non-IEP populations in the districts. This measure will increase the selection of districts with more students with an IEP and can provide nearly self-weighting samples of students within the federal disability categories in each district size stratum.

b. Stratification of the District Sample

The study team will stratify the district units before sample selection. The primary explicit stratification of the district sample will be by size of district. Approximately, 61 percent of students with IEPs attend school in districts with 375 or more students with IEPs (large districts); 16 percent attend districts with 200-375 students (medium districts), and 23 percent of students with IEPs attend districts with between 30 and 200 students (small districts). To keep the total number of districts to be recruited and the costs of data collection at reasonable levels, students attending the smallest districts will be sampled at 50 percent of their proportion of the total population, and students in the large districts will be sampled at about 118 percent of their proportion of the total population (See Table 1). In addition, schools serving deaf and blind students will form a separate stratum.


TABLE 1


ALLOCATION OF NLTS 2012 SAMPLE TO SMALL, MEDIUM AND LARGE DISTRICT UNITS



Percent of Students with IEPs

Proportional Allocation

District Units

Revised Allocation

Percent of Sample

Sampling Rate

Students with IEPs / District

Approx.
Districts
Recruited

Total

100%

9,600

275

9,600

100%



376

Large
(375 or more)

61.4%

5,893

165

6,972

72.6%

1.18

41

165

Medium
(200 to 374)

16.1%

1,548

44

1,548

18.1%

1.00

35

74

Small
(30 to 200)

22.5%

2,159

62

1,080

11.2%

0.50

17

137


The team will use implicit stratification on the following variables to ensure that the sample reflects the nationwide distribution of students along these dimensions: geographic region, degree of urbanicity, percentage of students living in families with income below the federal poverty level, and the extent of minority enrollment. In implicit stratification, the sampling frame within a stratum is ordered by a factor such as region of the country, and by using a sequential selection procedure, the sample selected is approximately proportionally allocated across the regions of the country.

c. Size Measure for District Selection

The study team will use a composite size measure to select the sample of districts and district units within a given stratum.2 The composite size measure will be based on the district level counts of the number of students with IEPs, N(students with IEPs in district i), and the number of students without an IEP, N(students without IEPs in district i). The size measure is based on global sampling rates for students with IEPs, f(IEP), and those without an IEP, f(W/O IEP), using data available from the CCD. The size measure for the ith district will be of the form

Si = f (IEP) * N(students with IEPs in district i) + f (W/O IEP) * N(students without IEPs in district i)

We expect that some districts (such as New York City, Los Angeles and Chicago) with large student populations will be selected with certainty, and the study team will use this size measure to identify these districts. The remaining districts within a strata will be selected with probability proportional to the composite size measure and without replacement. This composite size measure can result in nearly self-weighting samples of students within the disability categories in each size stratum.

To enable the undersampling of students in districts with 30-200 students, the study team will create “half-units” in the small district stratum which will include half of the target 375 students with an IEP. In this way students in these districts will be selected to the sample at a rate that is 50 percent of their incidence in the population and these district units will each contribute half the number of sample members that medium districts contribute.

d. Student Selection

Based on the federal reporting requirements and on the experience of NLTS 2 we anticipate that all districts will maintain lists of students by federal disability category. Based on information from the ED Office of Civil Rights we anticipate that most districts will also maintain a list of non-IEP students with Section 504 plans. Using these lists, the study team will assign each student age 13 to 21 to one of the strata (one of the IEP disability categories, the stratum of non-IEP students with Section 504 plans, or the stratum of non-IEP students without Section 504 plans). The study team will then draw a random sample from each stratum (controlling implicitly by grade level and school) at a rate designed to yield the target number of students in each stratum. The team will also select a reserve sample available for use to account for students who may be ineligible or choose not to respond.

It is anticipated that a proportion of districts will neither maintain lists by disability category nor have lists of students with Section 504 plans.3 In these districts, the study team will first select schools and then obtain the lists from the selected schools. The schools will be selected with probability proportional to size (such as the number of non-IEP students).

The study team expects to interview approximately 32 IEP students/parents and 8 non-IEP students/parents in each of the districts or district units. To obtain this many respondents from each district, the study team will select samples of approximately 40 students with an IEP and 10 students without an IEP, based on an anticipated response rate of 80 percent.

e. Precision and Minimum Detectable Differences

Table 2 presents target sample sizes and estimates of precision for a set of disability category subgroups and the non-IEP sample (divided into Section 504 students and all other students). All of the sample sizes in this table represent the estimated number of youth (or parents) responding to the surveys. This sample allocation is designed to allow meaningful precision for survey estimates and minimum detectable differences of approximately 0.10 for proportions near 0.50 (for a two-sided test with alpha of 0.05 and 80 percent power) for most of the disability categories. The precision estimates are based on an allocation of 600 respondents with Section 504 plans. The table presents estimates of minimum detectable differences (MDDs) for comparisons between the subpopulations and two larger populations: all students with IEPs and all students without IEPs.

Three categories of disabilities (traumatic brain injury, visual impairments, and deaf-blind) are too rare to support reliable estimates individually without shifting too much sample from much larger categories. For some of the analysis these categories may be combined with others to provide more reliable estimates.


TABLE 2


SAMPLE SIZES, PRECISION, AND MINIMUM DETECTABLE DIFFERENCES FOR SUBPOPULATIONS DEFINED BY DISABILITY CATEGORY




Half-Width of 95% Confidence Level at Selected Proportions


Minimum Detectable
Differences (MDD)


Proposed/
Estimated
Sample Size

.50

.10


With IEPs

Without
IEPs

All Students without IEPs

2,400

0.022

0.079


0.076

--

Without Section 504 Plans

1,800

0.025

0.087


0.080

--

With Section 504 Plans

600

0.043

0.131


0.148

0.137

All Students with IEPs

9,600

0.018

0.011


--

0.079

Specific Learning Disabilities

1,600

0.027

0.090


0.096

0.098

Other Health Impairments

1,200

0.031

0.010


0.101

0.107

Intellectual Disabilities

1,200

0.031

0.010


0.101

0.107

Emotional Disturbance

1,200

0.031

0.010


0.101

0.107

Speech or Language Impairments

1,000

0.033

0.107


0.106

0.114

Autism

1,000

0.033

0.107


0.106

0.114

Multiple Disabilities

900

0.035

0.111


0.110

0.118

Hearing Impairments

600

0.043

0.131


0.126

0.137

Orthopedic Impairments

450

0.049

0.149


0.142

0.154








Combined

450

0.049

0.030


0.149

0.154

Traumatic Brain Injury

233

0.068

0.041


0.201

0.204

Visual Impairments

204

0.073

0.044


0.214

0.217

Deaf-Blindness

20

0.287

0.172


0.822

0.823


Note: MDDs apply to comparisons between the row subpopulation and either all students with IEPs (excluding those students in the specific row subpopulation) or all students without IEPs. The MDDs are computed for detecting a difference in a proportion near 0.50 for a test with alpha of 0.05 and 80 percent power.


10. What is the basis for estimating an 80 percent response rate of students at baseline? 

As noted in Supporting Statement Part B (p. 6) we anticipate that data will be available at baseline for 80 percent of the student sample. Achieving high response rates and retaining a youth sample over the life of a multi-year longitudinal study is challenging, as the experience of NLTS 2 demonstrates (see discussion below under response to question 12, sample attrition, for a description of the NLTS 2 attrition rates). For this study, the challenge of securing parental consent for students to participate and completing data collection lead us to set a response rate target of 80 percent of the sample for baseline data collection. Below we outline Mathematica’s experience in interviewing similar populations and describe why we believe the 80 percent targets are ambitious but realistic.

Mathematica has achieved responses rates of 80 percent or higher on some surveys of disadvantaged youth in transition. Based on those studies, we have estimated our baseline and follow-up response rates. For the Youth Transition Demonstration, the largest demonstration funded by the Social Security Administration to help young people with disabilities make successful transitions, Mathematica is collecting data from parents and youth at baseline, and 12‑ and 36- months after random assignment. For the baseline and 12 month surveys, Mathematica achieved response rates of approximately 87 percent. For the 36-month follow-up, the response rate was 82 percent. For the National Job Corps Study, sponsored by the U.S. Department of Labor, Mathematica followed more than 15,000 youth over four years. At baseline the response rate was 93 percent for the full research sample. After four years, Mathematica was able to locate and interview 78 percent of the sample.

Achieving high response rates to baseline and follow-up surveys will require a combination of techniques that Mathematica has refined over the past 40-years, including:

  • Compelling advance materials, including brochures about the study, FAQs, and endorsements from leading organizations.

  • Assurance to sample members that the information they provide will be secure, treated confidentially, and used only for research purposes.

  • Well-designed questionnaires, with cognitively tested and easy-to-answer questions.

  • A toll-free help line for sample members to call with concerns or to schedule an appointment and well-trained interviewers able to address sample member’s concerns.

  • Multiple attempts to reach respondents at various times of the day and week.

  • Specialized refusal conversion and training as needed.

  • Providing a monetary thank-you (as determined by OMB) to show appreciation for participant’s time and effort.

We note that the surveys referred to above achieved approximately 80 percent response rates at the point of follow-up, whereas we are assuming a response rate of approximately 80 percent at baseline for NLTS 2012. We believe the more conservative planning assumptions are appropriate for NLTS 2012 for two reasons. First, we anticipate that some districts will not be willing to provide contact information without prior consent from parents. We expect this factor to reduce the percentage of the initial sample for whom we acquire consent and baseline data, relative to the situation where we are able to follow our basic plan of securing verbal recorded parental consent by telephone.  The second factor is that the Job Corps study included an attempt to conduct in-person interviews with sample members who could be located but did not complete the interview by telephone. NLTS 2012 does not include in-person follow-up.

11. What is the estimated response rate as time of follow up?

We estimate that first follow-up data collection will be completed for approximately 75 percent of the original sample at a point two years after the study baseline data collection. At first follow-up, we will attempt to locate and interview all sample members including both those who completed a baseline and those who did not complete a baseline.4 Based on similar studies conducted by Mathematica, including the two mentioned above, we believe keeping sample attrition to approximately 6 percent during a two-year interval is feasible. An important technique for retaining sample will be to collect a substantial amount of contact information from families at baseline. In addition to name, address, telephone numbers (landline and cellular), of sample members and their close friends or relatives who do not live with them, we will ask for e-mail addresses to which we can send reminders. Additionally, we will ask permission to send text messages to cellular phones. We will also take advantage of social media. According to a 2009 survey from the Pew Research Center's Internet & American Life Project 73 percent of online American teens ages 12 to 17 used an online social network website, a statistic that has continued to climb upwards from 55 percent in November 2006 and 65 percent in February 2008. Older online teens are more likely to report using online social networks than younger teens. Thus, Mathematica will ask sample members and parents for their social network screen names and permission to use social networks to contact them with reminders about the follow-up survey. By collecting all of these kinds of contact information, we believe Mathematica can achieve a response rate of approximately 75 percent in 2014.

12. Although we understand that IES is not seeking approval at this time for data collection from schools, parents and students, we believe that we need significantly more detail than had been provided thus far about later stages of the study. For example,

12.1. What is the data collection strategy, including mode, for interviewing students at baseline?

This section and the next provide the requested additional details on the study data collection design provided in the introduction to Part A and Part B and section 6 of Part A (p. 8). At baseline, both parents and students will be interviewed by telephone. Mathematica will obtain oral consent from parents first and depending on their age, assent or consent from the students. Written documentation of the oral consent will be sent to participants for their records. The parent interview is expected to take 40 minutes and the student interview is expected to take about 30 minutes. If a student’s disability prohibits self-response, interviewers will ask the parent or guardian to proxy for non-subjective questions. All questions will be designed to be answered by persons with disabilities. Questions will avoid high-frequency sounds, offer simple probes if the respondent does not understand the main questions, and accept ranges if exact response categories are unknown. Interviewers will assess the respondent’s emotional and physical state to offer breaks if necessary.

In addition, direct assessments of academic proficiency will be conducted with all sample members, using the Woodcock Johnson III Normative Update, at the point they are approximately 16 years old or older. This assessment will be conducted one on one by an assessor hired and trained by Mathematica. Accordingly, sample members who are 16-21 at baseline will be asked to complete the direct assessment in spring 2012. Direct assessment of sample members who are 13-15 at baseline will be conducted at the first follow-up in spring 2014 when most will be 16-17 years old.

Finally, we note that data will be obtained in spring 2012 from the following staff at the school attended for each sample member: 1) the principal or designee will furnish data on school programs, policies, and resources; and 2) the teacher who teaches the sample member math or language arts. For students with IEPs, the special education teacher most familiar with the sample member’s program will provide additional information about the student’s school program and transition services. Each of these school staff will be asked to complete a web survey, with telephone follow-up of respondents who do not complete the self-administered web survey.

12.2. What is the follow up interval from baseline and will this vary by age and grade? Will there be any intermediate contact?

Follow-up data collection will be conducted in spring 2014. Follow-up surveys will be administered to youth, their parents, and the special education staff most familiar with each student’s school program if they are still in school. The follow-up interval from baseline will be two years and will not vary by age or grade, except for the student direct assessment as stated in 12.1.

Mathematica is not planning an interim contact because of the large amount of contact data collected at baseline. We have assumed that one-third of the cases will need to be located for the follow-up interviews. Mathematica will begin using text messaging and reminders, through social media, about one month before interviewing begins. Mathematica will send an advance letter to sample members not reachable through electronic media about one week prior to the interview. Students who reach the age of consent between the baseline and follow-up interviews will be asked to consent for themselves prior to answering follow-up questions.


12.3. How will IES locate students who have left the district for any reason including graduation? 

At each data collection point, parent interviews are conducted prior to student interviews. If youth no longer live with their parents we will ask the parent how we can reach their child. In most cases, parents will know where the youth have relocated and will provide contact information. If a family has moved, various locating methods will be used. Searches using publicly available data bases, contacts provided at baseline, and mail returned as undeliverable with forwarding addresses will be the starting point for these searches. Again, we expect that one-third of the sample will need some kind of locating prior to follow-up interviewing.

12.4. What levels of attrition does IES project and based on what? Do you expect those to vary differentially by IEP and non-IEP students?

Given the response rates for similar types of studies, including NLTS 2, we expect that 20 percent of the students selected for the sample will not respond at baseline. We will try to contact the entire selected sample for the follow-up interview regardless of whether the parent/child was reached for baseline interview. We expect that we will not be able to interview 25 percent at follow-up but they will not necessarily include all of the 20 percent who did not respond to the baseline. .


Our expectations about attrition are also consistent with a recent analysis of response rates conducted for us by our colleagues at NCES. The Educational Longitudinal Study (ELS), a longitudinal survey of 10th graders in 2002, found just over a 6 percent loss between baseline and the first follow up interviews. While the ELS experienced differential attrition for students with disabilities (about 10 percent) and those without disabilities (about 6 percent), we do not expect such a gap. Because ELS was a study of primarily general education students, there was little extra or targeted effort devoted to retaining students with disabilities and their parents. In contrast, NLTS 2012 has the support of the special education community and its federal leaders. We continue to present at conferences and meeting about the study and are in the process of obtaining letters of support from stakeholder groups (e.g., state special education coordinators) in addition to the Assistant Secretary of OSERS. We believe this extra backing will enable us to achieve higher rates of response for students with disabilities at the follow up than was achieved by NLTS 2. Our plan to rely heavily on administrative records and third party data for key outcomes also mitigates the consequences of differential attrition in survey responses.


However, if a differential were to persist across multiple interview waves, it would become a much larger problem and serious threat to the study.   To address this possibility IES plans to monitor survey completion rates – overall and by IEP status – closely with Mathematica.  These differences noted in ELS underscore the need for 1) careful discussion in our analysis planning of what level of overall or differential attrition should trigger additional survey efforts to minimize adverse effects, and 2) planning for specific strategies to bring additional resources to bear should this be necessary. 


12.5. What MDEs does IES expect to have at the end of the study versus at baseline?  Are these meaningful for the questions that IES wants to answer?

We believe the study is well designed to address important questions about the experiences of youth, between the ages of 13 and 21 in fall 2011, as they move from school to adulthood and about the differences in transition related experiences of youth with and without IEPs. The MDEs for the baseline data of Phase I are shown in Table 2 (p. 9). For the major comparisons of interest, we will be able to detect differences of between .08 standard deviations (e.g., students with IEPs vs. students without IEPs) and .14 standard deviations (students in individual categories of disability to all other categories).

Phase II of the project is expected to continue following the NLTS 2012 sample after 2014. However, the schedule for data collection beyond 2014 has not been set. To address the reviewer’s question about later points in the study, the accompanying Table 3 presents precision estimates for the second data collection point in spring 2014 and for a not yet planned, and therefore hypothetical, data collection point in spring 2020. We have selected 2020 for purposes of this discussion because it creates an 8-year follow-up period that will support comparisons between students in the NLTS 2012 sample who were 13-16 at baseline and students in the NLTS 2 sample.

The first set of columns of Table 3 shows the half width of 95 percent confidence intervals at selected follow-up data collection points for selected subgroups of the full NLTS 2012 sample. The second set of columns shows minimum detectable differences (MDD) for each data collection point. The first column in that second set shows the MDD between the row subgroup and all IEP students; the next column to the right shows the MDD between the row subgroup and all students with no IEP. Response rate assumptions are that 80 percent of the sample provides data at baseline, that data collection occurs every two years, and that sample available after each round is 94 percent of the sample available at the previous round.

The estimates in Table 3 suggest the study will be able to estimate attributes of all students with IEPs and all students without IEPs at about +/- 2 percentage points at both the beginning and end of the 8 year period. Furthermore, the sample will be able to detect differences in the attributes of IEP and non-IEP students of about 9 percentage points at two years after baseline in spring 2014 and about 10 percentage points at eight years after baseline in spring 2020.

Table 3 also shows the precision of estimates for the other subgroups including the specific federal disability categories and students with Section 504 plans. The precision of subgroup estimates is less than that for all IEP and non-IEP students. The precision for the rarest disability categories is the lowest.

TABLE 3


PRECISION AND MINIMUM DETECTABLE DIFFERENCES AT VARIOUS DATA COLLECTION POINTS


Measure

Half-Width of 95 Percent Confidence Intervalsa


Minimum Detectable Difference Between Column and Row Groupsb

Age of Sample at Follow-Up

Age 13-21

Age 15-23

Age 21-29


Age 13-21

Age 13-21

Age 15-23

Age 15-23

Age 21-29

Age 21-29


Base

FU 1c

FU 4d


Base

Base

FU 1

FU 1

FU 4d

FU 4e

Proportion of Sample with Data

r =.8

r= .752

r= .625


r =.8

r =.8

r= .752

r= .752

r= .625

r= .625






vs.IEP

vs. No IEP

vs.IEP

vs. No IEP

vs.IEP

vs. No IEP


All Students Without IEPs

0.022

0.025

0.028


0.079


0.091


0.100


With Section 504 Plan

0.043

0.049

0.054


0.131

0.137

0.151

0.158

0.166

0.173

Without Section 504 Plan

0.025

0.029

0.032


0.087

0.095

0.100

0.109

0.110

0.120


All Students with IEPs

0.017

0.020

0.022



0.079


0.091


0.100

Other health impairments

0.030

0.035

0.038


0.100

0.107

0.115

0.123

0.126

0.135

Autism

0.033

0.038

0.042


0.107

0.113

0.123

0.131

0.135

0.144

Orthopedic impairments

0.049

0.057

0.062


0.149

0.154

0.171

0.177

0.188

0.194



aShows CI for attribute held by approximately half the population (p = .5).

bShows minimum detectable difference for contrast between subgroup in row head and subgroup in column head for an attribute held by half the population, using a 95% confidence int3erval and 80 percent power. Thus, if the difference between groups for the population exceeds the value shown the study will have an 80 percent chance of correctly rejecting the null hypothesis of no difference using 2-tailed test at the 95 percent confidence level..

cFU 1 refers to first follow-up planned for spring 2004.

dFU 4 refers to hypothetical, not yet planned fourth follow-up in spring 2020.

eHypothetical fourth follow-up assumes a second follow-up (also not yet planned) would be conducted in spring 2016, a third in spring 2018, and a fourth in spring 2020, and that at each follow-up point data are available for 94 percent of the number of cases available at the previous round of interviewing. These assumptions are used solely to respond to the OMB reviewer’s question about sample precision at later stages of the study.

12.6. What is the anticipated burden on each of the respondents? What is the justification for incentivizing them, especially principals, who we almost never incentive? What is the justification for the incentive levels for parents and students? What is the proposed incentive level for teachers and why? Please note that the letter to the superintendents indicates that every participant will be incentivized. Also, the proposed consent form to parents indicates that both parents and students will receive $25. We cannot approve these communications as written without a full discussion of planned incentives.

Although we acknowledge that incentives have not always been necessary to achieve high response rates with principals, we believe it is very important to offer a small incentive ($25) to the principal or the person he or she designates to complete the school characteristics questionnaire for NLTS 2012. This survey will take approximately 30 minutes and will ask principals to describe resources, programs and policies of the school. It is vitally important that we obtain an adequate response from principals because the survey will provide both school context data and the means by which we will estimate students’ “access” to various programs and resources, not simply their participation.

We think paying for principals’ time to complete the survey is justified for several reasons. First, although it has historically been OMB’s perspective that principals should complete ED-sponsored surveys as part of their regular duties, we believe there are circumstances when that rationale is more or less compelling. For example, payment for completion of a principal survey may be less necessary in studies where there is otherwise a benefit to the school from participating; that benefit may include a “school” payment to offset study burden, a study-provided intervention (e.g., a promising curriculum or induction program), or a strong presence of the study team in the school (e.g., a significant set of participating students, classroom observation, conducting professional development, etc.). We might expect that these components of the study could affect the principal’s motivation to complete the survey either extrinsically (the school is receiving benefits for participation) or intrinsically (principal agreed to participate in larger study and is therefore more likely to fully participate).


However, in the case of NLTS 2012, the principal is unlikely to have similar motivation because:


a) The school, as a whole, is not receiving any direct benefit for participation, thus not providing extrinsic motivation;

b) The prescriptive sample design required to obtain appropriate counts in each of the disability categories makes it possible that a principal may only have 1 or 2 sampled students in their school who have been selected to participate in the study; that level of exposure to the study is unlikely to provide sufficient extrinsic motivation to complete the survey;

c) There will be no or very limited face-to-face contact with members of the study team which could serve to provide some social motivation for completing the survey. Most of the NLTS 2012 surveys will be completed via the web or telephone; and

d) There is no separate stage in which the principals agree to participate in the study (the youth is the targeted sample) therefore we cannot count on principals’ intrinsic motivation to complete the survey.

Second, IES does have a history of providing incentives to principals in cases where the principal/school is not tied in a meaningful way to the study. Some relevant examples include:

  1. Impact of Charter School Strategies (1850-0799, NOA 3/10/06): paid $10 for a 15 minute survey. The circumstances are similar to those in NLTS 2012 in that no intervention or treatment was provided to the schools.  In addition to the 37 charter schools in the study sample that DID have significant contact with the evaluation team, the survey included the principals of hundreds of traditional public schools (wherever the control group went) as well as the other 500+ charter middle schools in the country. None of the latter two groups had any connection to the study or had any benefit from participating by completing a survey.

  2. Impact Evaluation of the DC School Choice Program (1850-0800, NOA 4/15/05 and 12/22/08):  This was a 12 minute hard copy survey that went to principals of all private schools and public schools in DC and paid them $10 for completion.  None of them was connected to the study, although about 66 of the 102 private schools were receiving vouchers from participating students. Due to poor response rates, OMB approved increasing the incentive to $20 in 2008.

Third, given the budget and staffing shortages many schools face, this payment will partially compensate participating school leaders for their time spent completing the survey, which will almost certainly be done outside regular school hours. This small expression of appreciation for their time and effort can only serve to provide a positive experience in working with ED which may result in more cooperation with future studies.

We are also proposing to provide incentives to parents and students because substantial incentives are widely considered necessary to maintain sample in longitudinal studies. Laurie and Flynn (2008) review a large number of longitudinal surveys in the U.S. and Europe. Their summary indicates that incentives are widely used in longitudinal studies in the US. For example, in 2005-2006, the Panel Study of Income Dynamics, Survey of Income and Program Participation, National Longitudinal Survey of Youth, and U.S. Health and Retirement Survey offer adult respondents ranging from $40 to $60, with additional incentives in NLSY if respondents call in to complete a telephone interview. Laurie and Flynn note that 1) higher incentives tend to produce higher response rates, 2) there is some evidence that effects on wave to wave retention are more pronounced than response rates at particular wave, and 3) there is some, but not consistent, evidence that incentives are most effective with sample members least likely to respond. Therefore, we propose $20 and $10 incentives for the baseline interview with parents and students, respectively.

Below are summarized the anticipated time required by each type of survey respondent and proposed incentives for the respondents to the Spring 2012 baseline data collection for the National Longitudinal Transition Study 2012.

Baseline Surveys and Youth Assessments—Spring 2012

Respondent

Interview Length

Incentive

Principal

30 minutes

$25

Math or Language Arts Teacher

25 minutes

$25 per student

Special Education Teacher

35 minutes

$25 per student

Parent (Interview)

40 minutes

$20

Youth (Interview)

30 minutes

Cash value card $10 in value

Youth (Academic Assessment)

15-45 minutes

No incentive




Math or Language Arts Teacher Survey. Each sample member’s math or language arts teacher will be asked to complete a web survey with telephone follow-up about that student’s math or language arts class (program of study, participation in class, supports, instructional strategy). Given that the sample is not clustered by school, we anticipate that most teachers will be asked to respond for one student; however, we will offer an incentive of $25 for each student for whom the teacher completes a questionnaire.

Special Education Teacher of Sample Members who have IEP. The special education teacher most familiar with the student’s overall program will be asked to complete the school program survey, which covers the characteristics of the student’s instructional programs, services, supports and accommodations, and transition planning activities. For the reasons outlined above, we believe it is important to offer an incentive of $25 per sample member that the teacher provides information about.

Parent and Youth Data Collection

Ensuring parent and student commitment to the study is of paramount importance. Our first contact will be in one of two ways: (1) when we call parents and students directly for consent and baseline interviewing, or (2) when they receive a consent form from the district asking to release contact information to the study team (see Attachment A). In either case, we propose to offer incentives for participation in the study to parents and students, at $20 and $10, respectively to encourage their participation. The parent incentive would be in the form of a check. The student incentive would be in the form of a cash value card worth $10.

These amounts differ somewhat from the $25 the OMB reviewer noted were proposed in the consent to release contact information form that was included in the clearance submission. The reduction is in response to OMB’s clear disinclination for parent and student incentive payments and brings the amounts more in line with those approved for other studies. However, we believe the unique circumstances of NLTS 2012 call for appropriate incentives. First, there is a significant burden on parents because they will in many cases be a proxy for their child, responding to both the student and parent surveys. And second, the study will need to contend with historical concerns about providing researchers with access to students with disabilities.

12.7. How will the payment levels to districts be set?

There is not a “set” procedure for determining a payment level for districts. The intent of the payments is to secure high rates of district participation by minimizing the financial burden on districts, particularly during a time when school/staff budgets are very tight. While some districts may not ask for any compensation, others may need to pay overtime or hire staff to fulfill our data requests. Mathematica will compensate districts directly for the extra staff time spent or by adding district staff to its on-call payroll with the payment level a function of the number of hours required by district staff to complete our request. Given historical experience collecting both rosters and administrative records we expect that this work will require skilled staff such as special education coordinators and district data management staff who could spend 80 or more hours gathering this information. As an alternative, Mathematica can also send its own staff to assist in districts in collecting this information. Recruiters will discuss options with districts and make decisions based on discussions with supervisors and district preferences.

12.8. How do all of these items compare to the last NLTS administration?

There are two main changes in the structure of the data collection from NLTS 2 to NLTS 2012. First, the youth baseline survey for NLTS 2012 will be conducted as part of the baseline data collection in the spring immediately following sample selection. By contrast, the first youth survey in NLTS 2 was conducted one year after the point of sample selection. We felt it was more useful to conduct the youth baseline immediately after sample selection in order to: (1) achieve high baseline completion rates, and (2) capture early youth experiences and expectations. However, like that of NLTS 2, the second round of youth data for NLTS 2012 will be conducted two years after youth baseline collection.

Second, NLTS 2 conducted an interview with the general education teacher who was the first general education teacher a special education student had class with during the week. However, we plan to conduct the teacher interview with either the student’s math teacher or language arts teacher because of the significantly greater interest in understanding teachers’ inclusionary instructional approaches and students’ experiences in these core academic classes; this focus also allows for linkage with student test scores. In the great majority of cases the teacher reporting on math or language arts instruction in 2012 is expected to be a general education teacher, even for special education students. We will be able to make inter-cohort comparisons of the responses of the general education math and language arts teachers participating in both NLTS 2 and NLTS 2012 surveys. In addition, the School Program Survey will include questions that will support cross cohort comparisons of the percentages of IEP students who receive math and language arts instruction in special education and general education classrooms. The sections below describe in more detail the population coverage, sample, data collection design, and sample retention of NLTS 2 and NLTS 2012. Table 4 summarizes the differences in these features of NLTS 2012 and NLTS 2.

a. Population coverage. NLTS 2 included only students with IEPs who were between 13 and 16 years old and in seventh grade or above in December 2000. By contrast, NLTS 2012 youth will be 13 to 21 when they are sampled and will include youth with IEPs, youth with Section 504 plans, and youth with neither IEPs nor Section 504 plans. IES decided to include the broader age range of students because it provides data on all ages within the group for whom transition is an important issue. Like the original NLTS study focused on the cohort sampled in 1985, the current study will produce information about the personal and family characteristics, expectations, school experiences, and outcomes for both the younger and older group of IEP students, including those 19-21, who were not sampled by NLTS 2. This provides an opportunity to secure information early in the study on the experiences and outcomes for this group for whom the transition from school is especially challenging and coordination of support services with agencies outside the school is likely to be especially important.

b. Baseline data collection strategy. NLTS 2 baseline data collection occurred over a two year period. In spring 2001 (study year 1), a parent interview was conducted by telephone with mail follow-up of sample members who could not be interviewed by telephone. In Spring 2002 (study year 2), NLTS 2 collected baseline data from school staff through 1) a principal questionnaire (completed by the student’s principal and covering school programs, policies, and resources), 2) a general education teacher questionnaire (completed by on academic education teacher of the student), and 3) a school program questionnaire (completed by the special education staff member most familiar with the student’s program). In addition, direct student assessment and an in-person interview were conducted with students ages 16-18 in spring 2002 and with younger members of the sample in spring 2004.

By contrast, all of the NLTS 2012 baseline data collection will occur in spring 2012. This baseline data collection will include for all youth sampled, parent and youth surveys, a principal survey, and a survey of the youth’s math or language arts teacher. (Mathematica will randomly determine whether to interview the math or language arts teacher). For youth with IEPs, Mathematica will conduct a school program survey with special education staff familiar with the sample member’s program. We noted above the reasons for this change in the study design between NLTS 2 and NLTS 2012.

c. Follow-up data collection. The first NLTS 2 follow up also occurred over two years. Telephone interviews with both parent and youth were conducted in spring 2003 (2 years after the baseline parent survey). The general education teacher survey and student school program surveys were conducted in spring 2004 (2 years after the baseline school staff data collection).

By contrast, the NLTS 2012 surveys with all respondents will be conducted in spring 2014. However, they will also be 2 years after baseline data collection in each case. We believe that conducting all of the follow-ups in a single year will provide a clear picture of the experiences and challenges encountered by youth and make the data from the various follow up surveys easier to compare and analyze together.

Like the NLTS 2 direct student assessments, the NLTS 2012 assessments will be conducted one time for each sample member when the sample member is 16 or older.5 As noted previously, the direct assessments administered in NLTS 2012 will allow for comparisons with the assessments conducted under the earlier study.

NLTS 2 conducted additional follow-up interviews with parents and youth in spring 2005, spring 2007, and spring 2009. For NLTS 2012, IES has not yet established a schedule for additional follow-up interviews.

d. Levels of attrition. NLTS 2 used a two stage sample design, as is planned for NLTS-2012. We describe NLTS 2 attrition and expectations for NLTS 2012 at each sampling stage below.

NLTS 2 selected approximately 3634 LEAs from the population of approximately 12,000 LEAs operating in fall 2000. In addition they invited 77 state schools serving students with vision and hearing impairments and multiple disabilities. A total of 501 LEAs and 38 state schools agreed to participate. Accordingly, the response rate at the first stage of sample selection was approximately 14 percent and 50 percent for LEAs and special state schools, respectively.

We anticipate that NLTS 2012 will have markedly higher response rates at various points in the study than did NLTS 2. At the district rate, our planning target is that between 66 and 70 percent of districts selected will agree to be part of the study and provide sampling lists, student data, and access to district staff. The anticipated 66-70 percent response rate for NLTS 2012 contrasts with the 14 percent district response rate implicit in the NLTS 2 design but is on par with the district response rate achieved by NCES’ High School Longitudinal Study (HSLS), with whom we have been conferring. ED has set this target based on Mathematica’s experience securing consent of national samples of districts to participate in school based studies and the experience of other organizations conducting longitudinal studies in schools. Also critical is our plan to make the study known to district personnel and encourage district participation which are outlined in section 2 of Part B (pp 7-9) of the Supporting Statement dated January 28, 2011. The strategies for this effort include 1) engaging stakeholder groups to make them aware of the study, and ask them to make their constituents aware of the study and encourage participation; 2) a well-organized effort to contact selected districts, explain study requirements, and respond flexibly to concerns (including concerns about burden), which is staffed by experienced recruiters; and 3) well-thought out responses to common concerns and explanations of the benefits of district participation, which will include access to resources on transition that might be of interest to district special educators as well as survey tools designed to help districts collect data on transition issues from students and parents for their own use.

For both studies, participating LEAs provide lists of students receiving special education. Available documentation from NLTS 2 indicates that sample selection yielded 11,276 students who were eligible to participate (eligible students were those with a good address and a working telephone). Documentation indicates approximately 12,000 students were selected but does not provide a precise number. Unweighted response rates to the parent and/or student interviews were 82 percent for the baseline (spring 2001; 9,230 completes), 56 percent for the first parent follow-up (spring 2003; 6,322 completes), and 48 percent for the second parent follow-up (spring 2005; 5,368 completes).6

NLTS 2012 will have approximately the same student level response rate at baseline as NLTS 2. However, we believe higher sample retention will be achieved in NLTS 2012. In contrast to NLTS 2, we plan to gather extensive contact information and conduct extensive locating and follow-up to obtain telephone numbers of parents or youth who move. In particular, we will attempt to locate at subsequent data collection points sample members who are not interviewed at one or more previous data collection points. For planning purposes, we have assumed that sample retention at each round of data collection will be 94 percent of the sample available at the prior round. This assumption is consistent with Mathematica’s experience on the Youth Transition Demonstration projects and the National Job Corps Study as described in our response to Question 10 above. As described in Table 4, we anticipate that data would be available for just over 60 percent of the sample at eight years after baseline. In contrast, NLTS 2 analyzed data for 48 percent of the NLTS 2 sample in spring 2005, which was four years after baseline data collection in spring 2001.

Completion rates for the third and fourth parent/youth follow-up interview for NLTS 2 are not currently available. No reports have been published.

TABLE 4

COMPARISON OF DATA COLLECTION DESIGN OF NLTS 2012 AND NLTS 2

NLTS 2012

NLTS 2

Study Population

Ages 13-21 Fall 2011


Students with IEP


Students with Section 504 Plan


Students with no IEP and no Section 504 Plan

Ages 13-16 Fall 2000


Students with IEP

NLTS 2012 Baseline Data Collection

Spring 2012


Parent interview, telephone


Youth interview, telephone


Youth, direct assessment, in-person


Principal, web/phone


Math or LA teacher, web/phone


School program survey, special education teacher, web/phone

NLTS 2 Round 1: Spring 2001


Parent interview/telephone, mail


NLTS 2 Round 2: Spring 2002


Youth direct assessment, and in‑person interview


School background survey, principal, mail


General education teacher survey, mail


School programs survey, special education teacher, mail

NLTS 2012 First Follow-Up Data Collection

Spring 2014


Parent interview, telephone


Youth interview, telephone


Youth, direct assessment in-person


School program survey, special education teacher, web/phone

NLTS 2 Round 3: Spring 2003


Parent interview, telephone, mail


Youth interview, telephone, mail


NLTS 2 Round 4: Spring 2004


School background survey, mail


General education teacher, survey, mail


School program survey, mail

NLTS 2012 Future Follow-Up Data Collection

Not yet planned

NLTS 2 Rounds 5-7: Spring 2005, Spring 2007, Spring 2009


Parent, telephone interview


Youth, telephone interview


13. To what extent will the NLTS content be reused in this study?  For any new areas in particular, we’d like to suggest the addition of someone from NCES associated with the High School Longitudinal Study to the technical review panel, or at least consultations with them on parental and study questionnaire content.

The design of the baseline questionnaire for NLTS 2012 has relied heavily on the content of NLTS 2 baseline data collection instruments, to allow for estimation of trend lines. The study team also contacted NCES staff working on the HSLS early in our design phase to discuss lessons learned in recruitment and survey items. The IES team leader for NLTS 2012 is currently a member of the HSLS Technical Work Group and so has early access to questionnaires in development and results of pre-testing. NLTS 2012 has made use of the HSLS baseline and first follow-up instruments, both to guide the design of specific survey questions and to ensure that the NLTS 2012 questionnaires covered all appropriate constructs. For example, we plan to use or adapt questions from the HSLS 2009 baseline on the following topics: youth attitudes toward the value of studying, student academic support programs, programs to support student persistence in high school, and transition support. We will also use one of the questions about perceived barriers to postsecondary success from the HSLS.

Differences from NLTS 2 in Topics Covered

We intend to preserve the most important variables collected by the prior NLTS 2 study, while adding items relating to important, evolving policy relevant issues. The next clearance package will provide the instruments and more detail on each of these issues, but the new areas include:

  • Planning for Postsecondary Education or Work. The NLTS 2 included relatively few questions about planning for postsecondary education and work and most of these focused on development of the IEP transition plan, which would not be appropriate for sample members with no IEP. We plan to add questions to the parent and youth surveys about the guidance youth receive about careers and postsecondary education options, support in selecting and applying to postsecondary education, assistance searching for jobs, challenges youth or parents face in planning for postsecondary education or work, and whether postsecondary education is affordable. In addition, we plan to add a question for parents of IEP students about challenges posed by the IEP transition planning process. These additional survey items will document the transitional challenges parents and youth perceive. They will allow us, using correlational analysis, to examine the apparent relationship between preparation/planning activities and outcomes in ways that can inform policy-relevant hypotheses that can be tested more rigorously in future demonstrations and evaluations of interventions.

  • Social Skills and Problem Behaviors, and Social Adjustment. The NLTS 2 teacher survey had relatively few questions relating to students social skills and behavior problems, yet issues around secondary school behavior have been of increasing importance in school policy and in studies that identify factors for school and later success. We propose to add questions on this topic. We plan to ask all of the items in selected subscales of the Social Skills Improvement System-Rating Scale (SSIS-RS). We also plan to ask selected questions on social adjustment that were asked in NLTS 2. Use of validated subscales for the SSIS-RS with all sample members will support description of subgroups of interest and ultimately exploration of the relationship between social skills and student high school and post-transition outcomes. These analyses can inform hypotheses about the types of social skills that are most important to cultivate—again hypotheses that can inform future research and demonstrations and program development.

  • Self-Determination. Self determination encompasses a set of characteristics and capabilities of youth with significant cognitive disabilities to be as independent as possible, to make decisions about their lives, and to take responsibility for those decisions. Special educators and researchers sought ways of improving self determination in the belief that improved self-determination will promote more positive post-school outcomes. NLTS 2 included questions designed to tap the relevant characteristics and capabilities but no measures capturing key constructs were reported. We are considering including a somewhat larger set of items comprising empirically validated subscales of self determination in the youth survey. These measures will allow descriptive analyses of the relationship between self-determination and post-school outcomes.

  • Other Barriers and Challenges. An important goal of the study is to identify the challenges youth face preparing for postsecondary education, work, and independent living as well as experiences or other factors that can facilitate positive outcomes. We will pursue this objective in two ways. First we will examine the relationship between outcomes and participation (or lack of it) in various services or activities to gauge which services or activities may promote specific outcomes. Second, we will ask respondents directly about the challenges youth confront preparing for life after high school. We plan to include in the parent survey questions about the challenges and barriers parents and youth encounter in selecting and applying to postsecondary education programs, defining career goals and making employment plans. We also plan to add questions about challenges youth and parents face in developing social and extracurricular activities and, for IEP students, challenges planning for future living arrangements. In addition, we plan to retain questions from the NLTS 2 survey on the student’s health conditions, parental expectations and resources, and the extent to which youth receive various supports, services, and accommodations. (More information about the types of barrier questions we plan to include in the surveys is included in our response to question 17 below). These two types of analyses will complement each other and together will provide a more nuanced picture of the issues youth confront.

To make space for the questions we are adding to the survey instruments without extending the length of the survey, we identified items in NLTS 2 that could be dropped or shortened. While some of these items have some potential value we believe they are lower priority than the items we propose to add or retain. We currently plan to delete NLTS 2 questions relating to several topics including the following:

  • Vocational Teacher’s Experience of Student. The NLTS 2 School Program Survey included detailed questions about a vocational class taken by each youth and the vocational teacher’s perceptions of the youth’s performance and behavior. Many of these questions are similar to the questions posed to academic teachers. To secure this information required input from a separate vocational teacher, increasing burden on the school. Given the decline in vocational course taking and the difficulty contacting another respondent, we believe the burden and costs associated with these items exceed the value of the information. Nonetheless, we plan to retain the NLTS 2 School Program Survey items that secure basic information on whether each sample member completed a vocational course, and if so, whether it was a general education or special education course.

  • Special Education Class Experience of Student. The NLTS 2 School Program Survey also included questions about the student’s behavior and performance in a special education class. The special education coordinator would often need to consult with a different teacher to secure this information which will increase the burden and may make these questions less reliable. Instead we plan to rely on the information provided by the language arts or math teachers on the student’s performance and behavior; this will provide a comparable perspective for IEP, Section 504, and non-IEP students. We anticipate that most students (including IEP students) will receive math and language arts instruction in general education classes that include both IEP and non‑IEP students. However, some students with an IEP will receive math and language art instruction from special education teachers in classes with only IEP students.

  • Teacher Questions on The Basis of Grades. The NLTS 2 general education teacher survey included a set of questions about the criteria for grading students—the importance placed on specific kinds of student behaviors and performance in determining grades for the class as a whole and for the sampled youth in particular. While of some interest, we believe the analytic value of these items is low since they are not precise and cannot be used to adjust measures of student performance.

  • Similar Parent and Youth questions about Extracurricular Activities and Risky Behaviors. The NLTS 2 parent survey included questions about after-school and extracurricular activities and risk behaviors which were also asked of youth in the following year. Since, in contrast to NLTS 2, we will survey both parents and youth at baseline, we propose to retain the items on these topics asked of youth and drop most of the duplicate items that NLTS 2 asked to parents. However, we are considering obtaining from parents rather than from youth information about some risky behaviors, particularly problems with alcohol and drug use and arrests. (Final decisions will be made before preparing the clearance request for follow up data collection).

14. The SS indicates that pretesting will be done before instruments are provided for clearance.  Under what clearance does IES plan to conduct those pretests?  Should IES request clearance in this package for those pretests?

Section 4 of Part B indicates that pretests will be conducted on individual instruments that will be used to collect data from school staff, parents and youth. The questionnaires for baseline and first follow-up data collection, for which clearance will be requested in subsequent submissions, will draw heavily on extensively used items. Therefore, the pretests of these instruments are expected to focus on ensuring that the question flow works well, that respondents understand the questions, and that the time required to complete the instrument is accurately estimated. Based on these considerations, each instrument will be administered to nine or fewer individuals, and therefore will not require prior OMB approval.

15. The table in SS B3 indicates that the IES will reassure districts about privacy concerns by indicating that the collection is covered by FERPA.  The key point is not that it is covered by FERPA but that the department has an exemption from FERPA informed consent requirements that permits access to these data.

We will ensure that all study recruitment materials include the following language:

“The collection of personally identifiable information from students’ education records on behalf of the study is permissible under the Family Educational Rights and Privacy Act (FERPA). FERPA provides for the nonconsensual disclosure of education records to “authorized representatives” of the Secretary provided that the disclosure is in connection with “an audit or evaluation of Federal or state supported education programs” ( 34 CFR 99.35(a)).”

16. Please use the standard ESRA pledge language in all instances when talking about confidentiality.  This includes in the Study Summary document, letters and consent form. 

We will use the language below in the Study Summary document, letters and consent form:

“Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. We will not provide information that identifies you or your district to anyone outside the study team, except as required by law. Any willful disclosure of such information for nonstatistical purposes, without the informed consent of the respondent, is a class E felony.”

We will use the following language, borrowed from the HSLS 2009, in any documents used with youth:

“The information we collect is used only for statistical purposes and may not be disclosed or used, in identifiable form for any other purpose except as required by law (Public Law 107-279, Section 183).”

We will eliminate the following text from the study summary: “Responses to all data collection activities will be kept confidential and be used only for research purposes. The reports prepared for the study will summarize findings across large groups of participants and will not associate responses with a specific district, school, or individual.”

We will eliminate the following text from the consent form: “Study information will be kept confidential and will only be reported in statistics without the names of people or schools.”

We will eliminate the following text from the letters: “Any student-level data provided to Mathematica by the district in support of this study will be kept strictly confidential, except as may be required by law or regulation, and will be used only for research purposes. Any identifying information on students will be replaced with randomly generated, anonymous identifiers prior to analysis. Access to individual level data will be restricted to the study team and researchers directly authorized by ED. Data related to individual students, their parents, schools, their staff member, or districts will only be publicly reported at an aggregated level and not identified in study findings. A restricted-use data file will be created for ED for potential further analysis. This file will contain individual level data, but no names or identifiers that would allow data to be attributed to a specific student, school, or district.”

17. RQ2 gets at barriers and challenges youth with disabilities encounter?  Can you clarify what that question will be getting at, i.e. barriers to what or in what?  Will you be looking at barriers to entering and succeeding in college and beyond?

Research question 2 in Table A.1 on page 6 of Part A of the Supporting Statement includes a question about what barriers students encounter. The analysis of barriers will focus on factors that either present obstacles or facilitate 1) entry to and completion of post-secondary education; 2) employment or 3) independent living. The structure of the study will not support rigorous inference about causal relationships among various potential obstacles or facilitators and post school outcomes; no credible method of establishing a counterfactual is available in the context of a longitudinal study. However, the rich base of information about personal and family characteristics; academic skills, social skills and self determination skills; and experiences in school and high school completion status in conjunction with measures of entry to and completion of postsecondary education, and postsecondary employment and earnings will support an exploratory effort to identify potential barriers and facilitators. In this context, measures of academic skills, social skills, self determinations skills and transition related experiences in school will receive special focus because all of these factors can be influenced by educators. These analyses will include multivariate analyses that examine which factors and experiences are good predicators of positive outcomes, controlling for other youth background characteristics.

In addition, the baseline and follow-up surveys will include questions relating to the challenges perceived by parents, youth, and their teachers with respect to the postsecondary success of youth. We believe this line of questioning is particularly important because the HSLS is addressing the issue of barriers and challenges in only a limited way. We will use this information to describe specific barriers and compare the barriers perceived by each of these three groups. This descriptive analysis may suggest additional measures to include in the multivariate analysis described above. We plan to design the surveys to allow an analysis of the following kinds of perceived barriers and challenges:

  • The challenges youth face in deciding what they might do after high school according to parents and youth. We will ask parents and youth whether they confront specific difficulties making post-high school plans such as lack of awareness of career options, including the education and training needed for specific jobs, lack of information about specific colleges or other education or training institutions, lack of information on available financial aid for tuition and other costs, lack of information on specific jobs, lack of support from school staff in developing education or career plans.

  • The challenges youth face in securing postsecondary education. Parents, teaching staff, and students will be asked whether individual youth are likely to confront various specific issues in securing postsecondary education including lack of interest or motivation to continue school, preparedness for postsecondary education, or adequacy of postsecondary accommodations.

  • The challenges students face in securing paid jobs after high school. Parents, teaching staff, and students will be asked whether youth are likely to face specific employment challenges. This will include youth’s lack of interest or motivation to work and limited job skills, fear of losing SSI benefits, the parent expects that the student cannot work, and the youth is likely to be perceived by employers as too physically or sensory impaired.

  • The challenges parents identify for their child’s participation in social, recreational and community activities. We will ask parents whether specific factors affect their child’s involvement in these activities such as their child is not accepted by other youth or is not welcomed by activity leaders, their child does not want to be the only participant with a disability or special need participating in the group, transportation is difficult to arrange, or medical or other condition prevent their participation.

  • The challenges youth perceive in planning where they will live in the future. Youth will be asked whether they are likely to have difficulty planning where they will live because of specific factors such as fears about living independently, lack of transportation likely to limit their ability to get around, lack of information on the available types of housing options, lack of affordable housing options in the community.

18. RQ3 deals with key outcomes. How do you determine what is a key outcome? What are the particular academic, social, and economic outcomes you intend to study?

We have identified key outcomes based both on current transition research and in consultation with the research team and the Technical Working Group. The table below provides our current list of key outcome measures and identifies the data source for each outcome.


KEY ACADEMIC, SOCIAL, AND ECONOMIC OUTCOMES FOR NLTS 2012


Outcome for Youth

Data Source

Academic Skills and School Engagement



Math and Language Arts Competencies

Academic Assessment


Scores on state academic assessments

Transcript


Grade point average

Transcript


Whether math/language arts class is at, above, or below grade level

Math/LEA Teacher survey


High school credits by subject and by level for math

Transcript


How often youth:

Math/LEA Teacher survey


Completes homework on time



Takes part in group discussion



Stays focused on class work



Works to best of ability



Typical homework hours per week

Youth


Repeating current grade level

Parent


Ever expelled; suspended

Parent,


Out of School Suspension

Transcript, School Program


Rate of absences

Transcript

Social Skills, Self-Determination Skills, and Problem Behaviors



Social Skills Improvement System subscales (communication, engagement, responsibility, externalizing)

Youth


Self Determination Scales (autonomous function, psychological empowerment, self realization)

Youth


Takes part in social activities (school, out-of-school group, volunteer);

Youth


Days/wk get together with friends

Youth


Means of communicating with friends (phone, text, IM, email, social media)

Youth


Has/expects to get driver's license

Youth, Parent


Drink, drug use (times in last mo); arrests in last 2 yrs; jail or detention in last 2 yrs

Parent

High School Completion



Whether obtained diploma and type (regular diploma, GED, certificate of completion)

Youth, Parent

Postsecondary Education Enrollment



Enrollment by type of program (2 year, 4 year, vocational certificate and degree completion by type of credential

Youth, Parent

Employment



Unpaid employment

Youth


Paid Employment

Youth


Wages

Youth


Hours

Youth


Type of job

Youth


Fired from a job in last 2 yrs

Youth

Independence



Living arrangement (independent living, with family members, supervised setting)

Youth


Has health insurance

Youth


Has an allowance or money can spend; has checking/savings accts; credit/debit care in name; gets bills in own name

Youth


Registered to vote

Youth



1Williams, RL (2000) “A Note on Robust Variance Estimation for Cluster-Correlated Data.” Biometrics 56, 645-646, June 2000.

2Folsom, Ralph E., Francis J. Potter, and Steven R. Williams. “Notes on a Composite Size Measure for Self-weighting Samples in Multiple Domains.” In Proceedings of the American Statistical Association, Section on Survey Research Methods. Alexandria, VA: American Statistical Association, 1987, pp. 792–796.

3During the initial district recruiting phase, the study team will be able to determine more clearly the number of such districts.

4For sample members who did not provide digitally recorded verbal consent at baseline, we will follow baseline protocols of interviewing the parents first to secure consent.

5 ?Students aged 16 to 18 in spring 2002 were assessed at that point, and those younger than 16 at sample selection were assessed at ages 16 or 17 in spring 2004.

6 ?See Lynn Newman et al. April 2009; NCSER 2009-3017, Table A-8, for interview completion data. Appendix A provides description of the LEA and student sampling process.

14

File Typeapplication/msword
AuthorACiemnecki
Last Modified ByAmanda DeGraff
File Modified2011-05-24
File Created2011-05-24

© 2024 OMB.report | Privacy Policy