Supporting Statement Part B

Supporting Statement Part B.doc

National Longitudinal Transition Study -2 (NLTS2)

OMB: 1850-0815

Document [doc]
Download: doc | pdf

Supporting Statement for Paperwork Reduction Act Submission

Part B: Collections of Information Employing Statistical Methods

1, 2. Population, Sampling Plan and Procedures

The National Longitudinal Transition Study-2 (NLTS2) must meet the information needs of a wide variety of audiences using a variety of data collection and analytic approaches. The NLTS2 sample must meet the following requirements in order to serve its multiple purposes:

  • Focus on students—NLTS2 data must enable accurate estimates about the characteristics, programs, and achievements of secondary school students receiving special education as they transition to young adulthood. However, no universe list of all students receiving special education existed from which to draw the NLTS2 sample. Thus, a sample of LEAs was drawn, from whose rosters students were selected. However, the sample of LEAs was only a vehicle for obtaining a sample of students; it is too small to make highly precise national estimates about LEA practices.

  • Generalize to each disability category and age cohortNot only must the NLTS2 sample enable reasonably precise estimates for the full special education student population ages 13 through 16 at the outset, it also generalizes to each special education disability category and to each of the single-year age cohorts within the age range. This requirement has important implications for the size of the student sample, which must have enough students in each disability category to meet this requirement. If the sample contains sufficient numbers of students per category, it also will be large enough to generalize to the four single-year age cohorts within the sample.

  • Longitudinal—NLTS2 data are collected repeatedly over a 9-year period to obtain information on postschool achievements, including postsecondary education degree attainment. The initial sample must be large enough to support estimates of reasonable precision in the ninth year of data collection (assuming that 8% of youth who are in the sample each year will be lost the following year because of mobility).1

  • Multiple data sources—Multiple data sources are needed to obtain the breadth of information specified in the NLTS2 conceptual framework. Many analyses employ information from more than one source. Given reasonable assumptions about response rates to the various data collection efforts, some youth will not have information from a source, reducing the sample for analyses using that data source. Even more will be missing information when several sources are combined. The sample must be large enough to accommodate missing information from multiple data sources.

  • Multiple analytic purposes—The richness of the NLTS2 database supports a variety of analyses, with implications for the sample. For example, subgroup analyses will examine experiences and outcomes of students receiving special education who are differentiated by particular characteristics (other than age and disability category, as mentioned above), such as gender, ethnicity, or functional abilities. The NLTS2 sample must be large enough to support these kinds of subgroup analyses.

  • Comparable to NLTS—The sample must permit comparisons with the original NLTS in order to determine changes in the experiences and achievements of students in transition over the past decade or more.

The NLTS2 design process considered in detail options for meeting these sample requirements within the funding constraints OSEP projected and selected the approach that best balances the sample requirements and resource constraints. That sampling approach was described in detail in the initial OMB clearance request, including numerous tabular presentations of the size of the LEA and student universe and sample.

To summarize, NLTS2 used a two-stage process to generate a nationally representative sample of students receiving special education who were ages 13 to 16 and in at least 7th grade (or an ungraded program with similar-age students) on December 1, 2000. A power analysis indicated that a total of 497 local education agencies (LEAs), stratified by region, district size (student enrollment), and community wealth (Orshansky percentile), was the appropriate sample for NLTS2. A total of 501 LEAs provided rosters from which to select students for the second stage sample, meeting both the requirements of LEA sample size and distribution across the sampling grid. A total of 3,634 LEAs were selected from the universe of those serving students with disabilities in the NLTS2 grade range and invited to participate to generate the needed sample of 501 LEAs, or 13.8% of the number invited.

Multiple analysis approaches were used to confirm that the NLTS2 LEA sample is a nonbiased representation of the universe from which it was selected. Analyses comparing the universe of LEAs and the LEA sample, both weighted and unweighted, on variables used in stratification revealed that the weighted LEA sample closely resembled the LEA universe with respect to those variables. To further confirm the representativeness of the NLTS2 LEA sample, OMB directed the Office of Special Education Programs to complete a nonresponse bias study; it was conducted in two stages. The first stage involved analyses of extant databases to determine whether variations in LEA characteristics contribute meaningfully to explaining variations in student-level experiences and outcomes. The second stage involved selecting a nationally representative sample of LEAs and conducting a telephone survey of those LEAs and LEAs participating in NTLS2 to compare various aspects of their special education policies and procedures. The results of both stages found no evidence to suggest any bias in the NLTS2 LEA sample.

NLTS2 drew a stratified random sample of students receiving special education in this nationally representative sample of LEAs and a sample of state-supported special schools.2 The student sample was stratified by disability category to ensure that the sample was nationally representative of each disability group. Thus, the LEA is the primary sampling unit, and the student with a disability is the secondary or final unit.

A total sample of 24,517 students was selected from 501 participating LEAs and 38 state-supported special schools. This number intentionally included approximately 200% of the number of students actually needed for the sample because, at the time students were selected from each participating LEA, it was not known how many LEAs ultimately would agree to participate. We also estimated that 10% of students would not have accurate location information and could not be contacted. Thus, each LEA was oversampled to ensure that enough students would be available in each disability category, even if the final number of participating LEAs was small.

Of these 24,517 students, 24,361 students (99.36%) had some location information (either an address or telephone number) provided by the LEA or special school from which they were sampled, which could be used to obtain a parent telephone interview or mailed family survey. However, in the process of sending notification letters to sample members and conducting computerized checks of location information for them, it was determined that the location information for 2,433 students was incorrect and updated correct information could not be obtained. Subtracting these students from the sample (along with those for whom no location information was obtained at the outset), resulted in a total of 21,928 students for whom accurate contact information was obtained, resulting in a “location rate” of 90.01% of the initial sample of 24,517 students.

As mentioned, the total number of sampled students was considerably greater than the number deemed needed to meet the objectives of the study. A total of 11,270 students ultimately were selected for the NLTS2 “attempted sample”—those for whom data collection actually was attempted (other sample members were retained in a “reserve sample”).

In evaluating the quality of a survey sample, there are two primary considerations: statistical precision and the potential for bias. The survey response rate is pertinent to both in that an unexpectedly low response rate can leave a study with insufficient statistical precision and it might produce, although does not necessarily produce, a biased sample—i.e., one that does not accurately represent the universe from which the sample was selected. Below we present the number of respondents for the fourth wave of the National Longitudinal Transition Study-2 (NLTS2) and a response rate that is calculated using the maximum eligible sample. We then discuss the implications for statistical precision and for the potential for sample bias.

Wave 4 NLTS2 Instrument Response Rate

The Terms of Clearance for the previous OMB Notice of Action for NLTS2 (9/22/2005) stated that, “ED will continue to carefully monitor response rates and report to OMB on any further analysis of attrition and response bias.” Accordingly, non-response bias reports have been submitted to OMB after each wave of data collection. The most recent such report was submitted in May 2008. Information from that report is presented below.

Table 1 specifies the number of respondents for the Wave 4 Parent/Youth interview/survey, and the associated response rates, calculated using the maximum appropriate eligible population within responding LEAs, as indicated in the table notes. Response rates for Waves 1, 2, and 3 of NLTS2 are also provided for comparison purposes. In particular, these calculations include youth as eligible whether or not they could possibly be reached for an interview or survey because no location information is available. Note that the sample obtained for each instrument will be weighted so that it accurately represents the universe of students, defined by age and disability category, from which the NLTS2 sample was selected, regardless of response rate.

Table 1. Response Rates for the Parent/Youth NLTS2 Instruments


Eligible students

Number with completed instrument

Response ratea

Wave 1 Parent interviews/mail survey

11,244a

9,230

82.09%

Wave 2 Parent interview/youth interview/survey

11,226b

6,859

61.10%

Wave 3 Parent interview/youth interview/survey

11,225c

5,657

50.40%

Wave 4 Parent/youth interview/survey

11,128d

5,570

50.06%

a 26 deceased youth were eliminated from the pool of eligible sample members, reducing that pool from 11,270 originally selected members to 11,244.

b 44 deceased youth were eliminated from the pool of eligible sample members in Wave 2, reducing that pool from 11,270 originally selected members to 11,226.

c 45 deceased youth were eliminated from the pool of eligible sample members in Wave 3, reducing that pool from 11,270 originally selected members to 11,225.

d 142 deceased youth were eliminated from the pool of eligible sample members in Wave 4, reducing that pool from 11,270 originally selected members to 11,128.

Implications for Statistical Precision

The NLTS2 sampling plan (available at http://www.nlts2.org/pdfs/final_sampling_plan.pdf) estimated the needed student sample using the following assumptions:

  • Estimates in Year 9 (the fifth and final wave of data collection for the parent/youth interviews) should have standard errors of no more than 3.6% for the largest categories of disability (learning disabilities, speech impairments, emotional disturbances, mental retardation, hearing impairments, and other health impairments. Other categories are expected to range from 3.8% (visual impairments) to 8.2% and 10.1% for the very small categories of traumatic brain injuries and deaf-blindness.

  • Ten percent of the initial sample would not have good contact information and, thus, would have no data from any instrument.

  • Attrition would be 8% per year (i.e., sample members lost due to out-of-date contact information) of those with initial contact information.

  • The parent/youth interview response rate would be 70% of the available sample (i.e., sample remaining after attrition) in a given wave.3

With a starting sample of 11,270, these assumptions would produce the available sample indicated in column A of Table 2 for each year of the study and the number of completed parent/youth interviews indicated in Column B. The data indicate that 3,643 parent/youth interviews would be needed in year 9 (wave 5) to achieve the precision levels desired. Column C indicates the actual number of parent interviews completed in Waves 1 through 3.

Table 2. Expected and actual number of parent and youth interview/surveys to date


A

B

C

Study year/wave

Expected “live” sample

Expected number completed parent and youth interview/surveys

Actual number completed parent and youth interview/surveys

1 (Wave 1)

10,143

7,100

9,230

2

9,332



3 (Wave 2)

8,585

6,010

6,859

4

7,898



5 (Wave 3)

7,266

5,086

5,657

6

6,685



7 (Wave 4)

6,150

4,305

5,570

8

5,658



9 (Wave 5)

5,205

3,643



The number of Wave 1 parent interviews exceeded the expected number by 30%, the actual number of completes in Wave 2 exceeded expectations by 14%, the actual number of completes in Wave 3 exceeded expectations by 11.2%, and the actual number of completes in Wave 4 exceeded expectations by 29.4%. Thus, the study is going into Wave 5 with a higher number of sample members with completed interviews than was expected in order to reach the desired precision level in Wave 5. Because having had a previous interview increases the chances of completing a subsequent interview (because information on location and on third-party contacts through whom a youth’s location could be traced), there is a high likelihood that subsequent waves of interviewing will continue to reap more than the expected number of completed interviews. This likelihood is further increased by the fact that the incentive plan approved by OMB (which was not available in Waves 1 and 2) permits payments of $20 for each completed parent and youth interview. This should help achieve or exceed the response rates required to reach the required number of completed interviews in Wave 5, suggesting the statistical precision requirements of the study will be met.

Implications for Potential Bias

Although, as noted above, response rate and response bias are conceptually independent (i.e., it is possible to generate an unbiased, representative sample even with a relatively low response rate), the risk of bias increases as response rate decreases. To reduce the likelihood of bias, the NLTS2 sample for each instrument in each wave is weighted to represent the distribution on the key factors of disability category, age, and race/ethnicity of students with disabilities in the universe, as reported by states to OSEP for their entire special education population. No other items in the limited dataset on the universe of students receiving special education are common to NLTS2, so there are not additional factors from a known universe that could be compared to test for bias or to develop or adjust weights.

Other than the variables in the OSEP report to the states, the closest approximation to the universe that can be used to assess potential bias in the Wave 4 Parent/Youth interview/survey are the responses to the NLTS2 Wave 1 parent interview. We previously used Wave 1 Parent data to examine Wave 3 Parent/Youth responses for possible biases. On the basis of those results we modified our weighting algorithm to include consideration of household income, parental volunteering, and being held back a grade. That is, our weighting algorithm for Wave 4 attempts to replicate the national distribution of disability category, age, and race/ethnicity and the distribution of household income, parental volunteering and being held back a grade that we found in our Wave 1 data.

The objective of this analysis is to compare our Wave 4 respondents with our Wave 4 eligibles on both a weighed and unweighted basis with respect to their Wave 1 responses. The preliminary step in performing this analysis is to identify key variables from the NLTS2 Wave 1 parent interview that reflect or help shape students’ school experiences and outcomes. Those variables include disability category; age; gender; household income; race/ethnicity; school type; school experiences; parental involvement, satisfaction and expectations; students school work quality; family support score; and social skills score.

The second step is to categorize the NLTS2 participant population according to whether or not a student (1) had a Wave 1 parent interview, (2) was eligible for the Wave 4 Parent/Youth interview/survey, and (3) was a respondent to that survey. Ineligibility is narrowly defined as deceased. Table 3 shows the six mutually exclusive categories into which a student could be classified (excluding 26 students who were ineligible for the Wave 1 interview). Cells are labeled G1 (for Group 1) to G6. The table shows the number of students in each cell.

Each student in Table 3 represents a set of students in the universe; if both instruments had been administered to every student in the universe, the universe also would be divided into the categories in Table 3. The original weights for the Wave 4 Parent/Youth Survey projected all students in groups G3 and G4 to represent all students in the universe in groups G3 through G6.

Table 3. Distribution of students to the wave 1 parent interview/survey and wave 4 parent and youth interview/surveya


Respondents to wave 1 parent interview

Nonrespondents to wave 1 parent interview

Ineligible for Wave 4 Parent and Youth Interview/Survey but alive at Wave 1

G1 = 142

G2 = 0

Respondents to Wave 4 Parent and Youth Interview/Survey

G3 = 5,359

G4 = 211

Nonrespondents to Wave 4 Parent and Youth Interview/Survey

G5 = 3,740

G6 = 1,820

a Excludes students who were ineligible for Wave 1.


For purposes of this nonresponse weighting analysis, two alternative weights were developed for participants in the Wave 4 Parent/Youth interview/survey. One set of weights (denoted as the G3 weights) project students in G3 to the portion of the universe represented by G3 through G6. The second set of weights (denoted as the G35 weights) project the students in G3 and G5 to the portion of the universe represented by G3 through G6.

Responses to the key questions from the Wave 1 Parent survey have been tabulated in four ways: using (1) the G3 group without weights, (2) the combination of the G3 and G5 group without weights, (3) the G3 group and the G3 weights, and (4) the combination of the G3 and G5 group and the G35 weights. The comparison of tabulations 1 and 2 (i.e., unweighted comparisons) can be used to assess the extent to which there is nonresponse bias before any weighting adjustments are made. The comparison of tabulations 3 and 4 (i.e., weighted comparisons) can be used to assess the extent to which there is nonresponse bias after weighting adjustments are made. For example, if Hispanic parents are disproportionately nonrespondents to the Wave 1 Parent Survey, this would be reflected in differences between tabulations 1 and 2, but not in differences between tabulations 3 and 4 because race/ethnicity is one of the variables considered in the weighting process.

The amount of bias caused by nonresponse in G5 can be estimated using the formula:
Bias = MG35 – MG3 where MG35 is the mean value for the key variable using the G35 weights and MG3 is the mean value for the key variable using the G3 weights.

When we examined the weighted results using our Wave 3 weighting algorithm we found that after weighting there were two remaining biases (differences in weighted percentages between all eligibles for Wave 4 and all respondents to Wave 4 of 3% or greater). Comparing Wave 4 and Wave 1 on a weighted basis, we found that in Wave 4 we had: (a) fewer students who had been suspended or expelled at Wave 1 (a difference of 3.0%), and (b) more students whose parents strongly agreed that their child was getting supports from the school that he/she needed at Wave 1 (a difference of 3.3%).

To eliminate these two residual differences, we adjusted the weights to equalize the distribution of parental opinions concerning whether their child was getting supports from the school that he/she needed at Wave 1. We considered "did not provide an answer" to be a legitimate category, in the sense that if these questions had been asked of all students with disabilities in the universe, a certain percentage would not provide an answer. We then recalculated weights using Deming’s algorithm so that the marginal totals for the weighted data approximated the estimated totals for the universe. Table 4 shows both the unweighted results and the results using the extended weights.


Table 4. Comparison of wave 4 parent and youth interview/survey eligible population and wave 4 parent and youth interview/survey respondents on their wave 1 parent interview/survey responses


Unweighted

Weighted*


Wave 4
respondents

Wave 4
eligible

Wave 4
respondents

Wave4
eligible

Disability category





Learning disability

8.3

9.6

62.1

62.1

Speech/language impairment

8.8

9.5

4.0

4.0

Mental retardation

8.8

9.4

12.2

12.2

Emotional disturbance

7.3

9.1

11.3

11.3

Hearing impairment

9.3

9.4

1.3

1.3

Visual impairment

8.0

7.4

0.5

0.5

Orthopedic impairment

10.6

9.8

1.2

1.2

Other health impairment

10.4

9.9

4.6

4.6

Autism

12.2

10.1

0.7

0.7

Traumatic brain injury

4.1

4.1

0.3

0.3

Multiple disabilities

10.2

9.9

1.8

1.8

Deaf-blindness

2.1

1.8

0.2

0.2

Gender = male

64.8

64.8

65.3

66.9

Age on 7/15/2001

 

 

 

 

13 or 14

33.8

34.2

31.6

30.5

15

25.5

24.8

21.7

23.9

16

25.0

25.1

27.4

26.5

17

15.7

15.9

19.3

19.0

Household income

 

 

 

 

$25,000 or less

27.8

31.6

33.2

33.3

$25,001 to $50,000

27.2

27.3

27.3

26.8

More than $50,000

37.5

31.3

30.3

30.3

No value provided by survey respondent

7.6

9.8

9.3

9.6

Race/ethnicity

 

 

 

 

White

66.4

62.5

62.1

61.8

African-American

18.2

20.7

20.5

21.0

Hispanic

12.5

13.5

14.4

14.3

School type

 

 

 

 

Attends regular school for general population

81.6

81.3

91.6

91.6

Attends neighborhood school

61.0

61.8

71.4

71.7

School experiences

 

 

 

 

Has ever been held back a grade

32.1

32.8

35.8

36.2

Has ever been suspended or expelled

23.8

27.2

30.3

32.7

Parent has been through mediation over special education services

13.0

12.7

8.5

10.6

In Wave 1 school year, parent belongs to a group for parents of students with disabilities

19.1

16.5

9.3

9.1

Parent’s agreement that student is getting supports from school he/she needs

 

 

 

 

Strongly agree

30.1

29.5

27.1

27.1

Disagree/strongly disagree

19.5

19.9

21.0

20.9

Parent’s satisfaction with child’s school

 

 

 

 

Very satisfied

45.4

43.6

39.0

37.0

Somewhat/very dissatisfied

18.5

19.4

19.7

20.5

See notes at end of table.

Table 4. Comparison of wave 4 parent and youth interview/survey eligible population and wave 4 parent and youth interview/survey respondents on their wave 1 parent interview/survey responses (concluded)


Unweighted

Weighted*


Wave 4
respondents

Wave 4
eligible

Wave 4
respondents

Wave4
eligible

In Wave 1 school year, parent:





Attended general school meeting

78.3

76.6

77.3

77.0

Volunteered at school

28.5

25.6

24.1

23.8

Went to IEP meeting

91.8

90.6

86.4

87.8

Parent wanted to be more involved in decisionmaking at IEP meeting

30.5

32.9

33.8

33.7

Expectations of student's postsecondary education

 

 

 

 

Definitely will

25.9

25.5

24.1

25.5

Probably/definitely won’t

43.1

41.9

36.6

37.5

Youth's overall academic achievement

 

 

 

 

Excellent or above average

40.0

37.9

28.1

28.0

Average

38.0

38.3

42.4

43.2

Below average or failing

22.0

23.8

29.6

28.8

Family support score [scale of 2-8]

 

 

 

 

2-5 (Low)

26.0

27.2

28.6

30.9

6.0

27.8

27.0

32.9

30.2

7.0

21.5

21.3

20.0

19.0

8 (high)

24.7

24.5

18.6

29.9

Social skills score [scale of 0-22]

 

 

 

 

0-10 (low)

22.0

21.6

17.0

17.4

11-16 (medium)

54.8

55.4

61.3

59.9

17-22 (high)

23.2

23.0

21.7

22.7

Shaded comparisons in bold are differences of 3.0 or more percentage points.

* Weights include adjustment for level of parental agreement to statement that their child is receiving the services and support from the school that he or she needs.



As seen in Table 4, prior to weighting, there were modest differences between respondents to the Wave 4 and Wave 1 interviews. Compared to Wave 1, Wave 4 respondents were: (1) less often from low ($25,000 or less) income households (27.8% vs. 31.6%), (2) more often from high (more than $50,000) income households (37.5% vs. 31.3%), (3) more often White (66.4% vs. 62.5%), and (4) less often from families of students who had been suspended or expelled by Wave 1 (23.8% vs. 27.2%).

Weighting on disability category, age, and race/ethnicity of students with disabilities, household income, parental volunteering, being held back a grade, and adequacy of school services reduced all differences to 2.7% or less.

Overall, the bias analysis is encouraging. Prior to weighting the differences between Wave 4 and Wave 1 respondents, although large enough to require attention, were not so large as to invalidate survey results. Weighting on characteristics known for the universe and some critical characteristics of the Wave 1 respondents reduces all differences to less than 3%.

Cumulative Response Rate

A power analysis indicated that a total of 497 local education agencies (LEAs), stratified by region, district size (student enrollment), and community wealth (Orshansky percentile), was the appropriate sample for NLTS2. A total of 501 LEAs provided rosters from which to select students for the second stage sample, meeting both the requirements of LEA sample size and distribution across the sampling grid. A total of 3,634 LEAs were selected from the universe of those serving students with disabilities in the NLTS2 grade range and invited to participate to generate the needed sample of 501 LEAs, or 13.8% of the number invited. Using this as the first-stage response rate, the cumulative response rates for the Wave 4 Parent/Youth instrument is 13.8% x 50.0% = 6.9%.

As was discussed earlier, it is possible to generate an unbiased, representative sample even with a relatively low response. Analyses comparing the universe of LEAs and the LEA sample, both weighted and unweighted, on variables used in stratification revealed that the weighted LEA sample closely resembled the LEA universe with respect to those variables. To further confirm the representativeness of the NLTS2 LEA sample, OMB directed the Office of Special Education Programs to complete a nonresponse bias study; it was conducted in two stages. The first stage involved analyses of extant databases to determine whether variations in LEA characteristics contribute meaningfully to explaining variations in student-level experiences and outcomes. The second stage involved selecting a nationally representative sample of LEAs and conducting a telephone survey of those LEAs and LEAs participating in NLTS2 to compare various aspects of their special education policies and procedures. The results of both stages support the conclusion that bias in the NLTS2 LEA sample is not a significant issue. It appears to be a nationally representative sample of LEAs from which a nationally representative sample of students was selected, meeting the goals and technical requirements of the NLTS2 sampling plan.

3. Maximizing Response Rates

There are two key aspects to maximizing the number of sample members for whom data are collected: minimizing the number of sample members lost through attrition, and completing data collection with the maximum number of sample members who are retained in the sample

To minimize sample attrition over the years of data collection, SRI is employing aggressive tracking mechanisms to maintain accurate and up-to-date contact information for sample members. To aid in this task, the parent and youth interviews ask for information that will facilitate tracking of parents/guardians and youth, including additional work and home telephone numbers for the respondents, location information for one or more friends or relatives who would know where the family/youth had moved, and e-mail addresses.

Newsletters and other study-update information are mailed to all NTLS2 sample members every 6 months; these mailings include a postcard that respondents can use to notify the study of address changes. In addition, mailings will be marked “address service requested” which will signal the Postal Service to provide the forwarding address of anyone who has moved.

Maximizing the number of sample members for whom data are collected is being achieved in several ways. Procedures identified below are being employed by SRI to help maximize response rates. Regarding the Parent and Youth Interviews, which are administered through computer-assisted telephone interviewing (CATI), the following procedures are being employed to maximize the completion rate for interviews:

  • Conduct electronic database searches to verify and update respondent contact information.

  • Use the third-party respondent named by parents and youth as persons who will know their whereabouts when they move when attempting to locate respondents.

  • Provide a $20 “thank you” incentive for completed interviews to all parent respondents to Part 1, as well as youth respondents to Part 2. Notification of the incentive is included in a lead letter mailed to potential respondents prior to the interview period. Personally address lead letters to respondents. (See Letters to Respondents included in this ICR submission.)

  • Provide a toll-free number for respondents to call to verify the study’s legitimacy or to ask other questions about the study. Those without phones in their homes also can call this number from any location and have the interview conducted at that time.

  • Require at least 25 unsuccessful call attempts to a number, spaced across days and times without reaching someone before considering whether to treat the case as “unable to contact.”

  • Draw a core of interviewers with experience working on telephone surveys of households, particularly interviewers who have proven their ability to obtain cooperation from a high proportion of sample members.

  • Require all interviewers to successfully complete training specific to this study, including discussions of how to avoid inviting a refusal, approaches that will help in addressing questions respondents are likely to ask, how to counter objections, and how to convert refusals.

  • Use call scheduling procedures that are designed to call numbers at different times of the day and week, to improve the chances of finding a respondent at home.

  • Make every reasonable effort to obtain an interview at the initial contact, but allow respondents flexibility in scheduling appointments to be interviewed.

  • Closely supervise interviewers during data collection.

  • Implement refusal conversion efforts for first-time refusals and use interviewers who are skilled at refusal conversion.

  • Conduct silent monitoring of interviews to identify and promptly correct behaviors that could be inviting refusals or otherwise contributing to low cooperation rates.

  • Leave a message on answering machines when such machines have been repeatedly encountered in order to let the respondent know the call is not a marketing effort but a research study.

For mailed instruments, SRI conducts follow-up mailings and reminder telephone calls at reasonable intervals after sending the initial instruments to encourage respondents to complete and return forms. Postage-paid pre addressed envelopes are included with all mailings to facilitate return of completed forms. Providing a $20 “thank you” incentive for completed parent and young adult mail questionnaires and creating individualized versions of the young adult mail survey, so that each young adult receives only the questionnaire sections appropriate to his/her circumstances, also are expected to contribute to improved response rates.

4. Testing of Instrumentation

The pilot testing of instruments conducted in the NLTS2 design phase, debriefing interviews conducted with parents and youth who had recently completed the wave 3 interview, and debriefing interviews with parents and youth who had recently completed the wave 4 interview is described in Item 8, Section A, Justification Statement.

5. Individuals Consulted on Statistical Issues

Persons involved in statistical aspects of the design and analysis include staff of the government’s design and study implementation contractor, SRI International. These staff and their telephone numbers are listed below.


Dr. Harold Javitz, Senior Statistician

Center for Health Sciences

650 859-5274


Dr. Mary Wagner, Principal Investigator

Center for Education and Human Services

650 859-2867


Dr. Lynn Newman, Co-Director

Center for Education and Human Services

650 859-3703


Dr. Renee Cameto, Co-Director

Center for Education and Human Services

650 859-6451


Dr. Jose Blackorby, Senior Education Researcher

Center for Education and Human Services

650 859-4210


In addition, all aspects of the design, sampling plan, and instrumentation were reviewed by the NLTS2 TWG, listed in Table 3 of Section A, Justification.


1 The assumption of 8% attrition reflects experience with the National Longitudinal Transition Study, in which aggressive tracking efforts kept sample attrition to about 6% per year. Changing demographics and the younger age of this sample relative to the NLTS suggest that a higher attrition rate may be experienced in NLTS2.

2 The first-stage sample of LEAs has been selected as part of the design process and is complete.

3 This rate assumes either the parent or youth interview is completed.

12


File Typeapplication/msword
File TitleOVERVIEW
AuthorMary Wagner
Last Modified Bykathy.axt
File Modified2008-08-19
File Created2008-08-19

© 2024 OMB.report | Privacy Policy