Volume I
Fast Response Survey System (FRSS) 108: Career and Technical Education (CTE) Programs in Public School Districts
OMB# 1850-0733 v.32
October 2016
National Center for Education Statistics
U.S. Department of Education
Justification
The National Center for Education Statistics (NCES), within the U.S. Department of Education (ED), requests OMB approval under the NCES system clearance for the Quick Response Information System (QRIS) (OMB# 1850-0733) to conduct data collection for the Fast Response Survey System (FRSS) survey #108 on career and technical education (CTE) programs offered to high school students in public school districts. The survey will provide nationally representative data, with a First Look report on the results to be released in December of 2017. The Office of Career, Technical, and Adult Education (OCTAE) requested that NCES conduct this FRSS survey.
Nearly all public high school students (95 percent of ninth-grade students in 2009) attended a school that offered Career and Technical Education (CTE) instruction, either on campus or at a partnering school. In 2009, 85 percent of public high school graduates had completed one or more occupational CTE courses, 76 percent had earned at least one full credit in occupational CTE, and 19 percent were CTE concentrators who had earned at least three credits in the same CTE field.1
Effective, high-quality CTE programs are aligned with college- and career-readiness standards as well as the needs of employers, industry, and labor. They provide students with a curriculum that combines integrated academic and technical content and strong employability skills. They also provide work-based learning opportunities that enable students to connect what they are learning to real-life career scenarios and choices. The students participating in effective CTE programs graduate with industry certifications or licenses and postsecondary certificates or degrees that prepare them for in-demand careers within high-growth industry sectors.2
NCES is authorized to conduct the FRSS survey by the Education Science Reform Act of 2002 (ESRA, 20 U.S.C. §9543). NCES has contracted Westat for all stages of this survey.
Design
Overview of Survey Development
FRSS has established procedures for developing short surveys on a wide variety of topics. The techniques used to shape the survey design on FRSS 108 include literature reviews on CTE programs, input from the NCES Quality Review Board (QRB), three rounds of feasibility calls, and a pretest. The specific ways we plan to use pretest calls are discussed below.
The current survey reflects lessons learned from topics and issues identified through literature review, with modifications based on three rounds of feasibility calls and a pretest with public school district personnel most knowledgeable about high school CTE programs. The first round of feasibility calls was conducted with 8 respondents in October and November 2015 (OMB# 1850-0803 v. 144). Because this is a new survey topic, the first round of calls used an open-ended interview guide to learn more about the CTE programs that districts offer to high school students, the terminology districts use regarding these programs, and the characteristics of the CTE programs they offer. The second round of feasibility calls was conducted with 11 respondents in January and February 2016 during which respondents provided feedback on draft survey questions. The third round of feasibility calls was conducted with 15 respondents in June and July 2016 during which respondents were asked to review draft survey questions, instructions, and definitions based on the initial rounds of feasibility calls. The resulting draft of the questionnaire was then reviewed by the NCES QRB and revised accordingly to prepare it for the pretest.
Pretest calls with 12 districts were conducted in September 2016 (OMB# 1850-0803 v.166). For the pretest, respondents were asked to complete the questionnaire and fax it to Westat, and then participate in a telephone debriefing with Westat to provide feedback on the questionnaire. The purpose of the pretest was to verify that all questions and corresponding instructions were clear and unambiguous, to determine if the information would be readily accessible to respondents, and to determine whether the burden on respondents could be reduced further. Changes to the questionnaire were made based on the feedback received from the pretest, and documented in a memorandum summarizing the pretest results (Attachment 1). The revised questionnaire (Attachment 2) is being submitted with this request for OMB clearance.
Procedures and Data Collection Instrument
A questionnaire package, containing a paper copy of the FRSS 108 questionnaire, a cover letter (Attachment 3), and a web information sheet (Attachment 4), will be mailed to each sampled district. The cover letter requests the participation of the district and introduces the purpose and content of the survey. It also notes that the survey should be completed by the person in the district who is most knowledgeable about career and technical education (CTE) programs for high school students. The cover letter includes instructions on how to complete and return the survey, as well as contact information in case of questions. The web information sheet is included in the mailing to provide information about the option to complete a web version of the survey. On the cover of the survey and in the cover letter, respondents are assured that their participation is voluntary and their answers may not be disclosed, or used, in identifiable form for any other purpose except as required by law (Education Sciences Reform Act of 2002, 20 U.S.C. § 9573).
If a completed survey is not received for a sampled district within 3 weeks after the initial mailing, the district will receive a nonresponse follow-up letter (Attachment 5), another copy of the district’s web information sheet, and a brief, scripted telephone call (Attachment 6) prompting the respondent to return a completed survey via the web, fax, or mail.
NCES Review and Consultations Outside of Agency
The NCES QRB members reviewed a draft list of questionnaire topics prior to the submission of the OMB package for the feasibility calls. Revisions were made to the list of topics based on input from the reviewers, and the list was used to develop an interview guide for the feasibility calls. After the second round of feasibility calls, a draft questionnaire was developed with input from OCTAE. During the third round of the feasibility calls, revisions were made to the draft questionnaire with input from OCTAE. Following the last round of feasibility calls, the QRB reviewed the draft questionnaire, and revisions were made based on their input. The revised version was used for the pretest.
In addition to staff from each of the three Divisions at NCES, the QRB also included staff from OCTAE and the Office of Planning, Evaluation, and Policy Development (OPEPD). The QRB members for this survey are listed below:
Braden Goetz, OCTAE
Sharon Lee Miller, OCTAE
Lul Tesfai, OCTAE
John Haigh, OCTAE
Heidi Silver-Pacuilla, OCTAE
Kelly Fitzpatrick, OPEPD
Michael Fong, OPEPD (Policy and Program Studies Service)
Milagros Lanauze, OPEPD (Budget Service)
Jing Chen, NCES (Assessment Division)
Chris Chapman, NCES (Sample Surveys Division)
Sharon Boivin, NCES (Sample Surveys Division)
Lisa Hudson, NCES (Sample Surveys Division)
Elise Christopher, NCES (Sample Surveys Division)
Gigi Jones, NCES (Administrative Data Division)
Joseph Murphy, NCES (Administrative Data Division)
Kashka Kubzdela, NCES (Statistical Standards and Data Confidentiality)
Assurance of Confidentiality
Data to be collected will not be released to the public with institutional or personal identifiers attached. Data will be presented in aggregate statistical form only. In addition, each data file undergoes extensive disclosure risk analysis and is reviewed by the NCES/IES Disclosure Review Board before use in generating report analyses and before release as a public use data file. Respondents will be assured that their participation in the survey is voluntary and that their answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (Education Sciences Reform Act of 2002, 20 U.S.C. § 9573).
Description of Sample and Burden
The sample design is a nationally representative sample of approximately 1,800 public school districts with high school students from the 2013–14 (or most recent) NCES Common Core of Data (CCD) Local Education Agency Universe File. The questionnaire is limited to three pages of items readily available to respondents and can be completed by most respondents in about 20 minutes.
Any special requirements that districts have for approval of surveys will be met before those districts are contacted. Based on previous FRSS studies, it is anticipated that approximately 15 districts with special clearance procedures will be contacted. These districts typically have unique requirements for obtaining approval. The materials sent to special districts will be tailored to meet the specific requirements of each district, consistent with the materials included in this OMB package. For example, most districts request information on survey justification, confidentiality, sample size, and survey collection procedures, which will be copied from the appropriate sections of the OMB package after its approval.
Questionnaire packages will be mailed to the superintendent of each sampled district in January 2016. The cover letter and questionnaire will include a description of the most appropriate respondent. Follow-up for nonresponse will be conducted by mail, email, and telephone and will begin about 3 weeks after the questionnaires have been mailed to the districts. Experienced telephone interviewers will be trained to conduct the nonresponse follow-up and will be monitored by Westat supervisory personnel. Telephone nonresponse follow-up is expected to take about 5 minutes and is used to prompt respondents to complete the survey by web or in the paper form (the latter to be mailed or faxed to Westat upon completion).
The respondent burden is estimated to be on average 2 hours per each of the 15 special districts (table 1). The estimated burden time for each of the 1,800 sampled districts to review the introductory letter requesting their participation (initial contact) is 5 minutes per district. Assuming a response rate of 90 percent, the initial sample of 1,800 districts will yield about 1,620 completed questionnaires, with a response burden of approximately 20 minutes per completed questionnaire3. It is also anticipated that about 75 percent of the sampled districts will receive a nonresponse follow-up call that will take about 5 minutes.
Table 1. Estimated burden for data collection and nonresponse follow-up for 1,800 districts: FRSS 108
Type of collection |
Sample size |
Estimated response rate (percent) |
Estimated number of respondents |
Estimated number of responses |
Burden hours per respondent |
Total respondent burden hours |
Special clearance district review |
15 |
100% |
15 |
15 |
2.00 |
30 |
Initial district contact |
1,800 |
100% |
1,800 |
1,800 |
.083 |
149 |
Questionnaire |
1,800 |
90% |
1,620 |
1,620 |
.33 |
535 |
Nonresponse follow-up call to school |
1,800 |
75% |
1,350 |
1,350 |
.083 |
112 |
Total burden |
- |
- |
1,815 |
4,785 |
- |
826 |
Questionnaire
The purpose of this Fast Response Survey System (FRSS) survey is to collect nationally representative data from public school districts about CTE programs offered by the districts. These programs may be offered at district facilities or in a partnering off-site location, such as area CTE facilities or postsecondary institutions. The sample will focus on school districts with high schools because CTE programs are generally designed for high school students.
Questions in the survey are based on criteria identified by OCTAE in their blueprint specifications for the most recent reauthorization of the Perkins Act as characteristics that define high-quality CTE programs. The questionnaire covers topics that focus on the characteristics of the CTE programs offered by districts, work-based learning activities, involvement of employers in CTE programs, barriers to offering programs and to participation in CTE, and the factors districts consider for adding or phasing out CTE programs.
The instructions and definitions page provides the definition of career and technical education (CTE) programs as “a sequence of courses at the high school level that provides students with the academic and technical knowledge and skills needed to prepare for further education and careers in current or emerging professions.” On this page, respondents are instructed to include all CTE programs that the district offers to high school students, including programs provided by the district or by other entities; to report only for CTE programs offered to high school students; and to report for the 2016–17 school year and the summer of 2016.
The box above question 1 provides the following instruction to respondents: “For this survey, include all CTE programs that your district offers to high school students, including programs provided by your district or by other entities (such as an area/regional CTE center, a consortium of districts, or a community or technical college).”
Question 1 asks whether the district offers CTE programs to students at the high school level. Respondents who answer “no” are directed to skip to question 15 because the intervening questions only apply to districts that offer CTE programs to their high school students.
Question 2 asks respondents to indicate (yes or no) whether each of the listed entities provide any of the CTE programs that the district offers to their high school students. The entities include the following: an area/regional CTE center or a group/consortium of school districts, your district individually (not as part of a consortium), 2-year community or technical college(s), and 4-year college(s) or universities. Respondents can also specify “other.”
Question 3 asks respondents to indicate (yes or no) whether the district offers CTE programs to high school students at each of the listed locations. The locations include the following: at some or all of your district’s regular (comprehensive) high schools; at another district’s regular (comprehensive) high school; at a CTE-focused high school that students attend full time; at a CTE center that students attend part time (for example, students spend half the day at the CTE center and half at the regular high school); and at a 2-year community or technical college, or a 4-year college or university. Respondents can also specify “other.”
Question 4 asks about how many of the CTE programs offered by the district to high school students are structured as career pathways that align with related postsecondary programs. The response options are none, few, some, most, and all.
Question 5 is a yes/no question that asks whether the district offers any CTE courses in which students may earn high school credits in math, science, English/language arts, or social studies.
Question 6 is a yes/no question that asks whether the district offers any CTE courses for which students can earn both high school and postsecondary credits for the same course.
Question 7 is a yes/no question that asks whether the district offers any CTE courses online, including courses in a blended/hybrid format.
Question 8 asks respondents to indicate (yes or no) whether each of the listed items is included in any of the CTE programs offered by the district to high school students. The items include the following: student-run enterprises or services (for example, school store or restaurant, cosmetology services, automotive or construction services, child development facility); mentoring by local employers; on-the-job training, internships, practicums, clinical experiences, or cooperative education (co-op); and apprenticeships or pre-apprenticeship programs (such as youth apprenticeships). Respondents can also specify “other work-based learning.”
Question 9 asks about how many of the CTE programs offered by the district to high school students require work-based learning activities (such as those listed in Question 8) for completion of the program. The response options are none, few, some, most, or all.
Question 10 asks respondents to indicate the extent to which employers are involved with the CTE programs offered by the district to high school students in each of the listed ways. The listed ways include the following: provide work-based learning opportunities, serve on your district’s CTE advisory council, advise about which occupations are in demand, provide advice on CTE programs to add or eliminate, review CTE program curriculum, provide guidance on industry standards, provide guidance about equipment or facilities, donate equipment, host student field trips, serve as guest speakers to CTE students, provide guidance for student CTE projects, judge student CTE competitions, and provide training opportunities for CTE teachers. Respondents can also specify “other.” The response options are not at all, small extent, moderate extent, large extent, and very large extent.
Question 11 asks respondents to indicate how much of a barrier each of the listed items is to the district in offering CTE programs to high school students. The items include the following: lack of funding or high cost of programs (for example, cost of infrastructure or equipment); facilities or space limitations; finding or keeping teachers for in-demand industries and occupations; limited availability of professional development in technical fields; difficulty keeping CTE teachers’ technical skills up to date; CTE teachers who move into teaching from other occupations have difficulty obtaining a regular or standard state teaching certificate; and difficulty developing partnerships with employers for work-based learning. Respondents can also specify “other.” The response options are not a barrier, small barrier, moderate barrier, large barrier, and very large barrier.
Question 12 asks respondents to indicate how much of a barrier each of the listed items is to student participation in the CTE programs offered by the district to high school students. The items include the following: lack of time in students’ schedules for CTE courses; students’ or parents’ negative perceptions of CTE; teachers’ or guidance counselors’ negative perceptions of CTE; transportation to CTE programs outside of the high school campus; transportation for work-based learning; students’ costs for supplies, uniforms, or materials; students’ difficulty finding work-based learning opportunities; lack of student support services for special populations. Respondents can also specify “other.” The response options are not a barrier, small barrier, moderate barrier, large barrier, and very large barrier.
The box above question 13 provides the following instruction to respondents: “Questions 13 and 14 ask about adding or phasing out CTE programs. Please answer these questions about CTE programs for which your district has a role in making these decisions. Check here and skip to question 15 if your district does not have a decision-making role in adding or phasing out CTE programs.” This instruction and checkbox is included because during feasibility calls, some districts reported that they were not involved in the decisions for adding or phasing out CTE programs that were provided by entities other than the district, such as a community college or an area/regional CTE center.
Question 13 asks respondents to indicate the extent to which each of the listed factors influences the district’s decision on whether to add a new CTE program for high school students. The factors include the following: student interest; facilities/space considerations (for example, whether appropriate space is available); costs for new program; availability of qualified teachers; information on which industries and occupations are in demand; employer (business/industry) recommendations; postsecondary institution recommendations; recommendations from your state department of education; career pathways from the high school to the postsecondary level (for example, to structure new pathways or better align existing pathways). Respondents can also specify “other.” The response options are not at all, small extent, moderate extent, large extent, and very large extent.
Question 14 asks respondents to indicate the extent to which each of the listed factors influences the district’s decision on whether to phase out a CTE program for high school students. The factors include the following: enrollment or student interest; facilities/space considerations (for example, facilities are outdated, space is needed for other purposes); cost of program; availability of qualified teachers (for example, a teacher leaves and is difficult to replace); information on which industries and occupations are in demand; employer (business/industry) recommendations; postsecondary institution recommendations; recommendations from your state department of education; career pathways from the high school to the postsecondary level (for example, if a program does not align with a career pathway). Respondents can also specify “other.” The response options are not at all, small extent, moderate extent, large extent, and very large extent.
Question 15 asks whether high school students within the responding district’s enrollment area have the option of enrolling in a separate CTE district instead of enrolling in the responding district. Because some respondents may not be familiar with this situation, the question first explains that some states have CTE school districts that provide only CTE programs and students have the option of enrolling in the CTE district instead of enrolling in their home district. All districts (both those that offer CTE programs and those that do not) are asked to answer Q15. For districts that do not offer CTE programs to their enrolled students (question 1 = no), it is especially important to know whether those students had the option of enrolling in a CTE district.
Survey Cost and Time Schedule
The survey is estimated to cost the federal government about $750,000, including about $700,000 for contractual costs and $50,000 for salaries and expenses. Contractual costs include the costs for survey preparation, data collection, data analysis, and report preparation.
Mailing of the survey will begin in January 2016, and about 3 weeks later, telephone follow-up for nonresponse will begin. Data collection is scheduled to end about 20 weeks after initial mailout.
Plan for Tabulation and Publication
The First Look report will be released on the NCES website in December 2017 and include explanatory text and tables. Participating districts will be notified when NCES releases the report. A public use data file will also be released on the NCES website. Survey responses will be weighted to produce national estimates. Tabulations will be produced for each data item. Cross-tabulations of data items will be made with selected classification variables, such as district enrollment size, community type (locale), and geographic region.
Statistical Methodology
Reviewing Statisticians
Chris Chapman, of NCES, is the Project Officer for this survey. Adam Chu, Senior Statistician, Westat, was consulted about the statistical aspects of the design.
Respondent Universe
FRSS 108 will collect data from a nationally representative sample of public school districts. All types of districts, with the exception of those that are federally operated, that meet the following conditions are in scope for the CTE survey:
The district contains at least one school that provides instruction in grade 11 or 12.
The district is located within the 50 States or the District of Columbia.
The district does not have zero or missing enrollment.
The district is either:
a regular school district (i.e., a local school district including those that are part of a supervisory union), or
a non-regular district (i.e., a supervisory union administrative center, regional educational services agency, state-operated district, charter school agency, or other non-regular district) that has at least one operating vocational education school that has greater than zero enrollment and is not a shared-time school.4
Sampling Frame
The sampling frame (i.e., universe list) from which the district sample will be drawn will be constructed from the 2013–14 (or most current available) Common Core of Data (CCD) Local Education Agency (LEA) Universe Survey file maintained by the National Center for Education Statistics (NCES).5 The CCD file contains a record for all known public school districts along with selected characteristics such as type of district, enrollment size by grades offered, urbanicity (type of locale), region of country, and others. As summarized in Table 2, there are 11,394 school districts with grades 11–12 that meet the conditions for inclusion in the CTE survey.
Table 2. Distribution of eligible districts with grades 11–12 in the 2013–14 CCD LEA Universe File, by district type, enrollment size class, and type of locale
District type* |
Enrollment size class |
Type of locale |
Total |
|||
City |
Suburban |
Town |
Rural |
|||
Regular |
Less than 1,000 |
21 |
205 |
431 |
4,139 |
4,796 |
Regular |
1,000 to 2,499 |
34 |
622 |
1,061 |
1,218 |
2,935 |
Regular |
2,500 to 9,999 |
243 |
1,261 |
732 |
525 |
2,761 |
Regular |
10,000 or more |
359 |
414 |
21 |
54 |
848 |
Non-regular |
All |
6 |
35 |
4 |
9 |
54 |
Total |
All |
663 |
2,537 |
2,249 |
5,945 |
11,394 |
*Regular school districts are coded as type 1 or 2 in the CCD file. All other type codes are non-regular districts.
Sample Design and Stratification
Traditionally, surveys conducted under the FRSS have employed stratified samples ranging in size from 1,200 to 1,800 districts depending on analytic goals and available resources. Since FRSS is designed to provide estimates for broadly-defined subgroups of interest as well as overall national estimates, a stratified sample design with primary strata defined by size class and other characteristics generally has been found to be effective in meeting study objectives. Specification of explicit strata for sampling purposes allows for the selection of districts at varying rates to (a) ensure that key subgroups are adequately represented in the sample, and (b) improve sampling precision for selected subgroup estimates. Moreover, use of enrollment size as the primary stratifier also helps to ensure that sample-based estimates that are correlated with the size of the district can better achieve reasonable levels of precision.
In view of the above considerations, we plan to select a stratified sample of 1,800 districts for the FRSS survey, with strata defined by classifying districts in the sampling frame by (a) enrollment size class (i.e., the following six size classes: [1] under 1,000 students; [2] 1,000 to 2,499; [3] 2,500 to 9,999; [4] 10,000 to 24,999; [5] 25,000 to 99,999; and [6] 100,000+), and (b) type of locale (city, suburb, town, and rural). The total sample size will be allocated to the strata in proportion to the aggregate square-root of the enrollment of the districts in the size class. Since most estimates to be derived from the survey will be categorical (e.g., the estimated proportion of districts with a specified characteristic), use of the square root of enrollment rather than enrollment as the measure of size for sample allocation will limit the design effects (and associated increased variances) arising from the use of varying sampling rates. Other variables, such as region and poverty status, will be used to sort districts in the sampling frame prior to sample selection. The sorting is a form of “implicit” stratification that helps ensure that districts with the given characteristics are appropriately represented in the sample. Within each sampling stratum, districts will be selected systematically and with equal probability from the sorted list of districts. Note that under this stratification, all districts with 100,000 or more students will be included in the sample with certainty. Table 3 summarizes the allocation of the sample of 1,800 districts to strata. Note that these are the numbers of districts to be selected. Assuming a 90 percent response rate, the expected number of responding districts is 1,620.
Table 3. Distribution of district sample by enrollment size class and type of locale
Enrollment size class |
Type of locale |
Total |
|||
City |
Suburb |
Town |
Rural |
||
Less than 1,000 |
2 |
18 |
38 |
286 |
344 |
1,000 to 2,499 |
5 |
88 |
142 |
159 |
394 |
2,500 to 9,999 |
63 |
294 |
156 |
115 |
628 |
10,000 or more* |
191 |
213 |
8 |
22 |
434 |
All |
261 |
613 |
344 |
582 |
1,800 |
* Districts with enrollment of 100,000 or greater will be included in the sample with certainty.
Expected Levels of Precision
Table 4 summarizes the expected sample sizes and levels of precision for selected subgroup estimates derived from the proposed sample design. The number of “responding districts” shown in the table are calculated assuming an overall response rate of 90 percent. Also shown are 95% confidence bounds around an estimated percentage derived from the respondent samples. The confidence bounds given in the table are for reported respondent characteristics ranging from a 20% characteristic to a 50% characteristic. As can be seen in the table, for subgroups with at least 340 respondents, estimates are expected to be relatively precise with 95% confidence bounds ranging from ± 4.1% to 6.0% for an estimated 50 percent characteristic. Moreover, under the proposed sample design, the minimum detectable difference (MDD) in estimated percentages between subgroups of approximately 300 respondents each would range from about 10% to 12% (e.g., using a T test to test for significance).
Table 4. Expected sample sizes (number of completed interviews) and 95% confidence bounds around an estimated proportion by selected subgroups under proposed design
Subgroup |
Number selected |
Number respondent districts |
95% confidence bounds around estimated percentage equal to: |
||
P = 20% |
P = 33% |
P = 50% |
|||
Size class |
|
|
|
|
|
<1,000 |
344 |
309 |
±4.46% |
±5.25% |
±5.58% |
1,000 to 2,499 |
394 |
355 |
±4.16% |
±4.89% |
±5.20% |
2,500 to 9,999 |
628 |
565 |
±3.30% |
±3.88% |
±4.12% |
10,000 + |
434 |
391 |
±4.11% |
±4.83% |
±5.13% |
Locale |
|
|
|
|
|
City |
252 |
227 |
±6.06% |
±7.12% |
±7.57% |
Suburb |
600 |
540 |
±3.90% |
±4.58% |
±4.87% |
Town |
348 |
313 |
±4.83% |
±5.68% |
±6.04% |
Rural |
600 |
540 |
±3.67% |
±4.32% |
±4.59% |
Region |
|
|
|
|
|
Northeast |
380 |
342 |
±4.88% |
±5.73% |
±6.10% |
Southeast |
342 |
308 |
±5.23% |
±6.15% |
±6.54% |
Central |
553 |
498 |
±3.99% |
±4.69% |
±4.98% |
West |
525 |
472 |
±4.65% |
±5.46% |
±5.81% |
Poverty Level * |
|
|
|
|
|
< 10% |
342 |
308 |
±5.29% |
±6.22% |
±6.61% |
10 to 19.9% |
690 |
621 |
±3.74% |
±4.39% |
±4.67% |
20 to 29.9% |
512 |
461 |
±4.44% |
±5.22% |
±5.55% |
30% + |
256 |
230 |
±6.29% |
±7.40% |
±7.87% |
All districts |
1,800 |
1,620 |
±2.34% |
±2.75% |
±2.92% |
* Based on district-level estimates from the Small Area Income and Poverty Estimates (SAIPE) program of the US Census Bureau (http://www.census.gov/did/www/saipe/methods/index.html).
Estimation and Calculation of Sampling Errors
For estimation purposes, sampling weights reflecting the overall probabilities of selection and adjustments for nonresponse will be attached to each data record. To properly reflect the complex features of the sample design, standard errors of the survey-based estimates will be calculated using jackknife replication. Under the jackknife replication approach, 50–100 subsamples or “replicates” will be formed in a way that preserves the basic features of the full sample design. A set of estimation weights (referred to as “replicate weights”) will then be constructed for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any survey statistic can be calculated for the full sample and each of the jackknife replicates. The variability of the replicate estimates is used to obtain a measure of the variance (standard error) of the survey statistic. Previous surveys, using similar sample designs, have yielded relative standard errors (i.e., coefficients of variation) in the range of 2 to 10 percent for most national estimates. Similar results are expected for this survey.
1 U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Policy and Program Studies Service (2014), National Assessment of Career and Technical Education: Final Report to Congress. Washington, DC.
2 U.S. Department of Education, Office of Career, Technical, and Adult Education, Investing in America’s Future: A Blueprint for Transforming Career and Technical Education (Summary) (2012). Washington, DC.
3 This estimate is the average amount of time district staff respondents reported the questionnaire took to complete during the pretest.
4 A shared-time school is typically a school offering vocational/technical education or other education services, in which some or all students are enrolled at a separate “home” school and attend the shared-time school on a part-day basis.
5 Glander, M. (2015). Documentation to the NCES Common Core of Data Local Education Agency Universe Survey: School Year 2013–14 Provisional Version 1a (NCES 2015-147). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Retrieved [date] from http://nces.ed.gov/pubsearch.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-23 |