FRSS 104 PEQIS 18 Volume 1

FRSS 104 PEQIS 18 Volume 1.docx

NCES Quick Response Information System (QRIS)

FRSS 104 PEQIS 18 Volume 1

OMB: 1850-0733

Document [docx]
Download: docx | pdf






U.S. DEPARTMENT OF EDUCATION

INSTITUTE OF EDUCATION SCIENCES


NATIONAL CENTER FOR EDUCATION STATISTICS

Date:

July 15, 2011


To:

Rochelle W. Martinez, OMB

Through:

Kashka Kubzdela, NCES



From

Jared Coopersmith, NCES

Subject:

Fast Response Survey System (FRSS) 104: Dual Credit and Exam-Based Courses in Public High Schools: 2010–11 and Postsecondary Education Quick Information System (PEQIS) 18: Dual Enrollment Programs and Courses for High School Students: 2010–11 (OMB# 1850-0733 v.25)


Justification

The National Center for Education Statistics (NCES), U.S. Department of Education (ED) proposes to use the Fast Response Survey System (FRSS) and the Postsecondary Education Quick Information System (PEQIS) to conduct two complementary surveys on dual credit and dual enrollment programs offered to high school students. The proposed surveys were requested by the Office of Elementary and Secondary Education (OESE), ED.

Dual enrollment courses that offer high school students the opportunity to earn high school and postsecondary credit simultaneously have attracted considerable interest among policymakers in recent years. These courses are appealing to policymakers because they hold the promise of both increasing the rigor of the high school curriculum and accelerating the completion of postsecondary education by participating students.1 Some policy analysts also contend that dual enrollment is an effective strategy for promoting enrollment and success in postsecondary education by not only high-achieving students, but middle- and low-achieving students as well.2

Policymakers in 46 states have established statewide policies to support and regulate dual enrollment course offerings. Twelve of these states require all of their school districts or public postsecondary institutions or both to provide dual enrollment opportunities to high school students.3 There also is emerging interest in dual enrollment among federal policymakers. The President’s federal Fiscal Year 2012 budget proposal would create a new College Pathways and Accelerated Learning program that would support, among other activities, efforts to provide dual enrollment programs to students attending high-poverty high schools.

There are no current national data on the prevalence and characteristics of dual enrollment courses. The proposed surveys will help fill this knowledge gap and provide policymakers and educators with comprehensive information about the extent and characteristics of dual enrollment programs in the United States.

The most recent national data on dual enrollment programs are for the 2002–03 school year. These data were collected by NCES through two complementary surveys, including an FRSS secondary school survey and a PEQIS postsecondary institution survey. Together, they provided comprehensive data on the extent of dual enrollment participation and the characteristics of these courses during the 2002–03 school year. These data have been cited widely in the policy and research literature on dual enrollment.4 In the eight years since these data were collected, there have been significant developments in the field that have likely affected the extent and characteristics of dual enrollment programs. A number of states have established new mandates or invested significant funding in expanding dual enrollment opportunities. For example, in 2003, North Carolina enacted new legislation to support partnerships between school districts and institutions of higher education to create innovative high school programs, particularly programs that offer dual enrollment opportunities. In 2006, Texas established a new policy that requires school districts to provide students with the opportunity to earn at least 12 college credits while in high school. Pennsylvania established a new program in 2007 that provides $10 million annually to support dual enrollment programs.5 In addition, since 2002, the Gates Foundation and several other philanthropies have supported the creation of 208 “early college high schools” in 28 states and the District of Columbia. These schools principally serve low- and middle-achieving students and are designed to enable students to earn a high school diploma and an associate’s degree concurrently.6 Given the extensive interest among policymakers and educators in dual enrollment and developments in the field since 2002, NCES and OESE believe the collection of new national data on dual enrollment is warranted.

The current surveys, under OMB clearance #1850-0733, are authorized under the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. § 9573), which authorizes NCES to collect and report statistical data related to education in the United States.

Design

Overview of Survey Development

Westat will collect the information for the Early Childhood, International, and Crosscutting Studies Division, NCES, ED, using the FRSS and PEQIS. Westat is responsible for questionnaire development; sample design and selection; data collection by mail and web; telephone follow-up; editing, coding, keying, and verification of data; and production of tabulations and the reports detailing the results of the surveys.

Both surveys were previously conducted by NCES to collect data for the 200203 school year; Dual Credit and Exam-Based Courses (FRSS 85) and Dual Enrollment Programs and Courses for High School Students (PEQIS 14). The development work for the current surveys (to collect data for the 2010–11 school year) is based on the previous versions, with modifications based on literature searches, feasibility calls, and pretest calls with school and institution staff most knowledgeable about dual credit and dual enrollment. The two rounds of feasibility calls were conducted to gather feedback from respondents on draft surveys (OMB# 1850-0803 v.39). Respondents were asked to review, but not complete, the questionnaire and then participate in a short telephone interview with Westat to provide feedback on the questionnaire. Based on respondent feedback on the questionnaires, the surveys were revised in consultation with OESE and NCES prior to pretesting. After the second round of feasibility testing, the surveys were revised and submitted to the NCES Quality Review Board (QRB) for additional feedback.

Based on QRB comments and suggestions, the surveys were revised and pretests were conducted with public secondary schools and postsecondary institutions to identify problems respondents might have in providing the requested information (OMB# 1850-0803 v.50). Respondents were asked to complete the questionnaire and participate in a telephone debriefing with Westat to provide feedback on the questionnaire. Completed questionnaires were collected by fax prior to the debriefing with each respondent. One round of pretesting was conducted for the FRSS survey, and two rounds of pretesting were conducted for the PEQIS survey. The purpose of the pretests was to verify that all questions and corresponding instructions were clear and unambiguous, to determine if the information would be readily accessible to respondents, and to determine whether the burden on respondents could be reduced further. Changes to the questionnaires were made based on the feedback received from the pretests, and documented in memorandums summarizing the pretest results. OESE, the data requester for this survey, reviewed and approved all questionnaire changes. The revised questionnaires (Attachment 1 for FRSS 104 and Attachment 2 for PEQIS 18) are being submitted with this request for OMB clearance.

Assurance of Confidentiality

Data to be collected will not be released to the public with institutional or personal identifiers attached. Data will be presented in aggregate statistical form only. In addition, each data file undergoes extensive disclosure risk analysis and is reviewed by the NCES/IES Disclosure Review Board before use in generating report analyses and before release as a public use data file. Respondents will be assured that all information identifying them or their schools or institutions will be kept confidential in compliance with the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. § 9573).

Description of Sample and Burden

The proposed sample design for FRSS is a nationally representative sample of 1,500 regular public secondary schools from the NCES Common Core of Data (CCD) Public Elementary/Secondary School Universe Survey: School Year 2009–10 file. The proposed PEQIS survey will collect data from a nationally representative sample of 1,600 4-year and 2-year Title IV degree-granting postsecondary institutions in the 50 states and the District of Columbia. The FRSS and PEQIS data collection will be accomplished by means of a self-administered survey. Each respondent will have the option of completing the survey using a paper questionnaire or a Web version of the questionnaire. The questionnaires are limited to three pages of items readily available to respondents and can be completed by most respondents in 30 minutes or less. These procedures are typical for FRSS and PEQIS surveys and result in minimum burden on respondents.

Prior to contacting schools for the FRSS survey, a courtesy information packet consisting of a cover letter (Attachment 3) and copy of the questionnaire will be mailed to the superintendent of each district with sampled schools. The packet will also include a list of the sampled schools within the district. Any special requirements that districts have for approval of surveys will be met before schools in those districts are contacted. Each of the approximately 100 special clearance districts has unique requirements for obtaining approval. The materials sent to special districts will be tailored to meet the specific requirements of each district, based on information from the OMB package. For example, most districts request information on survey justification, confidentiality, sample size, and survey collection procedures, which will be copied from the appropriate sections of the OMB package.

Questionnaires and information needed to access the Web surveys will be mailed in September 2011 to the principals of the sampled schools (for FRSS) and to the PEQIS coordinator of institutions in the PEQIS panel. The cover letter and questionnaire will include a description of the most appropriate respondent. Follow-up for nonresponse will be conducted both by mail and telephone and will begin about 3 weeks after the questionnaires have been mailed to the schools and institutions. Experienced telephone interviewers will be trained to conduct the nonresponse follow-up and will be monitored by Westat supervisory personnel. Telephone nonresponse follow-up is used to prompt respondents to complete the survey by web or mail and is expected to take about 5 minutes.



For FRSS 104, notification to the estimated 1,340 districts is expected to take approximately 5 minutes per district for a total of 111 respondent burden hours (table 1). It is anticipated that approximately 100 districts with special clearance procedures will be contacted. The respondent burden will be approximately 2 hours per special district for a total of 200 respondent burden hours. The estimated burden time for sampled schools to review the introductory letter requesting their participation (initial contact) is 5 minutes per school for a total burden of 125 hours. The response rates for FRSS and PEQIS surveys of schools and institutions typically have been 90 percent or greater. At a response rate of 90 percent, the initial sample of 1,500 schools for FRSS 104 will yield about 1,350 completed questionnaires. Based on a response burden of approximately 30 minutes per completed questionnaire, the response burden to complete the FRSS 104 survey is estimated to be about 675 hours.7 It is anticipated that about 25 percent of the sample will have returned the completed survey before nonresponse follow-up begins and about 75 percent of the sample (i.e. 1,125 schools) will receive a nonresponse follow-up call that will take about 5 minutes. The total estimated burden time for nonresponse follow-up is about 93 hours. The total number of burden hours for data collection and nonresponse follow-up is about 1,204 hours.

Table 1. Estimated burden for data collection and nonresponse follow-up: FRSS

Type of collection

Sample size

Estimated response rate (percent)

Estimated number of respondents

Estimated number of responses

Total burden hours per respondent

Respondent burden hours

District notification

1,340

1.00

1,340

1,340

.083

111

Special clearance district review

100

1.00

100

100

2.00

200

Initial school contact

1,500

1.00

1,500

1,500

.083

125

Questionnaire

1,500

.90

1,350

1,350

.50

675

Nonresponse follow-up call

1,500

.75

1,125

1,125

.083

93

Total burden

-

-

2,840

5,415

-

1,204



For PEQIS 18, the estimated burden time for sampled institutions to review the introductory letter requesting their participation (initial contact) is 5 minutes per district for a total of 133 respondent burden hours (table 2). The initial sample of 1,600 institutions will yield about 1,440 completed questionnaires, assuming a response rate of 90 percent. Based on a response burden of approximately 30 minutes per completed questionnaire, the estimated response burden to complete the PEQIS 18 survey is estimated to be about 720 hours.8 It is anticipated that about 25 percent of the sample will have returned the completed survey before nonresponse follow-up begins and about 75 percent of the sample (i.e. 1,200 institutions) will receive a nonresponse follow-up call that will take about 5 minutes. The total estimated burden time for nonresponse follow-up is about 100 hours. The total number of burden hours for data collection and nonresponse follow-up is about 953 hours.

Table 2. Estimated burden for data collection and nonresponse follow-up: PEQIS

Type of collection

Sample size

Estimated response rate (percent)

Estimated number of respondents

Estimated number of responses

Total burden hours per respondent

Respondent burden hours

Initial institution contact

1,600

1.0

1,600

1,600

.083

133

Questionnaire

1,600

.90

1,440

1,440

.50

720

Nonresponse follow-up call

1,600

.75

1,200

1,200

.083

100

Total burden

-

-

1,600

4,240

-

953



Procedures and Data Collection Instruments

FRSS 104 Secondary School Survey

A questionnaire, cover letter (Attachment 4), and Web information sheet (Attachment 5) will be mailed to each sampled school. The cover letter requests the participation of the school and introduces the purpose and content of the survey. It also notes that the survey should be completed by the person most knowledgeable about dual credit and exam-based courses at the school. The cover letter also includes instructions on how to complete and return the survey, as well as contact information in case of questions. The Web information sheet is included in the mailing to provide information about the option to complete a Web version of the survey. On the cover of the survey, respondents are assured that their participation is voluntary and their answers may not be disclosed or used in identifiable form for any other purpose unless compelled by law (Education Sciences Reform Act of 2002, 20 U.S.C. § 9573).

All sampled schools that do not complete a survey within 3 weeks after the initial mailing of the survey will receive a nonresponse follow-up letter (Attachment 6), another copy of the Web information sheet, and a brief, scripted telephone call (Attachment 7) prompting the respondent to return a completed survey via the Web, fax, or mail.

The survey is designed to collect general information on dual credit and exam-based courses for public secondary school students. The instructions and definitions page includes descriptions of key survey terminology, such as Advanced Placement (AP) courses, dual credit, credit in escrow, and duplicated counts of students. Below is a description of each question on the survey.

The first four questions ask about types of exam-based courses, including Advanced Placement (AP) and International Baccalaureate (IB) courses. The first question asks whether students took AP or IB courses and is included to set up the skip pattern for Question 2. The second question asks for the number of enrollments in both types of exam-based courses. Questions 1 and 2 were both asked in the 2002–03 survey. During survey development calls, we found that students sometimes receive dual credit for passing an AP course without having to take the AP exam. Some respondents were counting these courses in both the AP subsection and the dual credit subsection of the questionnaire. As a result, the next set of questions was added prior to the pretest to address the potential overlap between AP and dual credit courses. Question 3 asks whether any students have taken any AP courses for which they could earn dual credit without taking the AP exam. Question 4 asks for the number of enrollments in such courses. We also added specific instructions which indicate that AP courses should be excluded from the dual credit section unless students receive dual credit without taking the AP exam (item 4 on the instructions and definitions page).

Questions 5–7 ask about dual credit courses and requirements related to taking such courses. Question 5 is provided to ‘screen out’ schools that do not have students taking dual credit courses. Question 6 asks whether the school has established any requirements that students must meet for participation in dual credit courses, and is included to set up the skip pattern for Question 7. Question 7 asks about specific student requirements for enrolling in dual credit courses. Questions 5 through 7 were asked in the 2002–03 survey.

Question 8 was included in the survey to address an NCES policy interest in whether any students earned postsecondary credentials by taking dual credit courses.

Question 9 asks whether students took any dual credit courses with an academic focus or a career and technical/vocational focus. It is used as a screening question for the remaining questions in the grid for each course focus. The wording of the section heading and instructions above question 9 is included to emphasize that only dual credit courses (not AP courses) should be included. In addition, we referenced the specific item on the definitions page for course focus. Question 9 was asked in the 2002–03 survey.

Question 10 asks respondents for the number of enrollments for courses taught primarily through distance education, taught at locations for secondary school students, and taught on the campus of a postsecondary institution.

Questions 11–12 focus on dual credit courses taught on high school and postsecondary campuses. Question 11 asks about the instructors who taught the dual credit courses on the high school campus. Question 12 asks about the student composition of courses taught at the postsecondary institution. Questions 11 and 12 were both asked in the 2002–03 survey.

Question 13 asks whether students were awarded postsecondary credit immediately upon course completion or were offered credit in escrow. Question 13 was asked in the 2002–03 survey.

Questions 14–16 are included in the survey to collect information about the costs of enrollment in dual credit courses. Question 14 asks whether students or their parents generally paid out of pocket for tuition, fees, and books related to dual credit courses. Question 15 asks whether the school or district paid any of these expenses for students taking dual credit courses. Question 16 asks who is responsible for transportation costs associated with participation in dual credit courses.

PEQIS 18 Postsecondary Institution Survey

A questionnaire, cover letter (Attachment 8), and Web information sheet (Attachment 9) will be mailed to each institution in the PEQIS panel. The cover letter requests the participation of the institution and introduces the purpose and content of the survey. It also notes that the survey should be completed by the person most knowledgeable about dual enrollment at the institution. The cover letter also includes instructions on how to complete and return the survey, as well as contact information in case of questions. The Web information sheet is included in the mailing to provide information about the option to complete a Web version of the survey. On the cover of the survey, respondents are assured that their participation is voluntary and their answers may not be disclosed or used in identifiable form for any other purpose unless compelled by law (Education Sciences Reform Act of 2002, 20 U.S.C. § 9573).

All sampled institutions that do not complete a survey within 3 weeks after the initial mailing of the survey will receive a nonresponse follow-up letter (Attachment 10), another copy of the Web information sheet, and a brief, scripted telephone call (Attachment 11) prompting the respondent to return a completed survey via the Web, fax, or mail.

The survey is designed to collect basic information about dual enrollment courses offered to high school students by postsecondary institutions. This includes courses within and outside of a formal dual enrollment program. The instructions and definitions page of the questionnaire includes instructions for completing the survey, including the reporting time period. Definitions of key terms and concepts are also included on this page, including the definition of a dual enrollment program and what is meant by “within” and “outside” of a dual enrollment program. Below is a description of each question on the survey.

The first question is asked to determine if any high school students took courses during the previous academic year for college credit through the institution. Respondents that answer “no” are asked to return the survey without answering any additional questions. This question was also asked in the 2002–03 survey on Dual Enrollment Programs and Courses for High School Students.

Questions 2–3 are asked to gather information about courses offered to high school students outside of a dual enrollment program. Question 2 asks if any high school students took such a course, and Question 3 asks how many students took these courses. Questions 2 and 3 were both asked in the 2002–03 survey.

Question 4 is included to determine if high school students took courses for college credit through the institution that were part of a dual enrollment program. For the remaining survey questions, respondents are asked to consider only courses within a dual enrollment program. These questions are designed to collect characteristics of the dual enrollment courses (e.g., location and type of teacher) that would not be meaningful for courses outside an organized program (e.g., high school students taking courses on their own who are treated as regular college students). Question 5 asks for the unduplicated head count of high school students taking these courses. Questions 4 and 5 were asked in the 2002–03 survey.

Question 6 asks whether the dual enrollment courses were taught at the college campus, high school campus, through distance education or some other location. A similar question was asked in the 2002–03 survey. Prior to the pretest, this question was edited to account for courses taught via distance education.

Questions 7 and 8 collect information about the dual enrollment program instructors, and instructor qualifications for dual enrollment courses taught on a high school campus. Question 7 asks who the instructors were, and Question 8 asks how the minimum qualifications for high school instructors who taught college-level courses compare to those required for college instructors. Questions 7 and 8 were asked in the 2002–03 survey.

Questions 9–11 ask about coursetaking patterns for high school students participating in dual enrollment program(s) at the institution. Question 9 asks about the typical number of courses taken per academic term, and Question 10 asks for the maximum number of courses allowed per academic term. Question 11 asks when high school students were generally awarded college credit for courses taken through the dual enrollment program(s). Questions 9-11 were asked in the 2002–03 survey.

Questions 12–14 are included in the survey to gather information about the eligibility requirements for high school students to participate in dual enrollment programs at the institution. These questions were asked in the 2002–03 survey. Question 12 collects information about grade level requirements. Question 13 asks about other eligibility requirements for students to participate in the dual enrollment program(s). Question 14 asks if the academic requirements for high school students to participate in the dual enrollment program(s) are the same or different from the admission standards for regular college students.

Question 15 asks whether the curriculum for courses in the dual enrollment program(s) are specially designed for high school students or the same as the curriculum for regular college students. Question 15 was also asking during the 2002–03 data collection.

Questions 16–18 are included in the survey to collect information about the costs of participation in dual enrollment programs. Questions 17 and 18 were asked during the 2002–03 data collection, and question 16 is a new question added to the survey based on respondent feedback during survey development. Several respondents noted their institution discounts the rate of tuition for high school students participating in dual enrollment programs. Because the cost of dual enrollment program(s) is important to OESE, Question 16 was added to explicitly gather information about tuition discounts. Question 17 asks which sources paid tuition for the college-level courses taken in the dual enrollment programs. Question 18 asks whether high school students or their parents generally paid out of pocket for tuition, fees, or books related to these courses.

Question 19 is a new question added to address an NCES policy interest. This question is asked to gather information about awards high school students may receive through participation in the dual enrollment program(s).

Question 20 is a new question added at the request of OESE because OESE has a policy interest in the prevalence of comprehensive dual enrollment programs.

Questions 21–24 are included in the survey to gather information about dual enrollment programs that are geared specifically toward high school students at risk of educational failure. Question 21 asks if the institution has such a program. Questions 22–24 are only asked of institutions responding yes to Question 24. Question 22 asks for the number of students enrolled in these programs and Question 23 gathers information about the typical pattern of enrollments for these students. Questions 24 collects information about support services available to students enrolled in these types of dual enrollment programs.

Consultations Outside of Agency

The FRSS and PEQIS surveys were developed in close consultation with the data requestor (OESE). General topics and questions were identified through literature reviews and review of the previous surveys conducted in 2002–03 and the resulting data. For the FRSS survey, two rounds of feasibility calls and one round of pretest calls were conducted to generate feedback from respondents on the relevance and respondent burden of survey items. For the PEQIS survey, two rounds of feasibility calls and two rounds of pretest calls were conducted.

Survey Cost and Time Schedule

The FRSS 104 survey is estimated to cost the federal government about $615,000, including about $575,000 for contractual costs and $40,000 for salaries and expenses. The PEQIS 18 survey is estimated to cost the federal government about $540,000, including about $500,000 for contractual costs and $40,000 for salaries and expenses. Contractual costs include the costs for survey preparation, data collection, data analysis, and report preparation and dissemination.

Mailing of the FRSS 104 and PEQIS 18 surveys is planned for September 2011. About 3 weeks after mail out of the surveys, Westat will begin telephone follow-up for nonresponse. Data collection is scheduled for completion about 16 weeks after initial mail out.

Plan for Tabulation and Publication

The first look report will be released on the NCES website and include explanatory text and tables. Printed copies of the first look report will be sent to respondents that participated in the survey, and the FRSS 104 report will be sent to the districts with participating schools. A public use data file will also be released on the NCES website. Survey responses will be weighted to produce national estimates. Tabulations will be produced for each data item.

For FRSS 104, cross tabulations of data items will be made with selected classification variables, such as the following.

School enrollment (less than 500, 500–1,199, and 1,200 or more);

Community type (city, suburban, town, rural);

Geographical region (Northeast, Southeast, Central, and West);

Percent minority enrollment (less than 6 percent, 6–20 percent, 21–49 percent, 50 percent or more); and

For PEQIS 18, cross tabulations of data items will be made with selected classification variables, such as the following.

Type of institution (2-year public, 2-year private, 4-year public, 4-year private)

Size of institution (less than 3,000; 3,000–9,999; 10,000 or more)

Statistical Methodology

Reviewing Statisticians

Jared Coopersmith, of NCES, is the Project Officer for this survey. Adam Chu, Senior Statistician, Westat, was consulted about the statistical aspects of the design. Westat is the contractor currently conducting the QRIS surveys for NCES.

FRSS 104 Secondary School Survey

Respondent Universe and Statistical Methodology

The respondent universe for the FRSS survey on dual credit courses will be restricted to regular public secondary schools in the United States. For the purpose of this survey, “secondary schools” are defined to be those with at least an 11th or 12th grade. Note that vocational education, special education, and alternative/other non-regular schools will be excluded from the study, along with ungraded schools, and schools in the outlying U.S. territories. As described in the following section, a stratified sample of approximately 1,500 secondary schools will be selected for the survey from the 2009–2010 NCES Common Core of Data (CCD) Public School Universe File. As indicated in Table 3, approximately 19,000 regular secondary schools are included in the CCD universe file.

A stratified sample design will be used to select 1,500 secondary schools for the proposed survey. The primary sampling strata will be defined by enrollment size class (less than 300, 300 to 499, 500 to 999, 1,000 to 1,499, and 1,500+) and minority status based on the percent of students in the school who are nonwhite (i.e., the categories listed in Table 3).

Initially, the sample will be allocated to each primary stratum in rough proportion to the aggregate square root of the enrollment of the schools within the stratum. Such an allocation (referred to as a “probability-proportionate-to-square root of size” design) has been used in prior FRSS surveys and is efficient for jointly estimating school-level characteristics and quantitative measures correlated with school enrollment. Prior to sample selection, schools in the frame will be sorted by broad categories of urban-centric locale code (city, suburban, town, rural) and OE region within the primary strata defined above. The sorting is designed to induce additional implicit substratification within the primary strata. The sample of 1,500 schools will then be selected systematically from the sorted frame at rates that are determined by the sample allocation described above. Table 4 summarizes the expected numbers of schools to be sampled under the proposed design.

Table 3. Number and enrollment of secondary schools in the 20092010 CCD public school frame by enrollment size class and minority status

Instructional level

Enrollment size class

Percent minority enrollment1

Number of schools

Enrollment

Secondary2

Less than 300

Missing

246

36,900



Less than 6 percent

1,714

257,100



6 to 20.9 percent

1,460

219,000



21 to 49.9 percent

1,106

165,900



50 percent or more

1,653

247,950







300 to 499

Missing

0

0



Less than 6 percent

1,044

417,600



6 to 20.9 percent

688

275,200



21 to 49.9 percent

477

190,800



50 percent or more

899

359,600







500 to 999

Missing

0

0



Less than 6 percent

996

747,000



6 to 20.9 percent

1,119

839,250



21 to 49.9 percent

769

576,750



50 percent or more

1,066

799,500







1,000 to 1,499

Missing

0

0



Less than 6 percent

225

281,250



6 to 20.9 percent

760

950,000



21 to 49.9 percent

724

905,000



50 percent or more

779

973,750







1,500 or more

Missing

0

0



Less than 6 percent

72

144,000



6 to 20.9 percent

611

1,222,000



21 to 49.9 percent

1,032

2,064,000



50 percent or more

1,481

2,962,000

Total

18,921

14,634,550

1Percent of nonwhite students in school as reported in CCD file.

2Regular schools in the CCD public school universe file with a high grade of 11 or 12.



Table 4. Proposed allocation of the secondary school sample by enrollment size class and corresponding sampling rate

Enrollment size class

Number of schools
to be selected

Sampling rate (1 in ...)

1. Under 300

238

25.96

2. 300 to 499

196

15.90

3. 500 to 999

340

11.61

4. 1,000 to 1,499

277

8.99

5. 1,500 or more

450

7.11

Total

1,500

–––



Expected Levels of Precision

Table 5 summarizes the approximate sample sizes and standard errors to be expected under the proposed design for selected analytic domains. Note that the standard errors in Table 5 reflect design effects ranging from 1.10 to 1.30. The design effects (i.e., unequal weighting effects) are a consequence of the fact that large schools will be sampled at relatively higher rates (i.e., have smaller sampling weights) than small schools. Since the sample sizes in Table 5 are based on preliminary tabulations of the CCD file, the actual sample sizes may differ from those shown. Also, note that the sample sizes represent the expected numbers of completed questionnaires, and not the initial number of schools to be selected. The 2002–03 FRSS survey on dual credit achieved an unweighted response rate of over 90 percent. The standard errors in Table 5 can be converted to 95 percent confidence bounds by multiplying the entries by 2. For example, as can be seen in Table 5, an estimated proportion of the order of 20 percent (P = 0.20) for city schools would be subject to a margin of error of ±0.050 (±5.0 percent) at the 95 percent confidence level. Similarly, an estimated proportion of the order of 50 percent (P = 0.50) for the total sample would be subject to a margin of error of ±0.032 (±3.2 percent) at the 95 percent confidence level.

Table 5. Expected sample sizes (number of responding schools) and corresponding standard errors for estimates of proportions for selected analytic domains

Domain

Expected sample size1

Standard error2 of an estimated

proportion equal to ...

P = 0.20

P = 0.33

P = 0.50






Total sample

1,350

0.012

0.015

0.016


 




Type of locale

 




City

325

0.025

0.030

0.032

Suburban

362

0.024

0.028

0.030

Town

186

0.033

0.039

0.042

Rural

477

0.021

0.025

0.026


 




Percent of students in school who are nonwhite

 




Less than 6 percent

229

0.030

0.035

0.038

6 to 20.9 percent

331

0.025

0.029

0.031

21 to 49.9 percent

330

0.025

0.029

0.031

50 percent or more

459

0.021

0.025

0.027


 




Region

 




Northeast region

255

0.029

0.034

0.036

Southeast region

324

0.025

0.030

0.032

Central region

351

0.024

0.029

0.030

West region

420

0.022

0.026

0.028


 




Enrollment size class





Less than 300

214

0.029

0.034

0.036

300 to 499

176

0.032

0.037

0.039

500 to 999

306

0.024

0.028

0.030

1,000 to 1,499

249

0.027

0.031

0.033

1,500 or more

405

0.021

0.025

0.026

1Expected number of responding schools assuming a 90 percent response rate.

2Assumes design effects ranging from 1.1 to 1.3 depending on analytic domain.



Estimation and Calculation of Sampling Errors

For estimation purposes, sampling weights reflecting the overall probabilities of selection will be attached to each data record. These weights will also include upward adjustments for unit nonresponse. To properly reflect the complex features of the sample design, standard errors of the survey-based estimates will be calculated using jackknife replication. Under the jackknife replication approach, 50 subsamples or “replicates” will be formed in a way that preserves the basic features of the full sample design. A set of estimation weights (referred to as “replicate weights”) will then be generated for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any survey statistic can be calculated for the full sample and each of the 50 jackknife replicates. The mean square error of the replicate estimates then provides a measure of the variance (standard error) of the survey statistic. Previous surveys, using similar sample designs, have yielded relative standard errors (i.e., coefficients of variation) in the range of 2 to 10 percent for most national estimates. Similar results are expected for this survey.

PEQIS 18 Postsecondary Institution Survey

Respondent Universe and Statistical Methodology

This survey will be sent to approximately 1,600 postsecondary institutions in the PEQIS panel. The sampling frame for the PEQIS panel was constructed from the Integrated Postsecondary Education Data System (PEQIS) “Institutional Characteristics” file. Institutions eligible for the PEQIS frame included 2-year and 4-year (including graduate-level) Title IV-eligible degree granting institutions located in the 50 states and the District of Columbia: a total of approximately 4,300 institutions.

The PEQIS sampling frame was stratified by instructional level (4-year, 2-year), control (public, private not-for-profit), highest level of offering (doctor’s/first professional, master’s, bachelor’s, less than bachelor’s), and total enrollment. Within each strata, institutions were sorted by region (Northeast, Southeast, Central, West), and by whether the institution had a relatively high minority enrollment. The sample of institutions was allocated to the strata in proportion to the aggregate square root of total enrollment. Institutions within a stratum were sampled with equal probabilities of selection.



1 Florida Department of Education. (2004). Impact of Dual Enrollment on High Performing Students. Tallahassee, FL: Florida Department of Education.

2Hoffman, N. (2005). Add and Subtract: Dual Enrollment as a State Strategy to Increase Postsecondary Success for Underrepresented Students. Boston, MA: Jobs for the Future.

3 Dounay, J. (2008). Dual Enrollment. Denver, CO: Education Commission of the States.

4 Karp, M., Calcagno, J.C., et al. (2007). The Postsecondary Achievement of Participants in Dual Enrollment: An Analysis of Student Outcomes in Two States. St. Paul, MN: National Research Center for Career and Technical Education.

5 Hoffman, N., Vargas, J. and Santos, J. (2008). On Ramp to College: A State Policymaker’s Guide to Dual Enrollment. Boston, MA: Jobs for the Future.

6 Jobs for the Future. (2010). Early College High School: A Portrait in Numbers. Boston, MA: Jobs for the Future.

7 This estimate is the average amount of time secondary school staff respondents reported the questionnaire took them to complete during the pretest.

8 This estimate is the average amount of time postsecondary respondents reported the questionnaire took to complete during the two pretest rounds.

22


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMarken_S
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy