Final REL OMB Part B 062111

Final REL OMB Part B 062111.doc

Survey of Customers, Evaluation of the Regional Educational Laboratories

OMB: 1850-0885

Document [doc]
Download: doc | pdf



Customer Survey, Evaluation of the Regional Educational Laboratories


Statement for Paperwork Reduction Act Submission


PART B: Collection of Information Employing
Statistical Methods


Contract ED-04-CO-0059/0031







November 2010

Updated March and June 2011





Prepared for

Institute of Education Sciences

U.S. Department of Education


Prepared by

Westat


Contents



Page



Part B: Collection of Information Employing Statistical Methods


B.1 Respondent Universe and Sampling Methods

The primary customers of the RELs, and thus the two groups of survey respondents, are employees of SEAs and LEAs in their respective geographic regions. These may include both users and potential users of REL services and products. The potential respondent universes, which include users, and sampling methods used are described below for the two groups.

SEA Respondent Universe and Sampling Methods

There are 56 SEAs served by RELs, including 50 states, the District of Columbia (DC), Puerto Rico (PR), Virgin Islands (VI), three territories: American Samoa (AS), the Commonwealth of the Northern Mariana Islands (MP), and Guam (GU). These will all be included in the SEA stratum.



The REL Pacific serves one state, Hawaii (HI); three territories, AS, GU, and MP; and three nations in free association with the United States, the Federated States of Micronesia, the Republic of the Marshall Islands, and the Republic of Palau. These three nations have neither state nor local educational agencies. However, since they are served by a REL, any potential users from these nations are eligible for the survey and should be covered by the survey. The study team will cover the three nations in the SEA stratum but combine them into one unit, which will be referred to as the Pacific Nations. Another unique feature of the REL Pacific is the fact that all SEAs served by the REL have only one school district, so the SEAs and LEAs coincide. Therefore, once they are covered in the SEA stratum, the LEAs do not need to be covered by the LEA stratum. The SEA for PR also has only one regular district; it will be covered only by the SEA stratum.


The study team will construct a sample frame of potential users for the SEA stratum using association lists and visiting association websites, if necessary. The potential users the study team are targeting for the survey include the chief state school officers; special education directors; Title I directors; bilingual education directors; as well as administrators in charge of certification and licensure, assessment, and professional development. It is not the study team’s intention to cover all conceivable potential users but to cover the majority of them, and some rare categories of potential users may be excluded if they are too difficult to find.

The number of administrators in the above-mentioned positions across all SEAs served by the RELs is not expected to be great, so a census will be used if it does not exceed 300. However, if the total number of potential users exceeds 300, simple random samples from large SEAs will be drawn to keep the sample size at 300.

LEA Respondent Universe and Sampling Methods

The same approach for selecting the sample of SEA respondents cannot be taken for selecting the sample of LEA respondents because local association lists may not exist in some areas and will not include all potential users even when they do exist. To select the sample of LEA respondents, the study team will use a two-stage sample design in which LEAs are sampled first, then potential user lists for the sampled LEAs will be compiled from association lists, district web sites, and by contacting the sampled LEAs to identify individuals appropriate for the survey. This approach must be implemented using cluster sampling, which makes the design less efficient. The potential users from the LEA stratum include superintendents; assistant superintendents (e.g., for instruction); special education directors; Title I directors; bilingual education directors; and administrators in charge of certification and licensure, assessment, and professional development.


To build a sample frame of LEAs through which a sample of potential users can be selected, the study team will use the Common Core of Data (CCD) that is produced annually by the National Center for Education Statistics (NCES). Most of the 18,130 LEAs in the most recent CCD will be included in the sample frame, excepting supervisory union administrative centers, which do not have any local school districts, Department of Defense (DOD) dependents schools located overseas, and agencies without students of pre-kindergarten (PK) through grade 12. Any LEAs from HI, AS, GU, MP, and PR covered by the SEA stratum will be excluded as will districts serving a very small number of students (fewer than 10). The frame will consist of all other types of LEAs, including LEAs designated as domestic DOD and the Bureau of Indian Education (BIE). Excluding these LEAs, the LEA frame includes about 16,200 units.


To select a sample of potential users for the LEAs, a list frame of potential users (n=~129,600) from each of the LEAs selected in the first-stage sample will be assembled using district websites, local association lists, and contact with the LEAs. Exhibit B-1 shows the estimated number of potential respondents in the universe and in the sample for both the SEA and LEA stratum.


Exhibit B-1. Stratum and corresponding estimated number of entities in the universe and in the sample


Stratum

Estimated number of entities in universe

Number in the sample

SEA

300 - 400

300

LEA

1st stage – 16,200 LEAs

1,800 LEAs

2nd stage – 129,600 LEA officials

14,400 LEA officials

.

B.2 Information Collection Procedures

SEA sample design


If the total number of potential users exceeds 300, the study team will draw simple random samples from large SEAs to keep the sample size at 300. The study team will use the following sampling scheme depending on the SEA size (M) in terms of the number of potential users.

  • If M < 5, take all;

  • If 5 ≤ M < 10, take a simple random sample of 5;

  • If 10 ≤ M < 15, take a simple random sample of 7;

  • If M ≥ 15, take a simple random sample of 10.


LEA sample design


The LEA size varies widely, so the study team will select a sample of potential users in two stages by selecting 1,800 LEAs by Probability-Proportional-to-Size (PPS) sampling using an appropriate size measure. Since a reliable number of potential users in each LEA, which is an ideal size measure, is not available in the sample frame, the study team will use a proxy size measure that the study team believes is highly correlated with the number of potential users. The study team will select a fixed number of potential users from each selected LEA by an equal probability sampling method, such as simple random sampling or systematic sampling. This method will stabilize the sampling weights and thus increase the sampling efficiency as it reduces the variability of the sampling weight if the proxy size measure is highly correlated with the number of potential users and there are enough potential users to be selected with the fixed within-LEA sample size. However, the second condition will not be met because there are many small districts with only one or a few potential users. This problem will be handled by using three or four size strata in each REL region depending on the size distribution of LEAs within the REL. The size-strata are then allocated a sample size proportional to the total stratum size (student enrollment), and an appropriate number of LEAs are selected so that a fixed number of potential users can be selected from the sampled LEAs. The size-stratification makes this more achievable.


It is clear that the self-reported number of administrators in CCD is not a reliable size measure because different LEAs use different reporting practices, so the study team needs a good proxy size measure. Total student enrollment is sufficiently reliable for regular districts and components of supervisory unions (type codes 1, 2, and 3 in CCD),1 and the study team believes that it is a good proxy size measure. Supervisory unions do not have a total enrollment on the CCD, but their components do; these enrollment counts will be combined as one unit, and the size measure will be calculated as the sum of its components’ sizes. For several other types of districts (type codes 4 through 8), student enrollment is not a good indicator of the number of potential users in a district, and there is no other good proxy size measure available in the frame. Therefore, the study team will use a different sampling strategy for districts with type codes 4-8. First, the study team will stratify the LEA universe into two super strata: one, called Super-stratum A, for type codes 1-3, and the other, called Super-stratum B, for type codes 4-8. In Super-stratum A, the study team will use the design described above, but for Super-stratum B, the study team will use a two-phase sample design, where a larger sample of LEAs than needed is selected in the first phase. The study team will obtain the number of potential users for this sample through an internet search. This actual size measure (identified through the internet search) will be used to stratify the first-phase sample to select the second-phase sample of LEAs with the needed sample size.

LEAs in Super-stratum B tend to have a smaller number of potential users except state or federally operated special education institutions, but the number of such institutions is small. In terms of LEA population size, Super-stratum B comprises about 15 percent of the whole universe, so if the study team allocates the LEA sample of 1,800 proportionally disregarding this fact, 270 LEAs (30 per REL) would be allocated. However, for the reason explained above, the study team will increase it to 360 (40 per REL), which is the target for the second-phase sample. This will be doubled to 720 to be used as the first-phase sample size.2 So Super-stratum A will be given a sample size of 1,530, while Super-stratum B will be assigned 720 as the first-phase sample size, from which a sample of 360 will be selected in the second phase. The total initial sample will be 2,250, which will be reduced to 1,800 when the second-phase sampling is finished.

For both Super-strata, the study team plans to establish three or four size strata within each REL as follows:

  • Large (Very Large and Large if four strata are used)

  • Medium

  • Small

The study team will establish the cut points within each REL in such a way that the size strata receive equal sample allocation.


Analysis will be focused at the REL level, rather than at the national level, so the precision requirement of the survey will be set at the REL level. To determine the sample size at the REL level, the study team make the following assumptions: (1) the target precision is set to be 3 percentage points for a population proportion of 50 percent for a potential user characteristic—the half length of the 95 percent confidence interval is then 6 percentage points; (2) the expected response rate is 80 percent; (3) the REL use rate (i.e., the percentage of the actual users among the potential users) is 40 percent (the assumption made by Mathematica Policy Research for the IES’ Analytic and Technical Support survey); and (4) the design effect in the final nonresponse adjusted weights is 1.8.3


The determination of the sample size that meets the precision requirement starts with the calculation of the effective sample size,4 which is 278 (ignoring the finite population correction to be conservative). Applying the assumed response rate and design effect, the study team reaches a sample size of 626 for each REL. The grand total sample size for nine RELs from the LEA stratum is then 5,634—the REL Pacific will be covered by the SEA stratum. If the study team wants this precision level for user characteristics, the study team should apply the use rate (40%), and the needed sample size becomes 1,565 for each REL and 14,085 overall, which is increased to 14,400 to give 1,600 LEAs to each REL. Exhibit B-2 depicts the sample size and precisions.5


Exhibit B-2. Sample sizes and corresponding precisions for the two-stage design for the LEA stratum


REL Sample Size

Overall Sample Size (for all 10 RELs)

REL Precision for Potential Users (%)a

REL Yield for Users

REL Precision for Users (%)a

1,600

14,400

1.9

512

3.0

a The margin of error (one half-length) for the 95 percent confidence interval is about 2 times the precision.



B.3 Methods to Maximize Response Rates

To help ensure a high response rate, as the initial step, the study team will send a letter to the sampled potential respondents on ED letterhead signed by an NCEE official that explains the nature of the evaluation, OMB clearance information, and ED/Westat contact information (see attachment B). A letter from Westat will include the URL for the web survey and a username. The study team will send reminder emails or letters after 2 weeks and again after 4 weeks. Phone follow-up will be used for those individuals who do not respond after the second email reminder or letter. At the same time, the study team will mail paper copies of the survey to ensure an adequate response rate. The study team will use a management database to track response rates, including partial completes, and identify patterns of nonresponse.


Exhibit B-3 summarizes the strategies we will undertake to maximize response rates. With these strategies being employed we expect a minimum of an 80% response rate.



Exhibit B-3. Strategies to Maximize Response Rates


Provide clear instructions and user-friendly materials

  • Send personalized introductory letters from ED and Westat that explains the survey and what participation entails, provides assurance of confidentiality, and provides the web address and username along with instructions for completing the on-line survey.

Stress the brevity of the survey in initial correspondence and follow-up

  • Keep the survey very short and stress its brevity in communications with respondents.

Conduct non-response follow-up

  • Send reminder emails or letters after 2 weeks and again after 4 weeks.

  • Call and mail paper copies to individuals who do not respond after the second email or reminder letter.

Offer technical assistance for survey respondents

  • Provide toll-free technical assistance telephone number and email address.


Monitor progress regularly

  • Produce weekly data collection report of completed surveys

  • Maintain regular contact between study team members to monitor response rates, identify non-respondents, and resolve problems



B.4 Test of Procedures

The survey was tested by members of the Technical Work Group and cognitive testing was conducted internally by Westat’s Instrument Design, Evaluation and Analysis team of experts. Information derived from the testing was used to refine the survey. Refinements included rewording of response options and an additional topic added to items A1 and B6. The study team will conduct a pilot test with up to nine SEA and LEA respondents prior to finalizing the web version of the survey.


B.5 Individuals Consulted on Statistical Aspects of Design


These data collection plans were developed by Westat. The research team is led by Babette Gutmann, project director. Other members of the evaluation team who worked on the design include: Elaine Carlson and Hyunshik Lee from Westat. The NCEE project officer, Jonathan Jacobson, also played a central role in data collection plans. Contact information for these individuals is provided below.


Jonathan Jacobson

Institute of Education Sciences, U.S. Department of Education

202-208-3876


Babette Gutmann

Westat

301-738-3626


Elaine Carlson

Westat

301-251-4277


Hyunshik Lee

Westat

301-610-5112


1 Definition of CCD type code: 1=(Regular) Local school district that is not a component of a supervisory union; 2=Local school district component of a supervisory union sharing a superintendent and administrative services with other local school districts; 3=Supervisory administrative center or county superintendent serving the same purpose; 4=Regional education services agency or a county superintendent serving the same purpose; 5=State-operated institution charged, at least in part, with providing elementary and/or secondary instruction or services to a special-needs population; 6=Federally operated institution charged, at least in part, with providing elementary and/or secondary instruction or services to a special-needs population; 7=Agencies for which all associated schools are charter schools; 8=Other educational agencies that do not fit into the first seven categories.

2 This doubling factor for the first-phase sample size is obtained from the formulae given in Cochran (1975, p. 330-331) for estimation of a population proportion under some simplifying assumptions and assuming that the unit cost ratio of the first-phase to the second-phase is 1:10.

3 This design effect is based on the assumption that seven potential users per LEA will respond, the intra-class correlation is at most 0.1, and the effect of unequal weights is mild.

4 The effective sample size is the sample size necessary to meet a certain precision requirement under simple random sampling. Even though the sample is selected by simple random sampling, the final sample may not be of equal weights due to weight adjustment for nonresponse. Therefore, the effective sample size may not be the same as the original sample size, even for the simple random sample.

5 The study team recognizes that this is a conservative sample size for estimating user characteristics because the design effect will be smaller for them, and the precision could be better than 3 percent.

File Typeapplication/msword
File Modified2011-06-21
File Created2011-06-21

© 2024 OMB.report | Privacy Policy