ICRAS ICR ID and OMB Number: (1974.01) XXXX-XXXX
Revised: XX/XX/XXXX
RIN Number: XXXX-XXXX (if applicable)
FOR PAPERWORK REDUCTION ACT SUBMISSION
Mandatory Civil Rights Data Collection
June 2013
ICRAS ICR ID and OMB Number: (1974.01) XXXX-XXXX
Revised XX/XX/XXXX
RIN Number: XXXX-XXXX (if applicable)
Describe the potential respondent universe (including a numerical estimate) and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, state and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
The 2013-14 CRDC will be a universe of public schools and school districts. Therefore none of the sampling procedures (stratification, estimation, etc.) are proposed to be used in the 2013-14 CRDC administration. For the 2013-14 CRDC, the National Center for Education Statistics (NCES), in collaboration with OCR, will be redesigning the collection tool and process to both reduce the burden on school districts and improve data accuracy, reliability and quality.
OCR intends the 2015-16 collection to also be a universe of public schools and school districts. However, if budget constraints are such that a universe collection in 2015-16 is not possible, OCR will conduct a sample of public school districts. In the event that a sample will be selected for the 2015-16 CRDC, OCR will submit an updated sample selection plan to OMB for review and approval. OCR is providing information regarding a potential sampling frame and methodology for the 2015-16 CRDC, below.
Respondent Universe
The respondent universe of the 2013-14 CRDC and the sampling frame for the 2015-16 CRDC will be the most recently available data from the Common Core of Data (CCD) National Public Education Survey of Local Educational Agencies (LEA). The CCD is designed to be the Department of Education’s comprehensive statistical database of all public schools and school districts. Most of the data are obtained from administrative reports maintained by state education agencies (SEAs). The CCD survey is collected annually by NCES. The frame for the CCD survey includes regular, non-regular (special education, alternative, vocational, or technical), and public charter schools. The sample selection for the 2015-16 SY CRDC must occur before the beginning of the 2015-16 school year in order to notify school districts of their selection. As a result, OCR will use the most recently available CCD data (possibly from the 2014-15 school year) as the basis for the CRDC sampling frame.
For the CRDC and the CCD, an eligible school is defined as an institution that provides educational services and:
has one or more grade groups (PK through 12 grade) or is ungraded,
has one or more teachers,
is located in one or more buildings,
has assigned administrator(s),
receives public funds as its primary support, and
is operated by an educational agency.
Note - For purposes of this definition, "public funds" includes federal, state, and local public funds. “Located in a building" does not preclude virtual schools since the administrators and teachers are located in a building somewhere. An “education agency" is not limited to the state or local educational agency, but can include other agencies (e.g., corrections or health and human services) charged with providing public education services.
Frame Additions and Deletions
While the CRDC definition of a school matches that used by the CCD, there are a few operational differences. In some instances, schools in the CCD are essentially administrative units that may oversee entities that provide classroom instruction, or the school in the CCD may provide funding or oversight only. The CRDC is primarily designed to collect data from public school districts about educational entities where students receive educational services for at least 50 percent of the school day, regardless of whether students are reported elsewhere for funding, accountability, or other reporting purposes. To be eligible to participate in the CRDC, schools must serve students at the site for at least 50 percent of the school year. Since the CCD and CRDC differ slightly in scope, some records are deleted, added or modified in order to provide better coverage and a more efficient sample design for the CRDC. The following types of school records are deleted from the CCD during the creation of the sampling frame:
District boundary type 2: Agency has closed with no effect on another agency’s boundaries
School status 2: School has closed since the time of the last report
Nonoperational school districts: School districts that do not operate a school
Schools and school districts with a Federal Information Processing Standards (FIPS) state code of 58 (overseas DoD), 60 (American Samoa), 66 (Guam), 69 (Northern Marianas), 72 (Puerto Rico), or 78 (U.S. Virgin Islands).
Districts with no membership or missing membership at the district level are generally excluded, except in some special cases, such as where membership data were available for the associated schools.
Note that if the most recent CCD data for use in the sampling frame are from the 2014-15 school year, the CRDC will continue to include schools that are temporarily closed and may reopen in 3 years (status 6) and schools that may be operational within 2 years (status 7).
Additionally, OCR augments the CRDC frame with justice facilities, which may not be under the purview of the SEA. In collaboration with the Department of Justice Office for Juvenile Justice and Delinquency Prevention (OJJDP), OCR adds justice facilities which may not have been otherwise included in the CCD to ensure coverage of all youth in pre- or post-adjudication facilities that receive educational services. Also, state-operated programs for special populations of students (such as schools for the deaf and schools for the blind) are added to the universe if they are not already included in the CCD list.
It is estimated that the sampling frame for the 2015-16 school year CRDC will contain approximately 18,000 school districts. Below is the previous sample selection table used in planning for the 2009-10 CRDC:
Enrollment stratum |
Total in Universe |
Total in Sample |
1 – 300* |
3,044 |
610 |
301 – 3,000 |
7,763 |
2,887 |
3001 – 5,000 |
1,456 |
1,456 |
5,001-25,000 |
1,701 |
1,701 |
25001 + |
275 |
275 |
Response Rate
The CRDC has a traditionally high response rate due to the mandatory nature of the data collection. The typical response rate is between 98 and 100 percent. In 2009-10, 100 percent of all participating school districts provided data for the CRDC. This represented 48 percent of all school districts serving students in 2009-10. For the 2011-12 CRDC, OCR anticipates the response rate will be at least 98 percent. The anticipated response rate for the 2015-16 CRDC is expected to be no less than 98 percent.
Describe the procedures for the collection of information, including:
Statistical methodology for stratification and sample selection
Unusual problems requiring specialized sampling procedures
The 2013-14 CRDC will include a universe of all public schools and school districts. Therefore none of the sampling procedures (stratification, estimation, etc.) are proposed to be used in the 2013-14 CRDC administration.
Statistical Methodology for Stratification and Sample Section
The CRDC, which has collected data from school districts (LEAs) since 1968, has generally included a sample of approximately 6,000 school districts. In 1976, 2000, and 2011-12, data were collected from a universe of school districts. In 2009-10, the sample size was increased to approximately 7,000 school districts, expanding the coverage to include all school districts with enrollments of over 3,000 students, long-term secure juvenile justice facilities, and state-operated programs. The increased sample size provided a significantly more comprehensive picture of the access to equal educational opportunities for all students.
In CRDC samples since 1998, districts were selected using a rolling stratified sampling method that ensured a representative group from each state was included in the data collection. Under this design, each subsequent sample was selected in a way that minimized overlap with the preceding samples to the extent feasible. The procedures used to accomplish this objective are designed to avoid introducing biases and inefficiencies that would otherwise occur by simply excluding prior sample selections from the current sampling process.
This approach used strata divided by size of district, and sub-strata of high/low minority students. For the 1998, 2002, 2004, and 2006 samples, strata were defined by the following school district enrollment sizes: 1-300 students, 301-3,000 students, 3,001-5,000 students, 5,001-25,000 students, and over 25,001 students. The percentage of LEAs selected varied by state, inversely related to a factor based on the number of LEAs and the enrollment (i.e., states with fewer LEAs and students generally had a larger percentage selected to ensure adequate representation for statistical reliability). For the 2009-10 CRDC, OCR included all districts with more than 3,000 students. Additionally, some districts are always in the sample, such as all districts in states with fewer than 25 regular public school districts, school districts with more than 25,000 students and districts subject to federal court orders that are monitored by the U.S. Department of Justice. These “certainty districts” were removed from the sampling frame and the sample was drawn from the reduced sampling frame. All schools within selected districts were included in the CRDC sample.
In collaboration with NCES, OCR will review the past sample design to determine its adequacy in meeting the needs for state and national estimations of school and student counts. For the 2015-16 sample, OCR and NCES may consider alternative thresholds for stratum and sampling with certainty that are necessary to achieve reliable state and national estimations. Additionally, estimated percentages of particular student and school characteristics are an important feature of the CRDC. OCR and NCES will also collaborate on whether there is a need to oversample specific student populations or school types, such as charter schools, which are of interest to ED. The goal of the sample design will be to ensure sufficient numbers for precise state and national estimates. OCR anticipates that the 2015-16 CRDC sample data will be used extensively by its enforcement offices and by other offices in ED for programmatic and policy development purposes.
Based on currently available data for the 2011-12 SY, OCR estimates that the final 2015-16 CRDC sample will include no more than:
9,500 school districts, and
86,000 schools.
Procedures for the Collection of Information:
Verification and updates of school lists by selected LEAs: Fall 2015
Once selected for the CRDC, school districts are notified by standard mail of their required participation in the collection and asked to verify and update their listing of schools and provide a primary point of contact for the survey (previous letters are available at http://crdc.ed.gov/LEA/AdditionalResources.aspx). Links to survey questionnaires and supporting documents are also included in the initial mailing to school districts. A Web-based system is available to school districts to provide contact information for a principal point of contact. School districts may also verify their school list, add new schools that opened at the beginning of the 2015-16 school year, collapse schools that have merged, or remove schools that have closed. In past surveys, OCR has encountered discrepancies between the definitions of a school as held by the school district themselves and as reported by state departments of education to the CCD. This issue occurs most often in rural areas or in schools that offer grades K-12 in one building with one head principal. The schools often consider themselves one cohesive unit while the state does not. For accounting or other administrative purposes, the state may artificially split these schools by grade level and report them as two or three separate schools. For the purpose of CRDC, the K-12 school with one principal can be collapsed into one reporting entity and report all students enrolled in grades K-12.
Follow-up for non-responding school districts begins approximately three weeks after the initial mailing. Telephone follow-up and reminder letters will be sent to school districts that do not provide a principal point of contact or verify their school list.
Data collection by selected LEAs: Fall 2015-Summer 2016
After identifying a principal point of contact and a verified list of participating schools, OCR provides frequent training opportunities for school districts to understand the data elements collected on the CRDC and the survey submission process. Webinars, frequently asked questions and short tip sheets are circulated via email and posted to the CRDC informational website (crdc.ed.gov). This correspondence to school districts also serves as a reminder of their obligation to collect the required data over the course of the school year. A support center is available to school districts to call or email questions regarding the content of the data to be collected. During this time, training on preparing files for submission occurs. OCR has also provided pre-collection tools for school districts to gather and prepare flat files of the required data to prepare for the survey submission opening.
Survey Submission: Summer 2016
The survey submission widow opens with email notification to all participating school districts. School districts are typically given a minimum of 3 months to submit their data to OCR. In anticipation of the survey submission system opening, OCR and its contractors provide webinar training about using the survey submission website beginning two weeks prior to opening and continuing three weeks after the opening of the survey submission system. These trainings are also posted on the CRDC informational website for school districts to access at any point during the data collection cycle. During the survey submission period, frequent communication occurs with participating school districts to offer technical assistance and, as the survey due date approaches, reminders are sent to school districts that have not yet certified their CRDC submission.
Estimation procedure.
If a sample is used for the 2015-16 CRDC, OCR will develop state and national estimations. As has been done in the past, after the SY 2015-16 collection, statistical estimations for most items will be calculated for the nation and for each state. OCR will base the calculations on the methodology used for selecting the sample and in-depth analysis of non-respondent LEAs and/or schools and missing items within surveys to yield statistically sound conclusions for the universe. Data will be weighted to compensate for schools that did not provide usable data and values will be imputed on an item-by-item basis to compensate for item non-response. Final weighting and estimations procedures will be dependent on the sample selection plan for 2015-16.
Calculation of weights
In past CRDC samples, school districts were weighted by the inverse of the probability of selections. School and school district weights were then adjusted based on final item and survey response rates and any other sampling considerations that arose after the sample was drawn. For the 2015-16 CRDC, replicate weights will be attached to each surveyed school so that the weighted data will represent state and national totals. The final, adjusted weights used to estimate state and national student count totals will ensure that the sum of the weighted membership data (DG979) matches the number of students derived from the 2015-16 CCD public school universe file.
Degree of accuracy needed for the purpose described in the justification.
Accuracy
The CRDC is intended to collect information about educational equity and excellence in public elementary and secondary education. Although respondents are experts in the educational opportunities and participation in their school districts, there are opportunities for potential error, either through technical mishap or misinterpretation of the intent of a survey item. Over the course of several CRDC administrations, OCR has developed and continues to develop a series of checks designed to flag these errors for review by the respondent and OCR. These edits rely on internal logic checks, consistency within specific tolerances, and comparisons to similar data collected by other program offices within ED.
Imputations
Because the CRDC is a mandatory collection, respondents are required to provide data for each applicable question (see attachment A-4 for more details about how directional indicators are used to determine question applicability). In rare cases, a school district may not be able to respond with complete and accurate data to a specified data group. When there is item nonresponse, the item may be imputed for the 2013-14 and 2015-16 reporting of state and national totals.
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
The Department has historically conducted this survey biennially to reduce burden.
Describe methods to maximize response and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.
Historically the civil rights survey has had a very high response rate. In 2000, the predecessor Elementary and Secondary School Civil Rights Compliance Report (E&S Survey) was sent to a universe of all school districts and schools in the United States. The overall response rates were 97% of all school districts and 99% of all schools. The overall response rates for the 2002 E&S Survey were 98% for school districts and 98% for schools. For the 2004 CRDC, the response rates, including partial respondents to the data collection, were approximately 97% of all districts, and 97% of all schools. The 2006 CRDC achieved an unprecedented 100% response rate for school districts and a 99.6 % response rate for schools. The 2009-10 CRDC achieved a response rate of 100% of school districts, and 100% of schools.
Methods to maximize response rates
Frequent communications occur with participating school districts over the course of the data collection to ensure compliance with this statutorily mandated collection. School districts are notified by standard mail of their obligation to report and frequent email correspondence occurs with the school district’s primary point of contact regarding technical assistance available to support the school districts submitting the required data, reminders of upcoming deadlines, and notifications if the CRDC was not submitted by the due date. If school districts fail to respond in a timely manner, the contractor for the data collection, with assistance from OCR and its field offices as necessary, provides extensive outreach and assistance to the greatest extent possible until the districts respond, or the final deadline for accepting data has passed. The superintendents of non-responding school districts are also contacted by phone, email, and standard mail. This has proven to be very successful in past years.
Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
OCR plans to test the data collection procedures and data items described in this submission in a number of ways. Most of the data elements requested have already been collected in the previous 2009-10 and 2011-12 CRDCs. However, data quality is an overriding concern that OCR continues to assess and evaluate. OCR and NCES are assessing relevant data from CRDC survey years to evaluate the internal and external consistency and reliability of the reported data to continuously improve the business rules and edit checks used in the survey submission system. Edit checks currently help to identify potential problems and provide opportunities for school districts to correct possible mistakes before certifying the accuracy of their submission.
To support continued improvement of data quality in school districts and ensure the procedures for the survey are designed to minimize the burden on sampled schools and districts, OCR and NCES are partnering in the development of an enhanced survey submission system for the 2013-14 and 2015-16 collection. The redesigned submission system will build on improvements made in the 2009-10 and 2011-12 collections that enhanced the ability of school districts to provide partial flat files and improved data accuracy. OCR and NCES plan to test and pilot the survey submission system in a number of ways.
Content.
ED intends to conduct recordkeeping visits with school districts to determine whether and how they presently collect data for the new data groups. The sites will reflect a diverse set of districts in terms of size, urban/city, level of sophistication of SEA and LEA data systems, and programs offered. These site visits will be done in conjunction with NCES and under a separate OMB clearance package. These visits will also gather information on ways in which the survey collection tool can assist in improving data quality through edit checks and other analytical procedures. Additionally, ED is exploring the feasibility of conducting up to 20 one-on-one cognitive interviews with SEA, LEA, and/or school respondents to validate proposed item wording. These interviews may be conducted by phone and in person, as appropriate.
Survey Tool.
OCR and NCES will be developing an enhanced survey collection tool for the 2013-14 and 2015-16 CRDC. Testing plans for the tool closely mirror previous CRDC protocols. For the 2009-10 CRDC, ED invited several members of the CRDC technical working group to participate in some of the initial discussions with the contractor about layout and features of the tool. ED also pilot tested the online survey tool with LEAs prior to opening the survey tool for submissions by LEAs. The pilot LEAs had access to the online survey tool for five business days, during which the LEAs responded to the CRDC, entering previous-year or fictitious data, at their choice.
For the 2013-14 CRDC, ED plans to expand the pilot to include approximately 40-50 LEAs who will have access to the survey software for a period of no less than ten business days. ED will work with its selected contractor to compile LEA suggestions, categorized by level of criticality and feasibility. Based on these results, ED will identify changes, such as text refinements, and any critical technical issues for correction and resolution prior to the opening of the survey tool for all participating LEAs.
Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other persons who will actually collect and/or analyze the information for the agency.
OCR has consulted with the National Center for Education Statistics (NCES) regarding the preliminary plans for a sample for the 2015-16 CRDC. However, OCR intends the 2015-16 collection to also be a universe of public schools and school districts. If budget constraints are such that a universe collection in 2015-16 is not possible, OCR will work closely with NCES to finalize the sample selection plan to OMB for review and approval.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | SUPPORTING STATEMENT |
Author | Kenneth Smith |
File Modified | 0000-00-00 |
File Created | 2021-01-28 |