OMB_Supporting Statement B_FINAL_05292015

OMB_Supporting Statement B_FINAL_05292015.docx

Mandatory Civil Rights Data Collection

OMB: 1870-0504

Document [docx]
Download: docx | pdf

ICRAS ICR ID and OMB Number: (XXXX.XXX) 1870-0504

Revised: XX/XX/XXXX

RIN Number: XXXX-XXXX (if applicable)



FOR PAPERWORK REDUCTION ACT SUBMISSION

Mandatory Civil Rights Data Collection

May 2015



Supporting Statement, Part B: Collections of Information Employing Statistical methods

ICRAS ICR ID and OMB Number: (XXXX.XX) 1870-0504

Revised XX/XX/XXXX

RIN Number: XXXX-XXXX (if applicable)

  1. Describe the potential respondent universe (including a numerical estimate) and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, state and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

The 2015–16 Civil Rights Data Collection (CRDC) will be a universe of public schools and school districts. Therefore, none of the sampling procedures (i.e., stratification, estimation, etc.) are proposed to be used in the 2015–16 CRDC administration.

However, if budget constraints are such that a universe collection is not possible, the Office for Civil Rights (OCR) will select a sample of public school districts. In the event that a sample will be selected for the 2015–16 CRDC, OCR will submit an updated sample selection plan to OMB for review and approval. OCR is providing information regarding a potential sampling frame and methodology for the 2015–16 CRDC, below.

Respondent Universe

The respondent universe of the 2015–16 CRDC and the potential sampling frame for the 2015–16 CRDC will be the most recently available data from the Common Core of Data (CCD) National Public Education Survey of Local Educational Agencies (LEA). The CCD is designed to be the Department of Education’s (ED) comprehensive statistical database of all public schools and school districts. Most of the data are obtained from administrative reports maintained by state educational agencies (SEA). The CCD survey is collected annually by the National Center for Education Statistics (NCES). The frame for the CCD survey includes regular, non-regular (special education, alternative, vocational, or technical), and public charter schools. The sample selection for the 2015–16 CRDC, if any, should occur before the beginning of the 2015–16 school year, in order to notify school districts of their selection. As a result, OCR will use the most recently available CCD data (possibly from the 2014–15 school year) as the basis for the CRDC sampling frame.

For the CRDC and the CCD, an eligible school is defined as an institution that provides educational services and:

  • has one or more grade groups (PK through grade 12) or is ungraded,

  • has one or more teachers,

  • is located in one or more buildings,

  • has assigned administrator(s),

  • receives public funds as its primary support, and

  • is operated by an educational agency.

Note - For purposes of this definition, "public funds" includes federal, state, and local public funds. “Located in a building" does not preclude virtual schools since the administrators and teachers are located in a building somewhere. An “educational agency" is not limited to the state or local educational agency, but can include other agencies (e.g., corrections or health and human services) charged with providing public education services.

Frame Additions and Deletions

While the CRDC definition of a school matches that used by the CCD, there are a few operational differences. In some instances, schools in the CCD are essentially administrative units that may oversee entities that provide classroom instruction, or the school in the CCD may provide funding or oversight only. The CRDC is primarily designed to collect data from public school districts about educational entities where students receive educational services for at least 50 percent of the school day, regardless of whether students are reported elsewhere for funding, accountability, or other reporting purposes. To be eligible to participate in the CRDC, schools must serve students at the site for at least 50 percent of the school year. Since the CCD and CRDC differ slightly in scope, some records are deleted, added or modified in order to provide better coverage and a more efficient sample design for the CRDC. The following types of school records are deleted from the CCD during the creation of the sampling frame:

  • District boundary type 2: Agency has closed with no effect on another agency’s boundaries

  • School status 2: School has closed since the time of the last report

  • Nonoperational school districts: School districts that do not operate a school

  • Schools and school districts with a Federal Information Processing Standards (FIPS) state code of 58 (overseas DoD), 60 (American Samoa), 66 (Guam), 69 (Northern Marianas), 72 (Puerto Rico), or 78 (U.S. Virgin Islands).

Districts with no membership or missing membership at the district level are generally excluded, except in some special cases, such as where membership data were available for the associated schools.

Note that if the most recent CCD data for use in the sampling frame are from the 2014-15 school year, the CRDC will continue to include schools that are temporarily closed and may reopen in 3 years (status 6) and schools that may be operational within 2 years (status 7).

Additionally, OCR augments the CRDC frame with justice facilities, which may not be under the purview of the SEA or an LEA. In collaboration with the Department of Justice Office for Juvenile Justice and Delinquency Prevention (OJJDP), OCR adds justice facilities which may not have been otherwise included in the CCD to ensure coverage of all youth in pre- or post-adjudication facilities that receive educational services. For the 2015–16 CRDC, OCR plans to include the 50 largest juvenile justice detention facilities to further ensure the inclusion of these youth. Also, state-operated programs for special populations of students (such as schools for the deaf and schools for the blind) are added to the universe, if they are not already included in the CCD list.

It is estimated that the sampling frame for the 2015–16 CRDC will contain approximately 18,000 school districts. Below is the previous sample selection table used in planning for the 2013–14 CRDC:



Enrollment stratum

2011–12 Universe Total

1 – 300

4158

301 – 3,000

8969

3001 – 5,000

1416

5,001-25,000

1678

25,001 +

282

Response Rate

The CRDC has a traditionally high response rate due to the mandatory nature of the data collection. The typical response rate is between 98 and 100 percent. For the 2009–10 CRDC, 100 percent of all participating school districts provided data. This represented 48 percent of all school districts serving students in 2009–10. For the 2011–12 CRDC, 98 percent of all participating school districts and 99 percent of all schools provided data. The anticipated response rate for the 2015–16 CRDC is expected to be no less than 98 percent of all participating school districts.

  1. Describe the procedures for the collection of information, including:

  • Statistical methodology for stratification and sample selection

  • Unusual problems requiring specialized sampling procedures


The 2015–16 CRDC will include a universe of all public schools and school districts. Therefore, none of the sampling procedures (i.e., stratification, estimation, etc.) are proposed to be used in the 2015–16 CRDC administration. However, if budget constraints are such that a universe collection in 2015–16 is not possible, OCR will proceed as follows:

Statistical Methodology for Stratification and Sample Selection

The CRDC, which has collected data from school districts since 1968, has generally included a sample of approximately 6,000 school districts. For the 1976, 2000, and 2011–12 CRDCs, data were collected from a universe of school districts. In 2009–10, the sample size was increased to approximately 7,000 school districts, expanding the coverage to include all school districts with enrollments of over 3,000 students, long-term secure juvenile justice facilities, and state-operated programs. The increased sample size provided a significantly more comprehensive picture of the access to equal educational opportunities for all students.

For CRDC sample years since the 1998 CRDC, districts were selected using a rolling stratified sampling method that ensured a representative group from each state was included in the data collection. Under this design, each subsequent sample was selected in a way that minimized overlap with the preceding samples to the extent feasible. The procedures used to accomplish this objective are designed to avoid introducing biases and inefficiencies that would otherwise occur by simply excluding prior sample selections from the current sampling process.

This approach used strata divided by size of district, and sub-strata of high/low minority students. For the 1998, 2002, 2004, and 2006 CRDC samples, strata were defined by the following school district enrollment sizes: 1-300 students, 301-3,000 students, 3,001-5,000 students, 5,001-25,000 students, and over 25,001 students. The percentage of districts selected varied by state, inversely related to a factor based on the number of districts and the enrollment (i.e., states with fewer districts and students generally had a larger percentage selected to ensure adequate representation for statistical reliability). For the 2009–10 CRDC, OCR included all districts with more than 3,000 students. Additionally, some districts are always in the sample, such as all districts in states with fewer than 25 regular public school districts, school districts with more than 25,000 students and districts subject to federal court orders that are monitored by the U.S. Department of Justice. These “certainty districts” were removed from the sampling frame and the sample was drawn from the reduced sampling frame. All schools within selected districts were included in the CRDC sample.

In collaboration with NCES, OCR will review the past sample design to determine its adequacy in meeting the needs for state and national estimations of school and student counts. For the 2015–16 CRDC sample, if any, OCR and NCES may consider alternative thresholds for stratum and sampling with certainty that are necessary to achieve reliable state and national estimations. Additionally, estimated percentages of particular student and school characteristics are an important feature of the CRDC. OCR and NCES will also collaborate on whether there is a need to oversample specific student populations or school types, such as charter schools, which are of interest to ED. The goal of the sample design will be to ensure sufficient numbers for precise state and national estimates.

Based on currently available data for the 2011–12 SY, OCR estimates that the final 2015–16 CRDC sample, if any, will include no more than:

  • 9,500 school districts, and

  • 86,000 schools.

Procedures for the Collection of Information:

Verification and updates of school lists by selected LEAs: Fall 2015

Once selected for the CRDC, school districts are notified by standard mail of their required participation in the collection and asked to verify and update their listing of schools and provide a primary point of contact for the survey (previous letters are available at http://crdc.ed.gov/LEA/AdditionalResources.aspx). Links to survey questionnaires and supporting documents are also included in the initial mailing to school districts. A Web-based system is available to school districts to provide contact information for a principal point of contact. School districts may also verify their school list, add new schools that opened at the beginning of the 2015–16 school year, collapse schools that have merged, or remove schools that have closed. In past surveys, OCR has encountered discrepancies between the definitions of a school as held by the school district themselves and as reported by state departments of education to the CCD. This issue occurs most often in rural areas or in schools that offer grades K-12 in one building with one head principal. The schools often consider themselves one cohesive unit while the state does not. For accounting or other administrative purposes, the state may artificially split these schools by grade level and report them as two or three separate schools. For the purpose of CRDC, the K-12 school with one principal can be collapsed into one reporting entity and report all students enrolled in grades K-12.

Follow-up for non-responding school districts begins approximately three weeks after the initial mailing. Telephone follow-up and reminder letters will be sent to school districts that do not provide a principal point of contact or verify their school list.

Data collection by selected LEAs: Fall 2015-Summer 2016

After identifying a principal point of contact and a verified list of participating schools, OCR provides frequent training opportunities for school districts to understand the data elements collected on the CRDC and the survey submission process. Webinars, frequently asked questions and short tip sheets are circulated via email and posted to the CRDC informational website (crdc.ed.gov). This correspondence to school districts also serves as a reminder of their obligation to collect the required data over the course of the school year. A support center is available to school districts to call or email questions regarding the content of the data to be collected. During this time, training on preparing files for submission occurs. OCR has also provided pre-collection tools for school districts to gather and prepare flat files of the required data to prepare for the survey submission opening.

Survey Submission: Summer 2016

The survey submission window opens with email notification to all participating school districts. School districts are typically given a minimum of 3 months to submit their data to OCR. In anticipation of the survey submission system opening, OCR and its contractors provide webinar training on using the survey submission website beginning two weeks prior to opening and continuing three weeks after the opening of the survey submission system. These trainings are also posted on the CRDC informational website for school districts to access at any point during the data collection cycle. During the survey submission period, frequent communication occurs with participating school districts to offer technical assistance and, as the survey due date approaches, reminders are sent to school districts that have not yet certified their CRDC submission.

  • Estimation procedure.

If a sample is used for the 2015–16 CRDC, OCR will develop state and national estimations. As has been done in the past, after the 2015–16 collection, statistical estimations for most items will be calculated for the nation and for each state. To yield statistically sound conclusions for the universe, OCR will base the calculations on the methodology used for selecting the sample; in-depth analysis of non-respondent school districts and/or schools; and missing items within surveys. Data will be weighted to compensate for schools that did not provide usable data and values will be imputed on an item-by-item basis to compensate for item non-response. Final weighting and estimations procedures will be dependent on the sample selection plan for the 2015–16 CRDC.

Calculation of weights

In past CRDC samples, school districts were weighted by the inverse of the probability of selections. School and school district weights were then adjusted based on final item and survey response rates and any other sampling considerations that arose after the sample was drawn. For the 2015–16 CRDC, if a sample is used, replicate weights will be attached to each surveyed school so that the weighted data will represent state and national totals. The final, adjusted weights used to estimate state and national student count totals will ensure that the sum of the weighted membership data (DG979) matches the number of students derived from the 2015–16 CCD public school universe file.

  • Degree of accuracy needed for the purpose described in the justification.

Accuracy

The CRDC is intended to collect information about educational equity and excellence in public elementary and secondary education. Although respondents are experts in the educational opportunities and participation in their school districts, there are opportunities for potential error, either through technical mishap or misinterpretation of the intent of a survey item. Over the course of several CRDC administrations, OCR has developed and continues to develop a series of checks designed to flag these errors for review by the respondent and OCR. These edits rely on internal logic checks, consistency within specific tolerances, and comparisons to similar data collected by other program offices within ED.

Imputations

Because the CRDC is a mandatory collection, respondents are required to provide data for each applicable item. (See Attachment A-4 for more details about how directional indicators are used to determine item applicability.) In rare cases, a school district may not be able to respond with complete and accurate data to a specified item. When there is item nonresponse, the item may be imputed for the 2015–16 CRDC reporting of state and national totals.

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

The Department has historically conducted this survey biennially to reduce burden.

  1. Describe methods to maximize response and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


Historically, the civil rights survey has had a very high response rate. In 2000, the predecessor Elementary and Secondary School Civil Rights Compliance Report (E&S Survey) was sent to a universe of all school districts and schools in the United States. The overall response rates were 97 percent of all school districts and 99 percent of all schools. The overall response rates for the 2002 E&S Survey were 98 percent for school districts and 98 percent for schools. For the 2004 CRDC, the response rates, including partial respondents to the data collection, were approximately 97 percent of all districts, and 97 percent of all schools. The 2006 CRDC achieved an unprecedented 100 percent response rate for school districts and a 99.6 percent response rate for schools. The 2009–10 CRDC achieved a response rate of 100 percent of school districts, and 100 percent of schools. The 2011–12 CRDC achieved a response rate of 98 percent of school districts and 99 percent of schools.

Methods to maximize response rates

Frequent communications occur with participating school districts over the course of the data collection to ensure compliance with this statutorily mandated collection. School districts are notified by standard mail of their obligation to report. In addition, frequent email correspondence occurs with the school district’s primary point of contact regarding technical assistance available to support the school districts submitting the required data, reminders of upcoming deadlines, and notifications if the CRDC was not submitted by the due date. If school districts fail to respond in a timely manner, the contractor for the data collection, with assistance from OCR and its field offices as necessary, provides extensive outreach and assistance to the greatest extent possible until the districts respond, or the final deadline for accepting data has passed. The superintendents of non-responding school districts are also contacted by phone, email, and standard mail. This has proven to be very successful in past years.

  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

OCR plans to test the data collection procedures and data items described in this submission in a number of ways. Many of the data elements requested have already been collected in the previous 2011–12 CRDC and 2013–14 CRDC. However, data quality is an overriding concern that OCR continues to assess and evaluate. OCR and NCES are assessing relevant data from CRDC survey years to evaluate the internal and external consistency and reliability of the reported data to continuously improve the business rules and edit checks used in the survey submission system. Edit checks currently help to identify potential problems and provide opportunities for school districts to correct possible mistakes before certifying the accuracy of their submission.

To support continued improvement of data quality in school districts and ensure the procedures for the survey are designed to minimize the burden on sampled schools and districts, OCR and NCES partnered in the development of an enhanced survey submission system for the 2013–14 and 2015–16 data collections. The redesigned submission system will build on improvements made in the 2009–10 and 2011–12 collections that enhanced the ability of school districts to provide partial flat files and improved data accuracy. OCR and NCES plan to test and pilot the survey submission system in a number of ways.

Content.

The proposed data elements include: 1) both items that were required for the 2011–12 CRDC and 2013–14 CRDC and, 2) new items that were previously approved for as optional items for the 2013–14 CRDC and mandatory items for the 2015–16 CRDC. In preparation for the building of the redesigned submission system, ED conducted recordkeeping visits with school districts to determine whether and how they presently collect data for the new mandatory and optional data groups. These sites reflected a diverse set of districts in terms of size, urban/city, level of sophistication of SEA and LEA data systems, and programs offered. These site visits were done in conjunction with NCES and under a separate OMB clearance package. These visits also gathered information on ways in which the survey collection tool assisted in improving data quality through edit checks and other analytical procedures. Additionally, ED explored the feasibility of conducting up to 20 one-on-one cognitive interviews with SEA, LEA, and/or school respondents to validate proposed item wording. These interviews were conducted by phone and in person, as appropriate.

Survey Tool.

OCR and NCES developed an enhanced survey collection tool for the 2013–14 CRDC and 2015–16 CRDC. Testing plans for the tool closely mirror previous CRDC protocols. For the 2009–10 CRDC, ED invited several members of the CRDC technical working group to participate in some of the initial discussions with the contractor about layout and features of the tool. ED also pilot tested the online survey tool with LEAs prior to opening the survey tool for submissions by LEAs. The pilot LEAs had access to the online survey tool for five business days, during which the LEAs responded to the CRDC, entering previous-year or fictitious data, at their choice. For the 2013–14 CRDC, ED invited school districts that participated in site visits, several members of the CRDC technical working group, and SEAs that planned to pre-populate data elements for LEAs to participate in some initial discussions with the contractor about various features of the tool. ED also pilot tested the online survey tool for data submission with LEAs and SEAs. The pilot period lasted about two months.

For the 2015–16 CRDC, ED plans to expand the pilot to include approximately 40-50 LEAs who will have access to the survey software for a period of no less than ten business days. ED will work with its selected contractor to compile LEA suggestions, categorized by level of criticality and feasibility. Based on these results, ED will identify changes, such as text refinements, and any critical technical issues for correction and resolution prior to the opening of the survey tool for all participating LEAs.

  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other persons who will actually collect and/or analyze the information for the agency.


OCR has consulted with NCES regarding the preliminary plans for a sample for the 2015–16 CRDC. However, OCR intends the 2015–16 collection to be a universe of public schools and school districts. If budget constraints are such that a universe collection for the 2015–16 CRDC is not possible, OCR will work closely with NCES to finalize the sample selection plan and submit it to OMB for review and approval.

Shape1

4



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT
AuthorKenneth Smith
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy