2013-14 Civil Rights Data Collection (CRDC) Data Submission System Process Improvement and Feasibility Study

NCES Cognitive, Pilot, and Field Test Studies System

Attachments B-H CRDC 2013-14 Improvement Study Interview Protocols

2013-14 Civil Rights Data Collection (CRDC) Data Submission System Process Improvement and Feasibility Study

OMB: 1850-0803

Document [docx]
Download: docx | pdf

National Center for Education Statistics





Attachments B – G







2013-14 Civil Rights Data Collection (CRDC)

Data Submission System

Process Improvement and Feasibility Study

Protocols



OMB# 1850-0803 v.90











November 22, 2013




Attachment B: Introduction and Conclusion Text

Introduction



Thank you for taking the time to speak with us today. My name is__________, and my colleague, ___________, and I are members of the Sanametrix and the American Institutes for Research (AIR) project team that is conducting interviews to evaluate and improve the Civil Rights Data Collection process. Sanametrix provides customized software development and technical consulting, and is located in Washington, DC. AIR is also located in Washington, DC and is a research firm. We have been hired by the National Center for Education Statistics (NCES) to learn about your experiences with the CRDC process in order to improve the data collection tool and processes for future CRDC efforts. NCES has recently assumed responsibility for designing a new data collection tool for the CRDC in coordination with the Office of Civil Rights.

You were chosen to participate in this interview because we value your opinion and would like to hear your perspective about the CRDC. We recognize that everyone has different experiences with the CRDC and want to stress that there are no right or wrong answers. In order for us to advise NCES on how to improve the CRDC process, we need to hear your thoughtful and honest feedback.

I will be asking the questions, and _______________ will be taking notes. We would like to record our conversation to make sure that we catch all of the important information that you will share with us and it serves as our back up to the notes that _______________ will take today. Is it okay for me to record you? Do you have any questions before we get started?


Conclusion



That concludes our questions at this time.

[If interview is complete and no follow-up is anticipated:

Thank you so much for speaking with us today!”]

[If follow-up is needed:

Thank you so much for speaking with us today. I will be sending you a follow up email with the questions you weren’t able to answer at this time. Once you’ve had an opportunity to look up the information, we can schedule a phone call or you can provide the responses via email.”]

Attachment C: Site Visit Protocol – SEA respondent

Name:

Title:

Organization:

Phone:

Email:

Address: Confirm above respondent contact information, if necessary]



  1. Before we discuss any technical issues, we’d like to discuss a few background items.

    1. How would you describe your overall role in the SEA?

    2. What are your day-to-day activities?

    3. How would you describe your role for the CRDC?

    4. [If speaking to an SEA POC] How did you get selected as the CRDC point-of-contact? [If not POC, continue with sub-question (i)]

      1. Did you complete any special training or certification for CRDC data collection and entry?

      2. Are there any contractors or vendors you work with on the CRDC?

      3. What are the particular challenges you face in coordinating data collection?

    5. What current CRDC tools are useful to you or LEAs in your state (e.g., webinars, forms, Partner Support Center (PSC), templates, online tools)

      1. Can you show us what tools you use or modified? Are tools bookmarked? Downloaded? In hardcopy? Posted on the SEA website?

The following questions are designed to assess the SEA’s ability to collect data. These questions are not evaluating the quality of their data, they are simply allowing us to gain an understanding of the data collection.

  1. We would like to ask some general questions about CRDC data collection. Please describe the extent to which your agency is able to provide data for the CRDC, noting, in particular, data elements that you cannot provide or have trouble providing?

[Provide the following prompts as needed]

    1. Which data elements are easy to provide?

    2. Do you encounter any specific problems with particular elements of data collection, e.g.:

      1. Can you give us some examples of these problems?

    3. How would you describe your interaction with LEAs? (only for States that coordinate the collection of data with LEAs)

      1. Can you describe the process of communication with LEAs?

        1. Obtain any letters or email text that is used.

      2. What challenges, if any, do you experience with LEAs including charters schools, state operated programs, juvenile justice facilities?

        1. What is the most significant challenge?

        2. Are there things that have worked well for you?



The following questions are designed to collect data on the process and cycle of data collection. Here we are interested in finding out how the SEA goes about collecting data for the CRDC reports. This information will allow us to understand the ways in which data collections cycles may affect data quality.



  1. One of the areas NCES is interested in is the extent to which there is variation in the CRDC activities across agencies. Can you explain, in your own words, what the data collection process is for you, from start to finish? As you walk me through this, please show me the forms and tools you use in the process.

[Collect copies and examples of forms and documents that you are shown]

[Provide the following prompts as needed]

    1. Who in the LEA contributes to the process?

      1. What are their roles?

    2. Are subject matter experts consulted for specific elements of the CRDC that are also collected by your state? For example, for athletics or school safety?

    3. For the CRDC data that is also collected by your state, are you collecting it just for the purpose of the CRDC or are there other reasons why the data are collected?

      1. For what reasons are these data collected?

      2. Is it part of the State Longitudinal Data System (SLDS)?

      3. For the school-level counts of children served by IDEA by disability category that your state submits to EDFacts and becomes part of the CRDC, how do you ensure LEAs and schools in your state are reporting students at the school they physically attend for more than 50% of the school day?

  1. Please tell us when and how often your agency collects CRDC-related data?

    1. What drives the decision about when to collect the data?

    2. Are the data collected in stages or in phases?

      1. Why is it done that way?

    3. In your opinion, when would be the best time for your to report your data for the CRDC?

      1. Would this differ depending on the data you need to report?

  2. We are also visiting __________school district? Who is your data Point-of-Contact there? Is this the same as the CRDC contact? (only for states that have district level point of contact)

      1. How are district-level POC’s assigned in {state name here}?

      2. What do you need from your POC to ensure LEA data collection is accurate and on-time?

  3. We understand there are different methods for CRDC data submission, and that each requires different activities on the parts of both SEAs and LEAs. From your perspective, how would you say activities differ between large and small school districts? Please provide examples.

  4. In our earlier communication, we mentioned that we were interested in hearing about the activities you undertake for the CRDC and any forms you use to collect and report data. Let us shift to that now.

[Give the respondent a few minutes to make this transition]

    1. We’d like to see what data collection tools you use to collect and store data.

    2. What tools, if any, are provided by vendors?

[Be sure to collect examples of these tools]

    1. [For SEAs submitting data on behalf of their LEAs]Please walk us through your process for getting information from an LEA.

      1. How do LEAs report data to you?

        1. Who in the LEA is responsible for reporting data?

        2. What format do LEAs use to submit their data to you?

    2. What other outreach do you need to do to collect the required data?



[As the respondent walks through process of data collection, capture thorough notes on each step of the process, including when and with whom they make contact. Make a flowchart for or with the respondent.]

Confirm with the respondent - Does this flowchart accurately summarize your process?

Thank you for sharing your process with us. We have just a few more questions related to data.

  1. From your perspective, which data elements have poor data quality?

    1. Why are they poor quality? Can you provide an example?

    2. What suggestions do you have for their improvement?

  2. Thinking about CRDC data elements:

    1. For data you are collecting that is also needed for the CRDC, how are the data being used?

      1. Which data elements are most useful to SEAs? What makes them useful?

      2. Can you show me an example of how these data are used?

      3. Are there CRDC data elements that your state is collecting which are not used often? Why are these data not used?

    2. What kind of data would you say is most helpful to an SEA?

    3. How can we get you your CRDC data in a way that will be most useful to you?

  3. We are coming down to our final questions. If you had the opportunity to speak directly to someone from NCES, what would you recommend as an improvement in the CRDC communication process with POCs and other leadership?

    1. Is there anything else you would like to add to your comments today—anything else you would like to share with NCES?



[Use conclusion comments]



Attachment D: Site Visit Protocol – LEA respondent

Name:

Title:

Organization:

Phone:

Email:

Address:

[Confirm above respondent contact information, if necessary]



  1. Before we discuss any technical issues, we’d like to discuss a few background items.

    1. How would you describe your overall role in the district?

    2. What are your day-to-day activities?

    3. How would you describe your responsibilities for the CRDC?

    4. [If speaking to POC] How did you get selected as the CRDC point-of-contact for your district? [If not POC, continue with sub-question (i)]

      1. Did you complete any special training or certification for CRDC data collection and entry?

      2. Are there any contractors or vendors you work with on the CRDC?

      3. What are the particular challenges you face in coordinating data collection?

    5. What current CRDC tools were useful to you (e.g., webinars, forms, Partner Support Center (PSC), templates, online tools)

      1. Can you show us what tools you use? Did you bookmark any online tools? These won’t be available now but I’d just like to know if you accessed them. Downloaded? In hardcopy?

The following questions are designed to assess the LEA’s ability to collect data. These questions are not evaluating the quality of their data, they are simply allowing us to gain an understanding of the data collection at this agency level.

  1. We would like to ask some general questions about CRDC data collection. Please describe the extent to which your agency is able to provide data for the CRDC, noting, in particular, data elements that you cannot provide or have trouble providing?

[Provide the following prompts as needed]

    1. Which data elements are easy to provide?

    2. Do you encounter any specific problems with particular elements of data collection, e.g.:

      1. Can you give us some examples of these problems?

    3. Do you interact with the SEA during the CRDC data collection?

      1. How would you describe your interaction with the SEA?

      2. What problems, if any, do you experience with the SEA?

    4. What about your interactions with schools? Are there data elements that are not collected by your LEA and only reported by your schools?

The following questions are designed to collect data on the process and cycle of data collection. Here we are interested in finding out how the LEA goes about collecting data for the CRDC reports. This information will allow us to understand the ways in which data collections cycles may affect data quality.



  1. One of the areas NCES is interested in is the extent to which there is variation in the CRDC activities across districts. Can you explain, in your own words, what the data collection process looks like in your district, from start to finish?

[Provide the following prompts as needed]

    1. What is the process for outreach to schools in your LEA?

[Request and collect examples of outreach materials (letters, e-mails, and any other outreach materials – artifacts)]

    1. Is there a formal process in place?

    2. Who else in the LEA contributes to the process?

      1. What are their roles?

    3. Are subject matter experts consulted for specific elements of the CRDC? For example, do you consult with athletic directors? academic department administrators? administrators responsible for school safety?

    4. Are the data you collect for the CRDC just for the purpose of the CRDC or are there other reasons why they are collected?

      1. For what reasons are these data collected?

      2. Are these data part of the State Longitudinal Data System (SLDS)?

  1. Please tell us when and how often your agency collects CRDC-related data?

    1. What drives the decision about when to collect the data?

    2. Are the data collected in stages or in phases?

      1. Why is it done that way?

    3. In your opinion, when would be the best time for your to report your data for the CRDC?

      1. Would this differ depending on the data you need to report?

  2. In our earlier communication, we mentioned that we were interested in hearing about the activities you undertake for the CRDC and any forms you use to collect and report data. Let us shift to that now.

[Give the respondent a few minutes to make this transition]

    1. We’d like to see what data collection tools you use to collect and store data.

    2. What tools, if any, are provided by vendors?

[Be sure to collect examples of these tools]

    1. Please walk us through your process for getting information from a school.

      1. How do schools report data to you?

        1. Who in the school is responsible for reporting data?

        2. What format do schools use to submit their data to you?

        3. Does the data come directly to you from the school or do you get it from someone else?

    2. What other outreach do you need to do to collect the required data?

[As the respondent walks through process of data collection, capture thorough notes on each step of the process, including when and with whom they make contact. Make a flowchart for or with the respondent]

Confirm with the respondent - Does this flowchart accurately summarize your process?

  1. [Focused questioning on data elements. Show respondent list of data elements.] This is a list of several data elements NCES would like some feedback on. We’d like to go down this list and discuss these with you.

    1. Which of these are fairly easy to report? What is easy about reporting these data to CRDC?

    2. Which of these are problematic to report? What is difficult about reporting these data to CRDC?

  2. Who is responsible for verifying and certifying your data?

    1. Can you describe that process?

    2. What is difficult about the process?

We have just a few more questions related to the data.

  1. From your perspective, which data elements have poor data quality?

    1. Why are they poor quality? Can you provide an example?

    2. What suggestions do you have for their improvement?

  2. Thinking about all CRDC data elements:

    1. Which data elements are most useful to your district? What makes them useful?

      1. How do you use these data?

      2. How often do you use them?

    2. Which data elements are least useful to your district? Why?

    3. How can we get you your CRDC data in a way that will be most useful to you?

  3. We are coming down to our final questions. If you had the opportunity to speak directly to someone from NCES, what would you recommend as an improvement in the CRDC communication process with POCs and other leadership?

  4. Is there anything else you would like to comment on today—anything else you would like us to share with NCES?

[Use conclusion comments]



Attachment E: Site Visit Protocol – School administrator respondent

Name:

Title:

Organization:

Phone:

Email:

Address

[Confirm above respondent contact information, if necessary]



  1. Before we discuss any technical issues, we’d like to discuss a few background items.

    1. [If unknown] What is your role in the school?

    2. What are your day-to-day activities?

    3. How would you describe your responsibilities for the CRDC [collecting and reporting data education access and equity (such as student enrollment, teachers and staff, course participation and offerings, and discipline) ]?

    4. How did you get selected as the CRDC [or data] point-of-contact for your school? [If not POC, continue with sub-question (i)]

      1. Did you complete any special training or certification for CRDC [education] data collection and entry?

      2. What are the particular challenges you face in coordinating data collection?

The following questions are designed to assess the schools’ ability to collect and report data. These questions are not evaluating the quality of their data, they are simply allowing us to gain an understanding of the data collection at this agency level.

  1. We would like to ask some general questions about data collection at your school. Please describe the how your school would collect, store, and be able to provide data, noting, in particular, data elements that you cannot provide or have trouble providing?

[Provide the following prompts as needed]

    1. How is information stored or collected at your school?

    2. Do you interact with the LEA during the data collection or reporting?

      1. How would you describe your interaction with the LEA?

      2. What problems, if any, do you experience with the LEA?

    3. Who do you report your data to? Another school-level person? An LEA person? Could you walk us through the steps you would take to report the data?

    4. Who else in the school contributes to the process?

      1. What are their roles?

    5. Are subject matter experts consulted for specific data elements? For example, do you consult with athletic directors? academic department administrators? administrators responsible for school safety?

    6. How does your school go about collecting and storing newly requested data?

    7. Do you encounter any specific problems with particular elements of data collection:

      1. Can you give us some examples of these problems?

The following questions are designed to collect data on the process and cycle of data collection. Here we are interested in finding out how the school goes about collecting data for the CRDC reports. This information will allow us to understand the ways in which data collections cycles may affect data quality.


  1. In our earlier communication, we mentioned that we were interested in hearing about the activities you undertake for the CRDC and any forms you use to collect and report data. Let us shift to that now.

[Give the respondent a few minutes to make this transition]

    1. We’d like to see what data collection tools you use to collect and store data.

    2. Please walk us through your process for collecting data. Who in the school is responsible for reporting data?

    3. What other outreach do you need to do to collect the required data?

  1. [Focused questioning on selected data elements. Data elements revealed through the LEA interview that are directly reported by the school will also be included. Show respondent list of data elements]

  2. This is a list of several data elements NCES would like some feedback on. We’d like to go down this list and discuss these with you.

    1. Can you walk us through the information collection process for some general categories of CRDC data?

      1. Discipline:

        1. When a student is subject to a disciplinary action, such as a suspension or expulsion, what happens? Where is the information about the event stored?

        2. What happens when a student is referred to law enforcement? Where is the information stored? How would you find out if the referral resulted in an arrest?

      2. Athletics

        1. How is information about students participating in interscholastic athletics collected and stored? Can you show us how you would respond to a request for data on the number of single-sex teams?

      3. Course/Program Enrollments

        1. How would you respond to a request for information on the number of Algebra I classes?

        2. When a student enrolls in an advanced math or science course (or gifted and talented education program), how is that information tracked?

        3. How would you respond to requests for data on the number of students retained in grade xx the previous year?

      4. School finance

        1. How is information on teacher salary expenditures collected and stored at your school? Can you show us how you would respond to requests for data on expenditures for teacher salaries and instructional aide salaries?

      5. Teachers and Staff:

        1. What happens when a school counselor or school psychologist visits the school? How would you respond to a request to for data on the full-time equivalent (FTE) of school counselors or school psychologist for your school?

    2. Which of the items on this list are fairly easy to report? What is easy about reporting these data?

    3. Which of these are problematic to report? What is difficult about reporting these data?

    4. Are these data collected just for the purpose of the CRDC reporting or are there other reasons why they are collected?

      1. For what reasons are these data collected?

  3. Please tell us when and how often your school collects these data?

    1. What drives the decision about when to collect the data?

    2. Are the data collected in stages or in phases?

      1. Why is it done that way?

    3. In your opinion, when would be the best time or times of the year for your school to report on this data?

      1. Would this differ depending on the data you need to report?

We have just a few more questions related to the data.

  1. From your perspective, which data elements are not routinely collected and stored, where you may have to guess?

    1. Can you provide an example?

    2. What suggestions do you have for their improvement?

  2. We are coming down to our final questions. If you had the opportunity to speak directly to someone from NCES, what would you recommend as an improvement in the CRDC communication process with your school?

  3. Is there anything else you would like to comment on today—anything else you would like us to share with NCES?



[Use conclusion comments]



Attachment F: Site Visit Protocol – OCR office respondent

Name:

Title:

Organization:

Phone:

Email:

Address:

[Confirm above respondent contact information, if necessary]



We would like to ask some general questions about CRDC data elements before we give you an opportunity to share with us any reflections you think would be useful for NCES and OCR to know.

  1. Thinking about CRDC data elements:

    1. Which data elements are most useful to your office? What makes them useful?

    2. Which data elements are least useful to your office? Why?

    3. What kind of reports do you produce with the data from the CRDC data?

    4. Are there requests for particular data that you would like to be able meet but are unable to at the moment?

    5. Who makes these requests for data?

  2. From your perspective, which data elements have poor data quality?

    1. Why are they poor quality?

    2. What suggestions do you have for their improvement?

  3. What suggestions do you have for ways in which NCES can provide more helpful data to your office? Would it be helpful to have:

    1. A database?

    2. A report?

    3. Access to preliminary data?

[Probe on why these things would be useful]

  1. Can you tell us about how involved your office is in ensuring that districts complete the CRDC?

    1. Do you think your office could assist in making the CRDC completion more accurate and timely? In what ways?

  2. If you had the opportunity to speak directly to someone from NCES, what would you recommend as an improvement in the CRDC communication process with your office?

  3. Is there anything else you would like to add to your comments today—anything else you would like to share with NCES?



Attachment G: Site Visit Supplemental Materials – Frequently Asked Questions

Frequently Asked Questions Regarding Proposed Changes to the U.S. Department of Education’s Civil Rights Data Collection for School Years 2013–14 and 2015–16

What is the Civil Rights Data Collection?

The Civil Rights Data Collection (CRDC) is a biennial (i.e., every other school year) survey required by the U.S. Department of Education’s (Department) Office for Civil Rights (OCR) since 1968. The 2013–14 and 2015–16 CRDC proposes to collect data from a universe of all public local educational agencies (LEA) and schools, including juvenile justice facilities, charter schools, alternative schools, and schools serving students with disabilities.

What is the purpose of the CRDC?

The CRDC is a longstanding and critical aspect of the overall enforcement and monitoring strategy used OCR to ensure that recipients of the Department’s Federal financial assistance do not discriminate on the basis of race, color, national origin, sex, and disability. OCR relies on CRDC data it receives from public school districts as it investigates complaints alleging discrimination, determines whether the federal civil rights laws it enforces have been violated, initiates proactive compliance reviews to focus on particularly acute or nationwide civil rights compliance problems, and provides policy guidance and technical assistance to educational institutions, parents, students, and others.

The CRDC’s utility reaches far beyond OCR to the entire Department, to other agencies, and to researchers and policymakers across the nation. For example, CRDC data have been used by other Department offices for monitoring compliance with requirements for federal professional development funding, monitoring states under ESEA flexibility waivers, defining program requirements on discipline disparities in the Race to the Top district competition, and evaluating the Office of English Language Acquisition’s (OELA) programs and activities. Additionally, numerous NCES studies, including the Schools and Staffing Survey and the School Crime Survey, are planning to supplement their data collections with data from the CRDC. The alignment of the CRDC to the Department’s priorities and other collections, and the widespread use of the CRDC data across the Department and other agencies minimizes duplicate data collections and lessens reporting burden on state and local agencies.

State and federal agencies, policymakers, researchers, and many others outside of the Department also use the CRDC data, which the Department makes available to the public via the Web in a privacy-protected format. For each of these constituencies, the CRDC is an invaluable source of information about access to educational opportunities in our nation’s public schools. Researchers, advocacy organizations, and news media have used CRDC data to identify possible civil rights concerns in our nation’s schools and to find models of success. State legislatures and state boards of education have relied on CRDC data in crafting and revising educational policies, and for LEAs and schools across the country, the CRDC data is a critical tool for self-analysis and a mechanism for highlighting and correcting areas of educational concern.

Under what authority does the Department conduct the CRDC?

Section 203(c)(1) of the 1979 Department of Education Organization Act conveys to the Assistant Secretary for Civil Rights the authority to “collect or coordinate the collection of data necessary to ensure compliance with civil rights laws within the jurisdiction of the Office for Civil Rights” [20 U.S.C. § 3413(c)(1)].

The civil rights laws enforced by OCR include: Title VI of the Civil Rights Act of 1964, which prohibits discrimination based on race, color, and national origin; Title IX of the Education Amendments of 1972, which prohibits discrimination based on sex; and Section 504 of the Rehabilitation Act of 1973, which prohibits discrimination on the basis of disability. OCR’s implementing regulations for each of these statutes requires recipients of the Department’s federal financial assistance to submit to OCR “complete and accurate compliance reports at such times, and in such form and containing such information” as OCR “may determine to be necessary to enable [OCR] to ascertain whether the recipient has complied or is complying” with these laws and implementing regulations (34 CFR 100.6(b), 34 CFR 106.71, and 34 CFR 104.61, located at http://www2.ed.gov/policy/rights/reg/ocr/index.html). In addition, pursuant to a delegation by the Attorney General of the United States, OCR shares in the enforcement of Title II of the Americans with Disabilities Act of 1990, which prohibits discrimination based on disability. Any data collection that OCR has determined to be necessary to ascertain or ensure compliance with these laws is mandatory.

OCR also works with Department offices to help them effectively carry out programs of Federal financial assistance that the Secretary of Education is responsible for administering. [See Sections 201, 202(g), 411(a), and 412 of the Department of Education Organization Act (20 U.S.C. §§ 3411, 3412(g), 3471(a), and 3472)]. OCR works with the Department’s Office of Elementary and Secondary Education, which is responsible for administering the Elementary and Secondary Education Act of 1965 (ESEA). Section 9533 of the ESEA (20 U.S.C. § 7913) prohibits discrimination in the administration of the ESEA in violation of the Fifth or Fourteenth Amendments to the Constitution. In addition, Section 9534 of the ESEA (20 U.S.C. § 7914) prohibits discrimination in funded programs on the basis of race, color, religion, sex (except as otherwise permitted under Title IX), national origin, or disability. Thus, in addition to OCR's authority described above, the ESEA provides authority for the Department to mandate that LEAs respond to this data collection.

Why is the Department revising the CRDC?

The Office of Management and Budget (OMB) approved the CRDC under the Paperwork Reduction Act for 3 years, which allowed OCR to collect data for the 2009–10 and 2011–12 school years. To collect CRDC data for the 2013–14 and 2015–16 school years, OCR is required to clear the CRDC collection through OMB’s notice and comment process. In doing so, OCR is proposing changes that reflect new learning about the areas where opportunity gaps exist.

What are the major changes to the data collected in the 2013–14 and 2015–16 CRDC that are being proposed by the Department?

The proposed additions and changes to the CRDC for school years 2013–14 and 2015–16 that the Department is posting for public comment reflect the need for a deeper understanding of and accurate data about the educational opportunities and school context for our nation’s public school students. The following information summarizes the changes proposed for some general areas of information collected in the CRDC. For a more detailed list of what is proposed for 2013–14 and 2015–16 school years, including retained elements and proposed additions, see the Appendix in this FAQ.

School and District Characteristics

The CRDC continues to cover such topics as the number of magnet and alternative schools, districts operating under desegregation orders or plans, and student membership disaggregated by race, ethnicity, sex, disability, and English learner status. To provide greater context to deepen OCR’s understanding of the schools and districts in which students receive their education, the proposed changes to the CRDC include adding:

  • Items about civil rights coordinators in each district, which measure compliance with civil rights regulations and permit OCR to communicate with coordinators.

  • Items on the educational programs in justice facilities, which provide a more accurate account of the educational opportunities available to students.


Discipline

The 2009–10 CRDC made public indicators of school culture, including: numbers of students, broken down by demographic characteristics, who were suspended once and multiple times, expelled, and arrested in school; and new information about the use of restraint and seclusion in the classroom. To provide additional information about discipline practices in our schools, proposed changes to the CRDC include adding:

  • Items on the length of time out of class due to suspension, reflecting the findings of OCR enforcement actions and other reports that have shown that disproportionality in discipline extends beyond the type of punishment (in- or out-of-school suspensions);

  • Items on preschool discipline practices and corporal punishment;

  • Refinements to existing expulsion items, which are intended to make clear that such involuntary removals from a student's school for discipline reasons constitute expulsion, regardless of label, and to track where such students are sent to receive educational services (regular or alternative school); and

  • Critical data elements taken from the National Center for Education Statistics (NCES) School Survey on Crime and Safety to obtain school violence and safety information.


Harassment and Bullying

Safe environments are critical to learning. Since the school year 2009–10 CRDC, the CRDC has provided a lens on school climate and the bullying and harassment that students too often endure on the basis of race, sex, and disability. The proposed changes to the CRDC include adding:

  • Items on sexual orientation and religion to the types of allegations of harassment that need to be counted. It is important to note, however, that the proposal does not extend the reporting requirements about demographic data of the alleged complainant or harasser. (Note that the proposed changes do not authorize schools to inquire about the sexual orientation or religion of students. In classifying the allegations of harassment or bullying, respondents are directed to look to the likely motives of the alleged harasser/bully, and not the actual status of the alleged victim.)


Early Childhood Education

The CRDC continues to cover such topics as children’s access to and participation in early childhood education programs operated by LEAs. To deepen OCR’s understanding of services provided to our youngest students, OCR is proposing to expand the collection of early childhood experience elements to learn more about how programs serve the youngest children.


Pathways to College and Career

The CRDC continues to cover topics such as students’ participation in Algebra and other college-preparatory subjects, grade-level retention, and access to Advanced Placement (AP) courses. As the previous collection masked or was remiss in collecting information on some important trends, the proposed changes to the CRDC include:

  • A more accurate account of course taking and passing opportunities;

  • Items on whether or not students have access to dual enrollment and/or credit recovery;

  • Items to refine information on middle school math opportunities; and

  • Items on which schools have high and low chronic absenteeism rates. (Note: Chronic absenteeism can be a sign of serious school climate issues that are driving children out of school. It can also be a warning sign of serious problems that children may experience in the future, e.g., dropping out).


Proposed changes also reflect efforts to reduce burden (and better focus on science/technology/ engineering/mathematics (STEM) opportunities) by streamlining AP course-taking information.


School finance (funded with state and local funds)

Since the school year 2009–10 CRDC, the CRDC has been a source of information for exploring resource equity among schools within a district. The proposed changes to the CRDC include adding:

  • Items that provide finer-grained data on total personnel expenditures among schools for teachers and instructional aides, administrators, and those who provide critical support services for students and instruction. (Please note that OCR is soliciting comments regarding the collection of non-personnel expenditures in attachment A-5 of the OMB clearance package.)


Teachers, Support Staff, and Security Staff (funded with federal, state, and/or local funds)

Since the school year 2009–10 CRDC, the CRDC has been a resource for data on the number of first- and second-year teachers in schools, the number of school counselors in schools, and teacher absenteeism. The proposed changes will deepen OCR’s understanding of how to make schools and communities safer and allow OCR to compare resources available to schools of different populations. They include:

  • Items on school support and security staffing data for every school, most of which were adopted from the NCES Schools and Staffing Survey.



Appendix: List of Proposed Continuing and New CRDC Data Elements

for School Years 2013–14 and 2015–16

(Based on 60-day public comment period)

All data elements that OCR has proposed to continue or add for the 2013–14 and 2015–16 CRDC are presented below. New proposed data elements are underlined. Unless otherwise indicated, all student data are disaggregated by race/ethnicity, sex, disability, and LEP status.

School Level


School & District Characteristics

  • School characteristics, such as grades offered, whether the school is a special education, magnet, alternative or charter school, number of single sex academic courses by content area (not disaggregated).

  • Enrollment, including disaggregated data for total enrollment, limited English proficient (LEP), LEP students enrolled in English language instruction educational programs, disability (Individuals with Disabilities Education Act (IDEA)), and Section 504 only

  • Whether an ungraded school has mainly elementary school age students; middle school age students; and/or high school age students.

  • For justice facility only:

    • Type of facility (pre- or post-adjudication/conviction or both);

    • Number of days that make up the justice facility’s regular school year;

    • Total number of hours per week that educational program is offered during regular school year;

    • Number of students who participated in educational program for less than 15 calendar days; 15-30 calendar days; 31-90 calendar days; 91-180 calendar days; more than 180 calendar days.


Discipline

  • Number of preschool students (3-5 years old) who received corporal punishment, one out-of-school suspension, more than one out-of-school suspension, or were expelled.

  • Number of K-12 students who received the following disciplinary actions (disaggregated separately for students with disabilities and students without disabilities):

    • one or more in-school suspension;

    • one out-of-school suspension;

    • more than one out-of-school suspension;

    • expulsion with and without with educational services

    • expulsion under zero-tolerance policies;

    • removed for disciplinary reasons (to alternative school; to regular school);

    • referred to law enforcement agency or official;

    • arrested for school-related activity;

    • corporal punishment.

  • Number of instances of corporal punishment that students (preschool through grade 12) received.

  • Number of school days missed by students who received out-of-school suspensions.

  • Documented number of incidents that occurred at the school:

    • robbery with a weapon;

    • robbery with a firearm or explosive device;

    • robbery without a weapon;

    • physical attack or fight with a weapon;

    • physical attack or fight with a firearm or explosive device;

    • physical attack or fight without a weapon;

    • threats of physical attack with a weapon;

    • threats of physical attack with a firearm or explosive device;

    • threats of physical attack without a weapon;

    • rape or attempted rape;

    • sexual battery (other than rape);

    • possession of a firearm or explosive device;

    • Whether any of the school’s students, faculty, or staff died as a result of a homicide committed at the school; and

    • Whether there has been at least one incident at the school that involved a shooting (regardless of whether anyone was hurt).


Harassment and bullying

  • Number of reported allegations of harassment or bullying of K-12 students on the basis of: sex; race, color, or national origin; disability; sexual orientation; religion.

  • Number of K-12 students reported as harassed or bullied and the number of students disciplined for engaging in harassment or bullying on the basis of: sex; race, color, or national origin; disability.


Restraint and Seclusion

  • Students subjected to the following forms of restraint and seclusion (disaggregated separately for IDEA and non-IDEA students): mechanical restraint, physical restraint, seclusion.

  • Number of instances of mechanical restraint, physical restraint, seclusion.


Single-Sex Interscholastic Athletics

  • Number of high school sports, teams, and participants (disaggregated by sex).


Early Childhood Education

  • Number of students ages 3-5 enrolled in preschool

  • Whether the school’s preschool program serves non-IDEA students (disaggregated by age—3, 4, 5)


Pathways to College and Career

  • Number of students enrolled in

    • gifted & talented programs; and

    • at least one dual enrollment/dual credit program.

  • Number of students enrolled in Algebra I in grade 7 (total count).

  • Number of students who passed Algebra I in grade 7 (total count).

  • Number of students enrolled in Algebra I in grades: 8; 9&10; 11&12.

  • Number of students who passed Algebra I in grades: 8; 9&10; 11&12.

  • Number of students enrolled in Geometry in grade 8 (total count).

  • Number of classes in math and science (Algebra I, Geometry, Algebra II, Advanced Math, Calculus, Biology, Chemistry, Physics).

  • Number of students (grades 9-12) enrolled in other math and science courses, by subject (Geometry, Algebra II, Advanced Math, Calculus, Biology, Chemistry, Physics).

  • Whether the school has any students who participate in at least one credit recovery program that allows them to earn missed credit to graduate from high school.

  • Number of students absent 15 or more school days.

  • Number of students enrolled in the International Baccalaureate (IB) Diploma Programme

  • Data on Advanced Placement (AP) courses:

    • Number of different AP courses provided; whether students are allowed to self-select for participation in AP courses; and

    • Number of students enrolled in at least one AP course.

  • Number of students enrolled in at least one AP course in specific subject area: AP math of any kind; AP science of any kind; and Other AP subjects of any kind (including foreign language).

  • Number of students who:

    • took one or more AP exams for one or more (which may include all) AP courses enrolled in;

    • were enrolled in one or more AP courses but who did not take any AP exams;

    • passed one or more AP exams for one or more (which may include all) AP courses enrolled in; and

    • did not pass any AP exams for the one or more AP courses enrolled in.

  • Number of students who took SAT, ACT, or both, any time during school year Number of students retained in specified grade, by grade (K-12).


School finance (funded with state and local funds)

  • K-12 personnel FTEs and salaries at the school level (funded with state and/or local funds):

    • FTE teachers and amount of their salaries;

    • FTE instructional staff (teachers & aides) and amount of their salaries;

    • FTE support services staff for pupils and amount of their salaries;

    • FTE support services staff for instructional staff and amount of their salaries;

    • FTE school administration staff and amount of their salaries; and

    • FTE non-instructional staff and amount of their salaries.

  • Total amount of non-personnel expenditures at the school level



Teachers funded with federal, state, and/or local funds (not disaggregated)

  • FTE teachers (preschool through grade 12).

  • FTE first-year teachers (preschool through grade 12).

  • FTE second-year teachers (preschool through grade 12).

  • FTE teachers absent more than 10 school days (excluding professional development) (preschool through grade 12).

  • FTE school counselors (preschool through grade 12).

  • FTE psychologists (preschool through grade 12).

  • FTE social workers (preschool through grade 12).

  • FTE security guards (preschool through grade 12).

  • FTE school resource officers (preschool through grade 12).

  • FTE sworn law enforcement officers (preschool through grade 12).



Local Educational Agency Level

  • Number of public schools.

  • Number of students (preschool through grade 12) served by LEA in LEA and non-LEA facilities.

  • Whether LEA has civil rights coordinators for discrimination against students on basis of sex, race, and disability (and contact information).

  • Whether LEA is covered by desegregation order or plan.

  • Number of students enrolled in distance education courses.


  • Whether LEA’s early childhood program(s) serve non-IDEA children age birth-2.

  • Preschool length offered (full-day, part-day) and cost (free, partial charge, full charge)

  • Number of students served by LEA in preschool programs in LEA and non-LEA facilities (disaggregated by age – 3, 4, 5).

  • Whether preschool is provided to: all students, students with disabilities (IDEA), students in Title I schools, students from low income families.

  • Whether preschool serves non-IDEA students age 3; age 4; age 5.

  • Whether LEA is required to provide full-day or part-day kindergarten by state statute or regulation.

  • Kindergarten length offered (full-day, part-day) and cost (free, partial charge, full charge).


  • Whether LEA has policy that allows retention of third grade students who are not proficient in reading.

  • GED preparation program:

    • Number of students ages 16-19 who participated in LEA-operated GED prep program; and

    • Number of students ages 16-19 who participated in LEA-operated GED prep program, succeeded on GED test, and received high school equivalency credential.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorandy
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy