Justification

Volume I - CRDC 2015-16 Research and Evaluation.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf

National Center for Education Statistics



Volume I



2015-16 Civil Rights Data Collection (CRDC)

Research & Evaluation



OMB# 1850-0803 v.158



May 2016




  1. Submittal Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB #1850-0803), which provides for NCES to conduct various procedures (e.g. exploratory, cognitive, usability, or follow-up interviews; focus groups; feasibility studies; pilot tests) to develop, test, and improve its data collection instruments and methodologies.

  1. Background and Study Rationale

The purpose of the U.S. Department of Education (ED) Civil Rights Data Collection (CRDC) is to obtain data related to the obligation of the nation's public school districts and elementary and secondary schools to provide equal education opportunity to their students. To fulfill this goal, the CRDC collects a variety of information, including student enrollment and education programs and services data that are disaggregated by race/ethnicity, sex, limited English proficiency, and disability. The CRDC is a longstanding and critical aspect of the overall enforcement and monitoring strategy used by ED’s Office for Civil Rights (OCR) to ensure that recipients of ED’s federal financial assistance do not discriminate on the basis of race, color, national origin, sex, and disability. The CRDC data are also used by other ED offices as well as by policymakers and researchers outside of ED for research and statistical purposes.

ED has collected CRDC data on school characteristics, programs, services, and student outcomes directly from local education agencies (LEAs) on a biennial basis since 1968. For many years, the collection operated as the Elementary and Secondary School Civil Rights Compliance Report (OMB# 1870-0500). Since 2004, the CRDC has been conducted primarily online (with flat file submissions and paper surveys also allowed up until the 2013-14 CRDC).

Feedback from prior CRDC administrations indicates that some LEAs experience difficulty responding to the CRDC. For example, respondents report that some of the requested data are already submitted to their respective state education agencies (SEAs)—although these data items are not currently submitted by their SEAs to ED—and that other requested data are not maintained by schools or LEAs at the level of granularity required by the CRDC. Other issues include a lack of clarity around the definitions, resources available, timelines, and collection procedures. In light of these and other concerns, the National Center for Education Statistics (NCES), within ED, has partnered with OCR to improve the CRDC data collection process. This is a continuation of the successful partnership within ED. Beginning in 2004, the CRDC has been a part of the ED initiative to better coordinate and consolidate data reporting from SEAs and LEAs across ED through a data collection known as EDFacts. Under EDFacts, the CRDC transitioned from being primarily a paper form-based data collection system to a primarily web-based collection system designed and operated by EDFacts as part of their data collection contract. The partnership between the CRDC and EDFacts has been a successful partnership that continues with the transition of EDFacts to the Administrative Data Division of NCES that took place in the fall of 2013.

Numerous NCES studies, including the National Teacher and Principal Survey (NTPS) and the School Survey on Crime and Safety (SSOCS), are planning to supplement their data collections with data from the CRDC. Therefore, in partnership with OCR, NCES redesigned all aspects of the CRDC data submission process to improve alignment of CRDC data with other surveys. In the initial stage, NCES developed a new data collection tool for the 2013-14 CRDC to improve data quality, reduce burden on respondents, and improve the usability of the data collection tool and data feedbacks reports.

The research and evaluation project this year is to evaluate the 2013-14 data collection and uncover areas where the data collection instruments, resources and procedures are succeeding, where they are not, and where they require further improvements.

The research will be conducted in three phases. Phase 1 will consist of site visits to LEAs and SEAs, Phase 2 of in-depth cognitive interviews with CRDC responders, and Phase 3 will be a usability pilot test. We are seeking clearance for all three phases now so that the research can be continuous throughout the short 5-month research and evaluation period. Recruitment for the various research components takes time and must be arranged around summer vacations. It is therefore critical that we are able to start recruitment as soon as possible and allow participants who may not be available for one phase of research to be offered participation in a later phase of research. The next CRDC data collection is scheduled to begin in late-fall 2016, and all recommendations from the research will need to be implemented before data collection begins.

  1. Purpose

The CRDC Research & Evaluation project will evaluate the redesigned data collection and uncover areas where the data collection instruments, resources and procedures are succeeding, where they are not, and where they require further improvements. The research team will recommend improvements that will be evaluated and implemented by the data collection contractor in late-summer of 2016.

For the site visits, we are first seeking to understand more about how compatible the new data collection tool and procedures were with how LEAs access and house the data they need to report during the CRDC data collection period, the procedures they have in place for reporting data, and problems they face in reporting.

During site visits, ED will be able to ask for and go through hard copy materials with on-site staff to examine examples of particular problems and discuss any problems in more depth, to improve future survey formats as well as the online tool. Understanding where there is misalignment in these areas will help us develop tools and procedures that better facilitate reporting and improve data quality. To research these issues we will conduct 10 in-person site visits with various LEA and SEA staff associated with the CRDC.

While site-visits allow us to focus on issues of data access and reporting, there is a further need to identify specific data elements, resources, information, and definitions that are confusing for CRDC reporters when they are responding to the data request. To research CRDC responders’ understanding and interpretation of the CRDC data elements and definitions, we will conduct cognitive interviews with staff at 20 LEAs who have responded to or may respond in the future to the CRDC.

Finally, we will use the information gathered in the site visits and the cognitive interviews to develop online resources and refine instructions and technical assistance for the 2015-16 CRDC. The final part of the research and evaluation project will be a usability pilot test to determine what modifications and changes are needed for the online survey tool in preparation for the 2015-16 data collection. We will invite up to 50 CRDC users to test and evaluate the instruments and resources. Respondents will test the online tool from their workplace using their own computers. The 2013-14 and 2015-16 CRDC survey instruments are nearly identical, and for the pilot test we will use the 2013-14 CRDC tool that was used in the last data collection and is already fully programmed (approved under OMB# 1870-0504). The online tool can be made available to OMB officers upon request. Screen shots for the research focus areas of the online tool are provided in attachment C.

  1. Design

The sites proposed for each of the phases will reflect a diverse set of districts (and also states in phase 1), selected based on the criteria listed below and as recorded during the 2013-14 CRDC. The criteria will assist with prioritizing the types of districts asked to participate. Each district will fall into multiple criteria (e.g., large urban district with charter schools that certified late in 2013-14 and submitted using the online submission tool) and ED will try to recruit districts so that each criterion will be covered by multiple districts, with the intention of recruiting at least one district per criteria so that across all participating districts, we gain information from at least one district under each criterion.

  • Small (e.g. single school or two-school districts)

  • Large (e.g. many schools and students)

  • Location (e.g. rural, suburban, urban)

  • Did not certify their 2013-14 submission

  • Did not certify in 2011-12 but did certify in 2013-14

  • Certified late in 2013-14

  • Had an action plan created in 2013-14

  • Districts with charter schools

  • Submitted using the flat file submission tool (level of sophistication of data systems)

  • Submitted using the online submission tool (level of sophistication of data systems)

  • Districts in states that played a larger role in the submission in 2013-14 than prior years

Sites will be recruited by the study contractors, Sanametrix and AIR, in consultation with NCES and OCR. Recruitment will be conducted by letter, phone calls or emails, depending on the contact information available for superintendents, CRDC points-of-contact, or other key personnel (as provided in the 2013-14 CRDC or by ED). The recruitment process is expected to take up to 20 minutes per entity. Data collection will also be carried out Sanametrix and AIR

Phase 1. Site Visits

Members of the study team will conduct a 2-day visit (within 75 miles of a commercial airport) in pairs to each of the 10 selected sites across 5 states. Site visits will include an in-person interview with an SEA data coordinator and on-site interviews with LEA officials, such as the assistant superintendent and the data coordinator, depending on the size of the district. The site visits will take up to 4 hours in person at a school district or state education agency. Where possible we will visit a district one day and the SEA the next. Based on similar work conducted in 2013-14, some sites may request a meeting with all involved participants, while other sites may request we meet with each participant individually. We will accommodate the format that is most convenient for the district. There will be a lead interviewer and a note-taker. Site visit interviews will be audio-recorded (with participant approval), and detailed notes taken. After the site visit, a summary of the visit will be written and provided to NCES. We will also provide the summary to the site, if requested. Throughout the site visit data collection process, regular debriefing meetings will be held with the contractor and ED research team.

Recruitment

We anticipate contacting approximately 20 sites in order to recruit the target of 10 sites. Additionally, two OCR field offices will be contacted for recruitment and these offices are expected to participate upon request. OCR field offices are federal offices and therefore do not fall under this OMB clearance. We will recruit districts in at least 5 different states, with at least one of these states being proactive in assisting districts with their submission.

Training

All data collection staff will participate in a site visit training to ensure familiarity with and understanding of the purpose of the study, content of the protocols, and site visit procedures. The study team will be trained in what to look for on-site and, using objective and non-biased interview techniques, how to motivate respondents to provide accurate and complete information during interviews. The team will also be trained in key procedural issues such as guidelines for ensuring privacy during interviews, taking high-quality interview and site visit notes, and providing required follow-up communications.

Materials

The advance letter/email to be sent to participants, phone recruitment scripts, and the protocols to guide the site visits are provided in the Attachments document.

Phase 2. Cognitive Interviews

Separate from the site-visits, members of the study team will conduct a 90-minute telephone interview with SEA or LEA staff who are involved in the reporting of CRDC data elements. The interviewer will walk the respondent through key areas of the online instrument and associated resources (see attachments C and D) and will ask the respondent to “think-aloud” as they are browsing and entering information. This will provide real-time feedback to the research team about which tasks are problematic. Additionally, the interviewer will probe the respondent to gather more detail and explanation and to ensure the respondent provides feedback on all areas of interest to the research team. All interviews will be audio-recorded with respondents’ permission. Similar to the site visits, some sites may request a group call while others may request individual interviews. We will accommodate whichever format is preferred. There will be a lead interviewer and a note-taker on every call. A report will be prepared to detail problems and make recommendations for solutions.

Recruitment

We anticipate contacting approximately 30 participants in order to recruit the target of 20. Individuals responsible for data collection and entry will be recruited. Depending on the LEA/SEA, one person or a few individuals may fulfill all of these responsibilities. We expect the average number of participants per SEA/LEA to be 3 individuals. We expect all participants to be on the call at the same time and each respondent will answer as needed depending on the questions being asked.

Training

All interview staff will participate in interviewer training to ensure familiarity with and understanding of the purpose of the study, content of the protocols, and interview procedures. The study team will be trained in effective probing using objective and non-biased interview techniques and how to motivate respondents to provide accurate and complete information during interviews. The team will also be trained in key procedural issues such as guidelines for ensuring privacy during interviews, taking high-quality interview notes, and providing required follow-up communications.

Materials

The advance email to be sent to participants, phone recruitment scripts, protocols, and screen shots of the portions of the online tool that are the main focus of the interviews are provided in the Attachments document.

Phase 3. Pilot Test

Fifty sites will be invited to test the usability of the online data collection tool and resources and to provide feedback. Participants will be able to call, email, or otherwise provide written feedback to the study team throughout a 10-day pilot test period. The pilot test will be targeted to the areas of the instruments and study materials that were found to be problematic in the summary of known issues and any other areas that are discovered during the site visits or cognitive interviews.

During the pilot test, CRDC participants will be granted access to the online tool and submission system and asked to review and navigate through specific areas of the tool - which are expected to be the areas shown in the screen shots provided. For most of the testing, the participants will be using a blank testing form to which they will enter real or dummy data, depending on their preferences. All entered data will be deleted after the pilot test is completed, on or before December 31, 2016. For testing of errors and certifications, users will be given either dummy data or data from their own 2013-14 submission depending on the 2013-14data available for their district. Users will never have access to or see any data that is outside of their school, district or state. Users must login to the system and each user has associated credentials that restrict their access to information about their own school, district, or state. This is an existing feature of the online submission system to prevent data disclosure and it will remain in place in the pilot and 2015-16 systems.

For the pilot test, each respondent will be asked to review and complete each designated section of the online tool and provide written feedback about the sections. We anticipate a maximum of 8 hours for each respondent to complete these tasks over a period of up to 10 days. The burden time will be significantly lower for participants with good understanding of the tool and those with little written feedback.

Recruitment

We anticipate contacting approximately 50 LEAs/SEAs in order to recruit the target of 40-45 participating sites.

Training

All pilot testing staff will participate in training to ensure familiarity with and understanding of the purpose of the pilot test and the tool and resources being tested. The study team will be trained to handle questions and feedback from the participants and how to use a software system to log and track comments.

Materials

The pilot test invitation letter to be sent to participants and phone recruitment scripts are provided in the Attachments document.

  1. Consultations outside the Agency

The CRDC 2015-16 proposed design and content was sent out for public comment and approved under OMB# 1870-0504. OCR also maintains a CRDC working group that provides feedback on design and content. As part of the 2015-16 and 2017-18 development, a summary of known issues with the 2013-14 data collection is being prepared from review of feedback and Partner Support Center tickets, as well as from meetings and discussion between the data collection contractor, NCES, and OCR. This summary will be used to guide the research and evaluation tasks. Additionally, in early 2014, NCES conducted site visits and cognitive interviews with LEAs, SEAs, and schools to gather information about the CRDC data elements, submission process, and data collection procedures. Results from the prior research were used to inform the redesign of the 2013-14 CRDC.

  1. Payments to Respondents

Respondents will not receive payments.

  1. Assurance of Confidentiality

The study will ask about data holdings and processes of public institutions and will not collect personal data about respondents, therefore no confidentiality provision will be cited. Participation in the study is voluntary and the study recruitment phone calls and emails will notify potential participants of the study sponsorship and voluntary nature. Interviewers will also remind participants of these prior to the start of the interview, as noted in the interview protocol.

  1. Estimate of Hour Burden

We expect to spend 4 hours per site interviewing staff in Phase 1, plus participants may need one hour of prep time. We will spend 90-minutes interviewing staff in Phase 2, plus an additional 45 minutes of participant prep time. We anticipate 8 hours of usability testing and debriefing (per person, per site) in Phase 3, which will not require separate prep time. Based on our prior experience with similar research, Phase 1 site visits are expected to involve 4 people per site who will spend 1-hour each in the interview. Phase 2 cognitive interviews are expected to involve an average of 3 people per interview who will each participate in the full 90-minutes. Phase 3 testing is expected to involve 2 people per site who will each spend approximately 8 hours reviewing advance materials, testing, and providing feedback. The total burden is estimated at 965 hours which also includes approximately 20 minutes (0.3 hours) per site for recruitment, as shown in the burden table below.

Phase

Estimated Number of Respondents

Estimated Number of Responses

(Prep+Participation Time)

Estimated Burden Time Per Respondent [hours]

Total Estimated Burden Time [hours]

Phase 1 Recruitment

20

20

0.3

6

Phase 1 Participation (at 10 sites)

40

40

(1+1)

2

80

Phase 2 Recruitment

30

30

0.3

9

Phase 2 Participation (at 20 sites)

60

60

(0.75+1.5)

2.25

135

Phase 3 Recruitment

50

50

0.3

15

Phase 3 Participation (at 45 sites)

90

90

8

720

Total

290

290

-

965


  1. Estimate of Cost Burden

There is no direct cost to respondents. The total cost of this research to the federal government, including planning, data collection, and reporting, is approximately $330,000 (Site Visits $140,000; Cognitive Interviews $90,000; and Pilot Test $100,000).

  1. Reporting Plans

Internal summary report will be prepared with identified problems and suggested improvements that could be made to reduce burden and improve data quality. Phase 1 site visits will also have site-specific summaries, which may be shared with the participating site, if requested.

  1. Project Schedule

Recruitment will begin upon receipt of OMB approval. Visits and interviews will begin as soon as possible thereafter. All research is expected to be completed by November 1, 2016.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorandy
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy