2019 Part_A_Generic_Clearance_ Final

2019 Part_A_Generic_Clearance_ Final.docx

Generic Clearance for Cognitive, Pilot and Field Studies for Bureau of Justice Statistics Data Collection Activities

OMB: 1121-0339

Document [docx]
Download: docx | pdf

  1. JUSTIFICATION

1. Importance of Information

The Bureau of Justice Statistics (BJS) is requesting a 3-year generic clearance to continue its efforts to develop, test, and improve its survey design and data collection instruments (e.g., questionnaires and paper and online surveys) and methodologies for new and ongoing projects. BJS will use different procedures to complete various statistical design and developmental activities, including (but not limited to): pretesting of different types of survey and data collection methodologies; focus groups; cognitive laboratory activities; pilot testing; field testing; exploratory interviews; experiments with questionnaire design; and usability testing of electronic data collection instruments. BJS requests this clearance to conduct statistical activities associated with various projects, which are listed and described in Question #2.


BJS collects self-reported data directly from respondents through paper and online surveys, via phone, and through in-person interviews and focus groups. BJS respondents include individuals, agencies, and other establishments operating in the law enforcement, corrections (including community corrections), and victimization fields. BJS also acquires and uses administrative records that are maintained by criminal justice agencies as a routine part of their operations


BJS initiated its first generic clearance in 2013 to support its statistical work and received subsequent approval from OMB in 2016 to continue using the clearance to complete developmental activities associated with evaluating and improving its survey design and data collection instruments and methodologies. Prior to using a generic clearance, BJS relied on convenience samples of 9 or fewer persons to provide input and feedback on survey design and data collection methodologies, or requested full OMB clearance for such methodological work. BJS determined that these approaches resulted in untimely and lengthy delays to the data collection implementation and involved duplicative or additional resources, e.g., staff time and project costs. Additionally, convenience samples do not provide a sufficient basis for conducting any type of test and do not provide reliable generalizability. The information collected via these methods, while helpful, are inadequate in many situations and limited in their ability to detect and diagnose problems with the instruments and the procedures being tested. This lack of rigor, low reliability, and minimal generalizability would be unacceptable for a federal statistical agency.


The generic clearance is a more effective approach that will continue allowing BJS to take advantage of a variety of statistical methods that have been proven useful and effective for identifying data collection and procedural problems, informing solutions, improving the quality of BJS’s data collection instruments and methodologies, and measuring the usability and reliability of the survey design. Additionally, pretesting enables BJS to assess respondent burden and identify how and where it can be reduced through improved data collection procedures and survey design. This clearance is similar to the testing clearances held by other federal statistical agencies, including the Census Bureau, the Bureau of Labor Statistics, the National Center for Education Statistics, and the National Center for Science and Engineering Statistics.


BJS will use various means to collect information to complete a range of statistical activities and methods to support the projects included in this generic clearance. These activities include, but are not limited to –


  • Behavior coding – involves applying a standardized coding scheme to the completion of an interview or questionnaire, either by a coder using a tape-recording of the interview or by an in-person observer at the time of the interview. The coding scheme is designed to identify situations that occur during the interview that reflect problems with the questionnaire. For example, if respondents frequently interrupt the interviewer before the question is completed, the question may be too long. If respondents frequently give inadequate answers, this suggests there are some other problems with the question. Quantitative data derived from this type of standardized coding scheme can provide valuable information to identify problem areas in a questionnaire, and can be used as a substitute for or as a complement to the traditional interviewer debriefing.

  • Cognitive and usability interviews – involve intensive, one-on-one interviews in which the respondent is typically asked to "think aloud" as he or she answers survey questions. A number of different techniques may be involved, including asking respondents to paraphrase questions, probing questions asked to determine how respondents came up with their answers, and so on. The objective is to identify problems of ambiguity or misunderstanding, or other difficulties respondents have answering questions. This is frequently one of the early stages of revising a questionnaire.

  • Exploratory interviews – may be conducted with individuals in the very early stages of survey development to better understand a topic area. These interviews may cover discussions related to administrative records (e.g., what types of records, where, and in what format), subject matter, definitions, etc. Exploratory interviews may also be used to investigate whether sufficient issues are present related to an existing data collection to consider a redesign.

  • Focus groups – involve group sessions guided by a moderator, who follows a topical outline containing questions or topics focused on a particular issue, rather than adhering to a standardized questionnaire. Focus groups are useful for surfacing and exploring issues (e.g., confidentiality concerns) that people may feel some hesitation about discussing.

  • Follow-up interviews or re-interviews – involve re-interviewing or re-assessing a sample of respondents after the completion of a survey or assessment. Responses given in the re-interview are compared with the respondents’ initial responses for consistency between responses re-interviews provide data for studies of test–re-test reliability and other measures of data quality. In turn, this information aids in the development of improved, more reliable measures.

  • Interviewer debriefings – employ the knowledge of the employees who have the closest contact with the respondents. BJS will use this method in conjunction with other methods in its field tests to collect information about how interviewers react to the survey instruments, as well as to explore problems and issues encountered by interviewers during the interview.

  • Pilot testing – defined, for purposes of this clearance, as data collection efforts that are conducted among purposive or statistically representative samples for evaluative purposes. BJS conducts pilot testing to evaluate its data collection instruments and/or procedures. Pilot tests are an essential component of this clearance package because they serve as the vehicle for investigating basic item properties for new or redesigned data collection efforts, such as reliability, validity, and difficulty. Pilot tests can also be used to assess the feasibility of methods for standardizing the administration of data collection instruments and to test procedures regarding data procurement as well as comparability of data across sites. BJS will use results from pilot tests to publish research and development (R&D) and methodological reports, but will not publish statistical reports or data sets based on the findings.

  • Respondent debriefing questionnaires – are administered at the end of the data collection instrument to respondents who have participated in a field test. The debriefing form contains probing questions to determine how respondents interpret the questions and whether they have problems completing the data collection instrument. This structured approach to debriefing enables quantitative analysis of data from a representative sample of respondents, to learn whether respondents can answer the questions, and whether they interpret them in the manner intended by the questionnaire designers.

  • Split-sample experiments – involve testing alternative versions of questionnaires, and other collection methods, at least some of which have been designed to address problems identified in draft questionnaires or questionnaires from previous surveys. The use of multiple questionnaires, randomly assigned to permit statistical comparisons, is the critical component here. Data collection can include mail, telephone, Internet, or personal visit interviews or group sessions at which self-administered questionnaires are completed. Comparison of revised questionnaires against a control version, preferably, or against each other, facilitates statistical evaluation of the performance of alternative versions of the questionnaire. Split-sample tests that incorporate questionnaire design experiments are likely to have a larger maximum sample size than field tests using other methodologies. Larger sample sizes will enable the detection of statistically significant differences and facilitate methodological experiments that can extend questionnaire design knowledge more generally for use in a variety of BJS data collection instruments.


Procedures for Clearance


Prior to initiating any methodological testing described in this clearance, BJS will provide individual clearance packages for each project to OMB that describe the planned work and statistical activities that will be conducted. The package will include a project description, methodological overview, project timeline, the number of anticipated respondents and burden, and cost estimates in addition to copies of the data collection instruments and debriefing materials. These materials may include a set of prototype items showing each item type to be used; the range of topics to be covered by the data collection instrument; and an interview script. BJS will also provide other relevant documentation to support or describe the proposed statistical activities, e.g., different versions of the data collection instrument that will be used for split-sample experiments (either for small group sessions or as part of a field test) or the description and rationale for procedures when conducting a test of alternative procedures.


BJS requests that OMB provide and return to BJS written comments on substantive issues within 10 working days of receipt to allow BJS to respond in a timely manner and avoid delays to its data collections schedules.


2. Needs and Uses


BJS anticipates conducting data collection design and methodological work under this generic clearance to support various BJS projects. The projects listed below are those BJS is considering conducting, but it is not intended to be an exhaustive list of all the projects (grouped by major topic area) that may be conducted during the active period of the collection:


BJS Corrections data

  • Annual Survey of Probation (ASP) – currently, BJS obtains population, admissions, releases, and some criminal justice and demographic counts from probation agencies that supervise at least one felon. These same agencies are reporting for all of their probationers – felons and misdemeanants – and BJS would like to know whether the respondents can distinguish between these two types of offenders for the purposes of reporting demographic characteristics. At the same time, there may be hundreds of misdemeanant-only probation agencies that BJS does not survey on an annual basis, but from which BJS is interested in getting at least some population information to better describe the number of persons in the U.S. who are involved in the criminal justice system. BJS plans to conduct cognitive interviews of modifying and spitting form. These cognitive interviews would have two parts: 1) 75 current ASP respondents would be asked whether they could provide a breakdown of demographics and criminal justice characteristics for only those felons under their supervision, and if so, what would be the additional burden, and 2) an additional 50 misdemeanant-only agencies not currently reporting to BJS would be contacted and asked whether they could provide population counts for January 1 and December 31 of each year, as well as counts of total admissions to and releases from probation over a calendar year. Both tests should take no more than one hour, for a total of 125 hours.


  • Census of Jails – BJS would like to test the usability of its web-data collection form for the Census of Jails using 10 jail administrators from a variety of facilities: single-county, regional, multijurisdictional, and private. BJS’s data collection agent will contact the respondents for a telephone debriefing after the initial data provision, so the burden for each respondent is estimated at two hours, or 20 hours total.


  • Mortality in Correctional Institutions (MCI) – depending on the outcomes of the Bureau of Justice Assistance’s collection of deaths in state prisons and local jails, and after discussions with OMB, BJS may investigate greatly shortening the current MCI forms. BJS is interested in asking several questions of state and local correctional facility administrators about their ability and willingness to provide BJS with a minimal number of personal identifiers for each decedent, including name, dates of birth, death, and facility admission, and social security number. BJS could then purchase death certificate information from the CDC’s National Death Index (NDI), reducing the need to burden respondents with requests for this information. To achieve a good match with NDI data, however, good quality identifiers are needed. This short set of questions will take about 15 minutes to administer as part of annual verification calls made to all 3,051 MCI respondents, for an approximate total of 763 burden hours.


  • National Corrections Reporting Program (NCRP) – in 2016, 29 jurisdictions provided the date on which each of their prisoners is eligible for release to parole supervision as part of their NCRP yearend custody records. By speaking informally with NCRP respondents, it has become clear that states are not providing comparable data; one state may provide the very first date a person becomes eligible for parole, while another provides the actual parole board hearing date. BJS will conduct brief interviews with the 50 respondents to NCRP to obtain information about what data they currently provide, and what data they could provide. BJS will then use this information to establish a more comprehensible definition of the variable so that data are consistent across states. BJS estimates each interview will take about an hour, for a total estimated respondent burden of 50 hours.


  • National Prisoner Statistics program (NPS-1B) – BJS plans to conduct cognitive testing in several areas, including new survey questions, splitting of the NPS-1B form into two parts, and providing pre-populated forms to respondents. Specific examples include –

    • BJS anticipates that it may need to test several new questions on the NPS-1B form during the next three years, including potentially adding a few questions on the departments’ of corrections testing and treatment of Hepatitis C. To address this, BJS plans to conduct cognitive interviews of new questions. This hour-long cognitive test would involve all 51 departments of corrections, for a total of 51 burden hours.

    • The response rate to the annual NPS-1B aggregate prison statistical collection remains close to 100%, but response times have increased over the years. While BJS requests that respondents submit data by February 28th of each year, fewer than half the states were able to meet this deadline in 2018, and critical states like California and Texas were not able to submit data until June. This causes delays in the publication of BJS’s annual Prisoners report and release of the data to the public. BJS proposes two cognitive tests to try different methods of encouraging states to submit data earlier in the year.

    • Some states have said that they require more time to process the counts of prison admissions and releases, as opposed to single-day population counts. BJS will ask 10 state respondents to initially fill out yearend population numbers by sex on a single form and submit this, followed by a second form two months later that requests admission and release data. Under this model, BJS could release population counts earlier in the year, and use its Corrections Statistical Analysis Tool – Prisoners (CSAT) to release updated admission and release counts later. BJS believes that this cognitive test will require 2 hours of data preparation and reporting, for a total of 20 hours.

    • Cognitive test of providing pre-populated forms to respondents – Several states publish population counts on their websites on a weekly or monthly basis (see, for example – https://www.cdcr.ca.gov/Reports_Research/Offender_Information_Services_Branch/WeeklyWed/TPOP1A/TPOP1Ad190116.pdf). Not all counts are comparable to BJS definitions or are calculated using the fiscal year, instead of calendar year. However, if the ratio is stable over time between the counts on the website and what was ultimately provided on the NPS-1B form, BJS could calculate estimates more quickly than the state would normally provide a response. Ultimately, BJS could provide state respondents with pre-populated forms for verification or correction by the respondents, theoretically reducing burden if the estimation models are appropriate. After identifying states with available data, BJS would develop estimation models, and then send a pre-populated NPS-1B form to up to 20 states when data collection begins for the 2020 reference year. Follow-up interviews with the respondents would include questions on whether the pre-populated forms reduced burden. The introduction of this test and the interviews require no more than one hour of additional burden on respondents, for a total of 20 hours.


  • National Survey of Children’s Exposure to Violence (NatSCEV) – BJS, in partnership with the Office of Juvenile Justice and Delinquency Prevention (OJJDP), is expecting to conduct a pilot test for the Methodological Redesign of the National Survey of Children’s Exposure to Violence (NatSCEV). The goal of NatSCEV is to produce national estimates of the incidence and prevalence of exposure to violence, among children ages 0 to 17, across the settings of home, school, and the community. The pilot test will inform the data collection protocol for the next national data collection, pending eventual funding. The NatSCEV was previously administered via random-digit-dial (RDD) designs in 2008, 2011, and 2014. NatSCEV has been experiencing a decline in response rates. In 2014 the response rate from the landline frame was 14.7% and from the cell it was 9.7%.

  • The pilot test will examine the feasibility of using an address-based sample (ABS) in future NatSCEV collections.

  • The mode of data collection for the ABS design will be primarily web with the exception of a paper questionnaire option for non-responding parents. However, youth ages 12 to 17 will only be invited to participate in the web version in order to protect their confidentiality within their household.

  • Non-response follow-up by phone is expected in eligible households to boost youth response rates. Additional follow-up procedures will be tested to improve household and youth response rates.

BJS expects to draw a random sample of approximately 3,000 households. An estimated 210 households are expected to provide a household roster, which is expected to yield approximately 52 completed youth surveys. The pilot test will be limited to households with youth ages 12 to 17. The burden is expected to include 230 hours for sampled households (at 5 minutes per household), 35 hours for parents in eligible households who complete the roster (at 10 minutes for each responding household)), and approximately 9 hours for youth (at 10 minutes for each youth). The total respondent burden of the pilot test is expected to be 274 hours.


  • Survey of Inmates in Local Jails (SILJ) – BJS anticipates conducting cognitive testing on new survey questions and inmate consent and in different sites. Specific examples include –

    • BJS plans to field the SILJ in 2021, after a redesign of the instrument used in the most recent survey in 2002. New questions will include asking inmates in greater depth about opioid use and mental health issues. In addition, BJS will test a consent form similar to that used in the 2016 Survey of Prison Inmates (SPI) that would ask inmates for permission to (a) link their survey data to criminal history data obtained by the FBI (RAP sheets); (b) link their survey data to other government administrative records on employment, income, housing, health care, etc. that are housed at the Center for Administrative Records Research and Applications (CARRA) at the U.S. Census Bureau; (c) link to both sets of records; or (d) refuse permission to do any linkage with external records. BJS believes that a cognitive test of 18 inmates lasting approximately 130 minutes per interview for a total of 38 hours will allow BJS to determine whether the SILJ instrument and consent form are understandable and functional as intended.

    • BJS will pretest the SILJ in several jails on a rolling basis to incorporate any changes that need to be made based on issues observed in the previous group of interviews (timing of survey, consent forms, performance of Spanish language instrument). BJS anticipates conducting the one-hour pretest on approximately 450 local jail inmates, for a total of 450 burden hours.


BJS Law Enforcement data

  • Analysis of Publicly Available Court Data (APACD) – BJS intends to test the collection of data from publicly available state court sources, including data extracts, freely available websites, and restricted access websites, among other potential sources. BJS plans to do this with 10 sources (states or sampled units within states) and clean and standardize the collected data into usable files that are comparable across jurisdictions. BJS estimates that, for each of the 10 sampled jurisdictions the burden will average 15 hours, for a total burden estimate of 150 hours.


  • Census of Publicly Funded Forensic Crime Laboratories (CPFFCL) – BJS has conducted the CPFFCL approximately every 5 years since 2002. The most recent CPFFCL was administered in 2015, and BJS intends to field the CPFFCL in 2020. The CPFFCL is designed to capture information on laboratories’ staffing, budgets, caseload, procedures, and policies regarding evidence analysis and record retention. Historically, the CPFFCL included only publicly funded laboratories that employed one or more full-time scientists with degrees in the natural sciences. For the 2020 CPFFCL, BJS intends to expand the frame and questionnaire to include publicly funded laboratories that process only digital evidence. This will require BJS to conduct a larger frame verification effort to ensure that all eligible labs are included as well as to substantially revise the survey instrument to address the work of digital evidence labs. This revised instrument along with the original instrument will need to be cognitively tested to ensure its relevance, validity, and answerability. The burden estimate for the frame development work is 125 hours, and for the cognitive testing it is 60 hours, for a total estimated burden of 185 hours.


  • Census of State and Local Law Enforcement Agencies (CSLLEA) – the CSLLEA generates an enumeration of all publically funded state, county, local and tribal law enforcement agencies operating in the United States and provides complete personnel counts for the approximately 18,000 law enforcement agencies operating nationally. BJS has conducted the CSLLEA periodically since 1986, with the next iteration going into the field in 2022. The CSLLEA serves as one of the core law enforcement collections at BJS. The primary purposes of the CSLLEA are two-fold. One is to provide personnel counts and the functions of all law enforcement agencies in the U.S. Second, the CSLLEA serves as a frame for BJS’ law enforcement surveys, including the LEMAS core and supplements. To prepare for the 2022 CSLLEA, BJS will need to update the agency list to determine which law enforcement agencies are still in-service and make sure the point of contact information is current. Additionally, cognitive testing of the 2022 instrument will be done with approximately 45 agencies. The total anticipated number of respondents is 18,045 and the total burden estimate to update the frame and to conduct the cognitive testing is 4,545 hours.


  • Law Enforcement Management and Administrative Statistics (LEMAS) – the LEMAS core survey is presently the most systematic and comprehensive source of national data on law enforcement personnel, expenditures and pay, operations, equipment, computers and information systems, and policies and procedures. The LEMAS core survey provides national estimates for all state and local general purpose law enforcement agencies based on a nationally representative sample of agencies. LEMAS surveys have been conducted periodically since 1987, and data collected through the surveys have provided information on current issues and trends in law enforcement practices in the United States. The next LEMAS will be administered in 2020 using a redesigned data collection instrument and will be tested in an anticipated 50 agencies. The burden estimate to conduct cognitive testing of the new instrumentation is 75 hours.


  • National Survey of Prosecutors (NSP) – BJS is planning to conduct the NSP, which has not been successfully fielded since 2007. There have been changes in the field of prosecution since that time, and the survey should be modified to address those changes. BJS will seek clearance for cognitive interviews with at least 30 prosecutors to test the usability of the instrument, the capacity of respondents to provide the information, and the level of burden associated with reporting the information prior to implementing it on a national scale. The findings will be used to make the necessary changes to the instrument to enhance its functionality, improve the quality of the data collected, and minimize burden prior to national implementation. BJS would also seek to do some frame development work. The last successful fielding of the NSP was in 2007, and there may have been changes to the frame in the interim. The project team may require contact with up to 1,500 prosecutor offices to verify address information and update the contact person for the survey. The total anticipated number of respondents is 1,530 and the total burden estimate to update the frame and to conduct the cognitive testing is 435 hours.


  • Census of Problem-Solving Courts (CPSC) – the opioid crisis is a pressing issue, and the role of problem-solving courts is vital to addressing the intersection of drugs and crime. While there is anecdotal evidence that most crime handled in state courts includes some tie to drugs, there is little ability to examine this in state court data (e.g., a person steals a television to pawn to support a drug habit; the person is charged with theft or burglary, not a drug crime). One of the easiest targets to examine the intersection of drugs and crime is drug courts. The Census of Problem-Solving Courts in 2012 was a complete enumeration of all types of problem-solving courts: drug, mental health, DWI, juvenile, veterans, and domestic violence courts, among others. The next survey will focus on the problem-solving courts most likely to address the opioid epidemic, such as drug and veterans courts, and take a deeper look at the types of cases admitted to problem-solving courts, the progress of defendants through those courts, and the success rates for one particular year of those courts. BJS will use a generic clearance to complete frame development, to confirm the number of eligible courts with state and county problem-solving court coordinators. BJS also plans a cognitive test of the survey instrument with at least 30 state- or county problem-solving court coordinators. The total anticipated number of respondents is 66 and the total burden estimate to update the frame and to conduct the cognitive testing is 66 hours.


  • Survey of Campus Law Enforcement Agencies (SCLEA) – campus police departments are the most common special-purpose law enforcement agencies (LEAs) in the U.S. BJS conducted its first SCLEA in 1995 and administered two more waves in the 2004-05 and 2011-12 school years. The survey instrument will be updated and cognitively tested to refine the survey content and assess the feasibility of the items and wording of questions. The 2018 Census of State and Local Law Enforcement Agencies (CSLLEA) will provide information on LEAs with sworn personnel at public universities, but additional work will be needed on the frame to identify campuses that have security departments with non-sworn officers and private universities with either sworn or non-sworn personnel. The burden estimate is 45 hours and 30 respondents for cognitive testing and 300 hours and 1, 200 respondents for frame development, for a total of 1,230 respondents and 345 hours.


  • Survey of Public Defenders (SPD) – BJS is planning to conduct the Survey of Public Defenders (SPD), which will survey practicing public defenders rather than obtain summary data from agencies. BJS has completed some of the design work under the Survey of Public Defenders: A Design Study. The instrument developed in that effort will need some revisions and cognitive testing, and can be included in a pretest of response rates. BJS is concerned about low response rates that a national survey could generate and will seek clearance to test different follow-up strategies with at least 300 attorneys. The work will involve cognitive interviews about the revised survey with a subset of respondents and will also involve some cognitive interviewing of non-respondents, which will be used to refine and improve outreach strategies. BJS will also need to contact public defender offices (about 200) to request lists of attorneys to generate the sampling frame. Depending on the outcome of the pilot test and cognitive interviews, BJS may require a second generic clearance request to test the revised outreach strategies recommended by the first outreach test. The total anticipated number of respondents is 500 and the total burden estimate to update the frame and conduct the pretesting activities is 500 hours.


BJS Law Enforcement Incident-Based Reporting data

  • National Law Enforcement Calls For Service (CFS) – BJS anticipates pursuing work to address the dearth of empirical information about CFS to law enforcement and what proportion of those CFS are recorded by police as crime incidents. The lack of quantifiable information about CFS results in an inability to accurately describe the workload of police officers, how that workload varies by place over time, and how much of that workload is related specifically to criminal offenses. Development work is needed to (1) identify the proportion of law enforcement agencies that receive CFS through some type of computer-aided dispatch (CAD) system, (2) determine if data from CAD systems can be used to develop an estimate of CFS for the nation, and (3) determine if data from CAD systems can be used to develop a national estimate of how many CFS are subsequently recorded by police as crime incidents. To accomplish these goals, preliminary work under the generic clearance is needed to survey a sample of about 50 law enforcement agencies, stratified by type and size, to understand the various ways agencies receive and record details about CFS, and to request CAD (or equivalent) technical specifications from a subset of agencies to evaluate methods for collecting CFS data from a nationally-representative sample of law enforcement agencies. The estimated maximum burden hours for 50 respondents to respond to a survey about their CFS and CAD systems and to provide technical specifications for CAD is approximately 200 hours.


BJS Victimization data

  • NCVS Instrument Redesign BJS is expecting to test a potential redesign of the NCVS survey instrument and test alternative modes of administration. The NCVS was last redesigned in 1992. Prior generic clearances for this project (OMB Number 1121-0325) allowed for cognitively testing different sections of the NCVS instrument with adults and youth. BJS has requested an experimental comparison of the interviewer-administered instrument currently fielded by the Census Bureau, the redesigned interviewer-administered version, and the redesigned self-administered version developed as part of the prior generic clearances.



BJS will request full clearance for a field test with an experimental design comparing three versions of the NCVS instruments: two from the pilot test of field procedures (interviewer-administered versions of the current and redesigned NCVS) and a self-administered version of the redesigned questionnaire. The field test will involve administering the survey to a representative sample of persons age 12 or older, testing aspects of the design such as mode, victimization screener approaches, response rates, and administration times. BJS plans to conduct a small-scale pilot test of data collection field procedures and testing of the interviewer-administered version of the redesigned instrument alongside the current NCVS. The total anticipated number of respondents is 200 amounting to approximately 84 burden hours.


BJS Prison Rape Elimination Act of 2003 (PREA) data

  • National Inmate Survey, Prisons and Jails (NIS-4, Prisons; and NIS-4, Jails) – BJS plans to pursue development work for the next NIS surveys, as mandated by the PREA. Developmental work for the NIS-4 surveys is needed under the generic clearance in order to (1) conduct initial pilot test of ACASI and PAPI instrumentation and survey collection protocols; (2) develop and test a survey addendum to which 5%-10% of sampled respondents will be assigned in an effort to provide additional level of anonymity in the prison and jail setting; (3) develop and test a facility characteristics form that collects data related to PREA standards and other factors related to variations in sexual victimization. BJS intends to pilot test the NIS-4 instruments to (1) ensure the questions and response options do not cause any significant comprehension or recall problems for inmates; (2) ensure the instruments are performing appropriately and the skip logic is programmed accurately; (3) test the length of the surveys to ensure that they are running within specific time constraints; and (4) test sampling and interviewing protocols to ensure the procedures developed do not cause any significant burden to the facilities or staff. The estimated burden hours are 268 hours for pilot testing of the self-report surveys, 18 hours for the administration protocols, and 15 hours for facility questionnaire review and testing. The total burden estimate is 301 hours.


In most cases, data collection activities included in this clearance will be conducted under BJS’s authorizing legislation, 34 U.S.C. § 10131. BJS may also collect relevant data mandated by the PREA (P.L. 108-79). BJS will identify in each individual clearance package submitted to OMB and in materials that are provided to respondents the authorizing statutes associated with the specific project(s).


Consistent with 34 U.S.C. § 10134, BJS will only use data collected in conjunction with projects that are covered under this generic to inform its statistical methodological and design work and will not use individual-level information for enforcement or compliance efforts or to make determinations about benefits. BJS will follow the applicable laws, regulations, policies, and other authorities that govern BJS data, which are summarized in the BJS Data Protection Guidelines (see https://www.bjs.gov/content/pub/pdf/BJS_Data_Protection_Guidelines.pdf).


Because the data collection instruments being tested under this clearance are still in the process of development, the data that result from these collections are not considered official BJS or other federal statistics. The data will not be made public and will be used only to inform statistical activities and data quality improvement efforts. The data may also be prepared for presentations related to survey methodology at professional meetings or publications in professional journals. BJS will not disclose individual-level information that could result in the identification of a specific respondent, or use the information for compliance, benefits determinations, or enforcement purposes.


3. Use of Information Technology


When the data collection tools being tested employ automated methods for its data collection, the research conducted under this submission will also use automated data collection techniques. This clearance offers BJS the opportunity to test innovative technologies that may reduce respondent burden, achieve cost efficiencies, improve data quality and reliability, and increase the use of information technology.


4. Efforts to Identify Duplication


This research does not duplicate any other survey design or methodological work being done by BJS or other federal agencies. The purpose of this clearance is to enable and encourage additional research, which would not be done under other circumstances due to time and other resource constraints. This research will involve collaboration with staff from other agencies that are sponsoring surveys conducted by BJS, when applicable. The research may also involve joint efforts with staff from federal laboratory facilities. All efforts will be collaborative in nature, and no duplication in this area is anticipated.


To the maximum extent possible, BJS will make use of existing information and review results of prior evaluations of survey data before revising any data collection instruments. However, this information will provide limited utility and will not be sufficient by itself to refine BJS’s data collection instruments without the benefits associated with conducting additional pretesting and research activities covered under this generic clearance.


5. Minimizing Burden


This research will be designed as relatively small-scale data collection efforts to minimize the amount of burden required to improve data collection instruments and procedures, test new ideas, and refine or improve data collection methodologies. The results of the research conducted under this clearance are expected to improve the methods and instruments utilized in full scale studies and thereby improve information quality while minimizing burden to respondents and costs to the federal government.


6. Consequences of Less Frequent Collection


This clearance involves data collection research, design, and development activities for each survey or data collection named in the submission. BJS may add, change, or replace projects during the clearance period, but all subsequent activities would comport and comply with the terms of this generic clearance. Absent the ability to complete these statistical activities, BJS would not be able to complete the activities in a timely manner and the quality of the methodological design and data collected in conjunction with the projects would potentially decline. In addition, BJS would not have a reliable ways to assess and calculate the precise burden hours and costs associated with its survey and data collection efforts.


7. Special Circumstances


All the guidelines listed in the OMB guidelines are met. BJS does not anticipated any special circumstances.


8. Consultations Outside the Agency


The 60-day Federal Register notice was published on December 21, 2018 (83 FR, No. 245, p. 65746-65747). The 30-day Federal Register notice was published on February 27, 2019 (84 FR, No. 39 p. 6440-6441). No public comments have been received.


Consultation with staff from other federal agencies that sponsor surveys conducted by BJS will occur in conjunction with the testing program for the individual survey. BJS may also consult staff from federal laboratory facilities as part of joint research efforts. These consultations would include discussions concerning statistical topics such as potential response problems, clarity of questions and instructions, and other aspects of respondent burden. Additional efforts to consult with potential respondents to obtain their views on the availability of data, clarity of instructions, burden, etc., may be undertaken as part of the testing that is conducted under this clearance.


9. Paying Respondents


While none of the currently proposed projects involve the use of incentives, BJS may develop other projects where incentives could be used, in accordance with OMB guidelines. BJS may offer up to $40 for any cognitive labs and up to $75 for focus group participation. BJS may also propose incentive experiments in limited cases.


10. Assurance of Confidentiality


Consistent with the confidentiality provisions of 34 U.S.C. § 10231, BJS will only use the information gathered under this clearance for statistical or research purposes, and shall collect and report it in a manner that precludes their use for law enforcement or any purpose relating to a particular individual other than statistical or research purposes. All respondents who participate in research under this clearance will be informed that the information they provide may be used only for statistical purposes and may not be disclosed or used in identifiable form. The respondents will also be advised whether their participation is voluntary or mandatory, and BJS will inform respondents that an OMB number is required on the data collection instrument. BJS will communicate this information orally during in-person and telephone interviews and focus groups, in writing on recruitment and survey notification materials and data collection tool(s), and on web-forms in a format that allows a respondent to print and retain a copy. All participants involved in cognitive research will be required to sign a statement affirming their understanding of the voluntary and confidential nature of their participation.


BJS will include either the standard BJS confidentiality pledge (for collections involving identifiable information) or data use assurance (for collections that do not involve identifiable information) in written correspondence that is sent to data providers and respondents:


Confidentiality pledge for BJS collections that involve personality identifiable information (PII) or information identifiable to a private person:

The Bureau of Justice Statistics (BJS) is authorized to conduct this data collection under 34 U.S.C. § 10132 <or other authority, as applicable>. BJS will protect and maintain the confidentiality of your personally identifiable information (PII) to the fullest extent under federal law. BJS, its employees, and its contractors will only use the information you provide for statistical or research purposes pursuant to 34 U.S.C. § 10134, and will not disclose your information in identifiable form to anyone outside of the BJS project team without your consent. All PII collected under BJS’s authority is protected under the confidentiality provisions of 34 U.S.C. § 10231. Any person who violates these provisions may be punished by a fine up to $10,000, in addition to any other penalties imposed by law. Further, per the Cybersecurity Enhancement Act of 2015 (6 U.S.C. § 151), federal information systems are protected from malicious activities through cybersecurity screening of transmitted data. For more information on how BJS and its contractors will use and protect your information, go to https://www.bjs.gov/content/pub/pdf/BJS_Data_Protection_Guidelines.pdf.


Data use assurance for BJS data collections that do not obtain PII or information identifiable to a private person:

The Bureau of Justice Statistics (BJS) is authorized to conduct this data collection under 34 U.S.C. § 10132 <or other authority, as applicable>. BJS, its employees, and its contractors will only use the information you provide for statistical or research purposes pursuant to 34 U.S.C. § 10134, and will protect it to the fullest extent under federal law. For more information on how BJS and its contractors will use and protect your information, go to https://www.bjs.gov/content/pub/pdf/BJS_Data_Protection_Guidelines.pdf.


11. Justification for Sensitive Questions


It is possible that some potentially sensitive questions may be included in data collection instruments that are tested under this clearance. The testing is designed to identify questions that respondents consider to be sensitive, determine the potential sources of sensitivity, and address concerns related to those questions, to the extent possible, before the survey design is finalized and the actual data collection instrument is administered. BJS will include in individual project clearance submissions the justification for any sensitive questions included in a project covered by this generic clearance.


12. Estimate of Hour Burden


BJS estimates that the estimated number of people involved in the exploratory, field test, pilot, cognitive, and focus group work covered by this generic clearance is about 30,000 respondents over the 3-year period, with a total estimated respondent burden of approximately 15,000 hours.


BJS will use a variety of data collection instruments and methods to complete the statistical activities covered under this clearance. The exact number of respondents and the different instruments and their length are not known at this time. BJS will include specific details and burden estimates in its clearance submissions for individual projects.


13. Estimate of Cost Burden


There is typically no cost to respondents for participating in the research being conducted under this clearance, except for a respondent’s time associated with completing the questionnaire or participating in an interview or focus group.


14. Cost to Federal Government


Due to the nature of the generic clearance terms and structure, BJS cannot precisely estimate the actual number of respondents, length of interview(s), and/or mode(s) of data collection for the work to be conducted under this clearance over the entire 3-year clearance period. Without that information, it is not possible to estimate in advance the cost to the federal government. Costs associated with each individual project will be covered by the statistical unit conducting the research and will be supported with BJS program funding for statistical and research work. BJS will include cost-related information and estimates in the individual project clearance submissions.


15. Reason for Change in Burden


In the 60-day notice, BJS estimated the statistical activities covered under this general clearance would involve approximately 30,000 respondents for an estimated total respondent burden of 20,000 hours. BJS increased these estimates after identifying additional projects that will be undertaken during the 3-year clearance period and proposed higher numbers in the 30 day notice. However, after additional discussions, BJS is revising the numbers to approximately 30,000 respondents and a total of 15,000 burden hours.


16. Project Schedule


Due to the nature of this generic clearance, no single project timeline or schedule can be reported at this point. Major activities associated with the scope of work included in this clearance include data collection and methodological design efforts, data analysis and tabulation, and evaluative efforts. BJS will use the project findings to inform its statistical work. The information will not be the subject of estimates or other statistics in BJS reports, though it may be published (at the aggregate level) in research and development reports or be included as a methodological appendix or footnote in a report containing data from a larger data collection effort. The results of this research may also be prepared for presentation at professional meetings or publication in professional journals. BJS anticipates that project schedules will vary and that work will be conducted more or less continuously throughout the duration of the clearance.


17. Request to Not Display Expiration Date


No exemption is requested.


18. Exceptions to the Certification


There are no exceptions to the certification.

1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleHi All,
AuthorEdith.McArthur
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy