Responses to OMB questions

Forensic Comments-responses 5-5-08.doc

Survey of Law Enforcement's Forensic Backlogs

Responses to OMB questions

OMB: 1121-0320

Document [doc]
Download: doc | pdf

OMB COMMENTS: 200803-1121-002 OJP Survey of Law Enforcement's Forensic Evidence Processing

What is the purpose of this collection? What will you do with the results? Clear backlogs, or is there any other practical utility?


The survey will collect information on the backlog of criminal cases that contain forensic evidence in law enforcement agencies and the challenges that inhibit their submission to crime laboratories. Past DOJ surveys have identified sizable backlogs in our nation’s crime laboratories but there is little current information on how many unsolved cases in law enforcement agencies contain forensic evidence that has never been sent to crime laboratories for analysis. A better understanding of this issue can have several practical benefits. The results can be used to describe to policy makers the size and scope of the problem, which in terms can lead to possible solutions. This could include providing additional resources to agencies to submit and analyze forensic evidence in unsolved cases. Analyzing the evidence in some of these unsolved cases could result in new investigative leads, which in-turn could result in cases being solved.


Policy makers are interested in discussing solutions to the problem of an added burden on law enforcement and crime laboratories, but they need to have a more complete understanding of the major dimensions of the problem – How big is the case backlog? What is the capacity of the evidence collection, storage, analysis and retrieval system to reduce that backlog and provide timely information for the criminal justice system? This data collection can help provide answers to these questions.

Is this a recurring collection, or a one-time collection?

This survey represents a one-time collection of information.

Did you receive PRA approval for the pilot tests? When will you start the pilot?

We did not receive PRA approval for the pretest as we collected responses from only seven agencies.


PART A:

A9: What is this incentive that you want to provide, and why are you providing an incentive?

Originally, we planned to include a mouse pad with the survey mailing that would include the project’s web address as well as phone number to call with questions. With knowledge gained from the pre-test and for budgetary reasons, we have decided that we will not include an incentive with this survey.

A12: Where did you get the burden estimate from? Small pre-test?

During the small pretest, it became apparent that some questions may require that respondents coordinate with multiple units within their agency to obtain the requested information. In completing the survey during the pretest, respondents suggested that the total burden time should be changed from 30 minutes to 60 minutes, which is the revised burden time we are estimating.

What are the details of the pretesting and results?

The following summarized the results from the pretesting.


  1. Time required to complete the survey. During the pretest, multiple agencies commented that 30 minutes was not realistic for gathering the requested information. One agency commented that the survey was short (which was good) but fairly deep. Each of these agencies commented that the estimated time required to complete the survey should be changed to 60 minutes. We have made this change in the burden estimate presented on the survey.


  1. Agencies unclear about who should be involved in providing the requested information. Several agencies were not clear about who should be invovled within the agency in completing the survey. However in several instances during the pilot test, the survey ended up with the crime lab, which is not the appropriate group for completing the survey. As a result, we are providing more detailed directions (both in the cover letter, website, and survey instructions) to ensure the survey is sent to the appropriate staff within the agency.


  1. Definitions for crime categories. Several agencies had to add up multiple categories to provide the information for the first question. One suggestion was to use UCR categories to define the relevant offense categories so that respondents do not have to sum multiple sub-categories to complete the survey. This change was made in the final survey.


  1. Challenges Associated with Answering Backlog Questions With 20 Year Reference Period. Many of the agencies completing the Pilot Survey had difficulty answering the backlog questions that asked about open cases over the past 20 years. In doing so, several agencies commented that we could still collect the vast majority of relevant cases since it is very rare for cases older than 5 years to be re-opened for further investigation. As a result of these comments, the reference period for these questions was changed to the past 5 years.

Are you assuring confidentiality, or not? If yes, please cite statute.

In order to increase the response rate, NIJ does not intend to publish individual agency names or responses. NIJ intends to aggregate the responses for publication. This programmatic decision is not based on any specific statutes.


What are the planned publications, data analysis that you plan to produce? Include this with A16.

An NIJ final report will be produced that includes a full description of the study design, data, and methodology; a discussion of findings; and implications for policy and practice. The data will be analyzed using basic frequencies and cross-tabulations that describe key features about the forensic evidence process and associated case backlogs. Electronic versions of the survey data and supporting documentation, including a comprehensive codebook, will be produced. Specific agencies will not be identified in the NIJ report or the public-use data file.


How did you choose 49 state agencies? Why was 1 left out?

There are only 49 state law enforcement agencies. Hawaii is not included because it primarily performs court-related functions rather than investigative, law enforcement-based functions.

PART B:

What is your expected response rate?

An 85% response rate is expected


Provide cover letters and other supplementary documents.

The cover letter to be sent along with the first survey mailing is attached.


Why will you mail questionnaire if you specifically want web based responses? The language here indicates you won’t be identifying them, but it is not clear if you are assuring confidentiality or not. Please clarify.

While we will be emphasizing the web-based survey option as the preferred method of survey completion, we recognize that not all agencies will have computers readily available for this purpose. We have also found in past surveys that many respondents, particularly respondents in smaller agencies, still prefer to respond by mail or by fax. In order to ensure a greater response rate, we are making multiple modes of collection available.

B5: are you still in the process of developing the methodology or it is complete?


The survey methodology is complete. Our survey data collection plan is designed to maximize response rates and data completeness/accuracy while keeping respondent burden as low as possible. The survey will employ a mixed-mode data collection approach that includes a web-based survey, a hardcopy survey, and telephone follow-up.


A hardcopy of the survey will be sent to all sampled agencies at the onset of the data collection. A cover letter on NIJ letterhead and signed by NIJ will accompany this survey. These materials will describe the goals and importance of the project and provide general instructions for participation. The instructions will provide details on available methods of response including the URL for the project website and an agency specific user ID and password for the web survey. A project telephone number and email will also be provided for agencies that have technical or content-specific questions. One month after the first mailing, a second mail out will be sent to all non-respondents.


Two weeks after the 2
nd survey mail-out, the contractor will begin conducting follow-up calls to encourage respondents to complete the survey online, return the hardcopy survey, or complete the survey over the phone. We will use these calls to explore and overcome possible obstacles to their participation.


The project website will require respondents to login with a username and password to complete the survey. The website will run Secure Socket Layers (SSL) to encrypt data in transmission and data submitted via the website will be stored in a secure SQL database.


SURVEY INSTRUMENT:

Instructions- do you intend to say “agencies”, and not “agency’s”?

Yes.

Item 5: You may want to move this definition to item 3 and not repeat in item 5.

Agreed, please see revised survey instrument.

There are two items for 9, the second one as well as item 11 seem to be double barrel questions. Please consider breaking the questions into two pieces to get meaningful and useful responses.


Item 11 and item 12 were removed from the survey. As for item 9, we could not find a way to logically break it into two questions. The current wording for item 9 was understood by respondents during the pretest and so we decided to keep it as is.


File Typeapplication/msword
File TitleForensic:
AuthorAchanta_C
Last Modified Byjonesj
File Modified2008-05-05
File Created2008-05-05

© 2024 OMB.report | Privacy Policy