CPFFCL_OMB Supporting Statement - Part B

CPFFCL_OMB Supporting Statement - Part B.docx

2020 Census of Publicly Funded Forensic Crime Laboratories

OMB: 1121-0269

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT – Part B



2020 Census of Publicly Funded Forensic Crime Laboratories (CPFFCL)



  1. Universe and Respondent Selection


The 2020 Census of Publicly Funded Forensic Crime Laboratories (CPFFCL) will use procedures successfully employed in the prior administration of 2014 CPFFCL to identify the universe of eligible laboratories. This data collection is directed to federal, state, county, and municipal crime laboratories that are funded solely by the government or whose parent organization is a government agency. The CPFFCL includes agencies that employ one or more full-time scientists (1) with a minimum of a bachelor’s degree in chemistry, physics, biology, criminalistics, or a closely related forensic science field, and (2) whose principal function is examining physical evidence in criminal matters and providing reports and testimony to courts of law regarding such evidence.


The initial frame for the 2020 CPFFCL was provided by BJS to RTI based on the 2014 CPFFCL. In addition, BJS supplied RTI with list of recent recipients of grants under the Paul Coverdell Forensic Science Improvement Grants Program to determine if any eligible laboratories were on that list that were not on the 2014 CPFFCL frame. The Drug Enforcement Administration granted BJS permission to use its NFLIS-Drug (OMB Clearance #1117-0034) frame to ensure laboratories not on the 2014 CPFFCL frame but surveyed as part of NFLIS were not overlooked. Based on these resources, BJS plans to contact approximately 500 laboratories for the 2020 CPFFCL.


Based on an OMB generic clearance approval (1121-0339), RTI will soon conduct a frame verification effort. Of the 409 laboratories in the 2014 CPFFCL, 172 were surveyed as part of the NFLIS-Drug Survey of Crime Laboratory Drug Chemistry Sections survey in 2019. All but seven of those laboratories had at least some interaction with the NFLIS study team through prompting calls, data quality follow-up calls, or nonresponse follow-up calls. The verification effort will include those seven laboratories that were not contacted for or did not respond to the 2019 NFLIS survey and the 237 forensic laboratories that were not part of the 2019 NFLIS survey effort because they lacked drug chemistry sections. As part of obtaining this information, the project team will conduct internet searches and use the list of recent Paul Coverdell grantees. Because this is a community that is regularly surveyed by BJS and by DEA, it is not anticipated that internet searches and list comparisons will yield more than 50 new laboratories. Thus, we anticipate about 300 laboratories will require verification calling efforts. BJS expects to the verification effort to commence in March 2021.


The information gleaned from this effort will be combined to produce a current universe list of laboratories. Any duplicates will be identified and removed by examining the file for identical or nearly identical addresses, phone numbers, or names of the laboratory director. From this verification effort, BJS is estimating up to 500 laboratories may be contacted for the 2020 CPFFCL.


The 2020 CPFFCL will be a census rather than a sample survey. The reasons for this decision include:


  • The eligible population is approximately 500 agencies. Moving to a sample survey with a universe of this size will not result in significant cost savings given the stratification dimensions needed to capture critical aspects of the universe, such as size and government oversight. The differences in laboratories and laboratory systems complicates efforts to create representative sampling strata. Table 5 shows the number of these laboratory types by jurisdiction type based on the 2014 CPFFCL.


Table 6. Distribution of Publicly Funded Forensic Laboratories by Jurisdiction Type, 2014

Type of Jurisdiction

All Laboratories in the 2014 CPFFCL

All Laboratories

409

Federal

39

State

193

County

98

Municipal

79

Source: Bureau of Justice Statistics, Census of Publicly Funded Forensic Crime Laboratories (CPFFCL), 2014.


  • A census provides BJS with the opportunity to show how laboratories vary. Being able to compare laboratories is particularly important considering the variability that exists among these agencies in terms of administration, caseload, policies, procedures, resources, staffing, and infrastructure (as shown in the 2014 CPFFCL reports: https://www.bjs.gov/content/pub/pdf/pffclrs14.pdf; https://www.bjs.gov/content/pub/pdf/pffclqap14.pdf )

  • A census provides an opportunity to build a foundation for conducting future surveys of laboratories by other federal agencies and by the research community. Completing the census, for example, will provide the information necessary to produce samples based on a more comprehensive and fuller understanding of how each laboratory operates given the variability that exists within and across states. Generating samples of laboratories without this crucial information would be more time intensive and costly.


  1. Procedures for Collecting Information


The CPFFCL team will use a multi-mode data collection approach with web as the primary mode (see Attachment 3 for example screenshots) and hard copy data collection as an alternative. CPFFCL will use mail, email, and telephone follow-up as needed. The data collection period is planned for approximately seven months and will involve initial invitations, several reminders, and an end-of-study letter. There will be data quality follow-up and non-response follow-up. To promote an efficient and cost-effective data collection, the web-based submission method will be promoted over submitting printed copies of the questionnaire and initial mailing will not include a copy of the paper instrument, but rather instructions to complete the questionnaire online. Table 7 shows the 2020 CPFFCL contact schedule.


Table 7. 2020 CPFFCL Contact Schedule

Week

Contact Description

Attachments

0

Mailed pre-notification letter

6

2

Mailed survey invitation letter and endorsement letters

7, 22

3

Email invitation and endorsements letters

8, 22

6

Reminder 1 – postcard and email

9, 10

6

Start telephone/email data quality follow up

16

8

Reminder 2 – letter/email with questionnaire and business reply envelope

11, 2

11

Start telephone prompting for incomplete respondents and non-respondents

17, 18

11

Reminder 3 – mail and email

12, 13

14

Reminder 4 – letter

14

17

Reminder 5 – postcard/email

15

22

End-of-study email/letter reminder

19, 20

Variable

Thank you letter

21


Pre-notification letter. The data collection period will open with a pre-notification letter (see Attachment 6), on BJS letterhead and signed by the BJS director, which will be sent to all laboratory directors. The pre-notification letter highlights the importance of the 2020 CPFFCL and encourages participation. It also provides contact information that can be used to obtain additional information.


Invitation package and email message. Two weeks after the pre-notification letter is sent, RTI will mail an invitation package including a cover letter (Attachment 7) to the laboratory director of all eligible laboratories. This letter, signed by the BJS program manager, will include the survey web address, laboratory specific log-in credentials, and instructions to complete the web survey. The letter will stress the purpose and importance of CPFFCL and the need for participation. It will notify the recipient of the survey due date and provide RTI and BJS contacts for any questions or comments. The invitation package will include a letter of support from the American Society of Crime Laboratory Directors (Attachment 22).


Within a week after the mailed invitation letter is sent, an email invitation (see Attachment 8) will be sent to those directors for whom an email address is available. This invitation is closely aligned with the mailed invitation letter but contains a hyperlink to the web survey.


Mail and email reminders. Starting three weeks after the invitation package is sent, the project team will begin to send reminders to respondents alternating between mail and email to keep survey reminders fresh according to the following schedule—


  • Week 6: Three weeks after the invitation package is sent, the project team will send the first reminder via postcard to non-respondents for whom it has no email address (Attachment 9). Respondents for whom the project teams have an email will also receive a reminder via email (Attachment 10).

  • Week 8: Three weeks after the first reminder, the project team will mail a second reminder via letter (Attachment 11) including the questionnaire (Attachment 2) and a business reply envelope with which to return the completed form.

  • Week 11: Three weeks after the second reminder, the project team will send a third reminder via email (Attachment 12) or via letter (Attachment 13) for those laboratories that do not have email addresses.

  • Week 14: Three weeks after the third reminder, the project team will send the fourth reminder via letter (Attachment 14)

  • Week 17: Three weeks after the fourth reminder, the project team will send the fifth reminder via postcard (Attachment 15).


Telephone and email data quality follow-up. Approximately 6 weeks after data collection begins, RTI will begin reviewing the data received. As data discrepancies or missing data values are discovered, RTI staff will follow up with respondents via telephone or email to clarify responses or obtain missing information (Attachment 16).


Telephone prompting for incomplete responses. Approximately 11 weeks into data collection, RTI will be begin telephone prompting for incomplete respondents. As respondents have the ability to break off from a survey and pause, this effort will be geared toward prompting respondents who have begun but did not complete their surveys (Attachment 17). These communications will also allow the project staff to assess whether respondents have any issues accessing and completing the survey. Responses will be considered incomplete if a form is partially completed, and remains incomplete for three weeks.


Telephone and nonresponse follow-up. Eleven weeks into the data collection, telephone follow-up with non-respondents will begin (Attachment 18). Respondents will be reminded of the purpose and importance of the survey and informed of the goal of receiving a completed survey from each laboratory. They will be asked to submit the survey online but will be sent another hard copy version of the survey if requested. Up to 10 calls will be made by RTI until surveys are received (or a laboratory refuses to participate) and will reference the most recent communication (e.g., reminder letters, reminder emails, etc.). This effort will also be used to capture survey items deemed critical from non-respondents.


Mail and email the end-of-study notification. In week 22 of the data collection (five weeks after the fifth reminder), RTI will send an end-of-study notification both via mail and email to notify non-respondents that the study is coming to an end and that their response is needed within two weeks (Attachments 19, 20). Data collection will continue for approximately three more weeks to allow for receipt of any remaining questionnaires.


Thank-you letters. Beginning two weeks after the invitation package is sent out, RTI will mail thank you letters to those respondents who have completed the survey (Attachment 21). These letters will thank them for the time and effort necessary to complete the survey and once again emphasize the importance of CPFFCL data. Mailings of thank you letters will continue until the survey ends and all respondents have been mailed a letter.


Data Editing. As part of the data quality follow-up, RTI will attempt to reconcile missing or erroneous data through automated and manual edits of questionnaires within two weeks of completion. In collaboration with BJS, RTI will develop a set of edits that will use other data provided by the respondent on the survey instrument to confirm acceptable responses or identify possible errors due to missing or inconsistent data elements. For example, if a screening question was left blank, but the follow-up questions were completed, a manual edit would be made to indicate the intended positive response to the screening question. Through this process, RTI can quickly identify which hard copy questionnaires require follow-up and indicate the items that need clarification or retrieval from the respondent.


Data Retrieval. When errors due to missing or inconsistent data elements are found during data review and editing, attempts to verify or collect the correct information with the respondent will occur. When it is determined that data retrieval is needed, a project staff member will contact the respondent for clarification. Throughout the data retrieval process, RTI will document the questions needing retrieval (e.g. missing or inconsistent data elements) then request clarification on the provided information, obtain values for missing data elements, and discuss any other issues related to the respondent’s submission.


Data Entry. Respondents completing the survey via the web instrument will enter their responses directly into the online instrument. The instrument will have quality control checks programmed in to enforce skip patterns and check for out-of-range values. For those respondents returning the survey via hardcopy (mail or fax), the survey responses will be hand keyed by trained and certified data entry personnel at RTI’s Raleigh Operations Center (ROC). Twenty percent of these manually coded surveys will be entered a second time and have results compared as a form of quality control. Additionally, supervisors will conduct random spot checks of all manually entered surveys. Any anomalies, inconsistencies, or unexpected values will be investigated and resolved. Throughout the remainder of the data collection period, RTI and BJS staff will conduct regular data frequency reviews to evaluate the quality and completeness of data captured in both the web and hard copy modes. Data files will be made available to BJS via an SFTP site when response rates reach 50%, 75%, and 90%.


  1. Methods to Maximize Response Rates


The 2014 CPFFCL achieved an 88% response rate. BJS and RTI will undertake various activities to ensure that high response rates are again achieved for the 2020 CPFFCL. The CPFFCL team will use a web-based instrument supported by various online help functions to maximize response rates. A toll-free number will also be provided for both substantive and technical assistance. RTI staff will respond to these requests for assistance.


The survey instrument was reviewed to ensure the collection of the most pertinent information, removing any unnecessary questions to reduce burden. An item-level review of the 2020 CPFFCL was done to look for patterns of non-response (Attachment 24). Because the item response rate on the 2014 CPFFCL was high—the vast majority of survey items achieved over 90%—few items were flagged as problematic. The questionnaire was also reviewed for ease of use, flow, and additional survey methodology best practices to ensure ease in administration by expert panel reviewers and by BJS and RTI. BJS and RTI worked with a group of subject matter experts from laboratories that varied in characteristics (type of laboratory and geographical location) to further clarify questions that could create confusion, eliminate questions that were not relevant to the field, and to revise questions that were out of date. Because laboratories can use differing vocabulary to describe similar procedures, additional directions and definitions have been added to those questions to provide a direct example of the information BJS sought to improve the survey.


BJS will encourage respondents to submit their information via the web survey. Close attention has been paid to the formatting of the web survey instrument. The survey is formatted in a user-friendly manner and in such a way that respondents can complete it on a computer or tablet, through various browsers (e.g., Internet Explorer, Firefox, and Google Chrome), and at various resolutions or screen sizes. The web survey saves respondents’ answers automatically and gives them the option to save their progress, leave the survey and resume at a later time. Data will be checked as it is collected for completeness and logical consistency of responses. The online survey is programmed with data consistency checks and prompts to ensure logical consistency like enforcement of skip patterns and out-of-range responses. This will help reduce the need for data quality follow up calls following respondents’ submission of the questionnaire. However, where information does appear to be incomplete or inconsistent RTI project staff will follow up with respondents via telephone (Attachment 16).


To encourage participation and obtain higher response rates, project staff will conduct outreach and follow-up procedures at various points during the data collection. This includes reminders to take the survey (Attachments 9–15), data quality follow-up (Attachment 16), survey completion prompting (Attachment 17), and nonresponse follow-up calls (Attachment 18). Throughout the data collection, resources will be available to help respondents complete the survey. For technical help this includes telephone and email Help Desk support, and for overall questions or concerns with the survey, this includes providing respondents with BJS contacts. The non-response follow-up script also includes prompts for non-respondents to provide items deemed critical for the CPFFCL to capture. These items are highlighted on the survey form (Attachment 2).


The 2014 CPFFCL received widespread support by the American Society of Crime Laboratory Directors, which was enlisted to help develop the questionnaire and to encourage individual laboratory to respond to the survey. This continues to be case for 2020 CPFFCL, and their letter of support will be included in the survey invitation package (Attachment 22).


To promote 100% item completion by respondents, RTI will monitor item response rates as surveys are submitted. RTI has a survey management system linked to the web-based application that will flag missing items and invalid responses. RTI will also flag missing items on hard copy submissions on a flow basis. The data collection manager will oversee phone and email outreach to respondents to clarify missing or invalid responses and to take corrective action (Attachment 16). Changes to survey responses obtained through this follow-up effort will be tracked and entered in the data collection database.


As the 2020 CPFFCL is planned to be a complete census of laboratories, sampling weights are not necessary. However, in the event unit response rates are lower than anticipated, some weighting of the data may be required. The extent of this step will depend on response rates within sub-groups of the respondent pool. Response rates within jurisdiction size grouping and laboratory type will be reviewed to determine if a weighting adjustment is necessary. To ensure that nonresponding agencies are not different than those that participate, a nonresponse bias analysis will be conducted if the unit response rate falls below 80 percent.


Imputation procedures will be used to address issues of item nonresponse. The 2014 CPFFCL may provide the team with a cold-deck imputation solution for item nonresponse where there is item overlap between the two administrations. As needed, BJS will consider multiple imputation methods, flagging observations and values that were imputed in the archived dataset.


  1. Testing of Procedures


The proposed new questions in the 2020 CPFFCL data collection instrument and the revisions made to those retained from 2014 were reviewed by BJS and RTI staff, suggested and discussed by the Expert Panelists, and cognitively tested.


BJS and RTI cognitively tested the instrument with 23 laboratories. Laboratories varied by geographic region, size, and population served. To conduct the cognitive interviews, RTI talked with respondents via telephone for about an hour while respondents read through the questionnaire item by item. Each item had at least one probing question the RTI interviewer would ask the respondent to assess clarity and ease of answering the questions. The cognitive testing provided insight into whether respondents fully understood questions and provided expected answers, informed our phrasing and response options, and provided an estimate of the burden (Attachment 25). The instrument was modified to increase comprehension as a result of these interviews.


In addition, RTI and BJS will thoroughly test the web-based survey administration system through systematic user testing, including testing skip patterns, attempting to “break” the instrument, and back-end data checks on entered responses.


The 2020 CPFFCL will maintain similar respondent recruitment and support procedures as the 2014 CPFFCL administration, which was field tested and successfully employed. We expect that response rates for the 2020 CPFFCL will at a minimum match the 88% response rate set by the 2014 administration and potentially, achieve over a 95% response rate. RTI has previously used web-based survey instruments that are substantially similar to the format in design for the 2020 CPFFCL, including the 2018 Census of Medical Examiner and Coroner Offices (CMEC; OMB# 1221-0296), the Law Enforcement Management and Administrative Statistics survey (LEMAS; OMB #1121-0240) and Census of State and Local Law Enforcement Agencies (CSLLEA; OMB #1121-0346) administrations. The web-based survey administration procedures successfully employed in the CMEC, LEMAS, and CSLLEA survey designs have been substantially retained but modified as necessary to accommodate the 2020 CPFFCL instrument and respondents.



  1. Contacts for Statistical Aspects and Data Collection


  1. BJS contacts include:

Connor Brooks

CPFFCL Program Manager

Bureau of Justice Statistics

202-514-8633

[email protected]


Kevin M. Scott, Ph.D.

Law Enforcement Statistics Unit Chief

Bureau of Justice Statistics

202-616-3615

[email protected]


  1. Persons consulted on data collection and analysis:


Hope Smiley-McDonald, PhD

Program Director

3040 East Cornwallis Road

Research Triangle Park, NC 27709

919-485-5743

[email protected]


Jeri Roper Miller, PhD

Center Director

3040 East Cornwallis Road

Research Triangle Park, NC 27709

919-485-5685

[email protected]




Attachments:

  1. 34 U.S.C. § 10131-10132

  2. 2020 CPFFCL questionnaire: Formatted paper instrument

  3. 2020 CPFFCL questionnaire: Example screen shots of web instrument

  4. 60-day Federal Register notice

  5. 30-day Federal Register notice

  6. Pre-notification letter

  7. Survey invitation cover letter

  8. Survey invitation email

  9. 1st reminder – postcard

  10. 1st reminder – email

  11. 2nd reminder – letter

  12. 3rd reminder – email

  13. 3rd reminder – letter

  14. 4th reminder – letter

  15. 5th reminder – postcard

  16. Data quality follow-up telephone script

  17. Sample call script for telephone prompting calls

  18. Sample call script for nonresponse telephone calls

  19. End-of-Study letter

  20. End-of-Study email

  21. Thank you letter

  22. Letter of Support: American Society of Crime Laboratory Directors

  23. Data quality assessment of 2014 CPFFCL

  24. Cognitive testing report

8


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authoradamsd
File Modified0000-00-00
File Created2021-03-29

© 2024 OMB.report | Privacy Policy