MEMORANDUM
TO: Robert Sivinski
Office of Statistical and Science Policy
Office of Management and Budget
THROUGH: Jeffrey H. Anderson
Director
Bureau of Justice Statistics
Allen Beck
Senior Statistical Advisor
Bureau of Justice Statistics
Devon Adams
Acting Deputy Director
Bureau of Justice Statistics
Kevin Scott
Acting Chief, Prosecution and Judicial Statistics Unit
Bureau of Justice Statistics
FROM: Suzanne M. Strong
Statistician, Prosecution and Judicial Statistics Unit
Bureau of Justice Statistics
SUBJECT: BJS request to cognitively test the National Survey of Prosecutors (NSP) under the OMB generic clearance agreement (OMB Number 1121-0339).
The Bureau of Justice Statistics (BJS) is planning to collect data under the National Survey of Prosecutors (NSP) (previous OMB approval 1121-0149, expired 09/30/2017). The NSP has been fielded as a census (2001, 2007, attempted in 2014) and as a survey (1990, 1992, 1994, 1996, 2005). The current iteration of NSP is designed as a survey. The NSP was last successfully collected in 2007. In 2014, BJS attempted to collect data from a census of prosecutor offices under the NSP, but the survey failed to reach an adequate response rate (just over 40% of offices responded). BJS determined that one of the main contributors to the lack of response was the survey’s burden. The 2007 survey had an average response burden of 30 minutes per respondent; the 2014 survey doubled the burden with an average response of 60 minutes per respondent. BJS has redesigned the survey and requests generic clearance to cognitively test the NSP with 25 respondents in order to ensure that the survey is not overly burdensome.
BJS is requesting clearance for one round of cognitive testing of the 2019 NSP instrument. The project team will conduct 25 telephone interviews. To prepare for this testing, BJS, Research Triangle Institute (RTI) and National District Attorneys Association (NDAA) examined the item response rates to the 2007 and 2014 NSP. The project team dropped questions with very low response rates, flagged problematic questions for review, and prepared a preliminary instrument. BJS then convened a panel of prosecutors to review and revise the preliminary NSP instrument. The resulting instrument is the focus of this cognitive test. The instrument being tested includes five sections: staffing, budget, caseload, digital and forensic evidence, and diversion and problem-solving courts (attachment A). BJS believes the most important section, and the most difficult section, will be caseloads (section C).
BJS intends to draw a sample of 240 offices stratified by population size, region, membership in the NDAA, and whether the chief prosecutor is elected or appointed (table 1). Population size is correlated with size of prosecutor office. Membership in NDAA will likely affect whether or not an office is convinced to complete the survey by NDAA endorsement, with outreach likely to be more successful in NDAA-member offices. The other two dimensions – region and whether chief prosecutor is appointed or elected – affect the representativeness of the survey responses.
Table 1. Number of Offices on Recruitment List and Completed Interview Target, by Sample Strata |
||||||
Office size |
Geographic region |
Recruitment List Total |
Completed CI Target |
|||
NE |
SE |
MW |
W |
|||
|
8 |
8 |
8 |
8 |
24 |
3 |
|
12 |
12 |
12 |
12 |
48 |
4 |
|
12 |
12 |
12 |
12 |
48 |
5 |
|
28 |
28 |
28 |
28 |
112 |
13 |
TOTAL |
60 |
60 |
60 |
60 |
240 |
25* |
BJS will target specific respondent characteristics as part of its goal of 25 completed interviews. In each of the office size strata (1-4), the first purposively targeted respondents will be the non-NDAA member offices, with the goal of obtaining 2-5 completed interviews in at least 3 of the strata. Once 2-5 non-NDAA members from at least 3 strata are scheduled, BJS will focus on obtaining at least 2 interviews from 2 strata in the three states with appointed chief prosecutors (Alaska, which has only one office for the entire state, Connecticut, which has a headquarters office but also has 13 regional offices, and New Jersey). The proposed completed interviews would follow one of the following three scenarios:
Central office in CT and 1 NJ agency serving a pop of less than 810K
AK and 1 NJ agency serving 810K or more
AK and 1 NJ agency serving < 250K
In addition to these targeted groups of respondents, BJS will sample from the remaining offices until the target number of interviews per stratum are scheduled. Fifty of the 240 sample offices will be contacted in the first wave, with replacements identified on a rolling basis for those that refuse, withdraw consent, or who BJS is unable to contact.
Participation in the cognitive test is voluntary. All participants will be 18 years of age or older. Prior to participation, the office will be provided an invitation letter that describes the purpose of the cognitive test, instructions as to how to answer the survey, why the office was selected to participate in the test, and the expected length of time required for the interview after completing the survey (attachment B).
BJS will send the letters via email for offices where BJS has valid email addresses, and by mail to offices where BJS does not have a valid email address. The letters will be addressed to the chief prosecutor in the office. The participant is likely to be the chief prosecutor, or designated to a deputy prosecutor knowledgeable about the entire office. Once the letters have been sent, the project team will follow-up with a phone call to request participation and schedule time for the 60-minute cognitive interview (attachment C). If the potential participant refuses, they will be thanked for their time and a replacement respondent matching the initial respondent will be contacted immediately. Similarly, if the participant initially consents and then withdraws consent, BJS will thank the participant and draw a similar replacement. BJS will continue this process until 25 participants consent and are interviewed.
After obtaining consent, respondents will be sent a letter via email or mail with instructions to complete the survey (attachment D) and the survey itself. Respondents will be instructed to complete the survey questions that are easy to answer, and to estimate the time and effort to answer questions that are not readily answerable. All participants will be asked to complete section C of the survey, which BJS believes will be the most challenging section. The participant will be asked to complete the instrument in this manner prior to the cognitive interview. Respondents will be asked to return the survey so the cognitive interview process runs more smoothly.
The cognitive interviews will include questions about how the respondent interpreted the question, how much time was necessary to answer the question, and whether the question required the respondent to research the response or involve other staff members. The interviewer will ask the respondent if any questions are unclear, if there are any missing response options, any missing questions, or any questions that should be revised or deleted.
Participants will not receive any compensation for their time or interview, but a thank you email will be sent to each respondent within 48 hours of the interview. The project team will review the feedback and determine any revisions to the survey instrument.
BJS plans to begin the cognitive test process in January 2020. BJS will email (or mail) potential participants with an invitation letter (attachment B). Offices that agree to participate will receive the survey (attachment A) and instructions for completion (attachment D), which will instruct the participant to complete only the items that are able to be answered without research or discussing the content with other staff. The respondent will be asked to estimate the time necessary to research and discuss those questions during the cognitive interview. Approximately one week later, the project team will conduct the cognitive interview (attachment E). Table 2 summarizes the cognitive test process.
|
Table 2. Expected Cognitive Test Protocol
|
|||
Task # |
Task Description |
Number of participants contacted |
Method of contact |
Timing of contact |
1 |
Contact first round of potential participants |
50 |
Email, mail |
Day 1 |
2 |
Phone call to ascertain participation and schedule cognitive interview |
50 |
Phone |
Day 5 |
3 |
Contacts to replace participants who decline |
Up to 215 more until 25 total interviews scheduled |
Email, mail |
Up to day 14 |
4 |
Phone call to ascertain participation and schedule cognitive interview with replacement participants |
25 total |
Phone |
Up to day 28 |
5 |
Cognitive interviews |
25 total completed |
Phone |
Day 5-day 40 |
The burden hour estimates for the participants are provided in Table 3. The project team expects the initial contact for recruitment and scheduling phone call to take ten minutes per participant. The desired burden for the 2019 NSP is 30 minutes per respondent, although BJS is aware that this cognitive test will further refine the burden estimate for the full clearance request sometime early in 2020. The cognitive interview will require up to a 60-minute telephone interview with each participant. The total burden time for all contacts under this request is 42 hours.
|
Table 3. Burden Hour Estimates for Respondents
|
|||
Task # |
Task Description |
Number of participants |
Estimated burden (in minutes) |
Total burden (in hours) |
1 |
Initial contact to ascertain participation and schedule the cognitive interview |
25 |
10 |
4.2 |
2 |
Expected burden to complete the NSP |
25 |
30 |
12.5 |
3 |
Cognitive interview |
25 |
60 |
25 |
|
Total burden |
41.7 hours |
RTI International’s Institutional Review Board (IRB) determined the pilot testing protocol to be compliant with informed consent and data confidentiality standards (attachment F).
Questions regarding any aspect of this project can be directed to:
Suzanne M. Strong
Statistician
Bureau of Justice Statistics
U.S. Department of Justice
810 7th Street, NW
Washington, DC 20531
Office Phone: (202) 616-3666
E-mail: [email protected]
Attachment A: NSP instrument
Attachment B: Cognitive test invitation letter
Attachment C: Cognitive Recruitment Call Script
Attachment D: Survey cover letter
Attachment E: Cognitive interview script
Attachment F: IRB approval
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Davis, Elizabeth |
File Modified | 0000-00-00 |
File Created | 2021-01-14 |