OMB Memo for Nonsubstantive Change

Non-substantive Changes to 2016 SPI_final memo.docx

Survey of Prison Inmates, 2016

OMB Memo for Nonsubstantive Change

OMB: 1121-0152

Document [docx]
Download: docx | pdf

MEMORANDUM


TO: Shelly Martinez

Desk Officer

Office of Statistical Science and Policy, Office of Management and Budget


THROUGH: Lynn Murray

Clearance Officer, Justice Management Division


William J. Sabol, Ph.D.

Director, Bureau of Justice Statistics


Anastasios Tsoutis

Chief, Recidivism and Reentry Unit, Bureau of Justice Statistics


FROM: Lauren E. Glaze

Statistician, Bureau of Justice Statistics


SUBJECT: Nonsubstantive Changes to OMB #1121-0152, Survey of Prison Inmates, 2016


DATE: November 24, 2015


The purpose of this memorandum is to inform the Office of Management and Budget (OMB) of the results of a test of the functionality of the Computer Assisted Personal Interview (CAPI) survey instrument that has been designed for the 2016 Survey of Prison Inmates (SPI). OMB clearance for the test and national 2016 SPI collection, ICR reference #201505-1121-001, was approved on August 18, 2015.

Based on findings from the CAPI feasibility test, conducted on August 31 – September 2, 2015, changes have been made to the CAPI instrument. Many of the changes are associated with programming modifications; other changes were identified through the experience of administering the questionnaire to inmates that were sampled to participate in the test. In addition to changes to the instrument, the test offered an opportunity to examine inmates’ reactions to the informed consent process. The test did not suggest any need to alter the consent process; observations from the CAPI feasibility test are described in this memorandum. Accounting for the changes to the instrumentation and the opportunity to administer the consent and survey, no change to the estimated burden (i.e., 50,099 hours for 66,938 responses) approved by OMB in the original clearance is necessary.

RTI International (RTI) was competitively awarded a cooperative agreement (Award No. 2011-MU-MU-K070) in FY 2011to collaborate with the Bureau of Justice Statistics (BJS) on this project and will serve as the data collection agent. All changes described in this memorandum and reflected in the attachments were approved by the Institutional Review Board at RTI on November 23 (Attachment A).

Background

The 2016 SPI will be a national, omnibus survey of prisoners age 18 or older within the United States who are incarcerated in confinement or community-based correctional facilities operated by or for state or federal governments. BJS has been conducting SPI periodically since the 1970s among state prisoners and the early 1990s among federal prisoners.1 The primary goal of SPI is to produce reliable national estimates of the characteristics of the prison population across a variety of domains, such as the severity of offenses committed and the characteristics of the incident; medical, mental health, and substance abuse and dependency problems; behaviors in prison including both rule infractions and participation in programs. A secondary goal of the 2016 SPI is to generate subnational estimates of prisoners within jurisdictions that have the largest prison populations (i.e., 100,000 or more) in the nation. Survey data will be collected through personal interviews with a representative sample of approximately 33,200 prisoners using CAPI.

The scope of information collected solely through SPI and the level of detail for some topics are not available from any other single data source, particularly for special populations such as drug and alcohol users and mentally ill prisoners. These data are critical to understanding the composition of the prison population and the changes over time, factors related to the changes observed, including the impacts of corrections policy and practice reforms, the risk inmates pose to correctional agencies and for recidivism and factors that mitigate risk, and the challenges inmates face upon reintegrating into the community.

In addition to collecting self-report data through the 2016 SPI, BJS plans to rely on administrative records, when possible, to supplement the self-report data, thereby minimizing respondent burden, and to conduct future studies of inmates. Records from BJS’s National Corrections Reporting Program (NCRP) will be used to provide more detailed information on the criminal justice status of prisoners at the time of their arrest (e.g., time in the community prior to the current incarceration), their current offense (e.g., counts, new court commitments, probation/parole violators), and their sentence (e.g., indeterminate or determinate). This records and survey data linkage will enable BJS to further examine and increase our knowledge about recidivism and reentry, such as additional risk factors (collected through SPI) that are associated with time to failure when released to the community as well as the impact of factors intended to mitigate negative outcomes.

BJS also plans to link the 2016 SPI data with records of arrest and prosecution (RAP) to provide more detailed information about the criminal histories of prisoners beyond the indicators collected in SPI and to conduct a future recidivism study of the prisoners in the 2016 SPI sample. While the SPI serves as a rich source of information that is not available through the RAP sheets or NCRP, together, these sources of data provide an opportunity to inform the criminal justice field, policymakers, and various other stakeholders about recidivism at a national level beyond static factors like demographic, offense, and prior criminal history information. The SPI also addresses dynamic risk factors, such as pro-social connections, pre-prison employment, mental health and substance abuse problems etc., and factors intended to minimize risk such as educational or job skills programs and treatment for mental health or substance abuse disorders.

Another effort BJS plans to pursue is to link the self-report data with other federal administrative data. The goal of this effort is to supplement the survey data with detailed information on pre-prison employment, earnings, benefits received and eligibility, and other external factors that could contribute to our understanding of incarceration and community reentry. Through an existing interagency agreement (IAA) that was executed and funded in 2014, BJS plans to work with the Center for Administrative Records Research and Applications (CARRA) at the U.S. Census Bureau’s Center for Economic Studies (CES) to accomplish this records linkage.

As part of the consent process to participate in the SPI study, inmates will be informed of the intent to combine their self-report data with these various administrative records for statistical purposes and will be asked to provide verbal consent to do so.

Instrument Changes

The primary purpose of the CAPI feasibility test was to ensure the CAPI survey instrument is correctly programmed for the national study and that inmates are properly routed through the surveys. Although not the focus of the test, administration of the survey to nearly 45 inmates (60 were sampled) also provided an opportunity to identify changes to the instrument that can help minimize burden and enhance data quality (e.g., ways to simplify text to ease the question delivery and promote retaining the attention of the respondent or to expand response options and formats to allow more accurate recording of responses). Findings from the test suggested several revisions to the programming, as well as some changes to question wording and instructions, response options, and question order. This memorandum includes a version of the questionnaire in which each revision is shown (Attachment B) and a final version in which the changes have been made to the instrument (Attachment C).

Programming changes. Programming changes were focused on assuring correct question routing and text fills into certain questions to provide greater specification to the respondent. Examples of these changes include:

  • Question SESB62


    • Original: [IF #UNDER18 = 1 or more] Now I would like for you to tell me about the contact you have had with any of your [#UNDER18] children who are under 18 years old. [12MON_FILL1] [DATE_ADMIT], what type of contact have you had with any of those children under 18 years old?


    • Revised: [IF #UNDER18 = 1 or more] Now I would like for you to tell me about the contact you have had with [CHILD_FILL2] [#UNDER18] [CHILD_FILL3]. [12MON_FILL1] [DATE_ADMIT], what type of contact have you had with [CHILD_FILL6]?


  • Question AU5


    • Original: [IF AU1=1] Had you been drinking any alcohol at the time of [the offense/any of the offenses] for which you are now incarcerated?


    • Revised: [IF AU1=1] Had you been drinking any alcohol at the time of [CONTROLLING_OFFENSE] for which you are now incarcerated?


  • Questions CJ9 and CJ10


    • Original: NO COMPARISON OF RESPONSES OF REPORTED ADMISSION DATE [CJ9] AND ARREST DATE [CJ10], THEREBY ALLOWING INCONSISTENT ANSWERS (E.G., ADMISSION DATE BEFORE ARREST DATE)


    • Revised: ADDED PROGRAMMED DATA CHECK TO COMPARE RESPONSES. IF REPORTED ADMISSION DATE BEFORE ARREST DATE, INTERVIEWERS ARE PRESENTED SCRIPT FOR ERROR RESOLUTION – “The admission date you just gave me, [DATE_ADMIT] is before your arrest date [DATE_ARREST] that I recoded earlier.  Is this correct?”


  • Question CJ11 – Lookup Table


    • Original: LOOKUP TABLE USED TO SUPPORT INTERVIEWERS’ CODING OF OFFENSES INCLUDED MANY INSTANCES WITH COMPLICATED AND SUPERFILOUS TEXT THAT WAS NOT NECESSARY FOR QUESITON ROUTING OR ANALYTIC PURPOSES (E.G., “Burglary 2nd degree. R was under house arrest and cut off bracelet and is serving rest of house arrest time in jail”). FORMAT CAUSED CONFUSION FOR INTERVIEWERS AND SLOWED QUESTION ADMINISTRATION.


    • Revised; LOOKUP TABLE HAS BEEN REVISED TO FACILITATE EASE OF ADMINISTRATION AND PROMOTE CODING ACCURACY.


Question wording and instructions. Question and instruction changes focused on ways to simplify items to reduce respondent burden, add clarification for the respondents to improve comprehension, and improve the accuracy of interviewer delivery of the question. The changes were limited in scope and did not impact the substance of the questions. Examples of these changes include:

  • Question AU1


    • Original: The next questions are about alcoholic beverages, such as beer, wine, brandy, and mixed drinks. These questions are about drinks of alcoholic beverages. Throughout these questions, by a “drink,” we mean a can or bottle of beer, a glass of wine or a wine cooler, a shot of liquor, or a mixed drink with liquor in it. We are not asking about times when you only had a sip or two from a drink.


    • Revised: The next questions are about drinks of alcoholic beverages, such as beer, wine, brandy, and mixed drinks. By a “drink,” we mean a can or bottle of beer, a glass of wine or a wine cooler, a shot of liquor, or a mixed drink with liquor in it.


  • Question P14


    • Original: Do you currently have a work assignment either inside the facility, on facility grounds, or outside the prison facility for which you leave the prison grounds?


    • Revised: Do you currently have a work assignment either inside the facility, on facility grounds, or away from the prison facility?


Response option changes. Response option changes focused on ways to provide additional definition/clarification, streamline the response options for respondents and interviewers, ensure that the response options were collectively exhaustive, and better align with analytic plans. Examples of these changes include:

  • Question SES4


    • Original: In what country were you born?

  1. UNITED STATES Go to SES6

  2. PUERTO RICO Go to SES6

  3. US VIRGIN ISLANDS Go to SES6

  4. GUAM Go to SES6

  5. AMERICAN SAMOA Go to SES6

  6. NORTHERN MARIANA ISLANDS Go to SES6

  7. OTHER COUNTRY Go to SES5

DK/REF

    • Revised: In what country were you born?

  1. UNITED STATES Go to SES6

  2. OTHER COUNTRY

(SES4_OTH) SPECIFY: _________ Go to SES5

DK/REF

  • Question P4


    • Original: [IF P1 = 2] What is the main reason you have not participated in any job training programs since you were admitted to prison [DATE_ADMIT]?


1 DOESN’T KNOW ANYTHING ABOUT PROGRAM

2 DOESN’T NEED PROGRAM

3 HASN’T BEEN OFFERED THE CHANCE TO ATTEND PROGRAM

4 HAS HEARD BAD THINGS ABOUT PROGRAM

5 STAFF DIDN’T WANT HIM/HER TO ATTEND PROGRAM

6. TOO BUSY TO ATTEND PROGRAM

7 NOT QUALIFIED/ALLOWED TO ATTEND PROGRAM

8 SOME OTHER REASON

P4_OTH__________________ (specify)

DK/REF


    • Revised: ADDITIONAL RESPONSE OPTIONS INCLUDE –

COULD NOT GET INTO PROGRAM/WAIT-LISTED

NO SPECIFIC REASON



In addition to these changes, the use of hardcopy “show cards” that display response options for a few selected questions has been incorporated into the interview procedures. (See Attachment D for two examples.) These cards will reduce respondent burden and improve data quality by providing respondent-ready access to the full array of response options without having to hear them repeated for each question.


Question order changes. The changes to the question order were primarily in the Drug Use section. With one exception, the changes will present the questions in “chronological” order for respondents. Presenting sets of questions in a sequence that follows a natural progression across time can help respondents restrict their frame of reference and focus more easily on the conditions required for each question. In addition, this approach can limit burden when affirmation of not engaging in a behavior dictates that it is not necessary to ask follow-up questions. The one exception is the last series of questions asking about drug use in the 12 months prior to admission; this series leads into the series that measures drug dependence and abuse in the 12 months prior to admission. Placing these two series adjacently eliminates the need for a respondent to consider two different constructs during the same 12 month period at two different points in the interview. Given these conditions, the original and revised ordering of the Drug Use section are as follows.

  • Original ordering

    • Used a substance 12 months prior to admission to prison

    • Ever used a substance

    • Used a substance at the time of offense

    • Used a substance in the 30 days prior to arrest



  • Revised ordering

    • Ever used a substance

    • Used a substance in the 30 days prior to arrest

    • Used a substance at the time of offense

    • Used a substance 12 months prior to admission to prison


In the revised version, program routing was updated such that if a respondent answered “no” to questions asking if he or she ever used a substance, he or she would not be asked subsequent questions related to other time periods, thereby reducing burden relative to the order in the previous version of the questionnaire.

As noted above, a version of the questionnaire in which each revision is shown and a final version in which the changes have been made to the instrument are presented in Attachment B and Attachment C, respectively. Exhibit 1 starting on page 8 presents a description of the types of changes made to each of the ten sections of the survey.

Administration of Gender Identity and Sexual Orientation Questions

The SPI questionnaire includes two questions to measure gender identity and one to measure sexual orientation.

  • Question PH3 − What sex were you assigned at birth, on your original birth certificate?: Male; Female; DK/REF


  • Question PH4− How do you describe yourself?: Male; Female; Transgender; Do not identify as male, female or transgender; DK/REF


  • Question PH5 − Which of the following best represents how you think of yourself?: Lesbian or gay; Straight, that is not lesbian or gay; Bisexual; Something else; You don’t know the answer; REF





Per OMB’s request, as part of the CAPI feasibility test, interviewers were asked to pay particular attention to these items and to note any unusual reactions from inmates as well as any questions or concerns raised by them once they heard the questions. The questions were asked of all participants and none reacted in any way to the questions (other than providing their responses). None of the respondents self-reported as transgender (PH4) and all reported being assigned the gender at birth (PH3) with which they identified at the time of interview (PH4). 

Note that in the national study, even if transgender respondents do not identify as “transgender” and therefore do not report “transgender” in PH4, if any inmates report being born one gender (PH3) and identify with the other at the time of the interview (PH4), theoretically analysts would be able to categorize them as transgender, using an implicit measurement that relies on discordance between responses to PH3 and PH4. The version of the questionnaire that was fielded for the CAPI feasibility test included an instruction to interviewers after PH4 to maximize data quality. The instruction is intended to identify cases where there is a discordance between PH3 and PH4 and requires interviewers to confirm that it is not due to a key stroke error. We have retained that instruction in the final version of the questionnaire for the national study. Also, after the feasibility test, BJS decided to add an interviewer instruction to PH3 to assist interviewers in providing clarification to inmates who may have questions about what it means to have a sex “assigned at birth” or others who may focus on the fact that they never saw their birth certificate.

It is important to note though that it may be revealed that the national SPI sample size is too small to report reliable statistics or draw any meaningful conclusions related to transgender inmates.3

Consent Process and Text


The consent text used in the CAPI feasibility test did not reveal any problems; no inmate raised any questions or expressed concern. However, subsequent to the test, BJS had discussions with staff from the Bureau of Prisons (BOP) and members of BOP’s IRB about the implementation of SPI in federal facilities. Those discussions led to recommendations that the text be revised to reduce the reading level of the consent form. There were concerns that prisoners may not have raised issues during the test because they may not have fully understood the request. In responding to these recommendations, it was determined that the text used for the August/September feasibility test was written at an 11th grade reading level, using the Flesch-Kincaid scale. Revisions were identified that reduced the reading level to a 9th grade level. Attachment E provides the consent text and interviewer script to be used if an inmate is reluctant to participate due to the planned linkage of the survey data to administrative records. The attachment includes one document (pages 1-3) showing the changes made to the form and one document (pages 4-6) showing the final version of the form.

Exhibit 1. Types of questionnaire changes, by section


Section 1 — Demographics (DEMO)

  • Programming revision:

    • Questions relating to the period of time spent in the U.S. Armed Forces have been updated to accept time measurements in more than one units (years, months, weeks, and days).

  • Interviewer notes were added to provide instruction for when an interviewer should use the show cards.

  • Questions relating to military service had repeatedly referenced “United States Armed Forces.” This has been changed to “U.S. Armed Forces” to simplify question delivery.


Section 2 — Criminal Justice (CJ)

  • Programming revisions:

    • Routing instructions/specifications were revised to correct some routing errors.

    • Programming for a new inmate type to define the controlling offense for the condition where an inmate has “NO offenses recorded” has been added.

    • The programmed classification of the type of controlling offense was updated to include Type 5, which is defined as an “other” category.

    • Error boxes were programmed into the instrument that will direct interviewers to specific conflicts (e.g., date of admission is earlier than date of arrest) and provide scripted instructions on how to probe for clarification.

    • Programming fills were added to multiple questions to specify the reference period or the type of offense listed in a previous question.

    • Increased the numeric range for some questions to assure all answer scenarios will be captured.

  • Notes were added to provide interviewers with scripted instructions for reminding prisoners of the reference period of several questions, probing for a clear response, and providing respondents with clarification of terms like “parole violator” or “good time.”

  • Instructions to the interviewer on how to probe some questions have been updated; see for example, “date of arrest” and “type of offense” in the questionnaire.

  • Minor wording changes were made to simplify or clarify question and response option intent.


Section 3 — Socioeconomic Characteristics (SES)

  • Programming revision:

    • Programming fills were added to some questions and response options. These fills provide clarification for the respondent when being asked about his or her behavior related to a single child or multiple children.

  • Notes were added to provide interviewers with scripted instructions for reminding prisoners of the reference period of several questions and to provide respondents with clarification of terms like “agency” or “institution”.

  • Question text was revised to clarify when to include email as an option of communication with children and when to exclude such communication.

  • Minor wording changes were made to simplify or clarify question intent.


Section 4 — Mental Health (MH)

  • Notes were added to provide interviewers with instructions for when to use show cards and when to remind prisoners about the reference period of a question. Other notes were added to give interviewers instructions on how to interpret/code respondents’ answers.


Section 5 — Physical Health, Treatment, and Disabilities (PH)

  • Notes were added to provide interviewers with instructions for how to probe answers related to gender identity and when to remind prisoners about the reference period of a question.

  • Two questions were removed that asked if difficulties experienced by the inmate while doing activities, concentrating, or making decisions were caused by physical problems or by mental or emotional problems. These were deleted due to limited utility and concerns of measurement error.


Section 6 ― Alcohol Use (AU)

  • Programming revision:

    • Programming fills were added to multiple questions to specify the reference period.

  • Notes were added to provide interviewers with instructions for when to use show cards and when to remind prisoners about the references period of a question.

  • Wording changes were made to the introduction of this section to simplify and clarify the intent of the questions related to alcohol use.


Section 7 — Drug Use (DU)

  • Programming revision:

    • Programming routing was updated so that if a respondent answered “no” to questions asking if he or she “ever” used a drug, he or she would not be asked subsequent questions related to other time periods.

  • Notes were added to provide interviewers with instructions for when to use show cards and when to remind prisoners about the reference period of a question.

  • The order of the drug use questions was changed based on the reference periods of the questions. The new order of the questions is: ever, 30 days prior to arrest, time of offense, and then 12 months prior to admission.

  • Minor wording changes were made to simplify or clarify question intent.


Section 8 — Drug and Alcohol Treatment (DTX)

  • Programming revision:

    • Programming fills were added to specify the type of counseling received as being related to alcohol use, drug use, or use of both alcohol and drugs.

  • Questions were edited to clarify the definitions of treatment units and facilities.


Section 9 ― Rule Violations and Complaints (RV)

  • Programming revision:

    • The programmed question response range was updated to allow “0” to be a valid response.

  • Wording changes were made to clarify the intent of the questions related to breaking specific types of prison rules.


Section 10 – Programs, Services, and Work Assignments (P)

  • Two response options were added to a question asking for reasons why the respondent did not participate in an education program.

  • Wording changes were made to clarify the intent of a question related to off-grounds work assignments.



Attachments


  • Attachment A – IRB Approval Notice of 2016 SPI Changes_Amend 11-16-15

  • Attachment B – 2016 SPI Questionnaire –Track Changes_11-18-15 (V2)

  • Attachment C – 2016 SPI Questionnaire –Clean_11-18-15 (V2)

  • Attachment D – 2016 SPI Show Card Examples

  • Attachment E – 2016 SPI Consent_Track Changes_Clean

1 Prior iterations of BJS’s national survey of prisoners were known as the Survey of Inmates in State and Federal Correctional Facilities (SISFCF). The first survey of state prisoners was fielded in 1974 and periodically thereafter in years 1979, 1986, 1991, 1997, and 2004. The first survey of federal prisoners was fielded in 1991 along with the survey of state prisoners, and both have been fielded at the same time since 1991.

2 The wording of question SESB6 is heavily dependent on answers to previous questions in the interview. A hypothetical wording of the original and revised questions, assuming the same responses to the earlier questions follows:

Original: Now I would like for you to tell me about the contact you have had with any of your three children who are under 18 years old. During the past 12 months, that is since September 15, 2015, what type of contact have you had with any of those children under 18 years old?

Revised: Now I would like for you to tell me about the contact you have had with any of your three children who are under 18. During the past 12 months, that is since September 15, 2015, what type of contact have you had with any of those children?


3 BJS administered the National Inmate Survey (NIS) in 2007, 2008-2009, and 2011-2012 and in all three iterations, the questionnaire included an item that was used to measure transgender. Of all three iterations of NIS, the 2011-2012 sample of prisoners who completed interviews was the largest (about 44,000) and the number of inmates in the sample that reported identifying as transgender was also the largest (101), but transgender inmates represented only 0.2% of the sample. Based on assumptions about nonresponse, BJS is expecting that the number of completed interviews resulting from the 2016 SPI will be about half (about 23,200) that of the 2011-2012 NIS.

7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEdwards_s
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy