Collection of Information Employing Statistical Methods
1. Universe and Respondent Selection
Universe. The target population for the 2016 Survey of Prison Inmate (SPI) is all inmates being held in a prison in the United States during 2015 and 2016. The sampling population for the survey consists of 2,001 prison facilities housing approximately 1,399,000 male inmates and 103,600 female inmates. A prison is a correctional facility administered by or for a state or federal government. The population universe is based on the 2012 Census of State and Federal Correctional Facilities (Census). The universe of facilities includes both confinement facilities (i.e., a facility where less than 50% of the inmates are regularly permitted to leave unaccompanied) and community-based facilities (i.e., a facility where 50% or more of the inmates are regularly permitted to leave unaccompanied). The 2012 Census has been updated to account for known changes in facilities that were planned since the completion of the Census and prior to selecting the sample. The types of changes made to the frame include:
Adjusting the population size of a facility to account for a planned change in population.
Removing facilities that are planned to close.
Adding new facilities that are known to be operating prior to selecting the sample.
The population of facilities is broken out by facilities operated by or for a state department of corrections (DOC) and facilities operated by or for the Federal Bureau of Prisons (BOP). Furthermore, a known set of facilities’ primary function is medical and mental health services. These facilities house inmates that have more serious medical or mental health issues. Table B1 presents the distribution of facilities and the number of inmates housed within those facilities by the government type (state or federal), the sex of inmates housed in the facility (males only, females only, or both males and females), and facilities with a primary function of medical or mental health services.
Table B1. Number of facilities and inmates held, by government type, sex housed, and primary function of medical or mental health services
Government type |
Sex housed |
Medical/mental health services |
Number of facilities |
Number of inmates held |
|
Male |
Female |
||||
State |
Male only |
No |
1,309 |
1,108,164 |
0 |
Female only |
|
184 |
0 |
75,519 |
|
Both |
|
243 |
40,760 |
9,402 |
|
Male only |
Yes |
53 |
64,980 |
0 |
|
Female only |
|
7 |
0 |
5,278 |
|
Both |
|
14 |
6,391 |
595 |
|
Federal |
Male only |
No |
166 |
171,313 |
0 |
Female only |
|
18 |
0 |
11,334 |
|
Both |
|
1 |
1,304 |
53 |
|
Male only |
Yes |
5 |
6,096 |
0 |
|
Female only |
|
1 |
0 |
1,397 |
|
Both |
|
0 |
0 |
0 |
|
Total |
2,001 |
1,399,008 |
103,578 |
Sample design. The 2016 SPI sample is a two stage self-weighting design. In the first stage, a sample of facilities will be selected with probability proportionate to a size (PPS) measure. In the second stage, a simple random sample of inmates will be selected from each sampled facility. The 2016 SPI is designed to obtain estimates by the following type of inmate:
state, male inmate
state, female inmate
inmates housed in states with over 100,000 inmates as of December 31, 2013, which includes Texas, California, and Florida1
inmates housed in federal facilities
For state male and state female estimates, the design is powered to obtain estimates with a precision level equal to the minimum of the relative standard error (RSE) for the estimate in the 2004 SPI (formerly known as the 2004 Survey of Inmates in State and Federal Correctional Facilities) or an RSE equal to 0.10 for key outcomes. Estimates for inmates in state jurisdictions with over 100,000 inmates or federal facilities are powered to obtain estimates with a precision level equal to the minimum of the RSE obtained for the estimate in the 2004 SPI or 0.15 for key outcomes. Table B2 presents the number of inmates needed to be interviewed in order to obtain the precision goals.
Table B2. Number of inmates needed to obtain precision goals by analysis strata*
Analysis strata |
Number of interviewed inmates |
State, male inmates |
13,713 |
State, female inmates |
4,399 |
Texas inmates |
2,202 |
California inmates |
1,499 |
Florida inmates |
1,199 |
Federal inmates |
4,569 |
*Not all analysis strata are mutually exclusive. Namely, the number of interviewed inmates in Texas, California, and Florida are also included in the state male and female totals because those are the number of completed interviews necessary to meet the precision goals for each stratum.
First-stage design. In order to obtain the desired levels of precision, the 2016 SPI will obtain interviews from inmates in 350 participating facilities. Prior to selecting facilities, the universe will be divided into prisons that house male inmates and prisons that house female inmates. The 258 facilities that house both males and females (see Table B1) will be placed on both the male and female list of facilities with the inmate population equal to the number of inmates of the particular sex housed (e.g., a 1,500 inmate facility with 1,200 male inmates and 300 female inmates would be placed on the male facility list with an inmate size of 1,200 and placed on the female facility list with an inmate size of 300). Within each sex list, the universe of facilities will be stratified by five geo-administrative strata. The strata include –
federal facilities operated by BOP,
state facilities in Texas,
state facilities in California,
state facilities in Florida, and
state facilities in the remaining 47 states
Prior to allocating the desired number of participating facilities across strata, a size measure will be constructed for each facility. For each sex list, the size measure will be based on the number of inmates housed in the facility, adjusted to increase the probability of selection for facilities that are primarily a medical or mental health facility such that the likelihood of selecting a medical/mental health facility is increased by a factor of 3.0. Medical and mental health facilities are being oversampled because many of the key outcomes in the survey are related to medical and mental health. The oversample will help improve the reliability of these estimates by increasing the respondent sample size of inmates with either a medical or mental health condition. Of the 350 facilities, 340 will be proportionately allocated by government type (i.e., state or federal) based on the total size measure of state and federal facilities. The remaining 10 facilities will be allocated to the federal stratum. These 10 additional facilities are a “boost” to the federal sample size in order to ensure the precision goals for that analysis stratum are met. In order to achieve the desired precision goals for estimates among female inmates, facilities will be allocated across facilities that house males and females such that 3.5 times more state facilities housing females are selected than would be proportionally to their size measure and 2.4 times more federal facilities housing females are selected than would be proportionally to their size measure. Among the state facilities, the allocated facilities will be proportionally allocated across the four geographic strata. However, in order to reduce burden in the state jurisdictions with over 100,000 inmates, if the proportional allocation, based on the size measure, yields an expected sample size of confinement facilities greater than 30% of confinement facilities in that jurisdiction, then the number of selected confinement facilities in that jurisdiction will be capped at 30%. Based on these allocation rules, Table B3 presents the anticipated allocation of the sample across the ten sampling strata.
Table B3. Expected sample allocation of participating facilities across geographic strata by sex of inmate housed
Geographic strata |
Sex of inmate housed |
|
Male |
Female |
|
Federal |
41 |
10 |
Texas |
25 |
4 |
California |
17 |
2 |
Florida |
16 |
2 |
Remaining 47 states |
167 |
66 |
The design has two sampling methods built in to help ensure that 350 facilities participate in the study. The first method is a participation factor adjustment. The 2016 SPI design assumes that 85% of facilities will participate; this rate is based on BJS’s recent experiences fielding the National Inmate Survey (NIS). Reasons for non-participation include refusal at the jurisdiction or facility level and ineligibility. Given the adjustments made to the universe of prisons, the number of ineligible facilities is expected to be small. The participation factor adjustment will be incorporated into the design by inflating the number of facilities in Table B3, dividing the number of facilities by 0.85. The only exception to applying this participation factor is when one of the self-representing strata (i.e., Texas, California, Florida, or federal) has met its cap. If that occurs then the number of facilities selected will not exceed the cap. The second method is a reserve sample. A reserve sample of 50 facilities will be incorporated into the sample selection but will be released only if the participation rate falls below the desired level. The reserve sample will come solely from the stratum of the remaining 47 states. This is because it is not expected that only part of the federal, Texas, California, or Florida strata will participate.2 Therefore, if one of these jurisdictions refuses then having reserve sample in them will not help achieve the desired number of participating facilities. The reserve sample will be allocated proportionally across facilities by sex based on the distribution of the desired participating facilities. Incorporating the two methods to account for non-participation, the starting sample size of facilities will be 416 facilities.
The starting sample of facilities will be selected with probability proportionate to the facility’s size measure. Facilities with a large size measure (i.e., the expected probability of the facility being selected is greater than one) will be selected with certainty. Within each explicit stratum, the non-self-representing facilities will be implicitly stratified by facility type (confinement vs. community-based), whether the facility is primarily a medical or mental health facility, Census region, and state. Within each implicit stratum, facilities will be randomly ordered. Furthermore, in order to maintain the properties of a PPS sample, the reserve sample will be sub-selected from the initial sample of facilities via a systematic sample. The initial sample of facilities will be implicitly stratified as describe above and sorted by size of facility to systematically select the reserve facilities. Part or all of the reserve sample will be released in the second half of data collection if the facility participation rate is projected to be less than 85%.
Second-stage selection. In the second stage of selection, a random sample of inmates will be selected from each facility. In order to maintain a self-weighing design, a constant number of inmates will be selected from each facility. In state prisons, a simple random sample of inmates will be selected. In federal prisons, a stratified simple random sample of inmates will be selected. Inmates will be stratified by whether their controlling offense is a drug offense or not. Because more than half of the federal prison population is comprised of drug offenders, the 2016 SPI design will oversample non-drug offenders by a factor of 1.5 to ensure their representation in the sample.3
To meet the desired levels of precision, the number of inmates selected in state and federal facilities will be different. In state facilities, 64 completed interviews are desired, while in federal facilities 80 completed interviews are desired. In order to obtain the desired number of interviews, a 70% response rate will be assumed.4 Given this response rate assumption, a starting sample size of 92 inmates in state facilities and 115 inmates in federal facilities will be selected. The design allows for two exceptions to this within- facility sample size:
If the starting sample size is greater than 75% of a facility’s population, or
If the actual population of a facility differs from the population on the frame by 20% or more.
In the first case, the 2016 SPI does not want to overburden small facilities. Therefore, for facilities in which 92 inmates in state facilities or 115 inmates in federal facilities is greater than 75% of the population (i.e., state facilities with a population less than 123 inmates or federal facilities with a population less than 153 inmates), the sample size will be capped at 75% of the facility population. Based on a simulation study, 3.6% of the 350 participating facilities (2.1% of male, state facilities; 10.4% of female, state facilities; 0.8% of male, federal facilities; and 0.7% of female, federal facilities) will have their sample sizes adjusted because they are small.5
In the second case, the 2016 SPI is designed to be as close to self-weighting as possible which means selecting a constant number of inmates per facility. In cases where the actual population at the time of data collection deviates from the expected population on the frame by more than 20%, the within-facility sample size will be adjusted to compensate for the resulting weights that would differ from other facilities within the same stratum. To balance the need to maintain constant workloads across facilities, within-facility sample sizes will be capped at 80 completed interviews in state facilities and 100 completed interviews in federal facilities (i.e., a 25% increase in the population) when the facility is larger than expected and 50 completed interviews in state facilities and 64 completed interviews in federal facilities (i.e., a 20% decrease in the population) when the facility is smaller than expected. Because the frame for the 2016 SPI was adjusted to account for expected changes in facility populations, the number of cases in which this adjustment is necessary is expected to be small.
To test the programming of the CAPI instrument prior to fielding the national study, a purposive sample of two prisons not included in the national sample will be selected. Two state facilities located in Virginia will be selected based on their proximity to one another to minimize data collection costs and the time to complete the feasibility test as well as to increase efficiencies (e.g., using one interviewer team). Within each facility, thirty inmates will be randomly sampled. Because the goal of the pretest is to assess the functionality of the CAPI instrument given the changes since the 2013 SPI Pilot Study, we believe a sample of 60 inmates from two facilities is sufficient.
2. Procedures for Information Collection
The 2016 SPI data collection procedures were implemented in the 2013 SPI Pilot Study and are modeled after the approach that was used to conduct three waves of NIS (and adapted where necessary), for which over 250,000 inmates were interviewed in more than 1,200 correctional facilities. The experience of BJS and RTI conducting three rounds of NIS have provided a wealth of knowledge regarding how to effectively work with a variety of prisons to schedule and conduct data collection. While NIS and SPI serve different purposes, there are many similarities related to the logistical/operational components of the two, including the need to identify private interviewing space at each facility, provide information on the interviewers that allows the facility to conduct background checks on the team in advance of data collection, and determine how best to manage the flow of inmates to and from the interviewing location. Although no two prisons are exactly the same, we believe these plans and our experience interacting with and collecting data within a variety of prisons in NIS (and previous iterations of SPI) will result in a successful administration that minimizes burden on facilities and inmates while maximizing response and data quality.
Procedures for collecting the data include the following −
Obtaining Approval from RTI’s IRB. Approval of the final 2016 SPI questionnaire, consent form, and protocols for the implementation of the CAPI feasibility test and national study is pending with RTI’s IRB. Once a copy of the approval notice is received, BJS can provide a copy to OMB if necessary.
Obtaining Approval from Jurisdictions. The process to obtain approval from each jurisdiction with facilities in the SPI sample will begin once BJS receives approval from OMB to conduct the study. For the CAPI instrument test in two facilities prior to the national study, a letter will be sent to the commissioner of the DOC in Virginia to obtain approval, notify him of the two facilities, and request a contact at the DOC (see Attachment A6 – CAPI Testing Initial Contact Commissioner Letter). For the national study, letters will be sent to each commissioner of the 50 state DOCs and BOP to notify them of whether or not facilities in their jurisdiction are included in the SPI sample (see Attachment A7 – Announcement Letter Jurisdictions Sampled Facilities and Attachment A8 – Announcement Letter Jurisdictions No Sampled Facilities). (See Section 3 Methods to Maximize Response below for more information about the introduction letter that will be sent to all jurisdictions prior to beginning the national study.) The letter to commissioners with facilities in the sample will ask for their approval to conduct the study, provide them with a list of sampled facilities in their jurisdiction, and explain that the RTI Logistics Manager will be in touch to discuss arrangements, beginning with the establishment of a liaison from the DOC/BOP. Once a DOC/BOP contact is established, the Logistics Manager will contact that person to determine the jurisdiction’s preference of notifying particular facilities of their selection. (See Prison Recruitment section below.)
The Association of State Correctional Administrators (ASCA) is working with BJS and RTI to encourage participation among DOCs. If commissioners do not respond to approval requests from BJS, ASCA will follow up with the nonresponding DOCs via email and/or telephone calls to encourage approval and participation. (See Section 3 Methods to Maximize Response below for more information about the role of ASCA in the 2016 SPI.)
Prison Recruitment. Depending on the jurisdiction and their preferences, sampled prisons may be notified of their selection via a letter from BJS (see Attachment A9 – Sampled Facility Letter). In other jurisdictions, the DOC contact may prefer to notify its facilities of their selection. The RTI Logistics Manager will then work directly with each sampled prison to solicit participation and a contact person will be identified at each prison. Working with this individual, the Logistics Manager will finalize details for data collection, including submission of background check forms for the interviewers, identifying appropriate space for interviewing, need for bilingual interviewers, format of the roster which will be used to draw the sample of inmates, number of days and hours of each day when interviewing can be conducted, specific rules regarding items that may be brought into the prison, and instructions for arriving at the facility. Sampled facilities will also be provided with a one-page document that includes a list of questions and answers about SPI to provide facility staff with a brief overview of the study and assist them with the data collection process (see Attachment A10 – FAQ Facility).
Sampling of Inmates. No more than one week prior to data collection at a prison, the RTI Logistics Manager will work with the prisons to provide a roster of all inmates who are currently incarcerated there. Receiving the roster as close as possible to when data collection begins will ensure the sampling frame is as accurate as possible. RTI statisticians will use the rosters to draw a random sample of inmates within each facility. The day before data collection is scheduled to begin at a facility, the selected sample of inmates will be loaded into the data collection case management system and transmitted to the team of interviewers assigned to the particular facility.
In order to determine if any bias is introduced due to inmate nonresponse, facilities will be asked to provide administrative record data for all inmates on the roster to compare demographic characteristics of responding inmates with those who do not participate if necessary. (See Section 3 Methods to Maximize – Nonresponse Adjustments below for more information on the nonresponse bias analysis.) In addition to asking facilities for each inmate’s date of birth, sex, race/ethnicity, admission date, offense type, and sentence length to conduct the nonresponse bias analysis if necessary, they will also be asked to provide the inmate’s name, housing unit where the inmate resides within the facility, the inmate’s unique fingerprint-supported State Identification (SID) number, and the FBI’s unique fingerprint identification number for each inmate. The inmate’s name will be used by the interviewer to confirm that the correct inmate has been brought to the interview room. The housing unit information will used to facilitate data collection within each facility, specifically by housing units, if possible, to minimize any potential contamination by inmates who may discuss the survey with other potential inmate respondents once they leave the interview room. The SID and FBI number will be used by BJS and RTI to link the inmate self-report data collected through SPI with administrative records to supplement the inmate survey data and conduct additional studies.
Data Collection. A team of trained interviewers will visit the prison. They will ask correctional officers to bring each sampled inmate to a private interviewing area. Prior to the onset of the interview, interviewers will read the informed consent document to the inmate and will offer a copy of the document for the inmate to keep as well (see Attachment A4 – Interview Consent CAPI Testing Form and Attachment A5 – Interview Consent National Study Form). If the inmate consents, the interviewer will administer the interview by asking questions and recording the inmate’s responses in a laptop. If the inmate initially refuses, the interviewer will be trained to address any potential concerns to enhance participation. If the inmate is still hesitant to participate and expresses concerns about aspects of the records linkage, they will be informed that they can opt out of BJS reviewing their future criminal history records to conduct a recidivism study and linkage to other federal administrative data, but can still participate in the survey and linkage to their current criminal records to save time in the interview. The interviewer will code the inmate’s decision in the case management system (CMS). Inmates who consent will continue on with the survey while those who refuse the opt-out option will be thanked and will exit the interviewing room.
The pretest will be an opportunity to determine if this approach to obtain consent will likely be successful. If the pretest reveals potential problems with this approach that could adversely impact survey response, BJS and RTI will use the information to determine the necessary changes to make to the process prior to fielding the national study. If changes are necessary, BJS will submit a nonsubstantive change to OMB for approval prior to fielding the national study.
Interviewing will occur during the hours approved by the prison. Depending on the amount of interviewing space available and the number of hours approved each day by the facilities, we expect data collection will be completed within three to five days per facility. The number of interviewers who work in a given prison is dependent on the amount of interviewing space available. Typically prisons can accommodate four to six interviewers but we are prepared to send as many or as few interviewers as a prison can accommodate.
Thank You Letters. Within a week of completing data collection, a letter will be sent to each participating jurisdiction and facility to thank them for their participation in the study (see Attachment A11 – BJS Thank You Letter DOC and BOP, Attachment A12 – BJS Thank You Letter Facility, and Attachment A13 – RTI Thank You Letter Facility; see Attachment A14 – BJS Thank You Letter DOC Facility CAPI Testing).
3. Methods to Maximize Response
Marketing. BJS and RTI have been and will continue to market the 2016 SPI to garner support and generate interest among stakeholders to maximize response.
Expert review of questionnaire. Throughout the design of the 2016 SPI questionnaire, we solicited input from a variety of stakeholders to prioritize the content covered in the questionnaire and make difficult decisions about the constructs to exclude. We conducted two rounds of expert review. The first round consisted of reviewing a list of tentative domains and constructs to measure through SPI. In this round, stakeholders were asked to rank them in terms of priority, including identifying particular domains/constructs that could be removed from the list based on their relative importance to corrections fields. The second round of review included a draft questionnaire and a summary of each section in the questionnaire detailing the specific constructs, reference periods, and inmate populations that would receive the particular questions. Stakeholders were asked to review the documents and assess the scope of each section and importance to them/corrections. Typically reviewer feedback was provided to BJS through email. Stakeholders in both rounds of expert review included federal, state, and local corrections administrators and practitioners (see Part A, Section 8 Adherence to 5 CFR 1320.8(d) and Outside Consultations), criminal justice and correctional associations (e.g., American Correctional Association, Association of State Correctional Administrators, American Probation and Parole Association, American Society of Criminology etc.), other federal agencies (e.g., National Institute of Corrections, National Institute of Mental Health, Substance Abuse and Mental Health Services Administration, U.S. Department of Veterans Affairs, U.S. Department of Health and Human Services, Centers for Medicare and Medicaid Services, Office of National Drug Control Policy etc.), and researchers (see Part A, Section 8 Adherence to 5 CFR 1320.8(d) and Outside Consultations). This process also had the dual purpose of garnering support and interest among the corrections field for the 2016 SPI.
Presentations and open-discussion sessions. BJS and RTI have given presentations and conducted open-discussion sessions about the 2016 SPI with a variety of stakeholders at various venues over the past few years, including workshops at the American Correctional Association’s (ACA) annual summer and winter conferences, the American Society of Criminology (ASC) annual research conference, the annual BJS/National Institute of Corrections (NIC) NCRP Data Providers Conference, and biannual meetings of the ASCA Research Committee.
Endorsement by ASCA. ASCA has agreed to work with BJS and RTI to encourage its members to participate in the 2016 SPI, and this promotional work has already started. Brief announcements about SPI were made at recent biannual meetings that notified members of the upcoming study, educated them about the goals of SPI, and demonstrated the utility of the data. In addition, a letter of endorsement from the ASCA Research and Best Practices Committee will be sent to commissioners of the state DOCs with facilities in the SPI sample to encourage cooperation (see Attachment A15 – ASCA Letter). If necessary, ASCA will also follow up with nonresponding DOCs and encourage their participation by addressing concerns, explaining what SPI offers in terms of data, and stressing how vital their participation is for the success of the study.
SPI flyer. This two-page document includes key information about the 2016 SPI and was designed to inform a variety of stakeholders about the study (see Attachment A16 – SPI Flyer). The flyer includes background information, study goals, sample design and sizes, differences between SPI and NIS, content covered in the questionnaire, estimated time frame of data collection, topics of future statistical products, and archiving of the data. This flyer has been distributed by BJS at various workshops and open-discussion sessions at conferences, information booths located in exhibit halls of conferences, and to ASCA to share with its membership. In addition, this flyer will be included with the letter that is sent by BJS to all 50 state DOCs and BOP to officially announce the beginning of the next iteration of SPI. (See Announcement to corrections administrators below.)
Announcement to corrections administrators. Prior to the start of the national SPI study, BJS will send letters to the commissioners of the 50 state DOCs and BOP to introduce the SPI study (see Attachment A17 – SPI Introduction Letter). This letter will also contain the aforementioned SPI flyer. The goal of introducing the national study to all jurisdictions, even those who may not have facilities in the sample, is to ensure the corrections field is aware that the study will be occurring, explain the importance of the study and the utility of the data to the field, explain the study procedures so jurisdictions understand how they may or may not be included in the sample, provide an estimated time frame of when they will be notified as to whether or not facilities in their jurisdiction have been sampled and when BJS will be seeking their approval to conduct the study, provide them an opportunity to ask questions and review study materials (e.g., questionnaire) if interested, and generate interest and garner support from these key stakeholders.
Sample Design
Methods to minimize burden. In order to reduce burden in the state jurisdictions with over 100,000 inmates, if the proportional allocation of facilities, based on the size measure, yields an expected sample size of confinement facilities greater than 30% of the confinement facilities in that jurisdiction, then the number of selected confinement facilities in that jurisdiction will be capped at 30%. In addition, to avoid overburdening small facilities that are included in the sample, if the sample size of inmates per facility is more than 75% of a facility’s population then the inmate sample size will be capped at 75% of the facility population.
Reserve sample. As previously explained, the SPI sample design assumes a facility participation rate of 85% which is based on BJS’s recent experiences fielding three waves of the NIS. However, a reserve sample of 50 facilities will be incorporated into the sample selection and all or part of it will be released if the facility-level response rate falls below the desired level. This method will maximize response and data quality by helping to ensure that 350 facilities participate in the study, which is the number of participating facilities necessary to meet desired precision goals for key outcomes.
Administration
Protocols to minimize burden on facilities. The RTI Logistics Team has extensive experience effectively working with a variety of prisons through three rounds of NIS and the 2013 SPI Pilot Study to schedule and conduct data collection. Protocols have been established to minimize burden on facilities as much as possible, including a customized data collection schedule and minimizing the number of days in the facility to conduct data collection. The protocols allow for flexibility given that facilities are expected to vary in terms of interviewing space, number of days and hours of each day when interviewing can be conducted, specific rules regarding items that may be brought into the prison, and instructions for arriving at the facility.
Minimize burden on inmates. Every effort has been made to minimize the burden of the 2016 SPI administration on inmates. The SPI questionnaire has been designed and tested to maximize respondent comprehension. Also, the interview length has been reduced from an average of 83 minutes in the 2013 SPI Pilot Study to an estimated 60 minutes for the national implementation, including the informed consent process. Because the total interview time will be shorter, agreeing to participate will likely be more appealing to inmates and the survey will be less onerous to those who do participate. (See Section 4 Tests of Procedures or Methods below for more information on cognitive testing and pilot study results.) Whether interacting with facility staff or inmates, interviewers are trained to be courteous and professional in their behavior. They will adhere to the proscribed facility protocol and avoid engaging in extraneous conversations with inmates that could lengthen the interview unnecessarily. Interviewers will learn at training that an inmate’s movements and activities at the facility are typically quite structured with little room for deviations. The interviewers will work efficiently to ensure inmates are not delayed in getting to meals, counts, etc.
Protocols to maximize inmate response. Field interviewers will be trained on refusal avoidance to maximize response to the survey (see Attachment D – 2013 SPI Pilot Study Training Manual and Attachment A18 – Agenda Training National Study).6 During the consent process, they will tell inmates that their data is important to understanding the experiences of inmates. They will inform inmates that participation is voluntary and that any information that might identify them is confidential. During training, interviewers will gain experience responding to common questions likely to be raised by inmates. During mock interviews, they will have an opportunity to practice answering questions, and addressing general objections so they become comfortable with the information. Interviewers will encourage hesitant inmates to start the interview and see how it goes. Inmates who are hesitant to allow BJS to link their responses with administrative records will be offered the option to opt out of this step. Interviewers will be trained to minimize the number of inmates who refuse to participate in the full study and the number who opt out of records linkage by addressing inmate concerns about this process and its implications.
Interviewers will also work with facility staff to arrange for inmates who must stop the interview before reaching the end (perhaps because they must go to a meal, to a job, or return to their housing unit for a count) to come back at a later time to complete the interview. Interviewers will work closely with their facility contact to handle these restarted interviews efficiently.
Although we are precluded from offering incentives to inmates, the Logistics Manager will seek approval from each facility to offer a light snack (e.g., Chips-Ahoy 100 calorie cookies). If approved, inmates will be offered the snack and will be required to consume it prior to leaving the interviewing area (so it cannot be used as “currency” later). The interviewer will collect all trash and dispose of it according to facility procedures. While we know that some prisons will not allow the snacks, our experience conducting NIS demonstrated that offering snacks led to a 6% increase in inmate participation.
Lastly, a Spanish version of the CAPI questionnaire will be available for Spanish speaking respondents. Interviewer teams will consist of bilingual staff who have been RTI-certified as capable of conducting the interviews in Spanish. Only interviewers who have been certified are allowed to conduct Spanish language interviews. During discussions with their facility contact, the Logistics Manager will determine the percentage of Spanish speakers housed at the facility and bilingual interviewers will be staffed onto the team appropriately (e.g., more bilingual interviewers in a facility with a higher proportion of Spanish speakers).
Nonresponse Adjustments
Unit nonresponse. With any survey, it is typically the case that some of the selected subjects will not respond to the survey request (i.e., unit nonresponse) and some will not respond to particular questions (i.e., item nonresponse), despite best efforts made to collect all the data. Weighting will be used to adjust for unit nonresponse in SPI. The weights created will allow for the analysis of the cross-section sample of prisoners, including those in the self-representing jurisdictions. Four adjustments to the design-based weights will be made before they are finalized. These adjustments will be made within each of the six sampling strata to ensure that weight totals properly allow for BJS’s two analytic goals of producing national estimates and large jurisdiction estimates.
First, adjustments will be made for nonresponse at the first stage of selection (i.e., a refusal at the facility level). In this stage, the information available on the 2012 Census of State and Federal Correctional Facilities will be used to make adjustments. To determine which factors to use in the facility nonresponse weight adjustments, a procedure available in RTI’s SUDAAN software based on the Generalized Exponential Model (GEM) will be used to model the response propensity using information from the sampling frame and administrative records (e.g., facility characteristics such as facility size, whether mental health or medical services are provided, etc.).7 Ideally, only variables highly correlated with the outcomes of interest will be included in the model in order to reduce the potential for bias. However, because that is not known, all facility characteristics and significant lower-level interactions will be included in the model.
Second,
adjustments for nonresponse at the second stage of sample selection
(i.e., a refusal by an inmate) will be made. A nonresponse analysis
will be conducted and the results of this analysis will be used to
adjust the weights to reduce bias due to nonresponse. The factors
used in the nonresponse weight adjustment must be known for both
respondents and nonrespondents, and should be correlated with
response propensity and the key outcomes of interest. To determine
which factors to use in the inmate nonresponse weight adjustments,
the GEM will be used to model the response propensity using
information from the sampling frame and administrative records (e.g.,
sex, age, race/ethnicity, offense etc.). All inmate characteristics
available on the frame will be included in the model as well as
significant lower-order interactions. For respondents that do not
provide demographic information, imputation, as described below in
the Item nonresponse
section, will be utilized prior to fitting the nonresponse model. A
multifaceted approach will be implemented to determine patterns of
nonresponse and estimate the potential nonresponse bias. The approach
will include use of administrative records, modeling, descriptive
statistics, and Cohen’s effect size, which provides an estimate
of the size of any potential bias.8
(See Nonresponse bias analysis
below.)
Third, as we will obtain a full roster from each facility, we will create a poststratification adjustment based on the characteristics of the facility’s complete population. In other words, the second step will account for any potential bias due to nonresponse while the third step will account for potential bias among the population of inmates in selected facilities due to the sample selection mechanism.
Fourth, we will benchmark the weights to the latest National Prison Statistics (NPS) Program totals by sex. Because the first-stage weights are based on the 2012 Census, the sum of the final weights (i.e., the product of the adjusted first-stage and adjusted second-stage weights) may not sum to the number of incarcerated inmates during the data collection period. For this adjustment, a ratio adjustment will be used to adjust the product of the first-stage weights (after the first adjustment) and the second-stage weights (after the third adjustment).
Nonresponse bias analysis. As previously stated, recent iterations of NIS had inmate-level response rates below 80%. BJS is assuming a 70% inmate-level response rate in the national study based on those recent experiences with NIS. In order to ensure those inmates that do not participate in the study are not fundamentally different than those that do, a nonresponse bias analysis will be conducted if in fact the inmate-level response rate obtained in the 2016 SPI is below 80%. As part of the sampling process, each participating facility will provide an electronic roster containing administrative data for all inmates. The following administrative data on inmate characteristics will be used in the nonresponse bias analysis −
sex,
age,
race/ethnicity,
admission date,
sentence length, and
offense
For each inmate characteristic, BJS will compare the distribution of the respondents to the nonrespondents. A Cohn’s Effect Size statistic will be calculated for each characteristic. If any characteristic has an effect size that falls into the “medium” or “high” category, as defined by Cohn, then there is a potential for bias in the estimates. Each of these estimates will be included in a nonresponse model to adjust weights to minimize the potential for bias in the estimates.
Item nonresponse. Imputation will be the method used to adjust for item nonresponse in key outcomes of interest in the 2016 SPI, specifically age, sex, race/ethnicity, admission date, offense, and sentence length.9 Based on our experience with NIS, we expect facility rosters will include sex, age, and race for all inmates but race categories will likely not include multiracial and Hispanic origin. Therefore, we will use a deterministic imputation for sex, age, and race/ethnicity using the information from the facility rosters, and a model-based approach to verify the deterministic imputation of race/ethnicity. If the stochastic result does not match the deterministic result then the final imputation will be the stochastic model result. In addition, BJS and RTI will work together to explore whether other innovative methods, such as multiple imputation, can be used to impute additional variables. Per OMB standards, if nonresponse exceeds 30% for a particular item, BJS and RTI will work together to explore whether a nonresponse bias analysis is feasible or whether BJS should refrain from publishing statistics that are derived from that/those items.
To evaluate the effects of imputation on
the estimates, we will compare the estimates
calculated prior
to imputation to those calculated after all imputation is complete.
The differences in the estimates will be analyzed, along with the
standard errors of the differences.
It is crucial for all data users to know which
variables were imputed. Therefore, we will
include imputation
flags on the final dataset that indicate not only whether a value was
imputed, but the method of imputation used.
4. Tests of Procedures or Methods
As previously explained, the 2016 SPI data collection methodology is modeled after the approach used in prior iterations of SPI and three waves of NIS, for which over 250,000 inmates were interviewed across over 1,200 correctional facilities. These data collection methods and logistical procedures were also employed during the 2013 SPI Pilot Study.
Cognitive Testing of Questionnaire Content. The 2016 SPI questionnaire was redesigned through the Survey of Prison Inmates: Design and Testing Project, which included a pilot study to evaluate the questionnaire. Prior to fielding the pilot study, two rounds of cognitive interviews were conducted through the design and testing project.10 Paramount to the successful development of the survey is ensuring that the questions themselves are understood by respondents and thus able to accurately capture data on the experiences of inmates during and before incarceration. To evaluate the degree to which questions are understood by respondents and to inform enhancements to the survey, RTI conducted cognitive interviews in Durham, North Carolina with individuals who had served time in prison but had been recently released (within three months of the cognitive interview). The first round of cognitive interviews included nine participants and a selection of questions from the majority of sections in the questionnaire. The second round of cognitive interviews included another nine participants and a selection of questions from the remaining sections of the questionnaire.
Candidate questionnaire items were chosen for inclusion in cognitive testing based on several considerations and included those that play a determining role in significant navigation downstream in the questionnaire and those that contain technical language or uncommon terminology. Many items from the Criminal Justice section of the instrument were chosen in order to test those that collect offense information and play a fundamental role in determining the type of offense(s) an inmate committed, which impacts the way in which other data are collected, including the characteristics of the incident that led to the offense and sentencing information. Additionally, items were selected for inclusion to test changing reference periods and the potential impact this may have on accuracy and ease of recall. Further, some items were tested in order to maintain the flow of questions as proposed for the pilot study interview to avoid any confounding order effects. Findings from the cognitive interviews guided revisions to questionnaire items to reduce ambiguity, facilitate recall, and/or reduce respondent burden in advance of fielding the SPI Pilot Study in 2013.
2013 SPI Pilot Study. The main goal of the 2013 SPI Pilot Study was to evaluate the SPI questionnaire and functionality of the CAPI instrument. Survey questions administered in the 2016 national study will be largely the same as those administered during the pilot study, except that will be fewer questions.
The 2013 SPI Pilot Study allowed us to assess the 2016 SPI instrument based on answers to the following questions.
How long did the interviews take (average time, minimum and maximum times, etc.)? Instrument length was reviewed by sub-section and in total to inform decisions about the addition or deletion of content and the length of the data collection period within each facility, and to accurately estimate respondent burden for the 2016 SPI.
Were there any survey items with unusually long administration times? Such items were reviewed to determine whether they needed to be clarified to reduce confusion or were creating excessive burden on respondents and therefore should be omitted from the instrument, especially in light of the length of the survey.
Were there survey items with high rates of nonresponse? Item nonresponse (i.e., item-level assessment of “don’t know” and “refused” responses to survey questions) is an indicator of potential instrument or data quality problems. These types of items were candidates for deletion to reduce the length of the survey.
Did the instrument function as intended? The pilot study was the best opportunity to determine if any programming glitches existed in the complex navigation of the CAPI instrument and whether there were any questions that required additional instructions in order for the interviewer to be able to efficiently record the inmates’ answers.
How did inmates respond to a request to provide their social security number (SSN) to allow BJS to link their survey data to beneficiary records directly from the Social Security Administration (SSA)? The approach and language for this request was modeled after other government surveys, such as the National Health Interview Survey, where similar requests have been made and other data collection efforts, such as the Serious and Violent Offender Reentry Initiative (SVORI).11
The findings from the pilot study informed decisions to refine, change, and edit the questionnaire and consent forms to enhance them for the national study. Findings from the SSN request led to the exploration of other administrative data linkage options that can be achieved without an inmate’s SSN. The changes made since the pilot study include reducing the overall questionnaire length to maximize survey response and minimize burden, refining the questionnaire items to reduce respondent burden and improve data quality, improving the consent forms to streamline the consent process and maximize participation, and eliminating the SSN request in lieu of linking survey data to various federal datasets through the Center for Administrative Records Research and Applications (CARRA) at the U.S. Census Bureau’s Center for Economic Studies (CES) (see Part A, Section 1 Necessity of Information Collection for more information). The principle findings from the 2013 SPI Pilot Study, and the changes made to enhance the national study, include the following −
The interview length was the most serious challenge to gaining inmate cooperation. Through interviewer debriefing calls and observations of the inmate interviews by both BJS and RTI staff, it was revealed that many inmates refused to participate in the study after hearing how long, on average, the interview was expected to take (the consent form indicated the survey would take “about 80 minutes”) (see Attachment A19 – SPI Pilot Study Report).12 Altogether the pilot study achieved a cooperation rate of 54.3%.13 This rate takes into account all types of refusals initiated by both the facility and individual inmates, however. Considering just those inmates who actually met face-to-face with an interviewer, 59% agreed to participate following the consent process. The overall refusal rate for the pilot study was 42%.14 The study results demonstrate the impact of interview length on inmate-initiated breakoffs and noncompliance with the request to participate as well as facility initiated breakoffs due to scheduling parameters within the facility. Many inmates were unable to finish the interview due to counts and other scheduling considerations that required them to be elsewhere in the facility. As we did not have a procedure for restarting such breakoffs in the pilot study, we were unable to complete these cases even when the inmate was willing to do so. As noted earlier, we will work with the facilities to accommodate such breakoffs in the national study to maximize response and data quality.
In addition to being able to restart breakoff interviews with inmates, the high rate of breakoff interviews, 12.1% of the total sample, will likely also be reduced by shortening the interview. By comparison, the 2011-2012 NIS, which took much less time than the SPI Pilot Study, had a breakoff rate of just 0.76% (0.7% were facility-initiated breakoffs and 0.06% were inmate-initiated breakoffs).15 The goal of keeping the interview short, however, must be balanced against the various analytic goals of the 2016 SPI and meeting the needs of various stakeholders.16 Reducing the interview required difficult decisions regarding how to scale back the content of the questionnaire, while still meeting those core goals. In collaboration with experts at BJS, RTI, and several of the individuals named in Part A, Section 8 Adherence to 5 CFR 1320.8(d) and Outside Consultations of this supporting statement, we have made those decisions and estimate that we reduced each of the SPI sections, as shown in the Table B4. Overall, we estimate that we decreased the average questionnaire time by about 27 minutes.
Table B4. SPI interview length, by questionnaire section (Completed interviews from the 2013 SPI Pilot Study and timing estimates for the 2016 SPI)
Through debriefing calls, interviewers revealed that the consent process was lengthy and appeared to have an impact on response to the pilot study. In addition, interviewers reported that some of the language used in the consent form appeared to be difficult for some inmates to understand. It was also determined by BJS that some of the text in the consent form for this study was irrelevant and more appropriate for surveys of a sensitive nature, such as NIS, or ones that include juveniles. Based on these findings and observations, BJS and RTI worked together to identify changes and RTI staff worked informally with some members of their IRB to create a shorter and less complex consent form that still meets all requirements for research with human subjects. This revised consent form is currently under review by RTI’s IRB.
For a survey of this length, it was very encouraging to find that overall item nonresponse rates, due to “don’t know” responses and refusals, were low which is consistent with prior iterations of SPI. One exception was that some inmates had some difficulty recalling the specific month or day of an event that occurred a long time ago (e.g., the day they were admitted to prison or the month of the last job they worked prior to being incarcerated). Due to relatively small sample sizes, we did not cross tabulate or regress by length of incarceration to further analyze items with high nonresponse due to ”don’t know” responses. The rates of nonresponse for the date questions (six) that appeared to pose some recall challenges to inmates ranged from 6-27%. During the redesign of the 2016 SPI questionnaire, BJS and RTI had already reduced the number of date recall questions, where possible, compared to the 2004 SPI questionnaire. Those date questions that were asked in the pilot study were retained in the national study, as not only do OMB standards suggest a threshold of 30% for item nonresponse but if we find that missingness is high for a particular item in the national study, BJS and RTI can explore various approaches to address the problem. For example, the feasibility of a nonresponse bias analysis, imputation methods that rely on auxiliary data (e.g., administrative records), or restricting particular analyses to a subset of the sample that consists of inmates who were recently admitted, if we find that nonresponse is correlated with the length of incarceration.
In addition, it is possible that some of the challenge in reporting the month or day of a specific event may have been related to the order in which the questions were asked and terminology such as “that led to this incarceration,” which proved to be confusing to inmates during the pilot study because they were not sure if that meant incarceration in this facility or a previous facility (but for the same offense). Since then, we have reorganized some series of questions, such as those asked in Section 2, so the date questions are asked in chronological order, where possible, to make recall easier. We have also removed confusing terminology such as “that led to this incarceration,” and where possible, this terminology was replaced with dates that are used to anchor particular questions/reference periods. This approach was also used in the 2004 SPI survey and proved to be effective.
Results from the debriefing calls with the interviewers also revealed that introductory text to some questions or series of questions was too long or was redundant in places. It was determined that for some series of questions, the text could be scaled back to streamline the interview, improve the flow, and reduce the length of the interview. For example, in a series of questions in Section 11 regarding various types of rule violations inmates may have been written up for during the past 12 months, the reference period is no longer repeated for every single question as it was in the pilot study, but rather every few questions simply to remind inmates of the reference period.
Section 2 of the questionnaire is the most complex section and requires that a controlling offense for each inmate be identified, which is critical to routing inmates to appropriate sections and/or questions downstream in the survey. Part of this effort involves interviewers using a lookup table when inmates report the specific offenses for which they are incarcerated. As interviewers type in the offense, the lookup table is designed to provide a list of the possible offenses to assist the interviewer in selecting the correct offense. During the debriefing calls, interviewers reported that more training on the use of the lookup table would be helpful and could cut down on the time it takes to administer Section 2 of the questionnaire. For the national study, we will devote an entire section of the training on using the lookup table, and interviewers will practice with it more during the mock interviewers. It is expected that as field interviewers become more familiar with using the table during the actual interviews, the time it takes to administer Section 2 will be reduced, thus reducing the overall administration time.
Interviewers who worked on the pilot documented several situations where the CAPI instrument did not appear to be routing properly and/or that incorrect fill text was being displayed onscreen. Each of these reports was researched by RTI staff and corrected as quickly as possible. Once a change was implemented, a revised program was transmitted to the interviewers who were then able to utilize the updated instrument in subsequent interviews. The CAPI feasibility test will provide an opportunity to assess the functionality of the revised CAPI instrument for similar issues prior to fielding it on a national scale.
Interviewers reported that the onscreen instructions for some questions intended to help them code responses or clarify issues for inmates if asked proved to be very helpful. As BJS and RTI worked on scaling back the questionnaire based on the pilot test results, we identified places where additional onscreen instructions may be useful to interviewers and added them.
The results from the SSN request showed that of the 176 prisoners who reached the end of the questionnaire, 61.9% (109) agreed to provide their SSN, 10.8% (19) directly declined the request to provide their SSN, and an additional 27% (48) of inmates reported not having a SSN. Of those inmates who agreed to provide a SSN (109), 95% (104) provided one. Among all inmates who agreed to participate in the pilot (234), including those who did not reach the end of interview and were therefore not asked to provide their SSN, the participation rate was 44.4%.
In light of these findings, BJS has decided to forgo this strategy and instead link the survey data to SSA records through the U.S. Census Bureau’s CARRA program rather than work directly with SSA. There are several benefits of working with CARRA relative to the alternative of establishing a relationship with SSA or other government entity. First, BJS has an existing interagency agreement (IAA) with CARRA and is working with the group on similar efforts through BJS’s NCRP. Working through this existing IAA with CARRA will enable a more efficient and timely working relationship between BJS’s 2016 SPI team and CARRA than what might be possible by attempting to achieve the same goals through SSA’s process or establishing a new IAA with some other government entity. There is also evidence that suggests working with CARRA will achieve better results because CARRA does not require an inmate’s SSN to establish a match. Through the ongoing work with NCRP’s team and CARRA, a test to link NCRP records from 5 states which already supply SSN as part of their NCRP submission was conducted. Using state, first and last name, race, sex, and date of birth as match hooks, the CARRA group was able to match 82.2% of prison inmates if SSN was not one of the matching criteria. Lastly, CARRA allows access to a number of federal datasets that include a variety of information, such as data on receipt of supplemental security income, Temporary Assistance for Needy Families (TANF), public housing and rental assistance history, Department of Housing and Urban Development-insured mortgage loans, SSA’s Death Master File, enrollment in Medicare, and any listing in Census’ decennial census or American Community Survey (ACS). With additional approval, SPI data could be PIK’d to unemployment insurance (UI) wage data collected by CES through the Longitudinal Employer-Household Dynamics (LEHD), as well as to tax returns from the Internal Revenue Service (IRS). Third,
In summary, the evaluation of the questionnaire, consent procedures, and SSN request implemented in the 2013 SPI Pilot Study informed a variety of decisions. They include reducing the overall questionnaire length to maximize survey response, implementing improvements to questionnaire items to reduce respondent burden and improve data quality, eliminating the SSN request in favor of working with CARRA to achieve better results through records linkage, enhancing the consent protocols to streamline the processes and include additional records linkage, and maximizing response in advance of fielding the national study.
CAPI Instrument
RTI will conduct extensive testing of the automated survey instrument and its interaction with the various control systems that will be used during the 2016 SPI data collection. Testing will include full instrument testing of all parts of the instrument, including the systematic checking of instrument output and one or more tests of all control systems that will be used during the data collection. Also, BJS will have access to a laptop with the CAPI instrument and various staff will be asked to test the programming and specifications using test cases, specifically by responding to questions that appear on the screen to ensure the instrument routing is accurate. They will document the problems they encounter and that information will be transferred to RTI who will take the appropriate actions.
The feasibility testing of the CAPI instrument will be conducted among 60 inmates in two facilities to ensure the instrument is functioning properly after scaling back the length of the questionnaire significantly since the pilot test. As noted above, all systems will receive thorough testing prior to the pretest. However, the pretest will serve as a final check that all systems are functioning as specified and that no further programming revisions are needed. Any problems that are identified as a result of this additional CAPI testing will be fixed and tested internally by RTI prior to fielding the national study.17
5. Consultation Information
The Correction Statistics Program at BJS is responsible for the overall design and management of the activities described in this submission, including the fielding of the survey, data cleaning, and data analysis. BJS contacts include −
Lauren Glaze (primary contact)
Statistician and SPI Project Manager
Corrections Statistics Program
(202) 305-9628
Jennifer Bronson, PhD
Statistician and SPI Co-Project Manager
Corrections Statistics Program
(202) 616-8937
Anastasios (Tom) Tsoutis
Chief
Corrections Statistics Program
(202) 305-9079
During the development and design of the 2016 SPI, RTI staff provided input and services to BJS, specifically in the areas of questionnaire design, statistical methodology, data collection, and analysis. RTI will continue to provide support and services throughout the course of SPI and will also manage and coordinate the collection of all data. Contacts at RTI include −
Tim Smith
Manager, Security and Resilience Program
SPI Principal Investigator
RTI International
919-316-3988
Marcus Berzofsky
Statistician and SPI Co-Principal Investigator
RTI International
919-316-3752
Chris Stringer
Survey Methodologist
RTI International
919-541-7218
1 Based on population counts collected through the 2013 National Prisoner Statistics Program which are the most recent population counts currently available. See Carson, E.A. (2014). Prisoners in 2013. Washington, DC: Bureau of Justice Statistics.
2 Federal and state prison systems are centralized and cover the entire jurisdiction. If BOP or a state department of corrections (DOC) refuses to participate then the refusal will apply to the entire jurisdiction and cover all of the facilities sampled in that particular jurisdiction.
3Carson, E.A. (2014). Prisoners in 2013. Washington, DC: Bureau of Justice Statistics.
4 Prior iterations of SPI have yielded second-stage response rates of 90% or higher. However, since the last SPI, NIS was in the field in 2007, 2008-2009, and 2011-2012 and second-stage response rates for NIS averaged about 70%. While NIS covered a sensitive topic which may be why response rates were lower than prior iterations of SPI, to account for any potential adverse impacts from NIS and changes in the correctional environment related to survey requests from the Department of Justice, we have assumed a more conservative response rate than was achieved in prior iterations of SPI.
5 The simulation study was conducted by RTI under a Cooperative Agreement (award 2009-BJ-CX-K054) with BJS.
The project was the Survey of Prison Inmates: Design and Testing Project, which has been completed.
6 The training manual for the national study is not available yet but it will be similar to the manual for the pilot study.
7 Folsom, R.E., & Singh, A.C. (2000). The Generalized Model for Sampling Weight Calibration for Extreme Values, Nonresponse, and Poststratification. In Proceedings of the American Statistical Association’s Survey Research Methods Section, 598-603.
8 Cohen, J. (1988). “Statistical Power Analysis for the Behavioral Sciences.” (2nd Edition). Hillsdale, NJ: Erlbaum
9 In the 2004 SPI, the key outcomes of interest that were imputed were sex, age, and race/ethnicity.
10 The results from the cognitive interviews were also briefly discussed, and the final report was included, in the generic clearance package BJS submitted to OMB for approval of the 2013 SPI Pilot Study (OMB No. 1121-0339; expiration date 1/31/2016).
11 This was a project funded primarily by the National Institute of Justice (NIJ), in conjunction with other federal partners (primarily the U.S. Department of Education, Health and Human Services, Housing and Urban Development, and Labor) and conducted by RTI International. The goal of the initiative, which started in 2003, was to improve reentry outcomes of prisoners along five dimensions: criminal justice, employment, education, health and housing. The purposes of the evaluation, initiated in 2004, were to determine the extent to which participation in SVORI programs improved access to reentry services and programs and resulted in improved outcomes in the areas of housing, education, employment and criminal behavior. This was a longitudinal study that began in prison, moved to a structured reentry phase before and during the early months of release of prisoners, and continued for several years as released prisoners take on increasingly productive roles in the community.
12 The field staff participated in a telephone debriefing after data collection and this was one of the main findings resulting from those debriefing calls.
13 Using COOP4 from The American Association for Public Opinion Research. 2011. Standard Definitions:
Final Dispositions of Case Codes and Outcome Rates for Surveys. 7th edition. AAPOR. This includes 176cases where the inmate made it through the last question in the survey and 58 cases where the facility or inmate broke off the interview during various portions of the questionnaire.
14 Using REF3 from The American Association for Public Opinion Research. 2011. Standard Definitions:
Final Dispositions of Case Codes and Outcome Rates for Surveys. 7th edition. AAPOR.
15 For additional comparison, the 2012 National Survey on Drug Use and Health (NSDUH) is about 62 minutes long and has a breakoff rate of 0.05%. The NSDUH is a household survey of the U.S. population age 12 and older. It includes some types of institutions (colleges, homeless shelters, etc.) but does not include prisons. The interview is conducted using a combination of CAPI and ACASI.
16 The NIS questionnaire focused primarily on one particular topic (i.e., sexual victimization) while SPI covers a variety of topics, as it is the only source of national data for a number of them.
17 If necessary, BJS will submit a report to OMB summarizing the results of the CAPI feasibility test prior to fielding the national study.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Carson, Elizabeth |
File Modified | 0000-00-00 |
File Created | 2021-01-24 |