Final NSYC-3 Supporting Statements Part B OMB 080717_v2

Final NSYC-3 Supporting Statements Part B OMB 080717_v2.docx

National Survey of Youth in Custody (NSYC)

OMB: 1121-0319

Document [docx]
Download: docx | pdf






Supporting Statement (Part B)



National Survey of Youth in Custody-3 (NSYC-3):

Data Collection



Table of Contents

Section Page


Supporting Statement

Part B. Collection of Information Employing Statistical Methods

  1. Universe and Respondent Selection

The objective of this survey is to produce estimates of rates of sexual assault in juvenile correctional facilities that house adjudicated youth. To meet the goals of PREA, the sample is designed to produce national estimates for a sample of facilities owned or operated by states, or where youth are being held under state contract. Whenever possible, the data will also be used to produce state-level and facility-level estimates. The smallest facilities (i.e., facilities with less than 10 youth) will be excluded from the sampling frame because of the expense associated with enrolling and visiting facilities that would yield a small number of interviews. Under PREA, participation by sampled facilities is mandated, and BJS is required to list publicly any facilities declining to participate.

All youth participation is voluntary. Non-adjudicated youth will not be included in the survey because it is impractical to gain parent or guardian consent (PGC) given their customarily short lengths of stay. If facilities require PGC, a lead-time of 6 to 9 weeks is required. Non-adjudicated youth are typically not in facilities for this amount of time.

The primary goal of the sample design is to produce estimates at the facility, state, and national levels.

  1. Sample Frame


The sample frame will be created using the 2015 Census of Juveniles in Residential Placement (CJRP), which is maintained by the U.S. Census Bureau for the Office of Juvenile Justice and Delinquency Prevention. The sampling frame, and the numbers presented below, are based on the 2015 CJRP and supplemented by the list of facilities updated for the administration of the 2016 Juvenile Residential Facility Census (JRFC).

Table 2 depicts the universe of juvenile facilities as reported on the CJRP, supplemented with information from the JRFC, and the primary criteria for the NSYC-3 sample frame. Of the 2,511 facilities on the CJRP, 11 were not located in the 50 states or the District of Columbia and another 653 facilities had no youth records on the CJRP. This leaves 1,847 facilities containing 48,465 youth. Among these facilities an additional 70 facilities, housing 1,794 youth, were identified as closed.

One of the criteria used to define an eligible facility on NSYC-1 and NSYC-2, which will be used on NSYC-3, is that it housed at least 25% of youth who were adjudicated and held at least 10 adjudicated youth. Additional facilities will be excluded due to housing a low proportion of adjudicated youth or housing fewer than 10 adjudicated youth. As noted above, these restrictions are done for efficiency reasons because it is quite expensive to interview in facilities that have very few adjudicated youth.

Table 2 also shows the non-contract facilities that are out of scope. They are local/privately operated facilities that house youth for short periods (less than 90 days) and are not accessible for interviews due to the lack of state jurisdiction.

Finally, the frame is further restricted to facilities owned or operated by the state and those under state contract that house one or more state-placed adjudicated youth.

Table 2. Composition of the 2015 CJRP



Facilities

Number of youth


CJRP Universe as of October 2015

2,511

49,143






Facilities not in the 50 states or DC

11

351


Facilities with no youth records

653

653


Facilities identified as having closed
since October 2015

70

1,794


Facilities housing youth1

1,777

46,671


Non-adjudicated youth


9,992


Adjudicated youth


36,679







Facilities

No. of adj youth


Percent adjudicated




25% or less

194

372


More than 25%

1,583

36,307






Number adjudicated (more than 25%)




1-9

713

3,466


10+

870

32,841






Type of facility (10+ and more than 25%)



Non-contract

417

13,973


Contract

139

4,072


State

314

14,796






State and Contract facilities




Adjudicated youth (10+ and more than 25%)

453

18,868

Placed by state or in a state facility


17,296


Not placed by state and not in a state facility

1,572


1 Adds 326 youth held in these facilities age 21 and over.




































The final sampling frame will include 453 facilities and 18,868 youth. It will include all state owned or operated facilities in the U.S. with greater than 25% adjudicated youth and at least 10 adjudicated youth. It will also include all adjudicated youth in contract facilities with at least one youth placed by the state that also meet these criteria.

Facilities included are

  • state facilities with more than 25% adjudicated youth and at least 10 adjudicated youth (314 facilities, 14,796 adjudicated youth)

  • contract facilities with more than 25% adjudicated youth and at least 10 adjudicated youth (139 facilities, 4,072 adjudicated youth).


Facilities excluded are

  • state facilities with 25% or fewer adjudicated youth or fewer than 10 adjudicated youth (134 facilities, 667 adjudicated youth)

  • contract facilities with 25% or fewer adjudicated youth or fewer than 10 adjudicated youth (112 facilities, 529 adjudicated youth)

  • all non-contract facilities (1,078 facilities, 16,615 adjudicated youth) that house youth for less than 90 days and/or youth are not held under state jurisdiction.


The NSYC-3 frame of 18,868 adjudicated youth in state facilities and in contract facilities account for 94% of all adjudicated youth in these facilities or placed by states in contract facilities [18,868 NSYC-3 frame / (18,868 in NSYC-3 frame + 667 excluded state facilities + 529 excluded contract facilities)].

The NSYC-3 frame covers 40% of all youth held in residential facilities in the 50 U.S. states and the District of Columbia. The frame covers 51% of all adjudicated youth (see Table 2 above).

Table 3 shows the projected composition of the sampling frame by facility size:

Table 3. Composition of Sampling Frame by Facility Size



Facilities

Adjudicated Youth


Total

State

Contract

Total

State

Contract

Total

453

314

139

18,868

14,796

4,072








Facility Size







10-19

162

104

58

2,279

1,457

822

20+

291

210

81

16,589

13,339

3,250


  1. Facility Sampling

All facilities in the frame will be included with certainty. The frame has been shrinking since NSYC-1 was administered in 2008-09. It has been reduced to the point that, to achieve adequate statistical precision for NSYC-3, it is necessary to include all eligible facilities.

In addition to the facilities in the sampling frame described above, there will be new facilities. By examining facilities listed in the 2016 JRFC that were not on the 2015 CJRP, we have identified a total of 177 facilities that appear to be new. Most of these 177 facilities (all but 11) are private facilities, and we estimate the eligibility rate for these to be approximately 11%. Overall, we estimate that about 19 of the 177 facilities will be eligible and will participate in NSYC-3 based on our experience with NSYC-2. We estimate that these facilities will house approximately 592 eligible youth.

In addition to these facilities, there may be new facilities that appear on neither the 2015 CJRP nor the 2016 JRFC. Prior to fielding NSYC-3, a packet will be sent to the state administrator for each state that includes a list of the sampled facilities in their state. The Westat state recruiter will follow up with an email to the state administrator to request designation of a state liaison. The state liaison will be asked to provide information needed to ascertain the status of facilities in the state, including information on state jurisdiction over the sampled facilities, the existence of newly opened and recently closed facilities, current contracts with non-state facilities to house adjudicated youth, and the average length of stay of youth in each sampled facility.

The newly opened facilities identified by the state liaison will be reviewed to determine their eligibility, and one-half of such facilities will be included in the sample by applying a random sampling algorithm to any eligible new facilities identified by each state. During NSYC-2 state recruitment, 68 new facilities were identified using this process; of these, approximately 26 were deemed eligible based on the available information. Subsequently, 10 facilities were added to the frame reflecting the NSYC-2 sample design. For NSYC-3, we estimate that 26 eligible contract facilities will be identified during state recruitment and that 13 contract facilities will be added as a supplemental sample along with the 19 facilities described above.

Based on data from NSYC-2, we anticipate that approximately 27% of facilities in the NSYC-3 sample will have no youth sampled. Most of this will be the result of facility ineligibility (i.e., the sampled facility has closed and no longer houses adjudicated youth). After accounting for nonresponse and ineligibility, the expected composition of the cooperating facilities for NSYC-3 is shown in Table 4:



Table 4. Composition of Sampled and Cooperating Facilities by Facility Size



Facilities

Adjudicated Youth


Total

State

Contract

Total

State

Contract

Sampled Facilities

453

314

139

18,868

14,796

4,072

Supplemental Sample







2016 JRFC

19

1

18

592

57

535

State Recruitment

13

0

13

381

0

381

Total

485

315

170

19,841

14,853

4,988








Facility Size







10-19

173

104

71

2,396

1,463

1,007

20+

312

211

99

17,444

13,390

3,981








Cooperating Facilities

354

230

124

14,484

10,843

3,641








Facility Size







10-19

127

76

52

1,749

1,068

735

20+

227

154

72

12,734

9,775

2,906



  1. Youth Sampling

Youth response rates will vary across facilities according to whether the state or facility grants in loco parentis consent (ILP) for all, some, or none of the youth who are younger than the age of self-consent. Response rates in facilities with ILP, or if the youth population is mostly above the age of self-consent, will be much higher because it will not be necessary to obtain parent or guardian consent (PGC).

For most facilities, all adjudicated youth in each selected facility will be selected with certainty. Based on the workload associated with a 5-day interview period in facilities where more than 160 youth are consent-eligible, we will randomly subsample the youth to reach a final sample of 160 youth.1 We will also subsample youth when required due to facility or team resource constraints.

As in the prior NSYC administrations, a 10% subsample of youth at each facility will receive an alternative questionnaire (NSYC-A) that focuses on topics other than sexual assault. This is done to protect the confidentiality of the respondents. Thus, no one, other than the respondent answering the questions, will know whether or not a given youth responded to the sexual assault questionnaire.

Even though individual survey responses are protected from disclosure, we will not publish facility-level estimates where fewer than 10 youth completed a sexual assault survey to further ensure confidentiality. However, the data from these facilities will be included in the state-level and national estimates.

Facility-level estimates will be provided if the precision of the estimates meet a minimum standard for reliability. For NSYC-2, this standard required the estimate to meet all of the following criteria: (1) they were based on at least 15 youth who completed the sexual victimization survey; (2) they represented facilities with a 30% response rate or greater; and (3) they had a minimum coefficient of variation of 30% and were significantly precise to detect a high victimization rate. Similar criteria will be used for publishing estimates for NSYC-3.

Table 5 displays the estimated number of completed interviews based on the number of eligible facilities and youth reported in the 2015 CJRP and those forecasted for the supplemental (non-sex survey) sample.

Table 5. Estimated Sample Sizes by Number of Completed Sexual

Assault Surveys


Completed Sexual Assault Surveys

Facilities

Sampled Youth

Interviews

Total Sexual Assault Surveys

Youth younger than 15

197

3,409

2,045

1,842

Youth aged 15 and over

157

11,075

6,645

5,979

Total

354

14,484

8,690

7,821



  1. Procedures for Information Collection

  1. Agency Recruitment

The administrator of each state agency will receive a letter from BJS and a study information packet. The letter will request that the administrator identify a state staff member to serve as liaison for the study. Westat will contact the administrator soon after the packets are sent, answer questions the administrator might have, and obtain the contact information of the liaison. The same procedures will be used to contact executives of the non-state agencies with jurisdiction over the sampled non-state facilities.

Study information packets will be sent to all liaisons along with a letter explaining that Westat staff will contact them to discuss procedures and requirements (Attachments 5-I, 5-II, 5-III, and 5-IV). Through email and brief telephone contacts with the liaisons, state enrollment specialists will gather information on state mandatory reporting requirements, informed consent procedures, research clearance process (e.g., state institutional review boards), and other logistical requirements. Liaisons will be asked to provide written documentation of the state or non-state agency support for the study.

  1. Facility Recruitment

All facilities in the frame of 453 (state and non-state facilities) will be selected with certainty as well as a supplemental sample of approximately 32 new facilities for a total of 485 juvenile facilities. Each sampled facility will be contacted to notify them of their selection and to request participation. A coordinator will be designated at each facility and study materials will be mailed to this person (Attachments 5-V, 5-VI, and 5-VII). Facility enrollment specialists will conduct a series of brief telephone calls to collect initial information about the facility and obtain logistical information for planning the survey visit (Attachments 5-VIII and 5-IX).

  1. Sampling of Youth


Six to eight weeks prior to data collection at a facility, the facility will provide a roster of all adjudicated youth who are currently residing there (Attachments 10-I and 10-II). All youth to be interviewed will be drawn from the roster. The facility coordinator will provide periodic updates of admitted or discharged youth prior to the visit to the facility. As new youth are admitted, they will be sampled for the survey.

  1. Parent/Guardian Consent

For those facilities requiring PGC consent, procedures to contact the households will be negotiated with the facilities (based on requirements specified by the state or non-state agency). Either the facility or the study contractor will send written materials containing an explanation of the study and the nature of youth involvement to the parent (Attachment 3-I, 3-II, and 3-III and 3-IV). Depending on the procedures that are negotiated, mail or telephone prompting of parents/guardians who do not respond to the initial mailing will be done (Attachment 3-V).

  1. Data Collection

A team of interviewers will visit the facility. They will ask facility staff to individually bring each consent-eligible youth who was drawn from the sample roster to a private interviewing area. The interviewer will read an assent script (Attachment 4-I and 4-II) to the youth to ascertain his/her participation. If the youth agrees to participate, the interviewer will initiate a brief ACASI tutorial to familiarize the youth with the headphones, and touch-screen display. Once the youth finishes the tutorial, the ACASI system will shift to the questionnaire assigned to the youth (i.e., either NSYC-core or NSYC-A) and the youth will be able to complete the survey privately. At the end of the questionnaire, the youth will turn the computer back to the interviewers. The facility staff person will escort the youth from the interview area and the interviewer will then finish the process by answering a set of debriefing questions about the interview.

The facilities will also be asked to complete the online Facility Questionnaire (Attachment 5-X). This will allow researchers to examine correlations between victimization incidents and the specific facility characteristics.

  1. Weighting and Nonresponse Adjustment

The survey data will be weighted to provide facility, state, and national estimates. To generate facility estimates, an initial weight will be assigned to each youth. In most facilities, the initial weight will correspond to the inverse of the probability of being selected for the sexual assault questionnaire. In facilities where subsampling of consent-eligible youth occurred, the initial weight will correspond to the inverse of the probability of being selected for the sexual assault questionnaire multiplied by the facility’s subsampling rate.

A series of adjustments will be applied to the initial weight to compensate for nonresponse. These adjustments will be completed in three steps:

  1. Adjustment cells will be constructed based on facility and youth characteristics (e.g., facility population size, offense, race, Hispanic origin, age, gender, and the number of days in facility).

  2. In creating adjustment cells, we will require a minimum nonresponse cell size of 10 youth. In many facilities, this will result in a single adjustment cell for the weight adjustment.

  3. After the initial nonresponse adjustment, the distribution of weights will be examined. Based on the experience of NSYC-2, if the highest weight is more than 4 times the lowest weight, the weight will be trimmed and the difference between the weighted counts before and after trimming will be redistributed so the final ratio of high to low weights equals 4. The trimmed weight is the final facility-level youth weight and will be used for facility-level estimates only.

  4. To generate state and national estimates, the facility weights will be adjusted to reflect each facility’s probability of selection into the sample and then adjusted for facility nonresponse. The next steps in creating state and national nonresponse adjustments will be the same as those described above for facility-level weights.

The state-level weight will adjust the facility weight for the probability of selection of the facility and facility non-response. The probability adjustment will be inverse of the probability of selection for the facilities. Historically, virtually all facilities have participated and an elaborate nonresponse adjustment for the state facilities was not needed. However, facility-level nonresponse may go up in NSYC-3 because there will be a significant increase in the number of contract facilities in the sample. We will consider applying a state-level nonresponse adjustment that utilizes information from the frame including size of the facility, the type of facility, the gender mix, the ownership of the facility, and any other variables that become available. These state level weights will be used to generate the state-level estimates across the facilities within the state.

  1. Standard Errors and Confidence Intervals for Facility Estimates

Survey estimates are subject to sampling error arising from the fact that the estimates are based on a sample rather than a complete enumeration. For facility estimates, the sampling error varies by the value of the estimate, the number of completed interviews, and the size of the facility.

To express the possible variation due to sampling associated with facility-level estimates of sexual assault, we will provide lower and upper bounds of the related 95 percent confidence intervals. Because many facility samples will be small and the estimates may be close to zero, confidence intervals will be constructed using a method developed by Wilson.2

The Wilson method assumes that the distribution of is approximately t, where P is the underlying population proportion (e.g., assault rate). The inequality is then rearranged to obtain a confidence interval for P, where df = the degrees of freedom associated with t. The confidence limits based on the Wilson score method are then calculated as



where and .


Computationally, this method produces an asymmetrical confidence interval around the facility estimates, in which the lower bound is constrained to be greater than or equal to zero and the upper bound is less than or equal to 100 percent. It will also provide confidence intervals for facilities in which the survey estimates are zero (i.e., no assaults were reported). An estimate of 0 does not necessarily mean that the true assault rate is zero, but rather that the sample was too small to yield an occurrence of a very rare event through random sampling.

To provide an indication of the facility level precision to be achieved under the proposed sample design, Table 6 shows the approximate 95 percent confidence intervals of estimates when using the Wilson method for various sample sizes (i.e., numbers of youth completing interviews) and assault rates. The calculations assume a design effect of 1 and that half of the sampled youth will have a completed sexual assault survey.






Table 6. Approximate 95% Confidence Intervals for Facility Estimates Using the Wilson Method


Completed Interviews

Assault Rate

Lower Bound

Upper Bound

10

.00

0.00

0.16


.10

0.03

0.30


.20

0.08

0.42

30

.00

0.00

0.06


.10

0.05

0.20


.20

0.12

0.32

50

.00

0.00

0.04


.10

0.06

0.17


.20

0.13

0.29

100

.00

0.00

0.02


.10

0.07

0.15


.20

0.15

0.26


  1. Standard Errors and Confidence Intervals for State and National Estimates

For state and national estimates, the standard errors of estimates derived from the survey will be computed using the formula , where is the estimated assault rate, is the standard error for proportions, , and is the appropriate percentile of a t distribution.

Table 7 shows the approximate 95-percent confidence intervals for estimates based on a range of sample sizes applicable to state/regional and national estimates. The calculations assume that half of the sampled youth will have a completed sexual assault survey and an average design effect of 1.5. This value reflects differences in weights due to facility sampling, unequal weighting, and clustering.


Table 7. Approximate 95% Confidence Intervals for State and National Estimates


Completed Interviews

Assault Rate

Lower Bound

Upper Bound

200

.05

0.044

0.056


.10

0.089

0.111


.15

0.133

0.167

500

.05

0.046

0.054


.10

0.093

0.107


.15

0.140

0.160

1,000

.05

0.047

0.053


.10

0.095

0.105


.15

0.143

0.157

11,812

.05

0.049

0.051


.10

0.099

0.101


.15

0.148

0.152



  1. Reducing and Assessing Measurement Error

The survey design and procedures consider three major sources of measurement error:

  1. Comprehension. The sampled youth in this survey will be between 12 and 25 years old. Some youth may not have high reading skill levels and may have trouble concentrating for extended periods of time.

  2. Sensitive questions. It is difficult to disclose details about sexual assault incidents, especially during an interview. It may create respondent distress. In addition, youth may not be confident of the promise of confidentiality of the survey data and fear that someone in the facility will find out about the interview. This could lead to a fear of reprisal, either by other youth or by facility staff.

  3. Overreporting. Some youth may report a sexual assault that did not occur. This concern was voiced by some facility administrators. Since there are no actions taken with the youth as a result of reporting something on the survey, administrators feared youth would manufacture incidents knowing the facility staff may be held accountable.

With respect to comprehension, the core questions on sexual assault remain unchanged from previous NSYC-1 and NSYC-2 surveys. They have been extensively reviewed by experts and have undergone cognitive testing on previous rounds of the survey. NSYC-3 incorporates additional design features to minimize comprehension errors:

(1) The interview is self-administered through ACASI. The elimination of a personal interviewer, and allowing the youth to interact privately through the computer, increases youth’s willingness to disclose sensitive information, while the audio portion of the interview ameliorates low literacy conditions.

(2) The questionnaire is programmed to display “hot words,” highlighted in a different color, which youth can access by dragging the cursor over the word which will display the definition if they are uncertain of its meaning.

(3) The ACASI back end program will be invisible to the respondent, but includes a number of range and logic checks to guard against unrealistic values and ask youth in some cases to verify their responses; and 4) To further assist youth having difficulty with the interview, the computer will flag those who spend a long period in particular sections of the interview and prompt the youth to obtain assistance from an interviewer. All of these procedures were used on NSYC-1 and NSYC-2.

Once the interviews are completed, individual response patterns will be assessed to identify interviews having extreme or internally inconsistent responses. NSYC-1 and NSYC-2 used three response patterns considered indicative of invalid data. These patterns included

  • a youth completed the survey in less than 10 minutes. Based on internal testing, it was determined to be extremely difficult for a respondent to seriously complete the interview in less than 10 minutes.

  • the reported number of forced sexual contacts with other youth exceeded 1.5 incidents per day for every day since admission to the facility.

  • the reported number of forced sexual contacts with facility staff exceeded 1.5 incidents per day for every day since admission.

If the youth had any of these values, their responses were eliminated from the estimates. For NSYC-1, out of 9,362 completed interviews, 89 had at least one of these response patterns. For NSYC-2, out of 8,845 completed interviews, 67 had at least one of these response patterns.

BJS and Westat will develop an additional list of indicators to assess whether a youth showed signs that he or she did not fully understand the survey items, whether the youth did not consistently report the details of events, or if the youth provided inconsistent responses. On NSYC-2, this list contained 30 veracity indicators. For example, one indicator was if the youth provided unrealistic personal information. Another indicator was if the youth reported in a debriefing item that questions on sexual activity were hard to understand. Other indicators compared responses in one section of the survey with responses in other sections as a measure of internal reliability.

We will combine indicators into a count of the total number for each youth. Interviews will be taken out of the estimates if a youth has multiple indications of problems. Multiple extreme/inconsistent responses will be required to eliminate the response in recognition that youth could provide extreme or inconsistent data for a few items without them invalidating the entire interview. On NSYC-2, a standard of having at least 3 of the 30 response patterns was used to eliminate an interview from the analysis. The results of these 30 indicators were combined into a total number for each youth. About 90% of youth did not give any inconsistent responses; 7.0% reported one inconsistent response; 1.8% reported two; and 0.8% reported three or more. For estimating NSYC-2 sexual victimization rates, an additional 71 interviews were excluded based on three or more indicators of inconsistent responses.

The NSYC-2 publication also provided the national victimization rates when different criteria were used (e.g., when 2 or more outliers eliminated an interview; or when 1 or more eliminated an interview). Readers could then judge the extent to which these decision rules affected the results. For NSYC-3, a similar set of procedures will be used.

Facilities expressed concern that the youth could too easily over report or falsify sexual victimization in NSYC-1 and 2. There were also concerns that youth could telescope prior incidents of victimization forward (meaning, the youth reported incidents, which may have occurred outside the reference frame, and not while they were residing in their current facility). These issues have been addressed in several ways. First, between NSYC-1 and NSYC-2, Westat recruited 10 consultants who were either experts or investigators in the field of child welfare. The consultants were asked to review the NSYC-1 materials and comment on the methods used to evaluate the veracity of youth reporting as well as suggest other ways to evaluate the veracity of youth responses. As a result of the consultants’ feedback, changes were made to the NSYC-2 instrumentation (e.g., asking more details about a sexual contact that the youth reports). The NSYC-3 instrument retains these changes.

In addition, facilities in NSYC-2 reported concern over youth collusion during the survey administration period. To measure the extent of this issue, five questions about the youth’s prior knowledge and communication regarding the survey have been added to the NSYC-3 survey. Finally, the NSYC-3 instrument debriefing section (at the end of the survey) also contains three items asking youth whether anyone in the facility may have pressured them to respond to questions on the survey in a certain way. If the youth endorses any of these item, there are follow up questions asking who may have pressured them, and about which items.

  1. Imputation for Item Nonresponse

Variables important for weighting will be imputed. In past NSYC surveys, nonresponse to these items has generally been very low. However, even in such cases, imputation can be useful in facilitating analyses and in helping to ensure consistency among analyses. Because we do not anticipate significant amounts of missing data, imputation will be done using a hot deck procedure. This uses variables associated with the variable to be imputed (e.g., the youth’s Hispanic origin) to form imputation “cells” containing both cases with reported data for a given item (“donors”) and cases with missing data for that item (“recipients”). For any variables that are imputed, Westat will provide imputation flags so the data users can distinguish imputed values from reported values. To evaluate the impact of imputation on the estimates, we will examine and compare the pre- and post-imputation distributions of each imputed item.

  1. Nonresponse Bias Analysis

As noted in the sample design section, the weighting of the data will include an adjustment for nonresponse using variables available on the frame. Even after this adjustment, there may still be bias if the nonrespondents are different from the respondents within the weighting adjustment classes. Because it is difficult to make any definitive assessments related to nonresponse bias without external data, we will conduct several different analyses that check different assumptions associated with nonresponse bias:

One example of an analysis will compare facility-level response rates (e.g., by size, consent type, gender held, and type of facility) and youth-level response rates for subgroups (e.g., by characteristics on the roster). These comparisons will provide an indication of which groups and types of facilities have the lowest response rates.

A second example compares the final estimates (which include adjustments for nonresponse) to base-weighted estimates. This provides an indication of whether the weighting is affecting the estimates and possible direction of bias for an array of different outcomes (i.e., victimization) and respondent characteristics (e.g., gender, sexual orientation, attitudes toward staff, and mental health).

A third example looks into whether the consent type (ILP vs. PGC) is associated with differences in victimization rates. This can be investigated by estimating a two-level regression model (level 1 – youth; level 2 – facility), which controls for respondent and facility characteristics, including consent type. If consent type is significant, there would be some indication that the lower response rates associated with PGC may have a systematic bias relative to ILP facilities.

  1. Methods to Maximize Response

The survey materials are designed to be clear, concise and easy to use. Project staff will discuss with state liaisons whether the state can provide ILP consent. This discussion will include possible adaptations of the study protocol; for example, a state might be able to provide consent so long as the parent/guardian was notified of the survey and did not express refusal for their child to participate (i.e., passive consent). In addition, the study team is prepared to assist facilities in obtaining PGC consent for sampled youth. This includes conducting the mailings, using special mailing procedures (e.g., express delivery), making telephone calls to check on consent packages, and obtaining verbal consent by telephone (when approved by the state or non-state agency).

The voluntary nature of participation and assurance of confidentiality of the data collected is explained and reiterated throughout the PGC consent process and in the youth assent process. The NSYC questionnaires have been designed to maximize privacy and foster an awareness of the steps taken to protect confidentiality. Examples include the audio only delivery of questions through headphones and the lack of direct involvement of the interviewer once the youth begins answering the questions.

The NSYC questionnaires have also been designed to maximize respondent comprehension and participation and minimize burden. Examples include the touch-screen interface with the questions simultaneously delivered through audio feed via headphones. A Spanish version of the questionnaire is available for non-English, Spanish-speaking respondents. Westat field staff will be available to answer any questions that youth may have, including bilingual staff who can answer questions in Spanish. Arrangements with mental health staff at each facility or with outside providers will be made for delivery of counseling services for respondents interested in obtaining counseling services or assistance following the survey.

  1. Test of Procedures or Methods

Revision of Youth Surveys

The NSYC-3 data collection procedures are largely unchanged from those used for NSYC-2. However, some survey questions have been modified or added to the NSYC-core and NSYC-A surveys. In the development of the new items, BJS and Westat reviewed the NSYC-2 findings and existing literature on sexual assault. Of particular interest was including items that might be correlated with sexual assault, such as sexual orientation and gender identity, minority status, disability status, and grooming by staff or other youth. Through discussions with the Office of Juvenile Justice and Delinquency Prevention (OJJDP), BJS identified additional areas of interest for the NSYC-A survey to replace the alcohol and drug questions administered in NSYC-1 and NSYC-2. Westat conducted a literature review for each new topic area and revisited the literature for existing topics to determine if NSYC-2 items could be improved. Careful consideration was made to use measures with established national norms for youth populations, whenever possible.

The products of these reviews were summaries of recommended changes or additions to the survey. These reviews provided a rationale for the changes, based on extant literature and agency needs, as well as clarification on item wording. BJS and Westat held a series of meetings to review and modify the recommendations.

Once instruments were developed, BJS sponsored a 2-day national workshop. Attendees included state juvenile justice administrators, advocates, federal agency representatives (e.g., OJJDP and National Institute of Corrections), and other stakeholders. During the workshop, the proposed questionnaires were reviewed and discussed. Following this meeting, more revisions were made to the survey instruments.

The project team also conducted an Instrument Error Assessment Analysis on NSYC-2 items. The analysis examined the average timing of the sexual assault sections, looked at item percent missing, examined the kurtosis and skewness of continuous items, included an outlier analysis, evaluated consistency across items, evaluated inconsistent responses, and conducted an analysis of partially completed surveys. In addition, this review examined methodological research specific to asking juveniles about sensitive behaviors, with specific attention to possible overreporting of different types of behaviors. Decisions to add, drop or modify some items from the NSYC-2 instruments were based on this analysis.

The NSYC-core instrument contains changes to the following sections:

  • Section A on youth background and demographics was largely unchanged from NSYC-2 except for four new questions on sexual orientation and gender identity as adapted from the Federal Interagency Working Group on Improving Measurement of Sexual Orientation and Gender Identity (SOGI).

  • Section B on facility perceptions and prior victimization included new or revised questions on staff treatment, grooming, gangs, and education on PREA. All but one question on youth misconduct and victimization was unchanged.

  • Section C on sexual victimization in the facility remained the same to ensure comparability with NSYC-1 and NSYC-2 findings.

  • Section D, the detailed incident form for reporting sexual assault by another youth, was added. It will be administered if youth report youth-on-youth sexual assault in section C. This section was revised to provide additional information on all reported incidents and detail on the most serious incident. The addition of items will help obtain detailed characteristics of specific incidents and will help assess the veracity of some of the data collected about all incidents.

  • Section E is the detailed incident form for reporting sexual assault by staff. Similar to Section D, it was revised from NSYC-2 to provide more information on all reported staff-on-youth sexual victimization and more detail on the most serious incident.

  • Section F is a new section that covers disabilities, facility living conditions, mental health, youth history of victimization, youth experience with segregation/isolation, and access to legal counsel.

Most of the new items in the NSYC-core survey were placed after Section C to eliminate the possibility of affecting how youth respond to the sexual victimization questions.

NSYC-A is substantially different from the NSYC-A instrument used in NSYC-1 and -2, which focused largely on alcohol and drug use and treatment. The NSYC-A instrument in the current study has four sections. Sections A, B, and F are the same on both NSYC-A and the NSYC-core instrument. Section G contains revised questions on drug and alcohol use, treatment programs/services for substance abuse, and grievance procedures, and new questions on facility living conditions, mental health, treatment programs/services for mental health problems, legal counsel and institutional experiences, youth education and aspirations, communication with family, and after release plans.

If youth assigned to the NSYC-core instrument complete sections A-F in less than 35 minutes, they will be routed to Section G of the alternative survey to ensure that their survey time reaches 35 minutes. In Table 8, ‘X’ marks which sections are included in each instrument and ‘P’ indicates that questions in this section are asked of youth assigned to NSYC-core if they need to expand the survey to reach the target of 35 minutes.





Table 8. Sections in the NSYC-core and NSYC-A Instruments


Section

NSYC-core

NSYC-A

A. Youth background and demographics

X

X

B. Facility perceptions and prior victimization1

X

X

C. Sexual victimization in the facility

X


D. Detailed incident report form on sexual assault by another youth

X


E. Detailed incident report form on sexual assault by staff

X


F. Other topics (disabilities, facility living conditions, mental health, history of victimization, segregation/isolation, legal counsel, prior institutional experiences)

X

X

G. Other topics continued (grievance procedures, substance use, treatment programs and services, living arrangements, youth education and aspirations, communication with family, plans after release, additional items on facility living conditions and mental health)

P

X

Debriefing


1Random assignment to NSYC-core or NSYC-A occurs here.

X

X



New items for the NSYC-3 youth instruments were cognitively tested in August 2016 after receiving OMB clearance. The goal of the cognitive testing was to ensure that youth understood the questions, terminology, and response categories for new items. Twenty cognitive interviews were conducted with adjudicated male and female youth residing in three juvenile facilities. A total of 126 questions were explored with youth using two separate protocols. For sensitive questions, such as those involving staff grooming, youth were asked to only provide their interpretation of these items rather than answer them directly.

Ten additional cognitive interviews were conducted with youth residing in the community who identified as gay, lesbian, bisexual or transgender (LGBT) or who had a close family member or friend who was LGBT in order to thoroughly test the new sexual orientation and gender identity (SOGI) items. Based on findings from the initial cognitive interviews, a second round of cognitive interviews was conducted in November and December 2016 with five additional community youth who identified as gay, lesbian, or transgender. Seven questions (four SOGI and three disability items) were tested with the community youth and young adults.

Westat and BJS examined the results of the cognitive testing and modified some questions based on the cognitive testing (74 questions worked well and no change was recommended, 29 were recommended for minor edits, and 23 were recommended for more substantial edits).


Revision of Facility Survey


The facility survey will be administered at the same time as the data collection team visits for the youth surveys. A facility designee will complete the survey. The NSYC-3 facility survey will be administered on the web with an option to print a PDF if a hard copy version is desired.

The facility questionnaire comprises mostly new items, though several items are carried forward from the NSYC-2 facility questionnaire. Combining information about the facility with the self-reported information from the residents is expected to provide a more comprehensive picture of individual and facility characteristics when victimization is assessed. Selection of the items that remain on the facility questionnaire was informed by the NSYC-2 analysis of the facility attributes and subsequent identification of the attributes being associated with sexual victimization.

An initial version of the survey was reviewed and discussed at the national workshop. A subsequent expert review of the facility questionnaire was conducted with seven administrators who attended the workshop. They were asked to review the questionnaire but not to complete the questions. During 30-minute telephone interviews, they provided their overall impressions of the facility questionnaire and identified items that may be confusing or could be interpreted in multiple ways. Feedback from the expert review and the Spring 2017 Pilot Test (described below) were used to make final revisions to the facility survey.

Pilot Test of the Youth and Facility Surveys


The Pilot Test of the youth surveys was conducted in six facilities. The purpose of the Pilot was to 1) assess the length of time to complete the survey; 2) test the skip patterns and routing instructions for youth assigned to each version of the survey; and 3) determine whether any specific items were problematic. The report from the Pilot Test is presented in Appendix X.

A total of 151 youth responded to the youth surveys and completed enough of the items to be counted as a completed interview; 9 youth refused, and 1 youth was not usable due to a technical error.

The Pilot Test confirmed that the procedures to maintain an average time-to-complete the survey of 35 minutes were successful. As shown in Table 9, the survey from the tutorial through the debriefing questions took, on average, 34.9 and 33.5 minutes, to complete the NSYC-core and NSCY-A, respectively. It took an additional 7 minutes to complete the informed assent.




Table 9. NSYC Timings in Pilot Test

Section(s)

NSYC-core

Mean Time (Minutes)

n=126

NSYC-A

Mean Time (Minutes)

n=25

Assent only

7.0

7.2

Tutorial only

6.4

6.1

Total Time

  • Section A, B, F, and G

  • Section A through F

  • Tutorial through Section F

  • Section A through G

  • Section A through Debrief

  • Tutorial through Debrief




17.8

24.2

27.2

28.5

34.9



26.2




27.3

33.5


Debrief

1.3

1.2


Skip patterns were checked to assess the programming logic in the computerized survey. Two errors were found and corrected.

Frequencies for all items were reviewed to assess whether there were any problems with skipping or refusing to complete particular questions or there was lack of empirical variation. Overall, the rate of missing data was found to be low (well less than 5%). There were a small number of items that had more than 5% missing data. Among these, it was recommended to delete one item. The others had rates above 5% because of the nature of the questions (e.g., open-ended questions to explain prior answers).

Frequencies were also reviewed to assess whether there was adequate variation in the new items. This review found that almost all of the new items had adequate variation. The items with low variation could be explained because they cover infrequent behaviors (e.g., gender identity). No items were recommended for deletion as a result of the review of frequencies.

The Facility Survey was concurrently administered within the same six facilities during the Pilot Test. On average, four facilities were able to complete the questionnaire in 2.5 hours or less. Two facilities completed the questionnaire in over 2.5 hours. A copy of the Pilot Test Analysis Report is attached (see Attachment 11). Recommendations were made to reduce the overall number of items on the facility questionnaire, with a particular focus on questions that asked for statistics on facility staff. Other items that did not exhibit any variation were recommended for elimination. In addition, clarification was added for items that asked about “admitted” and “released” youth.

  1. Consultation Information

BJS contacts include

Jessica Stroop

Statistician

Bureau of Justice Statistics

810 Seventh St., N.W.

Washington, DC 20531

(202) 598-7610


The Principal Investigator for NSYC-3 is –

David Cantor

Vice-President

Westat

1600 Research Blvd

Rockville, MD 20850

(301) 294-2080



Allen J. Beck, Ph.D.

Senior Statistical Advisor

Bureau of Justice Statistics

810 Seventh St., N.W.

Washington, DC 20531

(202) 616-3277







1 Consent-eligible youth include those for whom the state or non-state agency provides in loco parentis consent, those for whom a parent/guardian provides consent, and those who are able to self-consent.

2 Brown, L.D., Cai, T. and DasGupta, A. (2001) Interval Estimation for a Binomial Proportion, Statistical Science, 16 (2), 101-138, and Wilson, E.B. (1927) Probable Inference, the Law of Succession, and Statistical Inference, Journal of the American Statistical Association, 22, 209-212.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJessica Stroop
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy