Pilot Study Memorandum updated

NCVS CS OMB Pilot Package_final2_21_rev1.docx

Research to support the National Crime Victimization Survey (NCVS)

Pilot Study Memorandum updated

OMB: 1121-0325

Document [docx]
Download: docx | pdf

MEMORANDUM

MEMORANDUM TO: Lynn Murray

Clearance Officer

Justice Management Division


THROUGH: James P. Lynch

Director

FROM: Michael Planty

Statistician, Project Manager

DATE: February 21, 2012


SUBJECT: BJS Request for OMB Clearance for Field Testing under the National Crime Victimization Survey (NCVS) Redesign Generic Clearance, OMB Number 1121-0325.

Shape1

The Bureau of Justice Statistics requests clearance for field test tasks under the OMB generic clearance agreement (OMB Number 1121-0325) for activities related to the National Crime Victimization Survey Redesign Research program. BJS, in consultation with Westat under cooperative agreement (Award 2010-BJ-CX-K077 National Crime Victimization Survey Research on Sub-national Estimates), is requesting clearance for a pilot test of two approaches to sample and contact households for a planned multi-site test of a Companion Survey to the NCVS. The goal of the NCVS Companion Survey is to help BJS determine the most viable and cost-effective option for generating sub-national estimates of crime victimization. The results from this pilot test will be used to inform the design of a full-scale test planned for late 2012.


The current request for approval is under the NCVS Redesign Generic Clearance Number 1121-0325. The generic clearance is currently set to expire May 31, 2012, and an extension has been submitted for approval. However, the field collection for the current request will overlap between the current and requested clearance periods. For this reason, BJS is requesting an exception to the rule to display the OMB expiration date on the instruments. The OMB number will be printed on the forms.


Purpose of the Research

BJS, in consultation with Westat under a cooperative agreement, has planned an NCVS Companion Survey (CS) as a way of producing sub-national estimates using a more cost-effective approach than the core NCVS. The first data collection phase of the NCVS-CS is a pilot test to be conducted in a single MSA (the Chicago-Naperville-Joliet, IL-IN-WI MSA). The pilot will test two data collection approaches, both using an address-based sample (ABS) design. One approach (which will be referred to as “Approach 2B” or a “telephone number harvest”) will screen by mail only those addresses for which we are unable to obtain a valid telephone number from directory services; the purpose of this mail screener is primarily to obtain a telephone number. The other approach (referred to as “Approach 2C” or the “two-phase ABS hybrid”) will screen all selected addresses by mail with a goal of oversampling households that are likely to include a victim of a crime. The 2C approach will also include questions that might be used to support model-based small-area estimates (SAE). For both approaches, we will conduct a telephone version of the core NCVS interview with sampled households, including a household informant and one or two randomly selected adults. (Refer to Attachments A and B for a background discussion on how the NCVS-CS approach was selected.) If the pilot data collection indicates that one of the approaches is viable for producing sub-national estimates in a cost-effective manner, then BJS plans to conduct a full-scale test in an additional five MSAs. Clearance for this five-MSA collection would be requested in a future OMB package (anticipated submission in June 2012).


The goals of the pilot are to:

  • Assess the viability of ABS is obtaining sub-national estimates in a cost-effective manner.

  • Identify which of 2B, 2C provides more information for producing blended estimates.

  • Identify which of 2B, 2C provides more information for small area estimation.

  • Analyze the effectiveness of the 2C screener at identifying households with a victim.

  • Determine optimal subsampling fractions for the full-scale test.


More details on the analytic objectives of the pilot data collection are included in Attachment L.


Usability and Cognitive Testing

Three rounds of cognitive interviews have been completed in order to test the 2B and 2C mail screeners, which are new to the NCVS methodology. The first round of cognitive testing took place primarily in Baltimore, MD, and the second and third rounds were conducted in Chicago, IL. A description of the cognitive interview methodology has already been delivered to OMB under a prior submission. Both the 2B and 2C screener performed well in the testing: there were no comprehension issues associated with any of the questions in either screener. Even respondents with relatively low levels of education (less than a high school diploma) understood the questions and could choose responses that mapped to their actual experiences. Respondents were able to repeat the questions in their own words and were able to report for the 12 month time-frame. When asked to validate the time-frame, respondents could provide anchors in time that sounded accurate and inspired confidence. During testing, three areas warranted instrument revision:


  1. Response option used for the neighborhood questions was problematic. The neighborhood questions ask respondents to agree/disagree with statements about neighborhood cohesion. Respondents are provided the following response options: “Strongly agree,” “Agree,” Neither agree nor disagree,” “Disagree,” and “Strongly disagree.” These neighborhood-cohesion survey items come from an existing instrument (Project on Human Development in Chicago Neighborhoods: Community Survey, 1994-1995). In both completed rounds of the NCVS-CS screener testing, many respondents chose “Neither agree nor disagree” when they actually meant “sometimes,” or “some people but not all people.” Other respondents used the “Neither agree nor disagree” to indicate a “don’t know” response.


Recommended Solution: In response to this consistent problem, we have added a “don’t know” response option to each of these questions to reduce the ambiguity of the middle category response. Since the source survey was interviewer administered, a “don’t know” option was implicitly included, so this modification is a relatively minimal change.


  1. Frame of reference for crime victimization. Mail screener 2C includes questions about victimization. Two of the crime victims from Baltimore thought that the victimization questions starting at Question 9 were limited to victimization that occurred in the neighborhood. One respondent was reluctant to report a theft of personal property that occurred in Washington DC because it did not occur in his neighborhood and another respondent reported a theft that a neighbor experienced because it occurred in the neighborhood.


Recommended Solution: As this seems to be a context effect, one solution might be to revise the question order. However, we are currently asking the neighborhood questions first in order to promote interest in the survey and we view this as important; we believe changing the question order could be harmful since starting the survey with questions about crime victimization could be detrimental to response. Because these data are to be used primarily to support oversampling, and are not being used to generate estimates of victimization, we do not suggest any changes to the question order. Instead we are revising the instruction to indicate that respondents should include crimes regardless where they occurred1.


  1. Redundant Content. The screener instrument that was tested included two opinion questions about police involvement. The first asked whether “police are doing a good job in dealing with problems that really concern people in this neighborhood.” The second asked whether “police do a good job in responding to people in the neighborhood after they have been victims of crime.” Respondent answers to these two items were almost always identical. These two items tap into the same sentiments, and did not produce unique information.


Recommended Solution: In order to reduce the length of the instrument, the second question about police response has been deleted.



Use of Incentives in the Pilot Test

N/A – no incentives are planned



Design of the Pilot Test

The pilot sample will explore two different approaches to sampling and contacting households and adults. The designs for both approaches start with a stratified simple random sample of addresses selected from the ABS frame in the Chicago-Naperville-Joliet, IL-IN-WI MSA. The ABS frame is a file of residential addresses that is maintained by a vendor, based on the United States Postal Service (USPS) Computerized Delivery Sequence File (CDSF). Two strata will be created, one for the central city of Chicago and one for the rest of the MSA. The sampling rate in the central city will be 1.5 times the rate in the remainder of the MSA to increase the expected yield of victims of crime and improve the ability to make comparisons with the Core NCVS. The sample will be randomized within stratum to the two experimental conditions. Refer to Figures 1 and 2 in Attachment C for flow charts illustrating the two approaches. We describe these two approaches below.


Design of Approach 2B. In Approach 2B, addresses will be sampled from the ABS frame and immediately matched by a vendor to identify telephone numbers associated with the addresses. Those with matching telephone numbers will be sent an advance letter (Attachment J) then called by a trained telephone interviewer. The NCVS household screener will be conducted with an adult (18 or older) living in the household. (This is the household interview.) In that interview, the respondent will be asked to verify that their address matches the sampled address, and two adults (if there are two or more adults in the household) will be randomly sampled for the personal victimization screener. In most households the household informant will be a sampled adult. About 85 percent of households have two or fewer adults so the chances of sampling the informant are very high.


In Approach 2B, if no telephone number can be linked to the address or if the linked telephone number is not correct for the address, then a mail screener will be sent to the address. The primary purpose of the mail screener in this approach is to obtain a telephone number. In order to increase interest in the survey, we will include a limited number of questions on perceptions of the neighborhood and of emergency services. Those households that provide a telephone number will be called by a telephone interviewer, using the same procedure as described above for matched households. Those with no telephone number will be classified as nonrespondents.


Design of Approach 2C. In Approach 2C, all addresses sampled from the ABS frame will be sent a mail screener that contains brief questions on victimization experiences, perception of crime, and items associated with victimization. The items on the mail screener will be used to classify households as either High Risk (likely to have experienced victimization in the past year) or Low Risk (unlikely to have experienced victimization). Households classified as High Risk will be sampled with certainty and called by a telephone interviewer as in Approach 2B. The Low Risk households will be subsampled at a rate of ½ and those selected will be sent for the telephone interview. We plan to release the 2C sample in two replicates so that we can review the sampling rate and make appropriate revisions in the second replicate.


In Approach 2C, sampled addresses that do not return the mail screener (as well as those that do return the screener without a telephone number and are sampled for the second phase) will be matched to find a telephone number. We will subsample nonresponding households with matching telephone numbers at a rate of one-half initially; those selected will be called for an interview. Sampled addresses with no telephone number, whether they returned a mail screener or not, will be classified as nonrespondents. The subsampling rate for nonresponding households may be increased later in the field period if needed to obtain the target number of completed household screeners.


Test of Survey Name. In addition to testing the two different sampling approaches, we plan to implement a test of the 2C mail instrument to assess whether “localizing” the instruments has any effect on the response rates and response patterns. Half of the sample allocated to the 2C instrument will randomly be assigned to the localized version (which will use the phrase “Chicagoland” in the survey name and FAQs2) and the remainder will get a generic version of this instrument3.



Sample Design

Our planned sample sizes and expected number of completed screener and extended interviews for the two approaches are summarized in Tables 1 and 2.



Table 1. Proposed sample size for pilot, Approach 2B

Sample Components

Assumptions

Addresses sampled

12,500

Vacancy Rate

12%

Occupied households

11,000

Vendor phone number match rate

60%

Households matched for phone #

6,600

% of Vendor phone numbers that are valid

80%

Household screeners mailed

5,720

Households where interview attempted

6,996

Household interview response rate

35%

Expected household interview completes

2,449

Avg. # of adults sampled per household

1.706

Extended person interview response rate

75%

Expected extended person completes

3,134




Table 2. Proposed sample size for pilot, Approach 2C

Sample Components

Assumptions

Addresses sampled

14,000

Vacancy Rate

12%

Occupied households

12,320

% of Household screeners returned with phone number

40%

Household screeners returned with phone number

4,928

% of household screeners returned with no phone number

5%

Household screener returned with no phone number

616

Vendor phone number match rate

50%

Household screeners completed using vendor phone number

308

Subsampling rate for screener nonrespondents

1 in 2

Household screener nonrespondents subsampled

1,694

Household screeners completed in High Risk Stratum (25%)

1,309

Household screeners completed in Low Risk Stratum (75%)

3,927

Subsampling rate for High Risk Stratum

1

Subsampling rate for Low Risk Stratum

1 in 2

Households subsampled in Low Risk Stratum

1,964

Households sent for telephone interview

4,967

Household interview response rate

50%

Expected household interview completes

2,483

Average #adults sampled per household

1.706

Extended person interview response rate

75%

Expected extended person interview completes

3,179


Key Assumptions

The sample sizes in Tables 1 and 2 are based on the assumptions given below. Some of these assumptions are based on experiences with other ABS studies at the national or state level.

The assumptions are:

  • A vacancy rate of 12 percent of the sampled addresses.

  • For the 2B sample, the vendor will find matching phone numbers for 60 percent of sampled addresses, and 80 percent of these will ring at the sampled address; for the 2C sample, the match rate will be lower (50 percent) because only mail nonrespondents will be matched, and the mismatch rate is incorporated into the telephone interview response rate.

  • A household interview response rate of 35 percent for Approach 2B and 50 percent for Approach 2C, which has a much higher proportion of mail cooperators.

  • A sample of up to two adults per household in both approaches.

  • In approach 2C we assume that households will be classified as High Risk (25 percent) and Low Risk (75 percent) based on responses to the mail screener. All High Risk households will be sampled; 50 percent of the Low Risk households will be sampled.

  • In Approach 2C, 50 percent of the mail screener nonrespondents with a vendor-supplied phone number will be subsampled.

  • Property crime victimization rate of 14 percent and violent crime victimization rate of 2 percent.

  • For High Risk households a property crime victimization rate of 30 percent and a violent crime victimization rate of 4 percent. For Low Risk households a property crime victimization rate of 10 percent and a violent crime victimization rate of 1.3 percent.

  • An adult conditional response rate (conditional upon household response) of 75 percent for both approaches.



The proposed design for Approach 2C is relatively simple and is not likely to be the optimal design for future data collections. It is designed primarily for improving the ability to predict victimization for future surveys and thus restricts the differential sampling rates applied. For example, an optimal design for estimating violent crime might use the data from the mail screener responses to form several strata and differentially sample the households to achieve a higher yield and greater precision for estimates of characteristics of violent crime. Such a design requires prior estimates of the specificity and sensitivity of the data from the screener to classify households with adults who are likely to have been victims of crime. No such estimates are available until the first pilot has been undertaken. Thus, one of the goals of the pilot is to generate the data that will enable us to optimize the design for future rounds.


Critical assumptions in comparing the two approaches are the percentage of households classified as High Risk (and Low Risk) and the victimization rates for those groups. We do not have any evidence as to what these should be, and have made conservative assumptions. If the mail screener has good properties (is able to better classify victims) then this would benefit Approach 2C. The current assumptions can only be assessed by the pilot results.


Our current propensity models are based on analysis from the core NCVS and are thus restricted to predictors from that survey. The ability to predict victimization based on NCVS data is limited, so the stratification into risk categories will primarily be based on responses to the mail survey victimization items. Any positive response to a victimization item from the mail will place the household in the High Risk stratum. Other households with no positive responses to the victimization items will be scored by the propensity model based on the core NCVS, and those with the highest propensity will be classified as High Risk. We expect to classify a total of 25 percent of the cases into the High Risk stratum and 75 percent into the Low Risk stratum.


Within-Household Sampling. During the household interview, the adults age 18 and older in the household will be rostered. In households with only one or two adults, all adults will be selected for the personal victimization screener. In households with more than two adults, two adults will be sampled per household with equal probability by the CATI program. In Approach 2B, the design effect from subsampling two adults in households with three or more is negligible, because so few households have more than 2 adults (about 16 percent of households). In Approach 2C, we have a design effect of about 1.09 due to subsampling half of the Low Risk and nonresponding households. To be conservative we have assumed a design effect of 1.1 for Approach 2B and 1.2 for Approach 2C so that other factors such as nonresponse weighting and within-household clustering are accounted for in our sample size calculations.


Expected Yields. The controlling factor in the sample design is the precision for estimating characteristics and the power for testing key hypotheses under the two approaches. The sample sizes were computed to give about equal precision for both approaches taking into account the different design effects. The ability to produce reasonable confidence intervals for characteristics of victims of property crime and victims of violent crime for each of the two approaches is important. We use confidence intervals rather than power because we do not have any reason to believe, a priori, that the estimates of characteristics for the two approaches should be different. For this we have estimated the standard errors for a characteristic of 30 percent for property and violent victims (e.g., 30 percent of violent crime victims live in rental units). Because crimes are rare, especially violent crimes, the standard errors for these estimates will not be precise given the sample sizes being used in the pilot. The precision is based on anticipated sample size and is discussed below.


  1. Under the assumptions, in approach 2B we expect about 343 completed household interviews reporting a property crime (a victimization rate of 14% for 2,449 households interviewed) and 63 completed person interviews with violent crime reports, some of which may be in the same households reporting property crime (a victimization rate of 2% for 3,134 persons interviewed). The standard error for a 30 percent estimate is expected to be about 0.025 for the property crime characteristic and about 0.061 for a violent crime statistic.


  1. In approach 2C we expect about 413 completed household interviews with property crime reports and 72 completed person interviews with violent crime reports. These numbers are derived as follows:


  • Property crime. We used different rate assumptions for the number of households reporting a property crime across the three 2C strata: For all strata we assume a 50 percent response rate. We assume a 10 percent victimization rate from a sample of 1,964 households in the low risk stratum, to yield approximately 98 households reporting property crime (1,964 x 0.5 = 982 x 0.1 = 98). For the high risk stratum, we assume a victimization rate of 30 percent from the sample of 1,309 to yield 196 households reporting a property crime (1,309 x 0.5 = 654 x 0.3 = 196). The third stratum includes 2C mail screener nonrespondents, where risk is unknown. For these we assume a 14 percent victimization rate – for a sample of 1,694 unknown risk households the yield is approximately 119 households reporting property crime (1,694 x 0.5 = 847 x 0.14 = 119). The aggregate number of 2C households reporting a property crime across the three strata is 413.

  • Violent crime. We also used different assumptions across strata for reporting a violent crime for the 2C sample. For all strata we assume there will be an average of 1.706 adults in residence and that the average person-level response rate will be 75 percent. We assume a victimization rate of 1.3 percent in the low risk stratum for 982 responding to yield 16 individuals reporting a violent crime (982 x 1.706 x 0.75 x 0.013). For adults in the high risk stratum we assume a victimization rate of 4 percent for 654 responding individuals to yield approximately 33 victims of a violent crime (654 x 1.706 x 0.75 x 0.04). Finally, for individuals in the unknown risk stratum, we assume a victimization rate of 2 percent for 847 responding individuals to yield 22 interviews with victims of a violent crime (847 x 1.706 x 0.75 x 0.02). The aggregate estimate of 2C individuals reporting a violent crime across the three strata is 72.


The standard error for a 30 percent estimate is expected to be about 0.024 for the property crime characteristic and about 0.059 for a violent crime statistic. As a result, the standard error of the estimated differences for property crimes from the two approaches would about 3.4 percentage points and for violent crimes it would be about 8.5 percentage points. Only very large increases in the sample sizes would provide more precise estimates and this may not be consistent with the objective of this first pilot collection.


  1. If we assume a relatively large NCVS core sample size for the sample MSA (say 5,000 responding households reporting a crime out of 31,250 completed household screeners), then differences in property crime rates of less than 2 percentage points (assuming a 14 percent property crime reporting rate) will be detectable with 80 percent power. For violent crime rates (assuming a 2 percent rate), a difference of about 0.7 percentage points will be detectable with power of 80 percent.


Data Collection Procedures

This section describes the data collection procedures for the NCVS-CS Pilot. Data collection for the Pilot is proposed to begin in February 2012 and close in April or May 2012. The goal is to complete the Pilot in sufficient time to inform the OMB Submission Package for the main NCVS-CS data collection. This full-scale test will include 5 MSAs and will be based on the most viable methodology determined by the Pilot (the OMB submission for the full-scale test is scheduled for June, 2012). The Pilot data collection for the NCVS-CS will be conducted using a combination of mail and telephone administration. Mail administration will be used to notify the household of their selection, to obtain neighborhood characteristics and victimization propensity for select households, and to obtain telephone numbers. Telephone administration will be used to implement the NCVS instruments.


The data collection strategy will vary for the 2B and 2C samples. In Approach 2B, households will be sampled from the ABS frame and immediately matched to identify telephone numbers with those households. Those with matching telephone numbers will be sent an advance letter, then called for an interview. In Approach 2B, if no telephone number can be linked to the address or if the linked telephone number is not correct for the address, then a cover letter and mail screener (Attachments D and E) will be sent to the address, followed by a postcard reminder (Attachment K). The purpose of the 2B mail screener is to obtain a telephone number. In order to increase interest in the survey, there are a limited number of questions on perceptions of the neighborhood and emergency services. Those households that provide a telephone number will be called, using the same procedure as described above for matched households. Those with no telephone number are classified as nonrespondents.


The strategy for the 2C approach is different in that all households in the 2C sample will first be sent a cover letter and 2C mail screener (Attachments D and F1 or F2), followed by a postcard reminder (Attachment K). Those 2C households which send back a mail screener will be sub-sampled to determine which ones will be included in the NCVS telephone data collection. As indicated earlier, the 2C sample with telephone numbers will be stratified by victimization propensity and whether a mail screener was completed, with disproportionate sub-sampling across strata. Sub-sampled households will be contacted by an interviewer and will be asked to complete the telephone interviews.


The first task of the telephone interviewer will be to confirm that s/he has reached the sampled address. If the interviewer verifies that the dialed telephone number reaches the sampled address, then the interviewer will identify an adult (18 or older) living in the household who can provide information at the household level. In that interview, (the household informant interview), there are three types of questions: (1) survey items about household characteristics, (2) items asking about personal victimization, and (3) items asking about household-level victimization. Based on data from the core NCVS, we anticipate that the average household informant interview will take about 25 minutes.


In the core NCVS, all individuals age 12 and older are interviewed about their personal victimization experiences. In the NCVS-CS, a random sample of two adults will be interviewed about their personal victimization experiences. In households with one or two adults, the household informant will (by definition) be one of the two sampled adults. In households with three or more adults, the household informant may not be one of the sampled adults. This means that the data we collected about their personal victimization experiences may not be used to generate estimates about personal victimization. The rationale for asking these questions of all household informants (regardless of whether they are sampled or not) is to reproduce the NCVS core as closely as possible. The household informant survey includes both questions about household property crime and personal victimization and the questions are not segmented, but are mixed within the instrument. If the personal victimization questions were removed from the NCVS-CS, there could be an impact on the household-level estimates due to the loss of crime cues from the personal victimization questions.


In households with only one adult (about 30 percent of households), the NCVS-CS Pilot survey is complete once the household informant interview is completed. In households with two adults (about 54 percent of households), the interviewer will ask the household informant to pass the telephone to the other adult or for a telephone number where the other sampled adult may be reached. This second adult will then be asked to complete a personal victimization screener. In households with three or more adults (about 16 percent of households), the CATI system will sample two adults and will inform the interviewer which of the remaining adults have been selected to complete a personal victimization screener. (If the household informant is sampled, then the household will be asked to complete one personal victimization screener, otherwise the household will be asked to complete two personal victimization screeners.) Based on data from the core NCVS, we anticipate that the average personal victimization interview will take about 7.5 minutes.


In order to maximize response rates, we will attempt multiple contacts at nonresponding households. Nonrespondents to the mail survey will be mailed a postcard reminder. Those who remain nonrespondents after the postcard will be subsampled for phone followup (when a telephone number is available from the sample database). The data collection design includes 10 call attempts to determine whether the telephone number reaches a household; if there are 10 no-contacts (across a variety of times and days) then the sample record will be closed as final nonresponse. Those telephone numbers determined to belong to a household will be called until we obtain a completed interview (or until the household refuses to participate). If a household or individual refuses to participate, a specially-trained interviewer will recontact and attempt to convert the refusal to an interview. Those respondents who refuse twice will be coded as final nonresponse (the exception are respondents deemed to be hostile to the survey request – these individuals will be coded as final nonrespondents and no conversion will be attempted).


Any reports of a victimization event will result in a detailed incident interview which will be administered for each incident, or occurrence, of each crime type. This is an event-level interview, so a respondent reporting more than one event would complete multiple incident interviews (one for each crime reported). Based on data from the core NCVS, we anticipate that the average incident interview will be about 20 minutes long.


The NCVS interview, included in separate attachments to this document, includes the following instruments:

  • Household screener (Control Card), completed by a household respondent (see Attachment G);

  • Victimization screener (NCVS-1), which includes both household and personal victimization questions; the household items are asked only of the household respondent (see Attachment G); and

  • Incident reports (NCVS-2) for when crimes are reported in either of the victimization (see Attachment H).


The household screener is asked of a “knowledgeable” adult in the household, and includes questions about the sampled address and about the individuals who live there. The NCVS household screener rosters all household members and collects information about each person (such as gender, race/ethnicity, age, etc.). One difference in the NCVS-CS data collection is that we will roster only adults.


As previously described, in the core NCVS, all household members age 12 and older are asked to complete a victimization screener. This instrument asks detailed questions about personal victimization. The household informant is also asked questions about household-level crime. For the CS, BJS has made some changes to this basic protocol, in the interests of respondent burden, cost, and data quality. As indicated earlier, only adults 18 years of age or older will be interviewed in the CS, and when the household includes three or more adults, two will be randomly selected for the personal victimization survey. The household respondent may be any knowledgeable adult in the household. In most cases, the household respondent will also be a sampled adult (selected with certainty in households with 1 or 2 adults). If the household respondent happens not to have been sampled, there will be a total of three respondents possible from the household. In all other households, there will be one or two.


The final difference between the core and the CS is that the core currently includes questions on identity theft and hate crimes, asked only of the household respondent. These will not be included in the CS.

To summarize, the sequence of the interview(s) in a household would look like this:

  • Verify address and identify appropriate household respondent;

  • With the household respondent, complete the household screener and a victimization screener including the household items (except those about hate crimes and identity theft), complete any required incident reports and select sample adult(s);

  • Ask for (next) sampled adult; complete additional victimization screener if needed and any required incident reports;

  • Repeat previous step if necessary for households with 3+ adults.


A description of the adaptation of the core NCVS instruments may be found in Attachment I.



Language. The NCVS-CS Pilot will be conducted in both English and Spanish. A Spanish CATI instrument will be available to bilingual interviewers, and all mailed materials will be available in both English and Spanish. Sampled addresses will be pre-identified as “potentially Spanish-speaking” based on surname (from the USPS database) and on neighborhoods with a high concentration of Spanish-speakers (based on Census data). Those identified as potentially Spanish-speaking will be mailed materials in both English and Spanish. These households will also be called by a bilingual interviewer for the initial contact (and for subsequent contacts if the household is confirmed as Spanish-speaking).


Approximately 13 percent of the MSA sample has a Spanish surname or is located in a Spanish linguistically-isolated Census tract. All of these households will receive mailed materials in both English and Spanish. An experiment will be embedded in the 2C sample to determine what impact bilingual materials have on the general population (that is, those households that do not have a Spanish surname and are outside of linguistically-isolated tracts). A sub-sample of n=1,500 of the “general population” 2C households will be flagged for bilingual mailings. The goal will be to assess the impact of bilingual mailings on (1) overall response and (2) response of Hispanic households not identified by either Surname or by Census tract.


Telephone Interview Pretest. We propose conducting a brief pretest to serve as a “dress-rehearsal” of the CATI instruments. The goals of the pretest will be to ensure that the CATI instrument works as expected, to provide a final test of question items, to identify unmet training needs, to refine the estimates of average interview length, and to assess cooperation. The primary focus will be on the household screener and victimization screeners; a sample size of about 50 completed households should be sufficient to achieve these goals. We will explore the possibility of including a purposive sample of households with victimizations4 to assess the incident reports in CATI.


BJS and Westat have developed two different draft mail screeners, included as Attachments E and F. These correspond with Approaches 2B and 2C described earlier. The purposes of the 2C screener are to:

  1. Obtain telephone numbers to conduct the NCVS telephone interview;

  2. Provide information to predict whether the household has experienced victimization, for oversampling;

  3. Provide information that may be used in models for small area estimates; and

  4. Provide information not available from the core NCVS that may be of interest to local jurisdictions.


Burden Hours for Pilot Test

We request 33.9 burden hours for a small pretest of the CATI instruments as well as 4,194 hours for the Pilot data collection (this covers 23,320 households receiving study material letters, as well as 6,313 interviewed adults) (tables 3 and 4). The NCVS generic clearance allocated a predetermined combination of sample cases and burden hours that could be used for NCVS redesign efforts. The current sample size and burden hours in the field testing fall within the remaining allocation.


Table 3. Estimated Burden of the Pretest Interview Task

Interview

Number of responses

Avg time per response

Total time across all respondents

Household

50

25 minutes

20.8 hours

Personal

35

7.5 minutes

4.4 hours

Crime Report

26

20 minutes

8.7 hours

TOTAL

85


33.9 hours


Table 4. Estimated Burden of the NCVS-CS Pilot Task

Type of Instrument

Number of respondents

Mean time (in minutes) per response

Total hours across all respondents

mailed supplemental materials (e.g., letters)

23,320

1

388.7

2b mail screener

1,716

6

171.6

2c mail screener

5,544

12

1,108.8

Household

4,932

25

2,055.0

Personal

1,381

7.5

172.6

Crime Report

891

20

297.0

TOTAL RESPONSES

14,464*

 

4,193.7

*NOTE: The sum of the “Number of respondents” column in the “Total responses” row does not include the 23,320 respondent households that will receive the mailed letters because we will not know how many actually read the materials. However, we did include this group’s burden (388.7 hrs) in our total burden calculation.


Analysis Plan

The goals of the pilot are to assess whether data collection Approach 2B or 2C provides better information for the cost for the various SAE approaches described in Attachment A. For details on the analysis plan for the Pilot, please refer to Attachment L. The broad analytic objectives include:

  1. Identify which of approach 2B or 2C provides more information for producing blended estimates.

  2. Identify which of approach 2B or 2C provides more information for small area estimation.

  3. Analyze the effectiveness of the 2C screener.

  4. Develop recommendations for modification of the sample design for the next MSA test.


As part of the analytic task we will also assess nonresponse bias. Attachment L also includes the plan for analyzing nonresponse bias.


Informed Consent, Data Confidentiality and Data Security

The contact letters and the script read to respondents once on the telephone provide the elements of informed consent. The initial letters provide the purpose of the survey, the voluntary nature of the study, how their address was included, and a number to call with questions about the study. The script read to respondents on the telephone repeats much of this information and additionally provides the length of the interview.


The data collected for this project are protected under the Bureau of Justice Statistics statutory protection. This protects the data from potential subpoena (42 USC 3789g). Access to Westat’s secure computer systems is password protected and data are protected by access privileges, which are assigned by the appropriate system administrator. All systems are backed up on a regular basis and are kept in a secure storage facility. To protect the identity of NCVS-CS respondents, no identifying information will be kept on the final survey file. Identifying information includes the address of the sampled unit and the telephone number. The survey will not be collecting the name of any of the respondents. The identifying information will be deleted once the analysis file has been created and the link is no longer needed. We estimate this to be 3 months after the study has ended. The final data sets, without the above identifiers, will be delivered to BJS at the end of the project in November 2014. Once these data are delivered, all copies at Westat will be destroyed.


With respect to personnel, all Westat employees are required to sign a pledge of confidentiality. This pledge requires employees to maintain confidentiality of project data and to follow the above procedures when handling confidential information.

1 The new introduction now reads: “The next questions ask about whether you or anyone in your household has experienced a crime in the past 12 months. Please include all crimes, no matter where it happened and even if it was not reported to the police.”

2 The term “Chicagoland” was received favorably by cognitive interview subjects in both the city of Chicago and outlying suburbs.

3 Since the 2B instrument is only mailed to those without a matching telephone number only one version (local) of this instrument will be used.

4 We plan to recruit approximately 10 crime victims using Craig’s List on-line advertising. Those recruited through Craig’s List will receive $40 compensation for participation in the pretest.

11


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEdwards_s
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy