NCVS CS Project Summary

NCVS-CS - Project Summary.docx

Research to support the National Crime Victimization Survey (NCVS)

NCVS CS Project Summary

OMB: 1121-0325

Document [docx]
Download: docx | pdf



National Crime Victimization Survey (NCVS)

Companion Survey (CS)

BJS Grant No. 2010-NV-CX-K077



June 28, 2011



Project Summary



Table of Contents





List of Appendices

Appendix A.1 – Content for the Approach 2B Mail Screener

Appendix A.2 – Content for the Approach 2C Mailed Screener

Appendix B.1 – 2b Mail Screener Cognitive Interview Protocol

Appendix B.2 – 2c Mail Screener Cognitive Interview Protocol

Appendix C – Crime Scenarios for 2c Screener Testing

Appendix D – Core NCVS Instruments for the NCVS CS Pilot

Appendix E – Auxiliary Materials

Appendix F – Informed Consent Form for Cognitive Interviews

Appendix G – Recruiting Script and Screener

Appendix H – Recruiting Advertisement Material





1. Background



The National Crime Victimization Survey (NCVS) is the Nation’s primary source of information on criminal victimization. Each year data are obtained from a nationally representative sample of some 124,000 households, comprising nearly 190,000 persons, on the frequency, characteristics and consequences of criminal victimization in the United States. The survey enables the Bureau of Justice Statistics (BJS) to estimate the incidence of victimization in the form of rape, sexual assault, robbery, assault, theft, household burglary, and motor vehicle theft for the population as a whole, as well as for various subgroups of the population such as women, the elderly, members of various racial groups, city dwellers, or other groups. The NCVS provides the largest national forum for victims to describe the impact of crime and characteristics of offenders.

Since 2008, BJS has initiated a number of research projects to assess and improve upon core NCVS methodology, including redesigning the sample plan, comparing alternative modes of interview, reducing non-response bias, examining various reference period lengths, testing effectiveness of victimization screening questions, and exploring the feasibility of producing sub-national estimates of victimization. During 2009, BJS met with various stakeholders, including the Federal Committee on Statistical Methodology, State Statistical Analysis Centers, state and local law enforcement agencies, law enforcement organizations, the Office of Management and Budget, and select Congressional staff to discuss the role of the NCVS, the need for sub-national estimates, other stakeholder needs, and the challenges and potential methodologies for achieving these objectives.

In response to an interest among stakeholders for the production of sub-national estimates, and with the advent of funding targeted for this work, the purpose of the current research is to develop and evaluate a cost effective sub-national companion survey of victimization. This document describes the development and pilot testing of approaches to producing sub-national estimates in one MSA. Following an evaluation of the research described here, BJS may request clearance to conduct a larger test of the approach that the pilot test indicates would be more cost-effective.

BJS is currently researching a number of methods to supply data and/or estimates at the sub-national level. This portfolio of small area estimation techniques includes both direct and indirect methods. One option being explored is to expand the core NCVS sample and/or to restructure the sampling plan to produce state-level estimates. Other options are examining the production of indirect estimates through small area estimation techniques using existing data. The current project is intended to lay a foundation for determining the most viable and cost-effective option for the development and implementation of a large-scale effort to generate sub-national crime victimization estimates.

To help prospective applicants understand why BJS has proposed the methodological approach contained herein, a summary of the expected benefits of this NCVS redesign project follows.



Expected Benefits of this Component of the NCVS Redesign:

1. Preliminary sub-national victimization data to—

a. evaluate the utility and costs associated with collection of such data;

b. understand how the key survey results vary by geographical area to determine which data elements would benefit most from an expanded sub-national data collection program; and

c. explore synthetic estimation for sub-national areas.

2. A joint core NCVS and sub-national component strategy that—

a. leverages the strengths of two different data collection methodologies;

b. permits the companion program to be scaled according to data needs and available funding;

c. provides the ability to directly compare the costs, operational outcomes, bias, and precision between two different but complementary survey methodologies;

d. maintains the continuity of the national estimates through the core NCVS;

e. enables BJS to combine the high quality nature of the core NCVS data with the companion data collection to help offset the weaknesses of this lower cost methodology;

f. offers the ability to apply different data collection methods, questionnaires and sampling methods to each collection;

g. assists BJS in creating multi-year estimates for smaller areas or specific demographic subgroups with the strengths of the panel design of the core NCVS; and

h. improves BJS’ ability to measure small changes in the yearly estimates of the incidence of victimization at the national level.



2. Selection of a Sampling and Data Collection Plan for the Companion Survey



This section describes some of the approaches that were considered for producing blended small area estimates (SAEs) of victimization rates and characteristics. The first sub-section briefly reviews general methods that could be used to produce these SAEs and discusses the major advantages and disadvantages of these SAE methods. The second sub-section focuses on the blended approach and data collection and cost issues associated with alternative strategies that fall within this realm.



2.1 Methods for obtaining small area estimates (SAEs) of victimization rates



  1. Model-based estimation with currently available information

Model-based methods predict the victimization rate from administrative data or other sources, using a regression model. If there is also a direct estimate of victimization for that area from the NCVS, the SAE is a weighted average of the NCVS estimate and the prediction from the regression model; if the NCVS has no sample in the area, the SAE is the regression prediction. If the assumed regression model is correct, the resulting SAE is unbiased under the model and has smaller mean squared error than using just the direct estimate from the NCVS alone.

Advantages: SAE methods have been studied for more than 30 years and have been used in applications ranging from poverty estimation to disease mapping. The regression modeling can be done at a small area, household, or person level, depending on the information available. The models “borrow strength” from other, similar areas to achieve improved predictions. A major advantage is that these methods do not incur additional data collection costs.

Disadvantages: Qualities of model-based estimators depend highly on the model assumptions. The model-based methods require the presence of high-quality, consistently reported, auxiliary information that are highly correlated with the outcomes (victimization). Auxiliary information is more likely to be available at the MSA level than at the person level; this information includes Census data on education, labor force characteristics, percentage owning homes, and SES, as well as administrative statistics collected by the BJS. It is also possible to use the Uniform Crime Reports (UCR) as a source of auxiliary information. The UCR data, however, are incomplete, have biases that may be differential by the small areas, and are not always highly correlated with NCVS estimates, at least at the MSA level. The UCR data appear to be an excellent source of auxiliary information for property crime, but the correlations between UCR and NCVS violent crime rates are low and sometimes negative. Current methods for combining information from different sources for SAEs assume that the UCR quantities are unbiased or have a bias that is constant for all areas; due to the voluntary nature of reporting and varying data quality by jurisdiction, this assumption is not met for UCR data.



  1. Model-based estimation with additional auxiliary information collected through a survey

The main drawback of approach (A) is the sketchy nature of available auxiliary information. One possible solution is to collect better auxiliary information, for example through a large mail survey in each state or in targeted areas. Such a survey could collect brief information about victimization, attitudes about crime, and similar variables.

Advantages: The mail survey could produce information of interest in its own right such as attitudes about crime or the police, as well as auxiliary information to be used in producing SAEs of victimization rates and characteristics, at a relatively low cost. A mail survey gives flexibility for moving the sample over time, to achieve greater precision in targeted geographic areas.

Disadvantages: Concepts of victimization may differ in the two surveys, and the differences may vary across demographic groups. This is a potential source of differential bias, but there is the possibility that these biases may be addressed by modeling whereas modeling is less likely to compensate for UCR differential biases by areas. As with all model-based methods, the quality of SAEs depends on how well the assumed model fits the data. In particular, the model must be trustworthy for areas that have no NCVS sample, since in those areas estimates depend entirely on the model. The method is likely to improve accuracy of SAEs of victimization rates in broad categories; it is less likely to improve SAEs of more detailed characteristics of victimizations.



  1. Blended estimates from two surveys

An independent companion survey (CS) on victimization is conducted, and estimates from the CS are blended with those from the NCVS. The two surveys share a common concept of victimization and may even share a common instrument and data collection strategy, although these are not essential. The data collection approaches and issues for this approach are covered in the next section.

Advantages: If the CS is undertaken using lower cost data collection methods and modes, the cost of achieving more precise SAEs can be substantially lower by using a CS than by increasing the NCVS sample size. This approach gives more information than approach (B) on details of victimization that can be used for type-of-crime classification and variables of interest such as weapon use. Different methods of blending the estimates are possible. One possibility is using dual frame survey estimation methods to combine CS and NCVS estimates for SAE. Alternatively, the CS could be used as auxiliary information in a model-based approach for SAE. If an address-based sample is used for the CS, detailed auxiliary information from the Census, the UCR, and police jurisdictions can be used in the design of the survey, thus improving efficiency relative to the PSU-based NCVS. As with approach (B), the sampling design is flexible and sample can be easily moved over time to give increased precision in different areas.

Disadvantages: The quality of data from the CS may not be as high as that from the NCVS. If the CS is done by a different mode or has different response rates or interviewer effects, the sources and directions of bias in the CS and NCVS may differ. Models to estimate these biases must be developed. The statistical literature for blending biased estimates is currently very limited and new statistical methods must be developed to tackle this challenging problem.



  1. Better direct measurements of victimization

SAEs obtained by direct measurement, either through increased NCVS sample sizes or better NCVS allocation in stratified sampling, rely only on the victimization reports of the respondents. Several options exist for obtaining better direct measurements:

  1. Increased sample sizes with the current design.

  2. Improved stratification with higher concentrations of victims in strata with high sampling fractions.

  3. A two-phase sample employing an inexpensive but fallible screener followed by the NCVS. For this to be cost-effective, in most situations the sum of specificity and sensitivity of the screener should exceed 1.6.

  4. Use of a dual frame method, in which Frame A is the general population sampling frame used in the NCVS and Frame B, an incomplete frame, has a high concentration of victims. Information for constructing Frame B might be available in individual law enforcement jurisdictions, if they have contact information for crime victims and consent can be obtained. A challenge in this approach for obtaining SAEs is that the Frame B membership of NCVS respondents may be unknown, due to inaccuracies in responses about reporting crime to the police as well as differential agency responses to recording crimes. Record linkage might resolve some of these issues, although NCVS respondents who live in Phoenix but were victimized in San Francisco may be difficult to classify. Because Frame B is small relative to Frame A and has a much higher proportion of victims, small inaccuracies in determining frame membership can result in large effects on estimated victimization rates.



Advantages: Direct estimates do not require modeling and therefore do not require the model assumptions of approaches (A)-(C). They may be thought of as the gold standard for quality of estimates.

Disadvantages: For many designs, obtaining a sufficient sample size for SAEs is expensive. This is the most costly of the four methods.



2.2 Data Collection Approaches for an NCVS Companion Survey

Here we consider three different approaches to conducting a Companion Survey (CS) in small areas such as MSAs. All three assume centralized telephone interviewing to collect data to support blended estimates (Approach 2.1(C)); one would also provide data to support model-based SAE (Approach 2.1(B)). They differ in what sample frame underlies the design and in how initial contacts with households are made. In-person follow-up for selected nonresponse is feasible with any of these approaches, although it is more limited with the RDD survey.



  1. Random-digit-dial survey

Traditional RDD designs using only landline frames are becoming increasingly rare as their coverage of the household population continues to decline. The design we will consider includes samples drawn from numbers assigned to both landline and cellular service, with cell numbers screened to identify cell-only households.

Cost: We will use the cost of a completed interview (household victimization screener at a minimum) for landline RDD as a metric, assuming the same number of completed interviews across different approaches. If that cost is 1, then the cost per cell-only household RDD complete is about 4, and the cost for a two-frame design where 11% of the completed interviews are with cell-only households is about 1.4.

Response Rate: We would anticipate a screening response rate of 30-40% in large MSAs, and 70-80% for the substantive interview, for a net of 20-30%.

Advantages: RDD methodology is well-tested. Instrument design is relatively straightforward, and in most cases the entire data collection can be done on one or two contacts with the household.

Disadvantages: The potential for bias due to undercoverage and nonresponse is high. There is limited ability to stratify geographically within MSAs. The cell sample would be less geographically efficient than the landline sample. Any in-person follow-up to study nonresponse bias would be limited to telephone numbers for which an address could be obtained. We would expect only about 50-60% of sampled landline telephone numbers would have a matched address after purging for nonworking and business numbers and some percentage (up to 20%) of these would be incorrect. There is as yet no reliable way to match cell numbers to addresses so in-person follow-up would not be possible.



  1. Address-based sample (ABS) with mail survey to obtain telephone numbers

This approach begins with selection of a sample of addresses from a vendor-enhanced version of the USPS Delivery Sequence File. We would then obtain telephone numbers for these addresses from vendor services. For those addresses without a telephone number, we would attempt to obtain one by mail using 2 or 3 mailings. The content of the mail piece would be limited and essentially non-substantive. We would then proceed with telephone interviewing in much the same way as for the RDD. During the telephone interview, the respondent would be asked to verify that the residence is at the sampled address since a proportion of the vendor numbers are not correct. For any sample address that is matched but the telephone number obtained is incorrect (about 20% will not even be working numbers at residences), the address will be placed into the mail process to obtain a telephone number.

Cost: We estimate the per-complete cost as about 1.1 times that for a landline RDD case. Thus, this approach is about 20% less expensive than Approach 2.2(A).

Response Rate: We estimate we would obtain vendor telephone numbers for about 50% of sampled addresses. While we do not have direct experience with the non-substantive screening approach, we assume about 40% of those mailed will provide a telephone number. About 20% of vendor-acquired telephone numbers would not be working or residential, and we estimate about 10-15% will be working but not actually be for the sampled address. Thus, about a third of the addresses with vendor-provided numbers would be cycled through the mail process, and we again assume about 40% response. In the end, we assume we would have good telephone numbers for about 60% of the addresses. Assuming a 40-50% screening rate (higher than RDD for a couple of reasons) and 80% for the substantive interview, the net response rate would be about 20%.

Advantages: ABS allows geographic stratification within MSAs, and has very good coverage. The telephone instrumentation would be very similar to that of the RDD approach. It is less expensive than RDD. In-person follow-up would be straightforward (with any ABS approach there is an issue with post office boxes that do not have an actual address but this is a small percentage of numbers), and the sample for follow-up could be clustered within MSAs to reduce cost.

Disadvantages: The response rate is likely to be comparable to or even lower than RDD. It is also likely that there will be a differential nonresponse for those with and without valid matching telephone numbers.



  1. ABS with mail screener and telephone interviewing

This approach may be called the “two-phase ABS hybrid.” The sample selection would be the same as that for Approach 2B, but would involve mailing every sampled household a brief screener questionnaire. The content of the screener could (a) support model-based estimation as described in Approach 2.1(B), (b) provide data that are expected to be highly correlated with victimization incidents to support stratification for the second phase (telephone) survey, and (c) yield telephone numbers for a large portion of those returning the survey. Nonresponders for whom telephone numbers are obtained from a vendor would also be available for the telephone interview. The telephone follow-up would proceed essentially the same way as in Approach 2B.

A key aspect of this approach is subsampling after the screener based on likelihood of victimization. The plan is to stratify returns into high and low likelihood based on answers to screener questions, and oversample (likely take all of) those in the high likelihood stratum. The goal is to increase the number of victimizations reported without increasing the number of second-phase telephone interviews conducted. The success of this approach depends on the sensitivity and specificity of the predictor questions; the pilot will provide a chance to assess these.

Cost: The mailing would be more expensive than that for Approach 2B because the entire sample would be mailed, and each substantive screener would likely be somewhat longer. The second phase telephone interview would be somewhat less expensive because almost all of those followed up would have already cooperated to the screener. On balance, assuming no subsampling after the screening, the per-complete cost would be about 1.2 times that of a landline RDD complete, or about 10% more expensive than Approach 2B.

The relative cost with subsampling for the pilot depends on the sampling rates in the two strata. If we assume the high likelihood stratum is sampled with probability one, then we might subsample the low likelihood stratum by taking only half of them for the telephone interview. For discussion, assume the high likelihood stratum is 20% of the respondents. If this approach were to be followed and we wished to maintain the total number of completed telephone interviews, we would nearly double the initial sample (and the mailing costs) and reduce the total sample for follow-up by 1/3. This would increase the total cost by about 20%, bringing it up to about the level of the RDD design (Approach 2(A)). An alternative approach is to attempt to retain the same number of completed telephone interviews with at least some victimization. This approach could be much less expensive if the screener instrument is effective. Whether either of these implementations of this approach to a CS design is cost-effective for producing blended estimates would depend on the sensitivity and specificity of the predictor questions in the screener. Let

S1 = specificity = P (mail survey classifies HH as nonvictim HH | NCVS classifies HH as nonvictim HH) and

S2 = sensitivity = P (mail survey classifies HH as victim HH | NCVS classifies HH as victim HH).

Let cj denote the cost per interview in phase j, for j=1, 2. The ratio of the standard error for estimating prevalence under the optimal 2-phase design to the standard error for estimating prevalence using only the CS under the same budget is (McNamee, 20031):

[(1- S2) S1]1/2 + [(1- S1) S2]1/2 + [c1/c2]1/2,

where is the Pearson correlation between the NCVS classification and the screener classification. If both sensitivity and specificity are high, the two-phase design can result in more accurate estimates of victimization prevalence.

Using both surveys produces two levels of information that can be used to improve SAEs: the CS at phase 2 can be blended with the NCVS, and the mail survey at phase 1 can provide high-quality auxiliary information for model-based SAEs of victimization. Such a design also allows exploration of multivariate relationships between victimization and attitudes about crime.

Response Rate: We would expect about a 50-55% response to the screener, and to get (either from the respondent or a vendor) telephone numbers for about 85%. Assuming 70-80% response to the telephone follow-up, the net would be in the 30-35% range. These rate estimates would vary depending on the particular geographic area(s) being surveyed.

Advantages: Besides the ABS advantages listed for 2B, this approach would likely increase the yield of victimization reports to support blended SAEs and provide correlates for model-based SAEs. Because of the higher yield, estimates of characteristics associated with victimizations would be more accurate. Based on research done for the National Household Education Survey, we believe that the response rates would be higher than for either 2A or 2B. The design also allows exploration of relationships between victimization and questions such as attitudes about crime that may be asked in the screener.

Disadvantages: Likely somewhat more expensive than 2B for a given total achieved sample size, although it could be considerably less expensive per reported incident.



3. Objectives and Design of the NCVS CS Pilot Study



BJS, in consultation with Westat under a cooperative agreement, has planned an NCVS Companion Survey (CS) pilot to test two survey approaches using an address-based sampling (ABS) design as described in the previous section. One approach (referred to as “Approach 2B” or a “telephone number harvest”) will screen by mail only those addresses for which we are unable to obtain a valid telephone number from directory services; the purpose of this mail screener is primarily to obtain a telephone number. The other approach (referred to as “Approach 2C” or the “two-phase ABS hybrid”) will screen all selected addresses by mail with a goal of identifying and oversampling households including one or more adults likely to have been the victim of a crime, and of obtaining information that might be used to support model-based small-area estimates (SAE). For both approaches, we will conduct a telephone version of the core NCVS interview with sampled households, including a household informant and one or two randomly selected adults. The goals of the pilot are to assess each of the approaches in terms of cost, data quality, and effectiveness at supporting blended SAE. We will also assess the substantive screener’s utility in supporting model-based SAE.





4. Pilot Study Sample Design



4.1 Sampling Addresses

The pilot sample will explore two different approaches to sampling and contacting households and adults. The designs for both approaches start with a simple random sample of addresses selected from the ABS frame in the Chicago-Naperville-Joliet, IL-IN-WI MSA. The ABS frame is a file of residential addresses that is maintained by a vendor, based on the United States Postal Service (USPS) Computerized Delivery Sequence File (CDSF). Figures 1 and 2 are flow charts depicting the sample design implementation for the two approaches, which are labeled 2B and 2C.

In Approach 2B, addresses will be sampled from the ABS frame and immediately matched by a vendor to identify telephone numbers associated with the addresses. Those with matching telephone numbers will be sent to the telephone research center (TRC) and called. The household screener will be conducted with an adult (18 or older) living in the household (this is the household interview). In that interview, the respondent will be asked to verify that their address matches the sampled address, and two adults (if there are two or more adults in the household) will be randomly sampled for the personal victimization screener. In most households the household informant will be a sampled adult (about 85 percent of households have two or fewer adults so the chances of sampling the informant are very high).



Figure 1. Approach 2B: Telephone Harvest









Figure 2. Approach 2C: Two-phase ABS Hybrid

In Approach 2B, if no telephone number can be linked to the address or if the linked telephone number is not correct for the address, then a mail screener will be sent to the address. The only purpose of the mail screener in this approach is to obtain a telephone number for calling. In order to increase interest in the survey, we will include a limited number of questions on a topic such as perceptions of crime in the neighborhood. Those households that provide a telephone number will be sent to the TRC for calling, using the same procedure as described above for matched households. Those with no telephone number are classified as nonrespondents.

In Approach 2C, all addresses sampled from the ABS frame will be sent a mail screener that contains brief questions on victimization experiences, perception of crime, and items associated with victimization. The items on the mail screener will be used to classify households as either High Risk (likely to have experienced victimization in the past year) or Low Risk (unlikely to have experienced victimization).Households classified as High Risk will be sampled with certainty and sent to the TRC for a household interview as in Approach 2B. The Low Risk households will be subsampled at a rate of ½ and these will be sent to the TRC for the interview. We plan to release the 2C sample in two replicates so that we can review the sampling rate and make appropriate revisions in the second replicate. The telephone number used in the contact will be that returned in the mail screener (or a matching number if no number is returned).

In Approach 2C, sampled addresses that do not return the mail screener (as well as those that do return the screener without a telephone number and are sampled for the second phase) will be matched to find a telephone number. We will subsample nonresponding households with matching telephone numbers at a rate of one-half initially; those selected will be sent to the TRC. Sampled addresses with no telephone number, whether they returned a mail screener or not, will be classified as nonrespondents. The subsampling rate for nonresponding households may be increased later in the field period if needed to obtain the target number of completed household screeners.

To summarize the 2C approach, within the high risk stratum all households will be selected with certainty. Within the low risk stratum, households will be sorted on propensity score (described later) and other measures from the screener responses, and a systematic sample of households will be selected at a rate of one-half initially. This rate may be revised after examining the crime rates reported for households in each stratum in the first replicate. A third stratum will consist of households who fail to return the mail screener but for whom we have a telephone number from a vendor. These will also be subsampled at a rate of 50% initially. If the number of completed household screeners appears likely to fall short of the target, this rate may be increased.



4.2 Sample Sizes

Our planned sample sizes and expected number of completed screener and extended interviews for the two approaches are summarized in Tables 1 and 2.

Table 1. Proposed sample size for pilot, Approach 2B


Approach 2B

Addresses sampled

12,500

Vacancy Rate

12%

Occupied households

11,000

Vendor phone number match rate

60%

Households matched for phone #

6,600

% of Vendor phone numbers that are valid

80%

Household screeners mailed

5,720

Households sent to TRC for interview

6,696

Household interview response rate

35%

Expected household interview completes

2,449

Average number of adults sampled per household

1.7

Extended person interview response rate

75%

Expected extended person completes

3,134



Table 2. Proposed sample size for pilot, Approach 2C


Approach 2C

Addresses sampled

14,000

Vacancy Rate

12%

Occupied households

12,320

% of Household screeners returned with phone number

40%

Household screeners returned with phone number

4,928

% of household screeners returned with no phone number

5%

Household screener returned with no phone number

616

Vendor phone number match rate

50%

Household screeners completed using vendor phone number

313

Subsampling rate for screener nonrespondents

1 in 2

Household screener nonrespondents subsampled

1,694

Household screeners completed in High Risk Stratum (25%)

1,309

Household screeners completed in Low Risk Stratum (75%)

3,927

Subsampling rate for High Risk Stratum

1

Subsampling rate for Low Risk Stratum

1 in 2

Households subsampled in Low Risk Stratum

1,964

Households sent to TRC for interview

4,967

Household interview response rate

50%

Expected household interview completes

2,483

Average #adults sampled per household

1.7

Extended person interview response rate

75%

Expected extended person interview completes

3,179

Key Assumptions

The sample sizes in Tables 1 and 2 are based on the assumptions given below. Some of these assumptions are based on experiences with other ABS studies at the national or state level.

The assumptions are:

  • A vacancy rate of 12 percent of the sampled addresses.

  • For the 2B sample, the vendor will find matching phone numbers for 60% of sampled addresses, and 80% of these phone numbers will ring at the sampled address; for the 2C sample, the match rate will be lower (50%) because only mail nonrespondents will be matched, and the mismatch rate is incorporated into the telephone interview response rate.

  • A household interview response rate of 35 percent for Approach 2B and 50 percent for Approach 2C, which has a much higher proportion of mail cooperators.

  • A sample of up to two adults per household in both approaches.

  • In approach 2C we assume that households will be classified as High Risk (25 percent) and Low Risk (75 percent) based on responses to the mail screener. All High Risk households will be sampled, while only 50 percent of the Low Risk households will be sampled.

  • In Approach 2C, 50% of the mail screener nonrespondents with a vendor-supplied phone number will be subsampled.

  • A property crime victimization rate of 14 percent and a violent crime victimization rate of 2 percent.

  • For High Risk households a property crime victimization rate of 30% and a violent crime victimization rate of 4 percent. For Low Risk households a property crime victimization rate of 10 percent and a violent crime victimization rate of 1.3 percent.

  • An adult conditional response rate (this is conditional upon the household response) of 75 percent for both approaches.



Critical assumptions in comparing the two approaches are the percentage of households classified as High Risk (and Low Risk) and the victimization rates for those groups. We do not have any evidence as to what these should be, and have made conservative assumptions in our opinion. If the mail screener has good properties (is able to better classify victims) then this would benefit Approach 2C. The current assumptions can only be assessed by the pilot results.

The proposed design for Approach 2C is relatively simple and is not likely to be the optimal design for future data collections. It is designed primarily for improving the ability to predict victimization from future surveys and thus restricts the differential sampling rates applied. For example, an optimal design for estimating violent crime might use the data from the mail screener responses to form several strata and differentially sample the households to achieve a higher yield and greater precision for estimates of characteristics of violent crime. Such a design requires prior estimates of the specificity and sensitivity of the data from the screener to classify households with adults who are likely to have been victims of crime. No such estimates are available until the first pilot has been undertaken. In fact, one of the goals of the pilot is to generate the data that will enable us to optimize the design for future rounds.

Our current propensity models are based on analysis from the core NCVS and are thus restricted to predictors from that survey. The ability to predict victimization based on NCVS data is limited, thus the stratification into risk categories will primarily be based on responses to the mail survey victimization items. Any positive response to a victimization item from the mail will place the household in the High Risk stratum. Other households with no positive responses to the victimization items will be scored by the propensity model based on the core NCVS, and those with the highest propensity will be classified as High Risk. We expect to classify a total of 25 percent of the cases into the High Risk stratum and 75 percent into the Low Risk stratum.



4.3 Within-Household Sampling

During the household interview, the adults age 18 and older in the household will be rostered. In households with only one or two adults, both adults will be selected for the personal victimization screener. In households with more than two adults, two adults will be sampled with equal probabilities by the CATI program. In Approach 2B, the design effect from subsampling two adults in households with three or more is negligible, because so few households have more than 2 adults (about 16% of households). In Approach 2C, we have a design effect of about 1.09 due to subsampling half of the Low Risk and nonresponding households. To be conservative we have assumed a design effect of 1.1 for Approach 2B and 1.2 for Approach 2C so that other factors such as nonresponse weighting and within-household clustering are accounted for in our sample size calculations.



4.4 Expected Yields

The controlling factor in the sample design is the precision for estimating characteristics and the power for testing key hypotheses under the two approaches. The sample sizes were computed to give about equal precision for both approaches taking into account the different design effects for the two approaches.

The ability to produce reasonable confidence intervals for characteristics of victims of property crime and victims of violent crimes for each of the two approaches is important. We use confidence intervals rather than power because we do not have any reason to believe, a priori, that the estimates of characteristics for the two approaches should be different. For this we have estimated the standard errors for a characteristic of 30 percent for property and violent victims (e.g., 30 percent of violent crime victims live in rental units). Because crimes are rare, especially violent crimes, the standard errors for these estimates are not very precise unless the sample size is much larger than we anticipate being used in the pilot. The precision at this level is the controlling factor in the sample size and is discussed below.





  1. Under the assumptions, in approach 2B we expect about 343 completed household interviews reporting a property crime and 63 completed person interviews with violent crime reports (these may be overlapping in that some addresses will report both types of crime). The standard error for a 30 percent estimate is expected to be about 0.025 for the property crime characteristic and about 0.061 for a violent crime statistic.

  2. In approach 2C we expect about 413 completed household interviews with property crime reports and 72 completed person interviews with violent crime reports. The standard error for a 30 percent estimate is expected to be about 0.024 for the property crime characteristic and about 0.059 for a violent crime statistic. As a result, the standard error of the estimated differences for property crimes from the two approaches would about 3.4 percentage points and for violent crimes it would be about 8.5 percentage points. Only very large increases in the sample sizes would provide more precise estimates and this may not be consistent with the objective of this first pilot collection.

  3. Whether there is a difference between the victimization rates from the CS (combining the samples from the two approaches) and the NCVS rates for the same area. If we assume a relatively large NCVS core sample size for the sample MSA (say 5,000 responding households reporting a crime out of 31,250 completed household screeners), then differences in property crime rates of less than 2 percentage points (assuming a 14 percent property crime reporting rate) will be detectable with 80 percent power. For violent crime rates (assuming a 2 percent rate), a difference of about 0.7 percentage points will be detectable with power of 80 percent.



5. Instrumentation



BJS and Westat have developed two different draft mail screeners, included as Appendices A.1 and A.2. These correspond with two methodological approaches that will be explored in a Pilot data collection of the NCVS-CS. What has been called the “2C” approach includes and address-based sample, a first phase mail screener, and a second phase telephone interview to administer the standard NCVS instruments. The purposes of the 2C screener are to:

  1. Obtain telephone numbers to conduct the core NCVS interview;

  2. Provide information to predict whether the household has experienced victimization, for oversampling;

  3. Provide information that may be used in models for small area estimates; and

  4. Provide information not available from the core NCVS that may be of interest to local jurisdictions.



What has been termed the “2B” approach also includes an address-based sample, but a mail screener is only sent to those households where a telephone number cannot be obtained. The purpose of the 2B screener is primarily to obtain telephone numbers; it includes a subset of 2C questions judged to be engaging for a wide range of respondents.

The development of the screener content for both the 2B and 2C mail screeners included a review of the crime victimization literature to identify variables associated with violent and property crime victimization. These variables fall into the following categories: Demographic Variables, Neighborhood Characteristics, Routine Activities/Lifestyle Variables, and Fear of Crime/Perceived Risk Variables. The selected items are mostly taken from previous surveys; items judged to be threatening or complex were eliminated from consideration. Victimization items are adapted from the core NCVS for mail administration, and include cues associated with the greatest number of incident reports.




5.1 The Development Process


Both mail screeners will undergo similar development processes. Because the graphic design and layout are extremely important for mail surveys, Westat graphic and typographical artists will produce design options. Once a design for each screener has been determined, these screeners will be tested.



Testing the Screeners


Both screeners will undergo cognitive testing by Westat survey methodologists. The testing protocol is included as Appendix B and an informed consent form is provided in Appendix F. Because these instruments are self-administered, we will use a retrospective debriefing approach. A retrospective debriefing consists of allowing the respondent to complete the questionnaire the way he/she would at home. The survey methodologist observes the respondent as he/she works through the questionnaire and notes any issues or potential issues. Once the respondent has completed the questionnaire, the methodologist reviews the answers and asks the respondent to explain what his/her answers mean. Any disconnect between the intended meaning of the question and the respondent’s interpretation of the meaning often emerge at this point. The methodologist will review with the respondent any of the notes made while observing the respondent completing the questionnaire. Scripted probes are also delivered at this point.


We prefer a retrospective debriefing for self-administered instruments because there is some limited evidence that asking respondents to think aloud as they complete a self-administered instrument can lead to increased navigation errors for some groups of respondents (Dillman and Redline 2004). Correct navigation is crucial for the success of the screeners. It is also generally understood that reading aloud can heighten attention and the respondent could notice and attend to things that otherwise would have gone unnoticed. The screeners are short enough that respondents can remember what they were thinking when they answered the questions. If it seems that respondents to the 2C screener (i.e., the long screener) are not able to recall their thoughts when probed, then 2C can be divided into sections and debriefing and probing can be done on a section by section basis.



Goals of the Cognitive Testing


One of the main goals of the cognitive testing is to determine whether the screener encourages or discourages the respondent in providing a telephone number. The language used, the layout and design, and the other questions asked could interact to encourage or discourage providing the telephone number. The cognitive sessions will also be designed to uncover any “red flags” that the screeners could trigger for the respondent, that is, whether the respondent thinks the screener is frightening, off-putting, generates suspicion, etc. Respondents usually signal with their body language and their tone of voice when they find something off-putting or offensive. Typically, respondents will make a facial gesture or gesticulate with some other part of their bodies – wide open eyes, a slight turn of the head, and so forth. Often respondents verbalize their thoughts and feelings with both articulated and unarticulated sounds. For example, an unarticulated sound would be an audible “ummmm,” “aaaahh,”“whaaaa?” or some other articulation of a partial words or sounds. Other respondents will verbalize their surprise or discomfort with statements like “what are you asking that for?” “I don’t see how that fits …” and so forth. Any articulated or unarticulated expression of discomfort or confounding is followed up with appropriate probes. These probes are by necessity unscripted and spontaneous. The cognitive interviewer will respond with a neutral and nonbiasing probe that elicits more information from the respondent. In this situation, the cognitive interviewer will collect two types of information: (1) information on the nature and cause of the issue and (2) information that can be used to redesign the question so that the issue at hand is solved.


If the respondent does not alert the interviewer to potential “red flags” through articulations or body language, the interviewer will probe about the respondent’s level of comfort with the questions. These probes are scripted and are found in the cognitive interview protocol.


Another goal of the cognitive work is to determine the extent to which the screener can be improved in any respect. Any negative reaction to the overall design, the cover page, the informational flip-side of the cover page, the FAQs, or the questions themselves will be analyzed. If the analysis shows a problem with any of the survey components, the component will be redesigned to more efficiently embody its measurement or communication goals. The nature of the redesign would depend entirely on the types of problems indicated.




All interviews will be audio and video recorded for note-taking purposes only. We anticipate that interviews will take no more than 90 minutes and respondents will be provided with $40 reimbursement for their time.

Recruitment


Westat will recruit up to 50 respondents, allocated across the various phases of testing (refer to Appendix G for a recruiting screener). To the greatest extent possible, Westat will recruit respondents who had experienced a crime victimization during the past 12 months; we anticipate that at least half of the participants will be crime victims.

Westat will advertise in a variety of advertising outlets, for instance, the Maryland Gazettes and the local Craig’s List, for adults who have experienced a crime victimization during the past 12 months (Appendix H). We anticipate that the primary source of recruiting will be from the advertisements  If the advertisements are not successful then we might use flyers placed throughout the target community.  Locations might include grocery store bulletin boards, community center bulletin boards, etc.  The locations would be in public spaces designed for such materials. Westat is exploring using other venues than the Washington, DC Metro area to recruit respondents. For instance, Buffalo, New York has a different demographic profile than the Washington, DC area and adequate crime levels to test the screeners. Chicago will also be included as a testing site. Additional sites might be included, if appropriate.

All recruiting materials will explain that we are looking for adults, aged 18 and over, who have experienced a crime victimization during the past 12 months. The past 12 months will be defined as 12 months from the moment the Westat recruiter talks with the potential recruit All potential respondents will be encouraged to contact the Westat recruiter, leaving their name and contact information on a voice messaging system. The Westat recruiter will then contact the potential respondent and conduct a screening interview. The focus of the recruitment screening interview (Attachment G) will be to identify individuals who are (1) aged 18 and older, (2) crime victims within the past 12 months, and (3) place the potential respondents into some basic demographic groups. Once eligible respondents have been identified, they will be offered time slots and scheduled for the interview at the venue where the cognitive interview is being held.

Respondents who experienced a crime victimization, but longer than 12 months ago or respondents who have never experienced a crime victimization ever could also be included in the cognitive interviews. The respondents who had experienced a crime victimization more than 12 months ago could report for that victimization and respondents who had never experienced a crime victimization could complete the cognitive interview based on a scenario.

To assist the non-crime victims in completing the usability test, we may use one of four different scenarios of crime victimizations (Attachment C). The scenarios will allow us to observe how theses respondents answer the screener questions. Using scenarios as part of cognitive or usability testing is a standard method for providing information about how people react to the screener questions and how their answers map to the response options provided.



Iterative Testing


We will conduct these interviews as iterative rounds of cognitive testing. In iterative testing, you keep testing until you find a significant flaw. At that point, testing is halted until that flaw is corrected. This method is commonly used in usability testing. It stresses the conservation of resources and usually leads to a high number of total flaws being discovered. In the cognitive laboratory, respondents tend to focus on the most obvious issues and problems and not notice more subtle issues until the more glaring problems have been removed. The iterative testing approach allows the more obvious issues to be repaired early in the testing process so that later interviews can be devoted to discovering more subtle and nuanced findings.



5.2 Administering the Core NCVS Instruments

The sample for administering the core NCVS by telephone will include:

  • A subsample of those responding to the 2C screener and providing telephone numbers;

  • Addresses selected for the 2C sample for which no screener is returned, but for which the sample vendor provides a telephone number;

  • Addresses selected for the 2B sample for which the sample vendor provides a telephone number; and

  • Those responding to the 2B screener and providing telephone numbers;

The instrumentation will be the same for all sampled addresses.

The core NCVS, included in separate attachments to this document, includes the following instruments:

  • Household screener (Control Card), completed by a household respondent;

  • Victimization screener, which includes both household and personal victimization questions; the household items are asked only of the household respondent; and

  • Incident reports for when crimes are reported in either of the victimization.

The household screener is asked of a “knowledgeable” adult in the household, and includes questions about the sampled address and about the individuals who live there. Address-based questions include:

  • Verification of address;

  • Mailing address*;

  • Year built (under certain circumstances)*;

  • Whether any other living quarters share the address, and if so some questions about the other living quarters*;

  • Whether the housing unit (HU) is owned or rented (tenure);

  • Whether the HU is student housing, owned by a public housing authority, or on Indian land*;

  • Whether the HU produces farm income above a certain amount annually*;

  • The type of HU (house, apartment, etc.);

  • How many HU are in its larger structure; and

  • Whether the HU is in a gated or walled community*.



The items marked with an asterisk (*) have purposes related to in-person data collection or to the area sample design, so will not be needed for the CS Pilot. Other items are of substantive interest.

The NCVS household screener rosters all household members (first and last names), and collects the following information about each person:

  • Gender;

  • Relationship to the reference person;

  • Who owns or rents the HU;

  • Whether the person has a usual residence elsewhere;

  • Date of birth;

  • Marital status;

  • Active duty status;

  • Education; and

  • Ethnicity and race.



Besides this household screener information, the victimization screener is also asked of a household respondent in the core NCVS, and the personal victimization questions on the screener are asked of every other person aged 12 or older in the household. Incident reports are asked of the person who reported the crime. The NCVS FR Manual instructs the FR, “For a first enumeration period household, ask to speak with one of the persons who owns or rents the home.” The first enumeration period interview is always done in person, and the victimization screener must be completed with the household respondent before any other victimization screener is attempted.

For the CS, BJS has made some changes to this basic protocol, in the interests of respondent burden, cost, and data quality. As described in the previous section, only adults 18 years of age or older will be interviewed, and when the household includes three or more adults, two will be randomly selected from the roster completed with the household respondent. The household respondent may be any knowledgeable adult in the household. In most cases, the household respondent will also be a sampled adult (selected with certainty in households with 1 or 2 adults). If the household respondent happens not to have been sampled, there will be a total of three respondents possible from the household. In all other households, there will be one or two.

Finally, the victimization screener currently includes questions on identity theft and hate crimes, asked only of the household respondent. These will not be included in the CS.

To summarize, the sequence of the interview(s) in a household would look like this:

  • Verify address and identify appropriate household respondent;

  • With the household respondent, complete the household screener and a victimization screener including the household items (except those about hate crimes and identity theft), complete any required incident reports and select sample adult(s);

  • Ask for (next) sampled adult ; complete additional victimization screener if needed and any required incident reports;

  • Repeat previous step if necessary for households with 3+ adults.



A more detailed description of the adaptation of the core NCVS instruments may be found in Appendix D.



6. Data Collection Procedures



This section describes the data collection procedures for the NCVS-CS Pilot. Data collection for the Pilot is proposed to begin in early January 2012 and close in late March 2012. The goal is to complete the Pilot in sufficient time to inform the OMB Submission Package for the main NCVS-CS data collection (this submission is scheduled for mid June, 2012). The Pilot data collection for the NCVS-CS will be conducted using a combination of mail and telephone administration. Mail administration will be used to notify the household of their selection, to obtain neighborhood characteristics and victimization propensity for select households, and to obtain telephone numbers. Telephone administration will be used to implement the NCVS instruments. Details on the various instruments (including content) are provided in Section 5.



6.1 Mail Data Collection

We anticipate that all of the sampled addresses will receive an advance mailing explaining the survey and notifying the household that it has been selected (please refer to Appendix E for the advance letter content). The data collection strategy will vary for the 2B and 2C samples. In Approach 2B, households will be sampled from the ABS frame and immediately matched to identify telephone numbers with those households. Those with matching telephone numbers will be sent to a telephone research center (TRC) and called. In Approach 2B, if no telephone number can be linked to the address or if the linked telephone number is not correct for the address, then a mail screener (Appendix A.1) will be sent to the address followed by a postcard reminder. The purpose of the 2B mail screener is to obtain a telephone number. In order to increase interest in the survey, there are a limited number of questions on a topic such as perceptions of the local police and emergency services. Those households that provide a telephone number will be the sent to the TRC for calling, using the same procedure as described above for matched households. Those with no telephone number are classified as nonrespondents.

The strategy for the 2C approach is different in that all households in the 2C sample will first be sent a 2C mail screener (Appendix A.2), followed by a postcard reminder. Those 2C households which send back a mail screener will be sub-sampled to determine which ones will be included in the NCVS telephone data collection. As indicated earlier in the sample design section, the 2C sample with telephone numbers will be stratified by victimization propensity and whether a mail screener was completed, with disproportionate sub-sampling across strata. Sub-sampled households will be contacted by an interviewer and will be asked to complete the telephone interviews. Mail screener nonrespondents will be sent an advance letter (Appendix E) notifying them to anticipate a telephone call, and to reinforce the importance of the survey.



6.2 Telephone Interviews

The first task of the telephone interviewer will be to confirm that s/he has reached the sampled address. If the interviewer verifies that the dialed telephone number reaches the sampled address, then the interviewer will identify an adult (18 or older) living in the household who can provide information at the household-level2. In that interview, (the household informant interview), there are three types of questions: (1) survey items about household characteristics, (2) items asking about personal victimization, and (3) items asking about household-level victimization. Based on data from the core NCVS, we anticipate that the average household informant interview will be about 25 minutes long.

In the core NCVS, all individuals age 12 and older are interviewed about their personal victimization experiences. In the NCVS-CS, a random sample of two adults will be interviewed about their personal victimization experiences. In households with one or two adults, the household informant will (by definition) be one of the two sampled adults. In households with three or more adults, the household informant may not be one of the sampled adults. This means that the data we collected about their personal victimization experiences may not be used to generate estimates about personal victimization. The rationale for asking these questions of all household informants (regardless of whether they are sampled or not) is to reproduce the NCVS core as much as possible. The household informant survey includes both questions about household property crime and personal victimization and the questions are not segmented, but are mixed within the instrument. If the personal victimization questions were removed from the NCVS-CS, there could be an impact on the household-level estimates due to the loss of crime cues from the personal victimization questions.

In households with only one adult (about 30 percent of households), the NCVS-CS Pilot survey is complete once the household informant interview is completed. In households with two adults (about 54 percent of households), the interviewer will ask the household informant to pass the telephone to the other adult. This second adult will then be asked to complete a personal victimization screener. In households with three or more adults (about 16 percent of households), the CATI system will sample two adults and will inform the interviewer which of the remaining adults have been selected to complete a personal victimization screener (if the household informant is sampled, then the household will be asked to complete one personal victimization screener, otherwise the household will be asked to complete two personal victimization screeners). Based on data from the core NCVS, we anticipate that the average personal victimization interview will be about 7.5 minutes long.

Any reports of a victimization event will result in a detailed incident interview which will be administered for each incident, or occurrence, of each crime type. This is an event-level interview, so a respondent reporting more than one event would complete multiple incident interviews (one for each crime reported). Based on data from the core NCVS, we anticipate that the average incident interview will be about 20 minutes long.

Telephone Interview Pretest. We propose conducting a brief pretest to serve as a “dress-rehearsal” of the CATI instruments. The goals of the pretest will be to ensure that the CATI instrument works as expected, to provide a final test of question items, to identify unmet training needs, to refine the estimates of average interview length, and to assess cooperation. The primary focus will be on the household screener and victimization screeners; a sample size of about 50 completed households should be sufficient to achieve these goals. We will explore the possibility of including a purposive sample of households with victimizations to assess the incident reports in CATI.


6.3 Methods to Maximize Response Rates


Use of Pre-notification Letters. Pre-notification letters will be mailed to engage respondent interest and cooperation by focusing on the legitimacy and importance of the study. The letters will provide advance notice of the survey contact and inform households about the purpose of the survey.

Use of reminder mailings and Nonresponse Followup Letters. Households receiving mail screeners will all receive a reminder postcard mailing. Also, nonresponding households in Approach 2C where telephone numbers are found will receive a nonresponse followup letter that will also serve as an advance letter for the telephone interview. The content of the letter will focus on the legitimacy and importance of the study. The letter will also address issues related to privacy or confidentiality of data.

Flexibility in Scheduling Interviews. In situations where a telephone respondent is unavailable, a call appointment will be entered into the CATI management system with notations on the best time to reach the respondent.

Follow-up telephone contacts. Multiple call attempts will be made on different days and at different times to maximize the chances of getting a person at home. Those that refuse during an initial telephone contact attempt will be held for about 2 weeks before contact is attempted by an interviewer again. Interviewers will be trained to address common issues and motivate participation.



7. Analysis Plan



In the following, 2B denotes the “telephone harvest” method and 2C denotes the “two-phase ABS hybrid” method. In the latter, we use the initial mail survey to classify the household into strata: H, with high likelihood of victimization, and L, with low likelihood of victimization. We can think of both approaches as two-phase surveys: in 2B, phase 1 consists of obtaining the telephone number either from a vendor or from an initial mail survey.

A previous section discussed different approaches to producing NCVS SAEs, including 2.1(B) model-based estimation with additional auxiliary information collected through a survey and 2.1(C) blended estimates from the core NCVS and the CS. Pilot data may be used to begin assessing each of these approaches. Thus, the goals of the pilot are to assess whether data collection Approach 2B or 2C provides better information for the cost for SAE approaches 2.1(B) and 2.1(C), separately to evaluate the effectiveness of the 2C screener, and to inform the design of the main MSA data collection that may follow. The following sections present details of the analysis plan for each of these objectives.



7.1 Objective 1: Identify which of 2B, 2C provides more information for producing blended estimates.

Each of the following measures is to be found for each of:

2B, full sample;
2B, telephone number from directory service;
2B, telephone number from mail screener;
2C, full sample;
2C, stratum L;
2C, stratum H; and
2C, mail nonresponse and telephone number from directory service.

  1. Response rates, by phase, demographics, victim classification in screener or other screener variables.

  2. Analyze response rates by characteristics of sampled census blocks.

  3. Cost per complete interview.

  4. Cost per victim in sample.

  5. Cost per victim of violent crime in sample.

  6. Cost per victim of property crime in sample.

  7. Estimated victimization rate, for major type of crime (TOC) classes. Compare victimization rates from the CS with those from the NCVS, recognizing possible differences due to mode and recall period. Test whether relative magnitudes, rankings of victimization rates by TOC are the same for the NCVS and the CS, after adjusting for census neighborhood characteristics in the respective samples.

  8. Estimated number of crimes reported to police. Compare with UCR for jurisdiction.

  9. Cost relative to standard error for estimating victimization rates, characteristics of victims from CS.

  10. Effects of recall period. Analyze victimization rates by month of occurrence relative to interview date. Compare the recency curves for the CS and NCVS. Compare victimization rates estimated using only incidents in most recent 6 months with NCVS bounded and unbounded victimization rates (also note confounding in NCVS since generally bounded interviews are telephone and unbounded interviews are in-person).

  11. Information from interviewer debriefing sessions to elicit potential improvements to the survey protocol. Also analyze missing data patterns, distributions of responses to specific items, and out-of-range and misreported information.

  12. Other potential sources of nonsampling error. Look at differences by land/cell phone, interviewer effects, number of callbacks, etc. If possible, obtain similar data on core NCVS.

  13. Poststratification and weighting methods to produce blended estimates with NCVS. If estimated bias is large, explore other models for bias. We will derive methods for modeling bias in Spring 2011. Estimate reduction in MSE for (a) victimization rates, (b) characteristics of crime victims, (c) multivariate relationships using blended survey data.



Since we won’t be using an optimal design for the pilot, we will also estimate the costs and response rates that would have resulted under a more efficient design, such as optimal allocation for the two-phase sample in 2C or an improved subsampling-within-household design.



7.2 Objective 2: Identify which of 2B, 2C provides more information for small area estimation.

This objective is related to the previous one, except here we consider model-based approaches for SAE.

  1. Examine census block-level variables as predictors of (violent) victimization. This can be done now with current NCVS at Census Bureau. Census predictors could be used as auxiliaries for SAE with both 2B and 2C. Compare with SAEs obtained using higher level of geography for prediction. Examine outliers in model-based predictions.

  2. Analyze associations between 2C screener questions and victimization. The focus here will be on the non-victimization-related questions such as attitudes toward police, fear of crime, routine activities, employment, etc.

  3. Fit SAE models using phase 1 data from 2C as auxiliary information, in addition to variables identified from census. Investigate both unit-level and area-level models. Estimate reduction in MSE under model that results from using the phase 1 data. Examine sensitivity to model assumptions.

  4. Develop theory for using both phases of a two-phase survey as auxiliary information in SAE. Compare reductions in MSE with blended estimate from objective 1.





7.3 Objective 3: Analyze effectiveness of 2C screener.

We will estimate specificity and sensitivity for different crime types and examine associations between screener questions and TOC classification for NCVS.



7.4 Objective 4: Modify design for next round of sampling.

We will use the costs and estimates of specificity and sensitivity from pilot analyses to determine optimal subsampling fractions from stratum L for estimating victimization rates. We will also consider modifying the stratification to oversample geographic areas with variables associated with high victimization rates. Finally, we will consider alternative approaches to subsampling adults in 2C; for example, is it more efficient to use different subsampling probabilities?



7.5 Nonresponse Bias Analyses

The NCVS-CS Pilot study is a methodological test to evaluate new methods of data collection to support estimates for small areas that would be too expensive to obtain using the traditional face-to-face interviewing methods. As such, the goal of the Pilot is to explore the quality of the estimates from the new methodologies and compare them to the estimates produced using the traditional methods. An important part of that investigation is the effect of nonresponse bias. The new methodologies are likely to result in lower response rates, but this does not imply that the estimates will necessarily have more nonresponse bias. The new methods will not only have different response rates, but they will likely have different measurement error properties, so the primary goal is to compare the overall effect.

Some of the planned analyses will explore the nonresponse component of the differences between the Pilot and the traditional methods. The approaches are very similar to many nonresponse bias studies. We mention these below, but we wish to emphasize that the main focus of the analyses will be on the sources of differences between the Pilot and the Core NCVS.

Another aspect of nonresponse bias that is of particular interest in the Pilot is the evaluation of the two approaches to data collection within the Pilot itself (the 2B and 2C approaches). The sample sizes for each of the two methods are more limited so only some types of nonresponse analyses are being proposed in addition to the simple response rates for the two approaches. The other analyses will explore the following questions:

  • What is the demographic profile of nonresponding households?


  • What is the demographic profile of nonresponding individuals?


  • What level of crime victimization would be reported by nonresponding households?


  • What level of crime victimization would be reported by nonresponding individuals?


To support the nonresponse bias analysis of the two approaches we plan to conduct a level of effort study that will explore whether specific subgroups are more likely to be nonrespondents at certain stages in the data collection. Part of this effort will look explicitly at those who participate only after multiple attempts (reluctant respondents). Reluctant respondents will include late responders and those who initially refused to participate but later complied. These respondents would have been nonrespondents if the methodology had not included nonresponse conversion efforts or had utilized a brief field period. For this reason we can use these respondents to help develop the profile of nonrespondents. We will also have their victimization data, and so we can assess their potential impact on the estimates. Even though these “reluctant” respondents may be different than the final nonrespondents, a comparison between the two groups will be useful.

Another type of nonresponse analysis that will be explored is the use of geographic data to characterize the respondents and nonrespondents. Although the pilot is only conducted in one area, there are likely to be differences within that area that can be assessed using data from the American Community Survey or the 2010 Census. These data will include characteristics such percent minority, median income, home ownership, race, ethnicity, etc.



8. IRB Review



The Westat IRB is currently reviewing this application. OMB will be provided the final letter of approval when it is received.





9. Data Confidentiality and Data Security



The data collected for this project are protected under the Bureau of Justice Statistics statutory protection (42 USC 3789g).

Access to Westat’s secure computer systems is password protected. All server and network data storage areas are protected by access privileges, which are assigned by the appropriate system administrator. All systems are backed up on a regular basis and are kept in a secure storage facility.

To protect the identity of NCVS-CS cognitive testing respondents, no identifying information will be kept on the final survey file. Identifying information includes the name, address, and telephone number of the cognitive interview respondent. The identifying information will be deleted once the analysis file has been created and the link is no longer needed. We estimate this to be 3 months after the NCVS-CS data collection has ended.

With respect to personnel, all Westat employees are required to sign a pledge of confidentiality. This pledge requires employees to maintain confidentiality of project data and to follow the above procedures when handling confidential information.





10. Estimate of Burden Hours



This package includes burden for three efforts:

1. Cognitive Interviews to support the design of the mail screener instruments.

2. A pretest of the CATI instruments.

3. A Pilot data collection of the NCVS-CS.

Below we have indicated the estimated burden (time and cost) for each of these three data collection efforts.

Estimated Burden of Screening for the Cognitive Interview Task

Maximum Number of respondents

Number of responses per respondent

Time per response

Total time across all respondents

100

1

2.5 minutes

4.17 hours



Estimated Burden of the Cognitive Interview Task

Maximum Number of respondents

Number of responses per respondent

Time per response

Total time across all respondents

50

1

1.5 hours

75 hours



Estimated Burden of the Pretest Interview Task

Type of Interview

Number of respondents

Number of responses per respondent

Avg time per response *

Total time across all respondents

Household

50

1

25 minutes

20.833 hours

Personal

35

1

7.5 minutes

4.375 hours

Crime Report

26

1.1

20 minutes

8.667 hours

TOTAL RESPONDENTS

Total:

85

Avg:

1.31

Avg per R:

23.9 minutes

Total burden:

33.875 hours



* Time per response was calculated as follows: We assume 50 households will complete a survey. Of these we are assuming that one-third will include a crime victim (our plan is to use a purposive sample so that we can reach crime victims in order to best test the interview protocol). Of the 50 households we assume that 30% will have one adult and complete one household interview (average burden of 25 minutes). We assume the remaining 70% will complete one household interview (25 minutes) and one personal interview (7.5 minutes). Of the 85 individual respondents, we assume that 26 will report a crime (20 minutes per crime report).



Estimated Burden of the NCVS-CS Pilot Task

Type of Instrument

Number of respondents

Number of responses per respondent

Mean time per response

Total time across all respondents

2b mail screener

1,716

1

6

172 hours

2c mail screener

5,544

1

12

1,109 hours

Household

4,932

1

25

2,055 hours

Personal

1,381

1

7.5

173 hours

Crime Report

893

1.1

20

327 hours

TOTAL RESPONSES

14,466



Total burden:

3,835 hours









Appendix A.1

Content for the Approach 2B Mail Screener


Please refer to accompanying pdf:

Appendix A1 - Draft Screener for 2B Approach.pdf”








Appendix A.2

Content for the Approach 2C Mailed Screener


Please refer to accompanying pdf:

Appendix A2 - Draft Screener for 2C Approach.pdf”






APPENDIX B.1

NCVS CS PILOT: 2B MAIL SCREENER

COGNITIVE INTERVIEW PROTOCOL

NCVS CS PILOT: 2B MAIL SCREENER

COGNITIVE INTERVIEW PROTOCOL


Date:_______________ Time_____ ID #:_____Interviewer Initials:_______________


  1. Introduction


Thank you for taking the time to help us out today. The session will take approximately 1 to 1 ½ hours. I’ll give you a little background about what we’ll be doing today.

Westat is working on this project for the Bureau of Justice Statistics. We are testing a paper questionnaire that may be used for the National Crime Victimization Survey. The NCVS is administered every year nationwide and collects information on crime victimization. Westat is developing a paper questionnaire that will help the NCVS be administered in more efficient ways.

Today, I’ll ask you to first complete the questionnaire, working at your own pace as if you were doing this at home. I’ll watch what you are doing and we will talk about your answers and what they mean when you are finished. We would also like to read some of the survey letters that we will be sending out.

When you finish, we will go through the survey together and I’ll ask you some questions about your answers and how you arrived at your answers. We need to make sure that people understand the question and that these are questions that people are willing to answer. This will help us improve the questionnaire.

This is a research project and your participation is voluntary. You can skip any question and you can stop at any point. We would very much appreciate your permission to audio record this conversation. The recording will be used for note-taking purposes only and will be destroyed when the project is over. When we are finished, we have 50 dollars for you in gratitude for your assistance. There are no right or wrong answers – we are interested in everything you have to say and we encourage you to speak openly about the questions and your answers. Please sign the research consent form (that says everything I just said).


  1. Consent Process



Hand the consent form to participant, answer any questions, and obtain consent before continuing.

Do you have any questions before we get started?


[Start recorder and get oral permission to record.] It is [DATE AND TIME], do I have your permission to audio record this conversation? ~~~~ Thank you.

ASK RESPONDENT TO START COMPLETING THE SCREENER.

  • Here is the questionnaire and a pen.

  • Please complete the questionnaire as if it had come in the mail and you were home alone filling it out.



  1. The Debriefing


interviewer instructions

  1. DURING THE DEBRIEFING – STATE THE QUESTION NUMBER SO ANY OBSERVERS AND THE RECORDING KNOW AT ALL TIMES WHICH QUESTION IS BEING DISCUSSED.


  1. WRITE COMMENTS ON THE RESPONDENTS QUESTIONNAIRE. WRITING ON THE RESPONDENT’S QUESTIONNAIRE CREATES A COMPLETE DOCUMENT OF THE INTERVIEW.


  1. IF RESPONDENT ASKS A QUESTION – STATE THE QUESTION NUMBER THAT THE RESPONDENT IS ASKING ABOUT AND SIMPLY SAY:


  • What makes you ask that?

  • Can you say more about that?


WHEN THE RESPONDENT HAS COMPLETED THE QUESTIONNAIRE, Review all answers. Point to the answer and say:

  • What does this answer means?

  • Can you say more about that?



IF THE RESPONDENT MAKES A MISTAKE (NAVIGATION, SKIP, CHANGED ANSWER, ETC): POINT TO THE MISTAKE AND ASK:

  • What happened here?

  • Can you say more about that?



FOR PLACES WHERE RESPONDENT DISPLAYED DIFFICULTY, CONFUSION, OR ANY “EXPRESSION”:

  • You seemed to hesitate here at Q___. What were you were thinking about?

  • You had a look on your face when you were reading this questions.

  • Can you tell me what you were thinking?

  • You just said that you ~~~. Can you say more about that?

  • You mentioned the ~~~~ ; how did that work for you?



3.1 Global Issues: Content, Response Propensity, and Sensitivity

Now that you have finished the questionnaire, but before we talk about your individual answers, let’s talk about some things in general.

  • Can you say in your own words what the questionnaire is about?

  • Was there anything that felt a little odd or not so good to answer?

  • Was there anything that you really didn’t want to answer or that felt a bit inappropriate or were all the questions completely fine?

  • Was there anything that would be a show stopper for you?

  • Who is this survey about?

  • Who do you think should respond to this survey?



3.2 Question by Question Probes

The Introduction

POINT TO THE TEXT RIGHT BELOW AT THE TOP OF THE PAGE

  • Did you see this?

  • Did you read it?

  • Can you tell me in your own word what it said?

  • Can you tell me in your own words what you think “neighborhood” means?


[GET R’S DEFINITION OF “NEIGHBORHOOD”]

  • The way they describe neighborhood here, is this the way you think about neighborhood or do you think about neighborhood in some other way?



FOR QX 1-3, REVIEW EACH ANSWER AND ASK WHAT THE ANSWER MEANS.

Qx. 1 asks about “a good place to live.” What does that mean to you – “a good place to live?

  • You marked “~~~.” What does that mean?

  • Can you say more about ~~~~?



Qx. 2. Is “litter, broken glass or trash” something you would notice, or would you not notice those things?

  • You marked “~~~.” What does that mean?

  • Can you say more about ~~~~?



Qx. 3. When you saw “crime” what jumped into your head?

  • You marked “~~~.” What does that mean?

  • Can you say more about ~~~~?



The Instructions above Qx 4

POINT TO THE INSTRUCTION ABOVE QUESTION 4

  • Did you read this?

  • What is it asking you to do?



Qx. 4. What made you “~~~~~” that people around here are willing to help their neighbors?

  • Can you say more about that?



Qx 5. What made you “~~~~” that this is a close-knit neighborhood?

  • Can you say more about that?



Qx. 6. Can you say in your own words what this question is asking?

  • The question asks about “people in the neighborhood.” Who do you think of when you see “people in the neighborhood”?

  • What does it mean to “trust” people in the neighborhood?

  • You answered that you “~~~~~.” Can you say what your answer means?



Qx. 7. Can you say in your own words what this question is asking?

  • The question asks about “people in the neighborhood.” Who do you think of when you see “people in the neighborhood”?

  • You answered that you “~~~~~.” Can you say what your answer means?



Qx. 8. Can you say in your own words what this question is asking?

  • The question asks about “people in the neighborhood.” Who do you think of when you see “people in the neighborhood”?

  • You answered that you “~~~~~.” Can you say what your answer means?



Qx. 9. Can you say in your own words what this question is asking?

  • What would be some of the concerns of the people in your neighborhood?

  • You answered that you “~~~~~.” Can you say what your answer means?



Qx. 10. Can you say in your own words what this question is asking?

  • The question asks about “people in the neighborhood.” Who do you think of when you see “people in the neighborhood”?

  • responding to people in the neighborhood after they have been victims of crime” mean?

  • You answered that you “~~~~~.” Can you say what your answer means?



Qx. 11. Can you say in your own words what this question is asking?

  • The question asks about “when people in your neighborhood call 911, does help arrive quickly?”

  • Is this information that you or other people in your neighborhood would know?

  • You answered that you “~~~~~.” Can you say what your answer means?



Qx. 12. You answered that you “~~~~~.” Can you say more about that?

  • How did you figure that you have live “~~~~” years at this address?



Qx. 13. You wrote/did not write in your phone number. What made you write in/not write in your phone number?

  • Is this information that you usually give out?

  • Do you feel comfortable or not comfortable giving out your phone number

  • Do you expect to get a phone call or do you expect not to be called?

  • For what reasons do you think you would be called?

  • Can you say more about that?



  1. Envelope, Survey Cover and Letters



  • Can you say something about the envelope that the survey was enclosed in? Is there anything about the envelope that you particularly like or dislike? Anything that would encourage you (or discourage you) from opening the envelope?

  • What about the cover letter that accompanied the survey? (Is there anything in the letter that you particularly like or dislike?)

  • The logo on the envelope and the letter is for the Bureau of Justice Statistics. I would like to show you some other logos that could be used on the survey materials. Please let me know What you think of these. [SHARE THE JUSTICE PROGRAM LOGO AND THE DOJ LOGO] What about the logos? Can you say more about that? is there one you prefer to the others? If yes, can you tell me why?

  • What do you think about the front cover of the questionnaire? Is there anything that jumps out at you?



  1. Closing



  • Is there anything else you noticed about the survey that we have not discussed?

  • Was there anything that you particularly liked or disliked about the survey?

  • Do you have any other thoughts or comments about the survey?

  • Thank you for your time.



APPENDIX B.2

NCVS CS PILOT: 2C MAIL SCREENER

COGNITIVE INTERVIEW PROTOCOL

NCVS CS PILOT: 2C MAIL SCREENER

COGNITIVE INTERVIEW PROTOCOL


Date:_______________ Time_____ ID #:_____Interviewer Initials:_______________


  1. Introduction


Thank you for taking the time to help us out today. The session will take approximately 1 to 1 ½ hours. I’ll give you a little background about what we’ll be doing today.

Westat is working on this project for the Bureau of Justice Statistics. We are testing a paper questionnaire that may be used for the National Crime Victimization Survey. The NCVS is administered every year nationwide and collects information on crime victimization. Westat is developing a paper questionnaire that will help the NCVS be administered in more efficient ways.

Today, I’ll ask you to first complete the questionnaire, working at your own pace as if you were doing this at home. I’ll watch what you are doing and we will talk about your answers and what they mean when you are finished. We would also like to read some of the survey letters that we will be sending out.

When you finish, we will go through the survey together and I’ll ask you some questions about your answers and how you arrived at your answers. We need to make sure that people understand the question and that these are questions that people are willing to answer. This will help us improve the questionnaire.

This is a research project and your participation is voluntary. You can skip any question and you can stop at any point. We would very much appreciate your permission to audio record this conversation. The recording will be used for note-taking purposes only and will be destroyed when the project is over. When we are finished, we have 50 dollars for you in gratitude for your assistance. There are no right or wrong answers – we are interested in everything you have to say and we encourage you to speak openly about the questions and your answers. Please sign the research consent form (that says everything I just said).



  1. Consent Process



Hand the consent form to participant, answer any questions, and obtain consent before continuing.

Do you have any questions before we get started?


[Start recorder and get oral permission to record.] It is [DATE AND TIME], do I have your permission to audio record this conversation? ~~~~ Thank you.

ASK RESPONDENT TO START COMPLETING THE SCREENER.

  • Here is the questionnaire and a pen.

  • Please complete the questionnaire as if it had come in the mail and you were home alone filling it out.



  1. The Debriefing


interviewer instructions

  1. DURING THE DEBRIEFING – STATE THE QUESTION NUMBER SO ANY OBSERVERS AND THE RECORDING KNOW AT ALL TIMES WHICH QUESTION IS BEING DISCUSSED.


  1. WRITE COMMENTS ON THE RESPONDENTS QUESTIONNAIRE. WRITING ON THE RESPONDENT’S QUESTIONNAIRE CREATES A COMPLETE DOCUMENT OF THE INTERVIEW.


  1. IF RESPONDENT ASKS A QUESTION – STATE THE QUESTION NUMBER THAT THE RESPONDENT IS ASKING ABOUT AND SIMPLY SAY:


  • What makes you ask that?

      • Can you say more about that?



WHEN THE RESPONDENT HAS COMPLETED THE QUESTIONNAIRE, Review all answers. Point to the answer and say:

  • What does this answer means?

  • Can you say more about that?



IF THE RESPONDENT MAKES A MISTAKE (NAVIGATION, SKIP, CHANGED ANSWER, ETC): POINT TO THE MISTAKE AND ASK:

  • What happened here?

      • Can you say more about that?



FOR PLACES WHERE RESPONDENT DISPLAYED DIFFICULTY, CONFUSION, OR ANY “EXPRESSION”:

  • You seemed to hesitate here at Q___. What were you were thinking about?

  • You had a look on your face when you were reading this questions.

  • Can you tell me what you were thinking?

  • You just said that you ~~~. Can you say more about that?

  • You mentioned the ~~~~ ; how did that work for you?



3.1 Global Issues: Content, Response Propensity, and Sensitivity

Now that you have finished the questionnaire, but before we talk about your individual answers, let’s talk about some things in general.

  • Can you say in your own words what the questionnaire is about?

  • Was there anything that felt a little odd or not so good to answer?

  • Was there anything that you really didn’t want to answer or that felt a bit inappropriate or were all the questions completely fine?

  • Was there anything that would be a show stopper for you?

  • Who is this survey about?

  • Who do you think should respond to this survey?

  • Some questions ask about victimization – what comes to mind when you hear the word victimization?

  • Is victimization a word that you use or do you use some other word or way of talking about crimes that could happen to a person?

  • Can you answer these kinds of questions for yourself?

  • Can you answer these kinds of questions for others in your household?



3.2 Question-by-question Probes

The Introduction

POINT TO THE TEXT RIGHT BELOW AT THE TOP OF THE PAGE

  • Did you see this?

  • Did you read it?

  • Can you tell me in your own word what it said?

  • Can you tell me in your own words what you think “neighborhood” means?


[GET R’S DEFINITION OF “NEIGHBORHOOD”]

  • The way they describe neighborhood here, is this the way you think about neighborhood or do you think about neighborhood in some other way?



FOR QX 1-8, REVIEW EACH ANSWER AND ASK WHAT THE ANSWER MEANS.

Qx. 1 asks about “a good place to live.” What does that mean to you – “a good place to live?

  • You marked “~~~.” What does that mean?

  • Can you say more about ~~~~?



Qx. 2. Is “litter, broken glass or trash” something you would notice, or would you not notice those things?

  • You marked “~~~.” What does that mean?

  • Can you say more about ~~~~?



Qx. 3. When you saw “crime” what jumped into your head? Can you say more about that?

  • You marked “~~~.” What does that mean?

  • Can you say more about ~~~~?



The Instructions above Qx 4

POINT TO THE INSTRUCTION ABOVE QUESTION 4

  • Did you read this?

  • What is it asking you to do?



Qx. 4. What made you “~~~~~” that people around here are willing to help their neighbors?

  • Can you say more about that?



Qx 5. What made you “~~~~” that this is a close-knit neighborhood?

  • Can you say more about that?



Qx 6. Can you say in your own words what this question is asking?

  • The question asks about “people in the neighborhood.” Who do you think of when you see “people in the neighborhood”?

  • What does it mean to “trust” people in the neighborhood?

  • You answered that you “~~~~~.” Can you say what your answer means?



Qx 7. What made you “~~~~” that the people in this neighborhood generally get along?

  • What does it mean to “generally get along”?

  • Can you say more about that?



Qx 8. What made you “~~~~” that people in this neighborhood share the same values?

  • What does it mean to “share the same values”?

  • Can you say more about that?



Experiences of People in Your Household

The Instructions

POINT TO THE TEXT RIGHT BELOW “Experiences of People in Your Household”

  • Did you see this?

  • Did you read it?

  • Can you tell me in your own word what it said?

  • What came to your mind when you saw “neighborhood”?


[SEE WHETHER DEFINITION OF “NEIGHBORHOOD” HAS CHANGED FROM PREVIOUS SECTION]

  • The instructions ask that you “please include all experiences, even if not reported to the police.

  • What “experiences” are they referring to?

  • What does it mean “even if not reported to the police”?

  • The questions ask about “the last 12 months.”

  • How did you figure the last 12 months to answer these questions?

  • The questions also ask about “in this household.”

  • What do you think of when you see “in this household”?

  • Were you able to report for your household or were you able to report for yourself or yourself and someone else, but not really your entire “household”?

  • The questions asked in this section, do you know this for everyone in your household or do you know this only know for yourself and maybe some other person?

  • In other words, how many people in your household would you know these things for?



FOR QX 9-16, REVIEW EACH ANSWER AND ASK WHAT THE ANSWER MEANS.

Qx. 9. “has something belonging to anyone in this household been stolen, such as, ……” What do you think this question is asking about?

  • Can you give some examples?

  • What other things should be reported here – other than the ones already named?

  • You answered “~~~~~.”

  • Can you say what your answer means?



Qx. 10. Can you say what this question is asking in your own words?

  • What types of things would you report here?

  • You answered “~~~~~.”

  • Can you say what your answer means?



Qx. 11. “were any cars, vans, trucks or other motor vehicles owned by anyone in this household stolen or used without permission.”

  • What this question is asking?

  • What types of things would you report here?

  • Anything else?

  • You answered “~~~~~.”

  • Can you say what your answer means?



Qx. 12. “did anyone steal or attempt to steal any parts from a vehicle, like a tire, car stereo, hubcap or battery, or anything that was left in a vehicle.”

  • What this question is asking?

  • What types of things would you report here? Anything else?

  • You answered “~~~~~.”

  • Can you say what your answer means?



POINT TO THE INSTRUCTION ABOVE QUESTION 13

  • Did you read this?

  • What is it asking you to do?

  • What types of things is the instruction asking you to include?



Qx. 13. What does “by force or threat” mean?

  • Can you give some examples?

  • What does “between people that don’t know each other, but often involve people who know each other”?

  • You answered “~~~~”. Can you say what your answer means?



Qx. 14. What does “attached with some type of weapon” mean?

  • Can you give some examples?”

  • Can you think of other weapons than gun, knife, baseball bat or rock”?

  • You answered “~~~~”. Can you say what your answer means?



Qx. 15. What does “attached in another way” mean?

  • Can you give some examples?”

  • Can you think of other ways of being attacked than grabbing, forcing unwanted sexual activity, punching, or choking”?

  • You answered “~~~~”. Can you say what your answer means?



Qx. 16. What does “threatened with any kind of attack” mean?

  • Can you give some examples?”

  • Can you think of other ways of being threatened with any kind of attack”?

  • You answered “~~~~”.

  • Can you say what your answer means?



Police and 911 Services

The Instructions

POINT TO THE TEXT RIGHT BELOW “Police and 911 Services”

  • Did you see this?

  • Did you read it?

  • Can you tell me in your own word what it said?

  • What came to your mind when you saw “Police”?

  • What came to your mind when you saw “911”?



FOR QX 17-19, REVIEW EACH ANSWER AND ASK WHAT THE ANSWER MEANS.

Qx. 17. Can you say in your own words what this question is asking?

  • What would be some of the concerns of the people in your neighborhood?

  • You answered that you “~~~~~.”

  • Can you say what your answer means?



Qx. 18. Can you say in your own words what this question is asking?

  • The question asks about “people in the neighborhood.”

  • Who do you think of when you see “people in the neighborhood”?

  • What do you think of when you see “responding to people in the neighborhood after they have been victims of crime” mean?

  • You answered that you “~~~~~.”

  • Can you say what your answer means?



Qx. 19. Can you say in your own words what this question is asking?

  • The question asks about “when people in your neighborhood call 911, does help arrive quickly?”

  • Is this information that you or other people in your neighborhood would know?

  • You answered that you “~~~~~.”

  • Can you say what your answer means?



Your Household

Qx. 20. The Matrix

Before we look at what you filled in for Qx. 20, can you just run through everyone who is a part of your household?

  • Anyone else?

  • So do you have anyone who lives in your household just part time?

  • College students away at school most of the year?

  • Could you say in your own words what the instruction [POINT TO QX. 20] asks you to do?

  • Anything else?

  • Let’s look at how you filled this in. [COMPARE ANSWERS IN MATRIX TO WHAT THEY JUST TOLD YOU.]

  • Was this easy or difficult to do?



Qx. 21. Can you say in your own words what this question is asking?

  • What does “not working who wants to find a job” mean?

  • You answered that you “~~~~~.”

  • Can you say what your answer means?



Qx. 22. Before we look at your answer, could you tell me all the jobs of the members of your household?

  • Any other jobs?

[COMPARE ANSWERS TO WHAT THEY JUST TOLD YOU.]

  • Let’s look at your answers.

  • You marked “~~~~” and that would be for “~~~~~” and then you marked “~~~~~~~” and that would be for “~~~~~”.



Qx. 23. You answered that you “~~~~~.”

  • Can you say what your answer means?

  • Is this an easy or a difficult question to answer?



Qx. 24. You answered that you “~~~~~.”

  • Can you say more about that?

  • How did you figure that you have live “~~~~” years at this address?



Qx. 25. You answered that you “~~~~~.”

  • Can you say more about that?

  • How did you figure that you have move “~~~~~” times in the past 5 years?



Qx. 26. You wrote/did not write in your phone number. What made you write in/not write in your phone number?

  • Is this information that you usually give out?

  • Do you feel comfortable or not comfortable giving out your phone number

  • Do you expect to get a phone call or do you expect not to be called?

  • For what reasons do you think you would be called?

  • Can you say more about that?



IF THE RESPONDENT WAS NOT A CRIME VICTIM, PROVIDE ONE OF THE CRIME SCENARIOS AND GO BACK TO EXPLORE THE SECTION ON CRIMES

IF THERE IS ENOUGH TIME REMAINING IN THE SESSION, ASK SECTION 4; ELSE SKIP TO SECTION 5/CLOSING

  1. Envelope, Survey Cover and Letters



  • Can you say something about the envelope that the survey was enclosed in? Is there anything about the envelope that you particularly like or dislike? Anything that would encourage you (or discourage you) from opening the envelope?

  • What about the cover letter that accompanied the survey? (Is there anything in the letter that you particularly like or dislike?)

  • The logo on the envelope and the letter is for the Bureau of Justice Statistics. I would like to show you some other logos that could be used on the survey materials. Please let me know What you think of these. [SHARE THE JUSTICE PROGRAM LOGO AND THE DOJ LOGO] What about the logos? Can you say more about that? is there one you prefer to the others? If yes, can you tell me why?

  • What do you think about the front cover of the questionnaire? Is there anything that jumps out at you?



  1. Closing



  • Is there anything else you noticed about the survey that we have not discussed?

  • Was there anything that you particularly liked or disliked about the survey?

  • Do you have any other thoughts or comments about the survey?

  • Thank you for your time.



Appendix C

Crime Scenarios for 2c Screener Testing





Cognitive Testing of the NCVS Screeners – Crime Scenarios for no-criminal victimization respondents

Summary of the scenarios

These four crime scenarios that represent a spread across different types of crimes. These crime scenarios will be used by respondents who are willing to test the NCVS screener, but who have not experienced a crime victimization in the past 12 months. These respondents will participate in the cognitive testing, but report the events described in the scenarios as something that happened to him/her.

Scenario 1

In January, your next-door neighbor had a party that was very loud and disruptive. You went to your neighbor to complain. Your neighbor called you a whiner and punched you in the face. You were taken to the hospital and treated for a broken nose. You went home after being treated and did not stay in the hospital overnight.

Scenario 2

In January, someone broke into your car and stole your car radio/CD player and your GPS . Your car was parked in your driveway. You reported it to your insurance company and to the police. You are still waiting to collect the insurance money.

Scenario 3

In January, you were at a service counter trying to rent a car. You put your cell phone and sun glasses on the counter right by where you were standing. When you turned around to go, you saw that your sun glasses and your cell phone were no longer there.

Scenario 4

In January, you were on a week-long business trip. When you arrived home, you saw that your front door had been broken open. When you walked in, a young kid ran up to you, knocked you down and ran out of the house. You discovered that about $5,000 worth of electronic equipment was missing and $1000 in cash. You immediately reported it to the police. You did not suffer any injuries from being knocked down.





Appendix D

Core NCVS Instruments for the NCVS CS Pilot



Core NCVS Instruments for the NCVS CS Pilot



This document describes the content of the core NCVS, as we intend to modify it for the purpose of administering the NCVS CS. To begin, we must administer the control card content as a one-time data collection. In the main NCVS this is collected at the initial household visit and then updated with each additional visit to or telephone interview with the household. Some content is not needed to support the goals of the NCVS CS, as described in section 5. The content we intend to retain intact or modify for CS purposes is outlined below.

The instruments we will deploy for the NCVS CS include the following:

  • Household screener (includes control card information, household and personal victimization screeners)

  • Personal victimization screener

  • Incident report



Content for the control card information is shown on pages 1 through 6, notes about intended adaptations of the NCVS-1 and NCVS-2 questionnaires is included on page 7.



HOUSEHOLD SCREENER

CSINTRO

Hello I'm (INTERVIEWER) from Westat. I'm calling for the Department of Justice concerning the National Crime Victimization Survey. The Department is conducting a survey here and throughout the Nation to determine how often people are victims of crime.

Are you at least 18 years old and able to answer some questions about this household?

1. YES (GO TO VERADD_CP)

2. NO (GO TO ASK18)

-8. DON’T KNOW (GO TO ASK18)

______________________________________________________________________________

(new screen created for CS introduction)





ASK18

May I please speak with someone who usually lives there, is at least 18 years old, and is able to answer some questions about the household?

1. YES (GO TO CSINTRO)

2. NONE AVAILABLE/MAKE APPT (GO TO RESULT)

-7. NO/REFUSED (CODE REFUSAL/GO TO RESULT)

______________________________________________________________________________

(new screen created for CS introduction/contact procedures)



VERADD_CP

I have your address listed as ...

123 Main Street

Anytown, MD 12345

Is that your exact address?

1. SAME ADDRESS (GO TO TENURE)

2. DIFFERENT ADDRESS (GO TO ADDVERF)

3. NOT R’S ADDRESS (GO TO ASK18)

____________________________________________________________________________

(response categories modified for CS)



ADDVERF

What is your address?

_______ _______________________________ _____________

STNUM STNAME STTYPE

__________________________________ __ _____

CITY STATE ZIP

____________________________________________________________________________

(new screen for CS)

(BASED ON COMPUTER ALGORITHM TO COMPARE THIS WITH THE SAMPLED ADDRESS, PROCEED TO TENURE IF A MATCH AND MOVED_CP IF NOT A MATCH)





MOVED_CP

Since your address rather than you personally was chosen for inclusion in the survey, no interview is required of you at this time. Thank you for your time.

1. ENTER 1 TO CONTINUE

____________________________________________________________________________

(statement to respondent modified for CS)

(GO TO RESULT, CODE – PHONE NUMBER DOES NOT REACH SAMPLED ADDRESS)



TENURE

Are your living quarters ...

1. Owned or being bought by you or someone in your household?

2. Rented for cash?

3. Occupied without payment of cash rent?

____________________________________________________________________________



TYPEOFHOUSINGUNIT

Please select one box that describes the type of housing unit.

1. House, apartment, flat

2. HU in nontransient hotel, motel, etc.

3. HU permanent in transient hotel, motel, etc.

4. HU in rooming house

5. Mobile home or trailer with no permanent room added

6. Mobile home or trailer with one or more permanent rooms attached

7. HU not specified above - Describe

8. Quarters not HU in rooming or boarding house

9. Unit not permanent in transient hotel, motel, etc.

10. Unoccupied site for mobile home, trailer, or tent

11. Student quarters in college dormitory

12. Other unit not specified above - Describe

___________________________________________________________________________________

(this item will need modification for telephone administration for the CS, appropriate changes will be explored during cognitive testing)



NUMBEROFUNITS

How many housing units are in this structure?

1. 1

2. 2

3. 3

4. 4

5. 5-9

6. 10+

7. Mobile home/trailer

8. Only OTHER units

_____________________________________________________________________________________



(COMPLETE HHROSTER_FNAME THROUGH HHMEMBER/HSEMEMURE FOR EACH PERSON BEFORE GOING TO NEXT PERSON)

HHROSTER_FNAME

What are the names of all people living or staying here who are at least 18 years old? Start with the name of the person or one of the people who (owns/rents) this home.

ENTER FIRST NAME ON THIS SCREEN

ENTER 999 TO LEAVE THE TABLE

_____________________

__________________________________________________________________________________

(question modified to ask only for names of persons 18 or older, also not asking for last names)

(UPON ENTERING 999 FOR THE FIRST TIME, GO TO HHCOVERAGE TO VERIFY ROSTER IS COMPLETE; IF RETURN TO ROSTER TO ADD PERSONS UPON ENTERING 999 FOR THE SECOND TIME PROCEED TO BIRTHDATEMO)



SEX

ASK IF NECESSARY

Is (HHROSTER_FNAME) male or female?

1. Male

2. Female

_________________________________________________________________________



RELATIONSHIP

What is (HHROSTER_FNAME)’s relationship to you?

11. Husband 16. Mother

12. Wife 17. Brother

13. Son 18. Sister

14. Daughter 19. Other relative

15. Father 20. Nonrelative

_______________________________________________________________________________________



HHMEMBER

Does (HHROSTER_FNAME) usually live here?

IF "NO", PROBE FOR USUAL RESIDENCE ELSEWHERE.

1. Yes (GO TO HHROSTER_FNAME FOR NEXT PERSON)

2. No

______________________________________________________________________



HSEMEMURE

Does (HHROSTER_FNAME) have a usual place of residence elsewhere?

1. Yes (DELETE FROM ROSTER, GO TO HHROSTER_FNAME FOR NEXT PERSON)

2. No (RETAIN ON ROSTER, GO TO HHROSTER_FNAME FOR NEXT PERSON)

_____________________________________________________________________



HHLDCOVERAGE

Have I missed any other adults age 18 or older living or staying here such as any lodgers or anyone who is away at present traveling or in the hospital?

1. Yes (GO TO HHROSTER_FNAME TO ADD PERSONS)

2. No

________________________________________________________________________

(Question modified to only refer to adults possibly missing from roster)







(COMPLETE BRTHDATEMO THROUGH RACE FOR EACH PERSON BEFORE GOING TO NEXT PERSON)

(AFTER SEQUENCE COMPLETED FOR LAST ADULT ON ROSTER GO TO ROSTERREVIEW)

BRTHDATEMO

What is (HHROSTER_FNAME)’s date of birth?

ENTER MONTH ON THIS SCREEN

______________________________________________________________________



BRTHDATEDY

What is (HHROSTER_FNAME)’s date of birth?

ENTER DAY ON THIS SCREEN

_______________________________________________________________________



BRTHDATEYR

What is (HHROSTER_FNAME)’s date of birth?

ENTER YEAR ON THIS SCREEN

IF THE YEAR IS LESS THAN 1890, ENTER 1890

_______________________________________________________________________

(IF BRTHDATEYR RESPONSE IS DON’T KNOW OR REFUSED, GO TO ESTAGE)



VFYAGE

That would make (HHROSTER_FNAME) (COMPUTED AGE) years old.

Is that correct?

1. Yes (GO TO MARITAL)

2. No (RETURN TO BRTHDATEMO/BRTHDATEDY/BRTHDATEYR TO CORRECT)

______________________________________________________________________







ESTAGE

Even though you don’t know (HHROSTER_FNAME)’s exact birth date, what is your best guess as to how old (he/she) was on (his/her) last birthday?

________________________________________________________________________

(IF VALID RESPONSE, GO TO MARITAL; OTHERWISE ASK AGERNG)



AGERNG

Is (he/she) ...

1. 18 - 24 years old?

2. 25 - 34 years old?

3. 35 - 49 years old?

4. 50 - 65 years old?

5. 66 years old or older?

_____________________________________________________________________

(Question and response categories modified to only probe using adult age ranges)



MARITAL

Is (HHROSTER_FNAME) now married, widowed, divorced, separated, or has (he/she) never been married?

1. Married

2. Widowed

3. Divorced

4. Separated

5. Never married

_______________________________________________________________________



ARMEDFORCES

Is (HHROSTER_FNAME) now in the Armed Forces?

1. Yes

2. No

______________________________________________________________________





EDUCATIONATTAIN

What is the highest level of school (HHROSTER_FNAME) completed or the highest degree (he/she) received?

1. 1ST GRADE 11. 11TH GRADE

2. 2ND GRADE 12. 12TH GRADE (NO DIPLOMA)

3. 3RD GRADE 13. HIGH SCHOOL GRADUATE (DIPLOMA, OR THE EQUIVALENT)

4. 4TH GRADE 14. SOME COLLEGE (NO DEGREE)

5. 5TH GRADE 15. ASSOCIATE’S DEGREE

6. 6TH GRADE 16. BACHELOR’S DEGREE (E.G. BA, AB, BS)

7. 7TH GRADE 17. MASTER’S DEGREE (E.G. MA, MS, MENG, MSW, MBA)

8. 8TH GRADE 18. PROFESSIONAL SCHOOL DEGREE (E.G. MD, DDS, DVM, LLB, JD)

9. 9TH GRADE 19. DOCTORAL DEGREE (E.G. PHD, EDD)

10. 10TH GRADE 20. NEVER ATTENDED, PRESCHOOL, KINDERGARTEN

____________________________________________________________________



SP_ORIGIN

(Are you/Is (HHROSTER_FNAME)) Spanish, Hispanic, or Latino?

1. Yes

2. No

____________________________________________________________________



RACE

Please choose one or more races that (you consider yourself/(HHROSTER_FNAME) considers (himself/herself) to be.

1. White 4. Asian

2. Black or African American 5. Native Hawaiian or other Pacific Islander

3. American Indian, or Alaska Native 6. Other - Specify

____________________________________________________________________

(GO TO BRTHDATEDY FOR NEXT PERSON ON ROSTER, IF LAST GO TO ROSTERREVIEW)





ROSTERREVIEW

REVIEW ALL CATEGORIES

IS THIS INFORMATION CORRECT?

LN NAME REL AGE SEX MARITAL STATUS

______________________________________________________________________________________

1 PERSON1

2 PERSON2

3 PERSON3

X PERSONX



1. Yes (GO TO NEXT SECTION/TIMEATADDRESS)

2. No

_____________________________________________________________________________



WHOTOCHANGE

ENTER THE LINE NUMBER OF THE PERSON REQUIRING A CHANGE.

LN NAME REL AGE SEX MARITAL STATUS

______________________________________________________________________________________

1 PERSON1

2 PERSON2

3 PERSON3

X PERSONX

___________________________________________________________________________



WHATFIX

WHAT CHANGE IS NEEDED?

LN NAME REL AGE SEX MARITAL STATUS

______________________________________________________________________________________

X PERSONX



1. NAME

2. RELATIONSHIP

3. DATE OF BIRTH

4. SEX

5. MARITAL STATUS

__________________________________________________________________________

(ALLOW CHANGES TO BE MADE AS NECESSARY, TO PERSONS AND DATA ITEMS CHOSEN IN WHOTOCHANGE AND WHATFIX.

RETURN TO ROSTERREVIEW WHEN DONE WITH CHANGES, UPON SELECTION OF ‘YES’ ON ROSTERREVIEW, PERFORM RANDOM ADULT SELECTION)

RANDOM ADULT SELECTION OCCURS HERE ONCE ROSTER IS COMPLETE:

  • IF ONE ADULT, SELECT HOUSEHOLD SCREENER RESPONDENT

  • IF TWO ADULTS SELECT HOUSEHOLD SCREENER RESPONDENT AND OTHER ADULT

  • IF THREE OR MORE ADULTS, RANDOMLY SELECT TWO ADULTS (MAY OR MAY NOT INCLUDE HOUSEHOLD SCREENER RESPONDENT)



HOUSEHOLD SCREENER RESPONDENT CONTINUES WITH NCVS-1 INSTRUMENT AND COMPLETES THE VICTIMIZATION SCREENER QUESTIONS INCLUDING THE HOUSEHOLD VICTIMIZATION QUESTIONS.

AFTER THE HOUSEHOLD RESPONDENT COMPLETES ANY REQUIRED INCIDENT REPORTS, ONE OR TWO RANDOMLY SELECTED ADULT RESPONDENTS (AS APPROPRIATE) COMPLETE THE PERSONAL VICTIMIZATION SCREENER AND ANY REQUIRED INCIDENT REPORTS.



Household Screener Remaining Content from Core NCVS-1 and NCVS-2 Questionnaires

The full NCVS-1 questionnaire will be replicated for the household respondent, starting at item 33a (TIMEATADDRESS) on page 2 and continuing through item 45d on page 7. The identity theft questions (items 46 to 59) on pages 7 through 9 will not be asked as part of the CS.

The hate crime questions (items 161 through 166) on pages 33 through 35 of the NCVS-2 incident report questionnaire will not be asked as part of the CS.

Pursuant to collection of any required incident reports from the household respondent using the content from the NCVS-2 incident report questionnaire, the household screener interview will conclude by asking the employment questions (items 74 to 79) on pages 10 and 11of the NCVS-1 of the household respondent. The very last question to be asked of the household respondent is item 12a (household income) on page 1 of the NCVS-1. If there is only one adult in the household the telephone interview is complete at this point.



Sampled Adult Personal Victimization Screeners and Incident Reports

Next, the NCVS-1 questionnaire content will be repeated for the sampled adult(s), from question 33a on page 2 through 45d on page 7, excluding any items labeled with the text “Asked of Household Respondent Only.” After completion of required incident reports this interview will also conclude with the employment questions (items 74 to 79) on pages 10 and 11 of the NCVS-1.

In households with two or more adults, the telephone interview will be considered complete when the household screener and personal victimization screener with any randomly selected adults are completed, as well as any required incident reports. Up to three respondents may be interviewed in households with three or more adult household members, in the event that one of the two randomly selected adults is not the household screener respondent.



Appendix E

Auxiliary Materials

  • Advance letters, 2B and 2C Mail Screener

  • Cover letters, 2B and 2C Mail Screener

  • Frequently Asked Questions, Mail Screeners

  • Advance letters, Telephone Survey

  • Frequently Asked Questions, Telephone Survey

2C Screener Advance Letter

The U.S. Department of Justice is conducting a survey of [CITY] area residents to obtain information on neighborhood safety in the [CITY] area. The Department of Justice has asked Westat to administer this survey for us. Westat will be mailing a brief survey to you in the following weeks. Some households will be asked to complete a somewhat longer survey later. Results from this survey (and similar surveys conducted by The U.S. Department of Justice) are used by local citizens, by legislators, and by policymakers to develop programs to aid crime victims and to help prevent crime.

Your address is part of a scientifically selected sample of addresses chosen throughout the [CITY] area for participation in this survey. Because this is a sample survey, your answers represent not only you and your household, but also hundreds of households like yours. For this reason, your voluntary cooperation is very important. I hope you will answer all the survey questions as completely and accurately as possible. Your answers will only be used to prepare statistical summaries, and no information about your household or you as an individual can be identified from these summaries. Your data will be protected to the maximum extent provided under the law.

Answers to the most frequently asked survey questions are on the reverse side of this letter. If you would like further information, you can contact Westat at 1-800-xxx-xxxx, or you can visit the DOJ website at www.xxxx.gov.

Thank you for your cooperation. The U.S. Department of Justice appreciates your help.



Sincerely,



Jim Lynch

Bureau of Justice Statistics

U.S. Department of Justice



Content of a Spanish version of the letter

The content will be the same as the English language letter, but with two additional sentences in paragraph 3:

Answers to the most frequently asked survey questions are on the reverse side of this letter. If you would like a copy of the survey in Spanish, or to complete the survey over the telephone, please call 1-800-xxx-xxxx. This is a free call.





2B Screener Advance Letter

The U.S. Department of Justice is conducting a survey of [CITY] area residents to obtain information about neighborhood safety. The Department of Justice has asked Westat to administer this survey for us. Westat will be mailing a brief survey to you in the following weeks. The information you provide will help inform us about your local emergency services and about your neighborhood. Results from this survey (and similar surveys conducted by The U.S. Department of Justice) are used by local citizens, by legislators, and by policymakers to develop programs to better understand neighborhood needs.

Your address is part of a scientifically selected sample of addresses chosen throughout the [CITY] area for participation in this survey. Because this is a sample survey, your answers represent not only you and your household, but also hundreds of households like yours. For this reason, your voluntary cooperation is very important. I hope you will answer all the survey questions as completely and accurately as possible. Your answers will only be used to prepare statistical summaries, and no information about your household or you as an individual can be identified from these summaries. Your data will be protected to the maximum extent provided under the law.

Answers to the most frequently asked survey questions are on the reverse side of this letter. If you would like further information, you can contact Westat at 1-800-xxx-xxxx, or you can visit the DOJ website at www.xxxx.gov.

Thank you for your cooperation. The U.S. Department of Justice appreciates your help.



Sincerely,



Jim Lynch

Bureau of Justice Statistics

U.S. Department of Justice



Content of a Spanish version of the letter

The content will be the same as the English language letter, but with two additional sentences in paragraph 3:

Answers to the most frequently asked survey questions are on the reverse side of this letter. If you would like a copy of the survey in Spanish, or to complete the survey over the telephone, please call 1-800-xxx-xxxx. This is a free call.







2C Mail Screener - cover letter

About a week ago your household was mailed a letter notifying you that The U.S. Department of Justice is conducting a survey of [CITY] area residents to obtain information on neighborhood safety in the [CITY] area.

Results from this survey (and similar surveys conducted by The U.S. Department of Justice) are used by local citizens, by legislators, and by policymakers to develop programs to aid crime victims and to help prevent crime.

Your address is part of a scientifically selected sample of addresses chosen throughout the [CITY] area for participation in this survey. Because this is a sample survey, your answers represent not only you and your household, but also hundreds of households like yours. For this reason, your voluntary cooperation is very important. I hope you will answer all the survey questions as completely and accurately as possible. Your answers will only be used to prepare statistical summaries, and no information about your household or you as an individual can be identified from these summaries. Your data will be protected to the maximum extent provided under the law. Please know that some households will be asked to complete a somewhat longer survey later.

Answers to the most frequently asked survey questions are on the reverse side of this letter. If you would like further information, you can contact Westat at 1-800-xxx-xxxx.

Thank you for your cooperation. The U.S. Department of Justice appreciates your help.



Sincerely,



Jim Lynch

Bureau of Justice Statistics

U.S. Department of Justice



Content of a Spanish version of the letter

The content will be the same as the English language letter; one difference is that the telephone # will be different





2B Mail Screener - cover letter

About a week ago your household was mailed a letter notifying you that The U.S. Department of Justice is conducting a survey of [CITY] area residents to obtain information on neighborhood safety in the [CITY] area.

Results from this survey (and similar surveys conducted by The U.S. Department of Justice) are used by local citizens, by legislators, and by policymakers to develop programs to better understand neighborhood needs.

Your address is part of a scientifically selected sample of addresses chosen throughout the [CITY] area for participation in this survey. Because this is a sample survey, your answers represent not only you and your household, but also hundreds of households like yours. For this reason, your voluntary cooperation is very important. I hope you will answer all the survey questions as completely and accurately as possible. Your answers will only be used to prepare statistical summaries, and no information about your household or you as an individual can be identified from these summaries. Your data will be protected to the maximum extent provided under the law.

Answers to the most frequently asked survey questions are on the reverse side of this letter. If you would like further information, you can contact Westat at 1-800-xxx-xxxx, or you can visit the DOJ website at www.xxxx.gov.

Thank you for your cooperation. The U.S. Department of Justice appreciates your help.



Sincerely,



Jim Lynch

Bureau of Justice Statistics

U.S. Department of Justice


Content of a Spanish version of the letter

The content will be the same as the English language letter; one difference is that the telephone # will be different



FAQ List for the back of all Mail Screener Letters

U.S. Department of Justice Survey of Neighborhoods

What is the U.S. Department of Justice Survey of Neighborhoods?

The U.S. Department of Justice (DOJ) Survey of Neighborhoods (SON) is a survey of households in the [CITY] area to obtain information about safety in [CITY] area communities.

Who is the sponsor of this study?

The survey is sponsored by the Bureau of Justice Statistics (BJS), U.S. Department of Justice (DOJ). The survey is conducted under the authority of Title 42, United States Code, Section 3732. To learn more about BJS, you can visit them on the web at www.ojp.usdoj.gov/bjs.

How long will it take to complete this survey?

We anticipate that most households will be able to complete the mailed survey in about 5-6 minutes. Some households may be contacted later for a more detailed survey.

Am I required to complete this survey?

Participation is voluntary and there are no penalties for refusing to answer. However, your household was randomly selected for this scientific sample survey, and you cannot be replaced with another household. Your cooperation is extremely important to help ensure the completeness and accuracy of this much needed information.

Who will use this information?

Results from this survey (and similar surveys conducted by The U.S. Department of Justice) are used by local citizens, by legislators, and by policymakers to develop programs to aid crime victims and to help prevent crime.

Who can I call with questions?

If you would like further information, you can contact Westat at 1-800-xxx-xxxx.

How was my household chosen for this study?

Households were selected at random from all [CITY]-area residential addresses. By selecting households randomly, we will be able to create scientific estimates about households in the [CITY] area. It's important to participate, so that we have an accurate picture of the [CITY] area community.

How do I know you'll keep my information confidential?

We are required by law to keep your information confidential to the full extent protected by law. After all the study is completed, any identifying information - your address and phone number - are removed from the data file and destroyed.

Telephone Survey Advance Letter – for 2B sample cases that did not receive a mail screener

The U.S. Department of Justice is conducting a survey of Chicago area residents to obtain information on the type and amount of crime committed against households and individuals in the Chicago area. The Department of Justice has asked Westat to administer this survey for us. Westat will be calling your house in the next few weeks to complete an interview with your household. The information you provide will help inform us about how much crime there is and about what types of crimes have happened recently to [Chicago area] residents. Results from this survey (and similar surveys conducted by The U.S. Department of Justice) are used by local citizens, by legislators, and by policymakers to develop programs to aid crime victims and to help prevent crime.

Your address is part of a scientifically selected sample of addresses chosen throughout the [Chicago area] for participation in this survey. Because this is a sample survey, your answers represent not only you and your household, but also hundreds of households like yours. For this reason, your voluntary cooperation is very important. I hope you will answer all the survey questions as completely and accurately as possible. Your answers will only be used to prepare statistical summaries, and no information about your household or you as an individual can be identified from these summaries. Your data will be protected to the maximum extent provided under the law.

Answers to the most frequently asked survey questions are on the reverse side of this letter. If you would like further information, you can contact Westat at 1-800-xxx-xxxx, or you can visit the DOJ website at www.xxxx.gov.

Thank you for your cooperation. The U.S. Department of Justice appreciates your help.



Sincerely,



Jim Lynch

Bureau of Justice Statistics

U.S. Department of Justice



Content of a Spanish version of the letter

The content will be the same as the English language letter.



Telephone Survey Advance Letter – for sample cases that did not respond to the mail screener

A few weeks ago your household was mailed a survey of [CITY] area residents to obtain information on neighborhood safety in the [CITY] area.

Results from this survey, conducted by The U.S. Department of Justice, are used by local citizens, by legislators, and by policymakers to develop programs to aid crime victims and to help prevent crime.

This is an important research study. Your cooperation is extremely important to help ensure the completeness and accuracy of this much needed information.

Please be aware that a Westat interviewer will be calling your home soon to complete this survey.

If you would like to schedule an interview time, you can contact Westat at 1-800-xxx-xxxx.

Thank you for your cooperation. The U.S. Department of Justice appreciates your help.



Sincerely,



Jim Lynch

Bureau of Justice Statistics

U.S. Department of Justice


Content of a Spanish version of the letter

The content will be the same as the English language letter



FAQ List for the back of the 2B Telephone Advance Letters



U.S. Department of Justice Chicagoland Crime Victimization Survey (CCVS)

What is the U.S. Department of Justice Chicagoland Crime Victimization Survey (CCVS)?

The U.S. Department of Justice (DOJ) Chicagoland Crime Victimization Survey (CCVS) is a survey based on a sample of households in the Chicago-area. It is designed to obtain information about persons victimized by certain types of crimes, such as theft, burglary, motor vehicle theft, robbery, and assult in Chicagoland communities.

Who is the sponsor of this study?

The survey is sponsored by the Bureau of Justice Statistics (BJS), U.S. Department of Justice (DOJ). The survey is conducted under the authority of Title 42, United States Code, Section 3732. To learn more about BJS, you can visit them on the web at www.ojp.usdoj.gov/bjs.

How long will it take to complete this telephone survey interview?

We anticipate the interview to take about 25 minutes. However, this is only an estimate as it will depend on your household’s experiences in the last year.

Am I required to complete this survey?

Participation is voluntary and there are no penalties for refusing to answer. However, your household was randomly selected for this scientific sample survey, and you cannot be replaced with another household. Your cooperation is extremely important to help ensure the completeness and accuracy of this much needed information.

Who will use this information?

Results from this survey (and similar surveys conducted by The U.S. Department of Justice) are used by local citizens, by legislators, and by policymakers to develop programs to aid crime victims and to help prevent crime.

Who can I call with questions?

If you would like further information, you can contact Westat at 1-800-xxx-xxxx.

How was my household chosen for this study?

Households were selected at random from all Chicago-area residential addresses. By selecting households randomly, we will be able to create scientific estimates about the households in the Chicago-area. It's important to participate, so that we have an accurate picture of the Chicagoland community.

How do I know you'll keep my information confidential?

We are required by law to keep your information confidential to the full extent protected by law. After all the study is completed, any identifying information - your address and phone number - are removed from the data file and destroyed.

Appendix F

Informed Consent Form for Cognitive Interviews





Consent Form



Thank you for your interest in helping us test a survey for a study we are conducting for the Bureau of Justice Statistics (BJS). BJS is interested in completing a survey of neighborhood concerns and safety in the [city name] area.

Today’s session will involve completing a survey and then answering questions about the experience of completing it. The feedback will help us to develop recommendations for improving the survey.

This is a research project and your participation is voluntary. The only cost to you is the time and effort to answer questions. We expect that today’s interview will last about 60 minutes.

You may skip any question that you do not want to answer, both in the questionnaire and in the discussion afterwards. Everything covered today will be treated as confidential. You will never be identified by name. The things you say may be put in a written summary of this discussion, but there will be no way to identify who said what, and your name will not be used anywhere.

While there are no direct benefits to you for participating in this study, you will be helping with an important research project. Also, as a token of our appreciation, we have $40 cash for you once the interview is complete.

If you have any questions about this study, you can call Sherm Edwards, at 301-294-2010.

We would like to record the interview. Sometimes listening to a recording of the interview helps us in making final improvements to the survey. If the recording is reviewed later, it will only be by a few Westat staff and possibly by some of the staff at BJS.









I have read and understand the statements above. I consent to participate in this session.



Shape1

Signature Date

Appendix G

Recruiting Script and Screener





Screener for the NCVS Cognitive Tests



NAME___________________________________ PHONE_____________________________



RESPONDENT Number |___|___|___|___|



We need people with diverse backgrounds to test the questionnaire. We are going to ask you some questions about yourself so we can make sure that people with varying backgrounds are represented in the testing.



1. What is your age? |___|___|

AGE

IF AGE 17 OR YOUNGER (END STATEMENT)

AGES 18+ (GO TO 2)



  1. Have you been the victim of a crime in the last 12 months?

YES 1

NO 2 (GO TO 5)



  1. Can you briefly explain what happened?









  1. What month did the crime happen?

[IF MORE THAN ONE RECORD THE MOST RECENT]

|______________|

MONTH

IF CRIME OCCURRED MORE THAN 12 MOTNHS AGO, GO TO END



5. Do you consider yourself to be Hispanic?

YES 1

NO 2



6. What is your race? Please select one or more.

White 1

Black or African American 2

Asian 3

Native Hawaiian or other Pacific Islander 4

American Indian or Alaska Native 5



7. What is the highest level of education you have completed?

8th grade, some high school, but did not get a diploma 1

12th grade, high school diploma, or GED 2

Some college, Associate’s degree, bachelor’s degree or higher 3



8. Can we add you to our data base and call you for other types of studies?

Yes ……………………………………………………………..1

No ………………………………………………………………2



9. Do you have any family, friends, or acquaintances who would be interested in participating in this or future studies? Can you give them our number and ask them to call us?



Yes ……………………………………………………………...1

No ………………………………………………………………2



ELIGIBLE RESPONDENTS POTENTIALLY – AGE 18+ AND CRIME VICTIMS:

It appears you are eligible for our study. We’d like to schedule an interview here at Westat. Let me read you the days and times I have available, and you can tell me what would be best for you. This will take about 1 hour of your time and we will pay you $40 cash. May I please have your full name and address? (We need your address so that we can send you directions to Westat. It will include a map that shows you exactly where the room is).



Full Name: |______________________________________________________________|



Address: |________________________________________________________________|



E-mail address: |___________________________________________________________|



I will send the directions out to you shortly. If you have to cancel your interview, please call back so that we can schedule someone in your place, OK? You can reach me at: [Recruiter’s Phone Number].



POTENTIALLY ELIGIBLE RESPONDENTS POTENTIALLY

  1. AGE 18+

  2. CRIME VICTIMS MORE THAN 12 MONTHS AGO OR NEVER A CRIME VICTIM



It appears you might be eligible for our study. We’ll call you back in a few days to schedule for this study or talk to you about some other studies. May I please have your full name and address? (We need your address so that we can send you directions to Westat. It will include a map that shows you exactly where the room is).



Full Name: |______________________________________________________________|



Address: |________________________________________________________________|



E-mail address: |___________________________________________________________|



I will send the directions out to you shortly. If you have to cancel your interview, please call back so that we can schedule someone in your place, OK? You can reach me at: [Recruiter’s Phone Number].







END STATEMENT – NOT 18 YEARS OF AGE:

It appears that you are probably not eligible for our study. Thank you very much for your interest and willingness to help out.



Appendix H

Recruiting Advertisement Material







Survey Research- Receive $40

Reply to: (see message body)
Date: 2011-0X-0X

Westat, a social science research company, needs individuals who have been a victim of a crime during the last 12 months to participate in an interview for a crime study. Adults, ages 18 and over, men and women, are encouraged to participate. We are particularly interested in men and women who have up to a high school education or did not complete high school. The interview will take place in Rockville, MD and last about 60-90 minutes. If you are interested, send an email to [email protected] or call 1-800-937-8281, Ext. xxxx and include your name, telephone number, email address and the best time to reach you. This is a Federally-sponsored study.
WESTAT
EOE




Help Test a Survey for a Study on Crime Victimization



Westat seeks individuals who have been a victim of a crime during the last 12 months to participate in crime study. Interviews will be held on site at Westat, in Rockville, MD.

Participants will be paid $40. Interviews will last approximately 1 hour. No special knowledge is needed.

To register, email [email protected] (Xxxxx) or call toll-free at 1-800-937-8281. All information from this interview will be treated as confidential.

TAKE ONE


Study Volunteers

1-800-937-8281, Ext. xxxx

[email protected]


Westat Study Volunteers

1-800-937-8281, Ext. xxxx

[email protected]


Westat Study Volunteers

1-800-937-8281, Ext. xxxx

[email protected]


Westat Study Volunteers

1-800-937-8281, Ext. xxxx

[email protected]


Westat Study Volunteers

1-800-937-8281, Ext. xxxx

[email protected]


Westat Study Volunteers

1-800-937-8281, Ext xxxx

[email protected]


Westat Study Volunteers

1-800-937-8281, Ext. xxxx

[email protected]



Survey Research- Receive $40

Reply to: (see message body)
Date: 2011-0X-0X

Westat, a social science research company, needs individuals to participate in an interview for a crime study. Adults, ages 18 and over, men and women, are encouraged to participate. We are particularly interested in men and women who have up to a high school education or did not complete high school. The interview will take place in Rockville, MD and last about 60-90 minutes. If you are interested, send an email to [email protected] or call 1-800-937-8281, Ext. xxxx and include your name, telephone number, email address and the best time to reach you. This is a Federally-sponsored study.
WESTAT
EOE



Help Test a Survey for a Study on Crime Victimization



Westat seeks individuals to participate in a crime study. Interviews will be held on site at Westat, in Rockville, MD.

Participants will be paid $40. Interviews will last approximately 1 hour. No special knowledge is needed.

To register, email [email protected] (Xxxxx) or call toll-free at 1-800-937-8281. All information from this interview will be treated as confidential.

TAKE ONE


Study Volunteers

1-800-937-8281, Ext. xxxx

[email protected]


Westat Study Volunteers

1-800-937-8281, Ext. xxxx

[email protected]


Westat Study Volunteers

1-800-937-8281, Ext. xxxx

[email protected]


Westat Study Volunteers

1-800-937-8281, Ext. xxxx

[email protected]


Westat Study Volunteers

1-800-937-8281, Ext. xxxx

[email protected]


Westat Study Volunteers

1-800-937-8281, Ext xxxx

[email protected]


Westat Study Volunteers

1-800-937-8281, Ext. xxxx

[email protected]





1

2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEdwards_s
File Modified0000-00-00
File Created2021-02-03

© 2024 OMB.report | Privacy Policy