Supplementary materials

Appendix Bundle - with Westat edits_rev1.docx

Research to support the National Crime Victimization Survey (NCVS)

Supplementary materials

OMB: 1121-0325

Document [docx]
Download: docx | pdf

able of Contents PAGE


A. Development of an Approach for Obtaining Small-Area Estimates from the NCVS 2


B. Sampling and Data Collection Approaches for an NCVS Companion Survey 5


C. Flow Charts Illustrating Approach 2B and 2C Methodologies 9


D. Mail Screener Cover Letter For 2B sample and 2C-localized & generic 11


<Instruments E-H Submitted Under Separate Cover>

E. Approach 2B Mail Screener 15

F1. Approach 2C Mail Screener (local version)

F2. Approach 2C Mail Screener (generic version)

G. NCVS-1 Core (Basic Screener Questionnaire)

H. NCVS-2 Core (Crime Incident Report)


I. Adaptation of the Core NCVS Instruments for the NCVS CS Pilot 16


J. Draft Advance Letter for 2B Phone Sample 28


K. Content for the Postcard Reminder 31


L. Analysis Plan 32





Attachment A


Development of an Approach for Obtaining Small-Area Estimates from the NCVS


This Attachment describes some of the approaches that were considered for producing blended small area estimates (SAEs) of victimization rates and characteristics. The first sub-section briefly reviews general methods that could be used to produce these SAEs and discusses the major advantages and disadvantages of these SAE methods. The second sub-section focuses on the blended approach and data collection and cost issues associated with alternative strategies that fall within this realm.


Methods for obtaining small area estimates (SAEs) of victimization rates


  1. Model-based estimation with currently available information

Model-based methods predict the victimization rate from administrative data or other sources, using a regression model. If there is also a direct estimate of victimization for that area from the NCVS, the SAE is a weighted average of the NCVS estimate and the prediction from the regression model; if the NCVS has no sample in the area, the SAE is the regression prediction. If the assumed regression model is correct, the resulting SAE is unbiased under the model and has smaller mean squared error than using just the direct estimate from the NCVS alone.


Advantages: SAE methods have been studied for more than 30 years and have been used in applications ranging from poverty estimation to disease mapping. The regression modeling can be done at a small area, household, or person level, depending on the information available. The models “borrow strength” from other, similar areas to achieve improved predictions. A major advantage is that these methods do not incur additional data collection costs.


Disadvantages: Qualities of model-based estimators depend highly on the model assumptions. The model-based methods require the presence of high-quality, consistently reported, auxiliary information that are highly correlated with the outcomes (victimization). Auxiliary information is more likely to be available at the MSA level than at the person level; this information includes Census data on education, labor force characteristics, percentage owning homes, and SES, as well as administrative statistics collected by the BJS. It is also possible to use the Uniform Crime Reports (UCR) as a source of auxiliary information. The UCR data, however, are incomplete, have biases that may be differential by the small areas, and are not always highly correlated with NCVS estimates, at least at the MSA level. The UCR data appear to be an excellent source of auxiliary information for property crime, but the correlations between UCR and NCVS violent crime rates are low and sometimes negative. Current methods for combining information from different sources for SAEs assume that the UCR quantities are unbiased or have a bias that is constant for all areas; due to the voluntary nature of reporting and varying data quality by jurisdiction, this assumption is not met for UCR data.


  1. Model-based estimation with additional auxiliary information collected through a survey

The main drawback of approach (A) is the sketchy nature of available auxiliary information. One possible solution is to collect better auxiliary information, for example through a large mail survey in each state or in targeted areas. Such a survey could collect brief information about victimization, attitudes about crime, and similar variables.


Advantages: The mail survey could produce information of interest in its own right such as attitudes about crime or the police, as well as auxiliary information to be used in producing SAEs of victimization rates and characteristics, at a relatively low cost. A mail survey gives flexibility for moving the sample over time, to achieve greater precision in targeted geographic areas.


Disadvantages: Concepts of victimization may differ in the two surveys, and the differences may vary across demographic groups. This is a potential source of differential bias, but there is the possibility that these biases may be addressed by modeling whereas modeling is less likely to compensate for UCR differential biases by areas. As with all model-based methods, the quality of SAEs depends on how well the assumed model fits the data. In particular, the model must be trustworthy for areas that have no NCVS sample, since in those areas estimates depend entirely on the model. The method is likely to improve accuracy of SAEs of victimization rates in broad categories; it is less likely to improve SAEs of more detailed characteristics of victimizations.


  1. Blended estimates from two surveys

An independent companion survey (CS) on victimization is conducted, and estimates from the CS are blended with those from the NCVS. The two surveys share a common concept of victimization and may even share a common instrument and data collection strategy, although these are not essential. The data collection approaches and issues for this approach are covered in the next section.


Advantages: If the CS is undertaken using lower cost data collection methods and modes, the cost of achieving more precise SAEs can be substantially lower by using a CS than by increasing the NCVS sample size. This approach gives more information than approach (B) on details of victimization that can be used for type-of-crime classification and variables of interest such as weapon use. Different methods of blending the estimates are possible. One possibility is using dual frame survey estimation methods to combine CS and NCVS estimates for SAE. Alternatively, the CS could be used as auxiliary information in a model-based approach for SAE. If an address-based sample is used for the CS, detailed auxiliary information from the Census, the UCR, and police jurisdictions can be used in the design of the survey, thus improving efficiency relative to the PSU-based NCVS. As with approach (B), the sampling design is flexible and sample can be easily moved over time to give increased precision in different areas.


Disadvantages: The quality of data from the CS may not be as high as that from the NCVS. If the CS is done by a different mode or has different response rates or interviewer effects, the sources and directions of bias in the CS and NCVS may differ. Models to estimate these biases must be developed. The statistical literature for blending biased estimates is currently very limited and new statistical methods must be developed to tackle this challenging problem.


  1. Better direct measurements of victimization

SAEs obtained by direct measurement, either through increased NCVS sample sizes or better NCVS allocation in stratified sampling, rely only on the victimization reports of the respondents. Several options exist for obtaining better direct measurements:

  1. Increased sample sizes with the current design.

  2. Improved stratification with higher concentrations of victims in strata with high sampling fractions.

  3. A two-phase sample employing an inexpensive but fallible screener followed by the NCVS. For this to be cost-effective, in most situations the sum of specificity and sensitivity of the screener should exceed 1.6.

  4. Use of a dual frame method, in which Frame A is the general population sampling frame used in the NCVS and Frame B, an incomplete frame, has a high concentration of victims. Information for constructing Frame B might be available in individual law enforcement jurisdictions, if they have contact information for crime victims and consent can be obtained. A challenge in this approach for obtaining SAEs is that the Frame B membership of NCVS respondents may be unknown, due to inaccuracies in responses about reporting crime to the police as well as differential agency responses to recording crimes. Record linkage might resolve some of these issues, although NCVS respondents who live in Phoenix but were victimized in San Francisco may be difficult to classify. Because Frame B is small relative to Frame A and has a much higher proportion of victims, small inaccuracies in determining frame membership can result in large effects on estimated victimization rates.


Advantages: Direct estimates do not require modeling and therefore do not require the model assumptions of approaches (A)-(C). They may be thought of as the gold standard for quality of estimates.


Disadvantages: For many designs, obtaining a sufficient sample size for SAEs is expensive. This is the most costly of the four methods.


Attachment B

Sampling and Data Collection Approaches for an NCVS Companion Survey


Here we consider three different approaches to conducting a Companion Survey (CS) in small areas such as MSAs. All three assume centralized telephone interviewing to collect data to support blended estimates (Approach 2.1(C)); one would also provide data to support model-based SAE (Approach 2.1(B)). They differ in what sample frame underlies the design and in how initial contacts with households are made. In-person follow-up for selected nonresponse is feasible with any of these approaches, although it is more limited with the RDD survey.



  1. Random-digit-dial survey

Traditional RDD designs using only landline frames are becoming increasingly rare as their coverage of the household population continues to decline. The design we will consider includes samples drawn from numbers assigned to both landline and cellular service, with cell numbers screened to identify cell-only households.


Cost: We will use the cost of a completed interview (household victimization screener at a minimum) for landline RDD as a metric, assuming the same number of completed interviews across different approaches. If that cost is 1, then the cost per cell-only household RDD complete is about 4, and the cost for a two-frame design where 11% of the completed interviews are with cell-only households is about 1.4.


Response Rate: We would anticipate a screening response rate of 30-40% in large MSAs, and 70-80% for the substantive interview, for a net of 20-30%.


Advantages: RDD methodology is well-tested. Instrument design is relatively straightforward, and in most cases the entire data collection can be done on one or two contacts with the household.


Disadvantages: The potential for bias due to undercoverage and nonresponse is high. There is limited ability to stratify geographically within MSAs. The cell sample would be less geographically efficient than the landline sample. Any in-person follow-up to study nonresponse bias would be limited to telephone numbers for which an address could be obtained. We would expect only about 50-60% of sampled landline telephone numbers would have a matched address after purging for nonworking and business numbers and some percentage (up to 20%) of these would be incorrect. There is as yet no reliable way to match cell numbers to addresses so in-person follow-up would not be possible.



  1. Address-based sample (ABS) with mail survey to obtain telephone numbers

This approach begins with selection of a sample of addresses from a vendor-enhanced version of the USPS Delivery Sequence File. We would then obtain telephone numbers for these addresses from vendor services. For those addresses without a telephone number, we would attempt to obtain one by mail using 2 or 3 mailings. The content of the mail piece would be limited and essentially non-substantive. We would then proceed with telephone interviewing in much the same way as for the RDD. During the telephone interview, the respondent would be asked to verify that the residence is at the sampled address since a proportion of the vendor numbers are not correct. For any sample address that is matched but the telephone number obtained is incorrect (about 20% will not even be working numbers at residences), the address will be placed into the mail process to obtain a telephone number.

Cost: We estimate the per-complete cost as about 1.1 times that for a landline RDD case. Thus, this approach is about 20% less expensive than Approach 2.2(A).


Response Rate: We estimate we would obtain vendor telephone numbers for about 50% of sampled addresses. While we do not have direct experience with the non-substantive screening approach, we assume about 40% of those mailed will provide a telephone number. About 20% of vendor-acquired telephone numbers would not be working or residential, and we estimate about 10-15% will be working but not actually be for the sampled address. Thus, about a third of the addresses with vendor-provided numbers would be cycled through the mail process, and we again assume about 40% response. In the end, we assume we would have good telephone numbers for about 60% of the addresses. Assuming a 40-50% screening rate (higher than RDD for a couple of reasons) and 80% for the substantive interview, the net response rate would be about 20%.


Advantages: ABS allows geographic stratification within MSAs, and has very good coverage. The telephone instrumentation would be very similar to that of the RDD approach. It is less expensive than RDD. In-person follow-up would be straightforward (with any ABS approach there is an issue with post office boxes that do not have an actual address but this is a small percentage of numbers), and the sample for follow-up could be clustered within MSAs to reduce cost.


Disadvantages: The response rate is likely to be comparable to or even lower than RDD. It is also likely that there will be a differential nonresponse for those with and without valid matching telephone numbers.



  1. ABS with mail screener and telephone interviewing

This approach may be called the “two-phase ABS hybrid.” The sample selection would be the same as that for Approach 2B, but would involve mailing every sampled household a brief screener questionnaire. The content of the screener could (a) support model-based estimation as described in Approach 2.1(B), (b) provide data that are expected to be highly correlated with victimization incidents to support stratification for the second phase (telephone) survey, and (c) yield telephone numbers for a large portion of those returning the survey. Nonresponders for whom telephone numbers are obtained from a vendor would also be available for the telephone interview. The telephone follow-up would proceed essentially the same way as in Approach 2B.


A key aspect of this approach is subsampling after the screener based on likelihood of victimization. The plan is to stratify returns into high and low likelihood based on answers to screener questions, and oversample (likely take all of) those in the high likelihood stratum. The goal is to increase the number of victimizations reported without increasing the number of second-phase telephone interviews conducted. The success of this approach depends on the sensitivity and specificity of the predictor questions; the pilot will provide a chance to assess these.


Cost: The mailing would be more expensive than that for Approach 2B because the entire sample would be mailed, and each substantive screener would likely be somewhat longer. The second phase telephone interview would be somewhat less expensive because almost all of those followed up would have already cooperated to the screener. On balance, assuming no subsampling after the screening, the per-complete cost would be about 1.2 times that of a landline RDD complete, or about 10% more expensive than Approach 2B.


The relative cost with subsampling for the pilot depends on the sampling rates in the two strata. If we assume the high likelihood stratum is sampled with probability one, then we might subsample the low likelihood stratum by taking only half of them for the telephone interview. For discussion, assume the high likelihood stratum is 20% of the respondents. If this approach were to be followed and we wished to maintain the total number of completed telephone interviews, we would nearly double the initial sample (and the mailing costs) and reduce the total sample for follow-up by 1/3. This would increase the total cost by about 20%, bringing it up to about the level of the RDD design (Approach 2(A)). An alternative approach is to attempt to retain the same number of completed telephone interviews with at least some victimization. This approach could be much less expensive if the screener instrument is effective. Whether either of these implementations of this approach to a CS design is cost-effective for producing blended estimates would depend on the sensitivity and specificity of the predictor questions in the screener. Let


S1 = specificity = P (mail survey classifies HH as nonvictim HH | NCVS classifies HH as nonvictim HH) and


S2 = sensitivity = P (mail survey classifies HH as victim HH | NCVS classifies HH as victim HH).


Let cj denote the cost per interview in phase j, for j=1, 2. The ratio of the standard error for estimating prevalence under the optimal 2-phase design to the standard error for estimating prevalence using only the CS under the same budget is (McNamee, 20031):


[(1- S2) S1]1/2 + [(1- S1) S2]1/2 + [c1/c2]1/2,


where is the Pearson correlation between the NCVS classification and the screener classification. If both sensitivity and specificity are high, the two-phase design can result in more accurate estimates of victimization prevalence.


Using both surveys produces two levels of information that can be used to improve SAEs: the CS at phase 2 can be blended with the NCVS, and the mail survey at phase 1 can provide high-quality auxiliary information for model-based SAEs of victimization. Such a design also allows exploration of multivariate relationships between victimization and attitudes about crime.


Response Rate: We would expect about a 50-55% response to the screener, and to get (either from the respondent or a vendor) telephone numbers for about 85%. Assuming 70-80% response to the telephone follow-up, the net would be in the 30-35% range. These rate estimates would vary depending on the particular geographic area(s) being surveyed.


Advantages: Besides the ABS advantages listed for 2B, this approach would likely increase the yield of victimization reports to support blended SAEs and provide correlates for model-based SAEs. Because of the higher yield, estimates of characteristics associated with victimizations would be more accurate. Based on research done for the National Household Education Survey, we believe that the response rates would be higher than for either 2A or 2B. The design also allows exploration of relationships between victimization and questions such as attitudes about crime that may be asked in the screener.


Disadvantages: Likely somewhat more expensive than 2B for a given total achieved sample size, although it could be considerably less expensive per reported incident.




Attachment C

Flow Charts Illustrating Approach 2B and 2C Methodologies


Figure 1. Approach 2B: Telephone Harvest




Figure 2. Approach 2C: Two-phase ABS Hybrid




Attachment D


Mail Screener Cover Letter

For 2B sample and 2C-localized language







{DATE}


Dear Chicago Resident:


Please complete the enclosed survey from the U.S. Department of Justice, Bureau of Justice Statistics. The survey is about your neighborhood and what it is like for you to live in your neighborhood. Results from this survey will be used to better understand the needs of neighborhood residents.


Your address is part of a random sample of addresses chosen throughout the Chicago area. This is part of a scientific study and your answers represent not only you and your household, but also hundreds of households like yours. For this reason, your voluntary cooperation is very important.


The enclosed questionnaire is the first part of a two-part survey. Some households will be asked to complete a longer telephone survey later. The information you provide will be used for statistical purposes only and may not be disclosed, or used, in identifiable form for any other purpose as required by law (Title 42, U.S. Code, Sections 3789g). Your responses will be combined with those of others to produce statistical summaries about crime and safety.


Answers to the most frequently asked questions about this survey are included in the questionnaire. Section 3732 of the Justice Systems Improvement Act of 1979 authorizes the Bureau of Justice Statistics to conduct this survey. If you would like further information, you can contact our survey support at 1- 888-544-1070 or you can visit the BJS website at www.bjs.gov/ncvspilot.cfm.


Neighborhood issues affect all people in the greater Chicago area. Thank you for your generous cooperation. The U.S. Bureau of Justice Statistics appreciates your help in this very important survey.


Sincerely,


James P. Lynch, Ph.D.

Director, Bureau of Justice Statistics

Office of Justice Programs

U.S. Department of Justice


Mail Screener Cover Letter

For 2C-generic language







{DATE}


Dear Chicago Resident:


Please complete the enclosed survey from the U.S. Department of Justice, Bureau of Justice Statistics. This survey is about your neighborhood and what it is like for you to live in your neighborhood. Results from this survey will be used to better understand the needs of neighborhood residents.


Your address is part of a random sample of addresses in the United States. This is part of a scientific study and your answers represent not only you and your household, but also hundreds of households like yours. For this reason, your voluntary cooperation is very important.


The enclosed questionnaire is the first part of a two-part survey. Some households will be asked to complete a longer telephone survey later. The information you provide will be used for statistical purposes only and may not be disclosed, or used, in identifiable form for any other purpose as required by law (Title 42, U.S. Code, Sections 3789g). Your responses will be combined with those of others to produce statistical summaries about crime and safety.


Answers to the most frequently asked survey questions are included in the questionnaire. Section 3732 of the Justice Systems Improvement Act of 1979 authorizes the Bureau of Justice Statistics to conduct this survey. If you would like further information, you can contact our survey support at 1-888-544-1070 or you can visit the BJS website at www.bjs.gov/ncvspilot.cfm.


Neighborhood issues affect all Americans. Thank you for your generous cooperation. The U.S. Bureau of Justice Statistics appreciates your help in this very important survey.


Sincerely,


James P. Lynch, Ph.D.

Director, Bureau of Justice Statistics

Office of Justice Programs

U.S. Department of Justice


Instruments Submitted Under a Separate Cover


Appendix E: Approach 2B Mail Screener

Appendix F1: Approach 2C Mail Screener (local version)

Appendix F2: Approach 2C Mail Screener (generic version)

Appendix G: NCVS-1 Core (Basic Screener Questionnaire)

Appendix H: NCVS-2 Core (Crime Incident Report)



Attachment I


Adaptation of the Core NCVS Instruments for the NCVS CS Pilot

Core NCVS Instruments for the NCVS CS Pilot


This document describes the content of the core NCVS, as we intend to modify it for the purpose of administering the NCVS CS. To begin, we must administer the control card content as a one-time data collection. In the main NCVS this is collected at the initial household visit and then updated with each additional visit to or telephone interview with the household. Some content is not needed to support the goals of the NCVS CS, as described in section 5. The content we intend to retain intact or modify for CS purposes is outlined below.


The instruments we will deploy for the NCVS CS include the following:

  • Household screener (includes control card information, household and personal victimization screeners)

  • Personal victimization screener

  • Incident report


Content for the control card information is shown on pages 1 through 6, notes about intended adaptations of the NCVS-1 and NCVS-2 questionnaires is included on page 7.



HOUSEHOLD SCREENER

CSINTRO

Hello, this is {DISPLAY D1} and I am calling for a research study being conducted {in the Chicago area} by the Bureau of Justice Statistics. Have I reached (PHONE NUMBER}?

IF YES:

IF LANDLINE: Are you a member of this household at least 18 years old?

IF CELL PHONE: If you are currently driving a car or doing any activity that requires your full attention, I need to call you back at a later time.

IF 18 AND NOT DRIVING/UNSAFE CONDITION:

I’d like to confirm your address. [INTERVIEWER CONFIRMS THE ABS ADDRESS]


IF ADDRESS CONFIRMED THEN THE INTERVIEW IS STARTED WITH BACKGROUND QUESTIONS ABOUT THE HH AND ITS MEMBERS

______________________________________________________________________________

(new screen created for CS introduction)



ASK18

May I please speak with someone who usually lives there, is at least 18 years old, and is able to answer some questions about the household?

1. YES (GO TO CSINTRO)

2. NONE AVAILABLE/MAKE APPT (GO TO RESULT)

-7. NO/REFUSED (CODE REFUSAL/GO TO RESULT)

______________________________________________________________________________

(new screen created for CS introduction/contact procedures)


VERADD_CP

I have your address listed as ...

123 Main Street

Anytown, MD 12345

Is that your exact address?

1. SAME ADDRESS (GO TO TENURE)

2. DIFFERENT ADDRESS (GO TO ADDVERF)

3. NOT R’S ADDRESS (GO TO ASK18)

____________________________________________________________________________

(response categories modified for CS)


ADDVERF

What is your address?

_______ _______________________________ _____________

STNUM STNAME STTYPE

__________________________________ __ _____

CITY STATE ZIP

____________________________________________________________________________

(new screen for CS)

(BASED ON COMPUTER ALGORITHM TO COMPARE THIS WITH THE SAMPLED ADDRESS, PROCEED TO TENURE IF A MATCH AND MOVED_CP IF NOT A MATCH)



MOVED_CP

Since your address rather than you personally was chosen for inclusion in the survey, no interview is required of you at this time. Thank you for your time.

1. ENTER 1 TO CONTINUE

____________________________________________________________________________

(statement to respondent modified for CS)

(GO TO RESULT, CODE – PHONE NUMBER DOES NOT REACH SAMPLED ADDRESS)


TENURE

Are your living quarters ...

1. Owned or being bought by you or someone in your household?

2. Rented for cash?

3. Occupied without payment of cash rent?

____________________________________________________________________________


TYPEOFHOUSINGUNIT

Please select one box that describes the type of housing unit.

1. House, apartment, flat

2. HU in nontransient hotel, motel, etc.

3. HU permanent in transient hotel, motel, etc.

4. HU in rooming house

5. Mobile home or trailer with no permanent room added

6. Mobile home or trailer with one or more permanent rooms attached

7. HU not specified above - Describe

8. Quarters not HU in rooming or boarding house

9. Unit not permanent in transient hotel, motel, etc.

10. Unoccupied site for mobile home, trailer, or tent

11. Student quarters in college dormitory

12. Other unit not specified above - Describe

___________________________________________________________________________________

(this item will need modification for telephone administration for the CS, appropriate changes will be explored during cognitive testing)


NUMBEROFUNITS

How many housing units are in this structure?

1. 1

2. 2

3. 3

4. 4

5. 5-9

6. 10+

7. Mobile home/trailer

8. Only OTHER units

_____________________________________________________________________________________


(COMPLETE HHROSTER_FNAME THROUGH HHMEMBER/HSEMEMURE FOR EACH PERSON BEFORE GOING TO NEXT PERSON)

HHROSTER_FNAME

What are the names of all people living or staying here who are at least 18 years old? Start with the name of the person or one of the people who (owns/rents) this home.

ENTER FIRST NAME ON THIS SCREEN

ENTER 999 TO LEAVE THE TABLE

_____________________

__________________________________________________________________________________

(question modified to ask only for names of persons 18 or older, also not asking for last names)

(UPON ENTERING 999 FOR THE FIRST TIME, GO TO HHCOVERAGE TO VERIFY ROSTER IS COMPLETE; IF RETURN TO ROSTER TO ADD PERSONS UPON ENTERING 999 FOR THE SECOND TIME PROCEED TO BIRTHDATEMO)


SEX

ASK IF NECESSARY

Is (HHROSTER_FNAME) male or female?

1. Male

2. Female

_________________________________________________________________________




RELATIONSHIP

What is (HHROSTER_FNAME)’s relationship to you?

11. Husband 16. Mother

12. Wife 17. Brother

13. Son 18. Sister

14. Daughter 19. Other relative

15. Father 20. Nonrelative

_______________________________________________________________________________________


HHMEMBER

Does (HHROSTER_FNAME) usually live here?

IF "NO", PROBE FOR USUAL RESIDENCE ELSEWHERE.

1. Yes (GO TO HHROSTER_FNAME FOR NEXT PERSON)

2. No

______________________________________________________________________


HSEMEMURE

Does (HHROSTER_FNAME) have a usual place of residence elsewhere?

1. Yes (DELETE FROM ROSTER, GO TO HHROSTER_FNAME FOR NEXT PERSON)

2. No (RETAIN ON ROSTER, GO TO HHROSTER_FNAME FOR NEXT PERSON)

_____________________________________________________________________


HHLDCOVERAGE

Have I missed any other adults age 18 or older living or staying here such as any lodgers or anyone who is away at present traveling or in the hospital?

1. Yes (GO TO HHROSTER_FNAME TO ADD PERSONS)

2. No

________________________________________________________________________

(Question modified to only refer to adults possibly missing from roster)






(COMPLETE BRTHDATEMO THROUGH RACE FOR EACH PERSON BEFORE GOING TO NEXT PERSON)

(AFTER SEQUENCE COMPLETED FOR LAST ADULT ON ROSTER GO TO ROSTERREVIEW)

BRTHDATEMO

What is (HHROSTER_FNAME)’s date of birth?

ENTER MONTH ON THIS SCREEN

______________________________________________________________________


BRTHDATEDY

What is (HHROSTER_FNAME)’s date of birth?

ENTER DAY ON THIS SCREEN

_______________________________________________________________________


BRTHDATEYR

What is (HHROSTER_FNAME)’s date of birth?

ENTER YEAR ON THIS SCREEN

IF THE YEAR IS LESS THAN 1890, ENTER 1890

_______________________________________________________________________

(IF BRTHDATEYR RESPONSE IS DON’T KNOW OR REFUSED, GO TO ESTAGE)


VFYAGE

That would make (HHROSTER_FNAME) (COMPUTED AGE) years old.

Is that correct?

1. Yes (GO TO MARITAL)

2. No (RETURN TO BRTHDATEMO/BRTHDATEDY/BRTHDATEYR TO CORRECT)

______________________________________________________________________




ESTAGE

Even though you don’t know (HHROSTER_FNAME)’s exact birth date, what is your best guess as to how old (he/she) was on (his/her) last birthday?

________________________________________________________________________

(IF VALID RESPONSE, GO TO MARITAL; OTHERWISE ASK AGERNG)


AGERNG

Is (he/she) ...

1. 18 - 24 years old?

2. 25 - 34 years old?

3. 35 - 49 years old?

4. 50 - 65 years old?

5. 66 years old or older?

_____________________________________________________________________

(Question and response categories modified to only probe using adult age ranges)


MARITAL

Is (HHROSTER_FNAME) now married, widowed, divorced, separated, or has (he/she) never been married?

1. Married

2. Widowed

3. Divorced

4. Separated

5. Never married

_______________________________________________________________________


ARMEDFORCES

Is (HHROSTER_FNAME) now in the Armed Forces?

1. Yes

2. No

______________________________________________________________________





EDUCATIONATTAIN

What is the highest level of school (HHROSTER_FNAME) completed or the highest degree (he/she) received?

1. 1ST GRADE 11. 11TH GRADE

2. 2ND GRADE 12. 12TH GRADE (NO DIPLOMA)

3. 3RD GRADE 13. HIGH SCHOOL GRADUATE (DIPLOMA, OR THE EQUIVALENT)

4. 4TH GRADE 14. SOME COLLEGE (NO DEGREE)

5. 5TH GRADE 15. ASSOCIATE’S DEGREE

6. 6TH GRADE 16. BACHELOR’S DEGREE (E.G. BA, AB, BS)

7. 7TH GRADE 17. MASTER’S DEGREE (E.G. MA, MS, MENG, MSW, MBA)

8. 8TH GRADE 18. PROFESSIONAL SCHOOL DEGREE (E.G. MD, DDS, DVM, LLB, JD)

9. 9TH GRADE 19. DOCTORAL DEGREE (E.G. PHD, EDD)

10. 10TH GRADE 20. NEVER ATTENDED, PRESCHOOL, KINDERGARTEN

____________________________________________________________________


SP_ORIGIN

(Are you/Is (HHROSTER_FNAME)) Spanish, Hispanic, or Latino?

1. Yes

2. No

____________________________________________________________________


RACE

Please choose one or more races that (you consider yourself/(HHROSTER_FNAME) considers (himself/herself) to be.

1. White 4. Asian

2. Black or African American 5. Native Hawaiian or other Pacific Islander

3. American Indian, or Alaska Native 6. Other - Specify

____________________________________________________________________

(GO TO BRTHDATEDY FOR NEXT PERSON ON ROSTER, IF LAST GO TO ROSTERREVIEW)

NOTE: The Interviewer will not read the “Other/Specify” option and will only code this if the respondent responds something other than the 5 listed races.



ROSTERREVIEW

REVIEW ALL CATEGORIES

IS THIS INFORMATION CORRECT?

LN NAME REL AGE SEX MARITAL STATUS

______________________________________________________________________________________

1 PERSON1

2 PERSON2

3 PERSON3

X PERSONX



1. Yes (GO TO NEXT SECTION/TIMEATADDRESS)

2. No

_____________________________________________________________________________


WHOTOCHANGE

ENTER THE LINE NUMBER OF THE PERSON REQUIRING A CHANGE.

LN NAME REL AGE SEX MARITAL STATUS

______________________________________________________________________________________

1 PERSON1

2 PERSON2

3 PERSON3

X PERSONX

___________________________________________________________________________


WHATFIX

WHAT CHANGE IS NEEDED?

LN NAME REL AGE SEX MARITAL STATUS

______________________________________________________________________________________

X PERSONX


1. NAME

2. RELATIONSHIP

3. DATE OF BIRTH

4. SEX

5. MARITAL STATUS

__________________________________________________________________________

(ALLOW CHANGES TO BE MADE AS NECESSARY, TO PERSONS AND DATA ITEMS CHOSEN IN WHOTOCHANGE AND WHATFIX.

RETURN TO ROSTERREVIEW WHEN DONE WITH CHANGES, UPON SELECTION OF ‘YES’ ON ROSTERREVIEW, PERFORM RANDOM ADULT SELECTION)

RANDOM ADULT SELECTION OCCURS HERE ONCE ROSTER IS COMPLETE:

  • IF ONE ADULT, SELECT HOUSEHOLD SCREENER RESPONDENT

  • IF TWO ADULTS SELECT HOUSEHOLD SCREENER RESPONDENT AND OTHER ADULT

  • IF THREE OR MORE ADULTS, RANDOMLY SELECT TWO ADULTS (MAY OR MAY NOT INCLUDE HOUSEHOLD SCREENER RESPONDENT)


HOUSEHOLD SCREENER RESPONDENT CONTINUES WITH NCVS-1 INSTRUMENT AND COMPLETES THE VICTIMIZATION SCREENER QUESTIONS INCLUDING THE HOUSEHOLD VICTIMIZATION QUESTIONS.


AFTER THE HOUSEHOLD RESPONDENT COMPLETES ANY REQUIRED INCIDENT REPORTS, ONE OR TWO RANDOMLY SELECTED ADULT RESPONDENTS (AS APPROPRIATE) COMPLETE THE PERSONAL VICTIMIZATION SCREENER AND ANY REQUIRED INCIDENT REPORTS.


Household Screener Remaining Content from Core NCVS-1 and NCVS-2 Questionnaires

The full NCVS-1 questionnaire will be replicated for the household respondent, starting at item 33a (TIMEATADDRESS) on page 2 and continuing through item 45d on page 7. The identity theft questions (items 46 to 59) on pages 7 through 9 will not be asked as part of the CS.


The hate crime questions (items 161 through 166) on pages 33 through 35 of the NCVS-2 incident report questionnaire will not be asked as part of the CS.


Pursuant to collection of any required incident reports from the household respondent using the content from the NCVS-2 incident report questionnaire, the household screener interview will conclude by asking the employment questions (items 74 to 79) on pages 10 and 11of the NCVS-1 of the household respondent. The very last question to be asked of the household respondent is item 12a (household income) on page 1 of the NCVS-1. If there is only one adult in the household the telephone interview is complete at this point.


Sampled Adult Personal Victimization Screeners and Incident Reports

Next, the NCVS-1 questionnaire content will be repeated for the sampled adult(s), from question 33a on page 2 through 45d on page 7, excluding any items labeled with the text “Asked of Household Respondent Only.” After completion of required incident reports this interview will also conclude with the employment questions (items 74 to 79) on pages 10 and 11 of the NCVS-1.


In households with two or more adults, the telephone interview will be considered complete when the household screener and personal victimization screener with any randomly selected adults are completed, as well as any required incident reports. Up to three respondents may be interviewed in households with three or more adult household members, in the event that one of the two randomly selected adults is not the household screener respondent.



Attachment J

Draft Advance Letter for 2B Phone Sample









{DATE}


Dear Chicago Resident:


The U.S. Department of Justice, Bureau of Justice Statistics, is conducting a survey about neighborhood crime.


A representative from our survey support center, Westat, will be calling your household soon to complete the Chicago Crime Victimization Survey.


The information your household provides will help us understand how much crime there is, where it occurs, what crime costs victims, and which groups of people are at the greatest risk. Many crimes are never reported to the police, and this survey helps give a more complete picture of crime in our country.


Your address is part of a random sample of addresses chosen throughout the Chicago area. Because this is part of a scientific study, your answers represent not only you and your household, but also hundreds of households like yours. For this reason, your voluntary cooperation is very important.


The information you provide will be used for statistical purposes only and may not be disclosed, or used, in identifiable form for any other purpose as required by law (Title 42, U.S. Code, Sections 3789g). Your responses will be combined with those of others to produce statistical summaries about crime and safety.


Answers to the most frequently asked survey questions are on the reverse side of this letter. Section 3732 of the Justice Systems Improvement Act of 1979 authorizes the Bureau of Justice Statistics to conduct this survey. If you would like further information, you can contact our survey research support at 1-888-544-1070 or you can visit the BJS website atwww.bjs.gov/ncvspilot.cfm.


Thank you for your cooperation. The U.S. Bureau of Justice Statistics appreciates your help.


Sincerely,





James P. Lynch, Ph.D.

Director, Bureau of Justice Statistics

Office of Justice Programs

U.S. Department of Justice


Commonly Asked Questions


How long will it take to complete this survey?

The average time varies depending on your household’s experiences; the average survey should take 25 minutes to complete.


Am I required to complete this survey?

Your participation is voluntary and there are no penalties for not answering.


How was my household chosen for this study?

Your household was selected at random from all Chicago-area residential addresses.


Why is random selection so important?

Random selection means that a diverse group of Chicago-area residents can represent the experience and opinions of the entire Chicago area. For this survey to be truly scientific, all household included should complete this short questionnaire.


Who is the sponsor of this study?

The survey is sponsored by the Bureau of Justice Statistics (BJS), U.S. Department of Justice (DOJ). The survey is conducted under the authority of Title 42, United States Code, Section 3732. To learn more about BJS, you can visit them on the web at www.ojp.usdoj.gov/bjs.


What is the Chicago Crime Victimization Survey?

The Chicago Crime Victimization Survey is a local version of BJS’ National Crime Victimization Survey (often referred to as the NCVS), which is a nationwide survey based on a sample of households. It is designed to collect information about how many people are victims of crime, as well as get details on the impact of crime on our communities.


Who will use this information?

The results from this survey (and similar surveys conducted by The U.S. Department of Justice) will be used to better understand the needs of neighborhood residents.


How do I know you'll keep my information confidential?

The information you provide will be used for statistical purposes only and may not be disclosed, or used, in identifiable form for any other purpose as required by law (Title 42, U.S. Code, Sections 3789g). Your responses will be combined with those of others to produce statistical summaries about crime and safety. After the study is completed, identifying information - your address and phone number - are destroyed.


Who can I call with questions?

Further information can be obtained from our survey support center at 1-888-544-1070.




Attachment K

Content for the Postcard Reminder



Content for 2B and 2C-localized language


About a week ago we mailed a survey to your household for the Survey of Chicagoland Neighborhoods, sponsored by the Bureau of Justice Statistics within the U.S. Department of Justice. If someone in your household has already returned the survey, we thank you very much for your help.


If you have not completed and returned your survey, please do so right away. We need to hear from everyone and your help is very important to us. Your household was randomly selected to take part in this survey and cannot be replaced.


If you have any questions, feel free to contact our survey research contractor at 1-xxx-xxx-xxx or by email at xxx-xxx-xxxx. For further information please visit our Web site at http://bjs.ojp.usdoj.gov/ xxxx.



Content for 2C-generic language


About a week ago we mailed a survey to your household for the Survey of Neighborhoods, sponsored by the Bureau of Justice Statistics within the U.S. Department of Justice. If someone in your household has already returned the survey, we thank you very much for your help.


If you have not completed and returned your survey, please do so right away. We need to hear from everyone and your help is very important to us. Your household was randomly selected to take part in this survey and cannot be replaced.


If you have any questions, feel free to contact our survey research contractor at 1-xxx-xxx-xxx or by email at xxx-xxx-xxxx. For further information please visit our Web site at http://bjs.ojp.usdoj.gov/ xxxx.



Attachment L

Analysis Plan



Here we present the analysis plan for the Pilot data collection, including the analytic objectives and the plan for nonresponse bias analysis.


In the following, 2B denotes the “telephone harvest” method and 2C denotes the “two-phase ABS hybrid” method. In the latter, we use the initial mail survey to classify the household into strata: H, with high likelihood of victimization, and L, with low likelihood of victimization. We can think of both approaches as two-phase surveys: in 2B, phase 1 consists of obtaining the telephone number either from a vendor or from an initial mail survey.


A previous section discussed different approaches to producing NCVS SAEs, including 2.1(B) model-based estimation with additional auxiliary information collected through a survey and 2.1(C) blended estimates from the core NCVS and the CS. Pilot data may be used to begin assessing each of these approaches. Thus, the goals of the pilot are to assess whether data collection Approach 2B or 2C provides better information for the cost for SAE approaches 2.1(B) and 2.1(C), separately to evaluate the effectiveness of the 2C screener, and to inform the design of the main MSA data collection that may follow. The following sections present details of the analysis plan for each of these objectives.



Objective 1: Identify which of 2B, 2C provides more information for producing blended estimates.


Each of the following measures is to be found for each of:


2B, full sample;
2B, telephone number from directory service;
2B, telephone number from mail screener;
2C, full sample;
2C, stratum L;
2C, stratum H; and
2C, mail nonresponse and telephone number from directory service.


  1. Response rates, by phase, demographics, victim classification in screener or other screener variables.

  2. Analyze response rates by characteristics of sampled census blocks.

  3. Cost per complete interview.

  4. Cost per victim in sample.

  5. Cost per victim of violent crime in sample.

  6. Cost per victim of property crime in sample.

  7. Estimated victimization rate, for major type of crime (TOC) classes. Compare victimization rates from the CS with those from the NCVS, recognizing possible differences due to mode and recall period. Test whether relative magnitudes, rankings of victimization rates by TOC are the same for the NCVS and the CS, after adjusting for census neighborhood characteristics in the respective samples.

  8. Estimated number of crimes reported to police. Compare with UCR for jurisdiction.

  9. Cost relative to standard error for estimating victimization rates, characteristics of victims from CS.

  10. Effects of recall period. Analyze victimization rates by month of occurrence relative to interview date. Compare the recency curves for the CS and NCVS. Compare victimization rates estimated using only incidents in most recent 6 months with NCVS bounded and unbounded victimization rates (also note confounding in NCVS since generally bounded interviews are telephone and unbounded interviews are in-person).

  11. Information from interviewer debriefing sessions to elicit potential improvements to the survey protocol. Also analyze missing data patterns, distributions of responses to specific items, and out-of-range and misreported information.

  12. Other potential sources of nonsampling error. Look at differences by land/cell phone, interviewer effects, number of callbacks, etc. If possible, obtain similar data on core NCVS.

  13. Poststratification and weighting methods to produce blended estimates with NCVS. If estimated bias is large, explore other models for bias. We will derive methods for modeling bias in Spring 2011. Estimate reduction in MSE for (a) victimization rates, (b) characteristics of crime victims, (c) multivariate relationships using blended survey data.


Since we won’t be using an optimal design for the pilot, we will also estimate the costs and response rates that would have resulted under a more efficient design, such as optimal allocation for the two-phase sample in 2C or an improved subsampling-within-household design.



Objective 2: Identify which of 2B, 2C provides more information for small area estimation.


This objective is related to the previous one, except here we consider model-based approaches for SAE.

  1. Examine census block-level variables as predictors of (violent) victimization. This can be done now with current NCVS at Census Bureau. Census predictors could be used as auxiliaries for SAE with both 2B and 2C. Compare with SAEs obtained using higher level of geography for prediction. Examine outliers in model-based predictions.

  2. Analyze associations between 2C screener questions and victimization. The focus here will be on the non-victimization-related questions such as attitudes toward police, fear of crime, routine activities, employment, etc.

  3. Fit SAE models using phase 1 data from 2C as auxiliary information, in addition to variables identified from census. Investigate both unit-level and area-level models. Estimate reduction in MSE under model that results from using the phase 1 data. Examine sensitivity to model assumptions.

  4. Develop theory for using both phases of a two-phase survey as auxiliary information in SAE. Compare reductions in MSE with blended estimate from objective 1.



Objective 3: Analyze effectiveness of 2C screener.


We will estimate specificity and sensitivity for different crime types and examine associations between screener questions and TOC classification for NCVS.





Objective 4: Modify design for next round of sampling.


We will use the costs and estimates of specificity and sensitivity from pilot analyses to determine optimal subsampling fractions from stratum L for estimating victimization rates. We will also consider modifying the stratification to oversample geographic areas with variables associated with high victimization rates. Finally, we will consider alternative approaches to subsampling adults in 2C; for example, is it more efficient to use different subsampling probabilities?


Nonresponse Bias Analyses


The NCVS-CS Pilot study is a methodological test to evaluate new methods of data collection to support estimates for small areas that would be too expensive to obtain using the traditional face-to-face interviewing methods. As such, the goal of the Pilot is to explore the quality of the estimates from the new methodologies and compare them to the estimates produced using the traditional methods. An important part of that investigation is the effect of nonresponse bias. The new methodologies are likely to result in lower response rates, but this does not imply that the estimates will necessarily have more nonresponse bias. The new methods will not only have different response rates, but they will likely have different measurement error properties, so the primary goal is to compare the overall effect.


Some of the planned analyses will explore the nonresponse component of the differences between the Pilot and the traditional methods. The approaches are very similar to many nonresponse bias studies. We mention these below, but we wish to emphasize that the main focus of the analyses will be on the sources of differences between the Pilot and the Core NCVS.


Another aspect of nonresponse bias that is of particular interest in the Pilot is the evaluation of the two approaches to data collection within the Pilot itself (the 2B and 2C approaches). The sample sizes for each of the two methods are more limited so only some types of nonresponse analyses are being proposed in addition to the simple response rates for the two approaches. The other analyses will explore the following questions:


  • What is the demographic profile of nonresponding households?


  • What is the demographic profile of nonresponding individuals?


  • What level of crime victimization would be reported by nonresponding households?


  • What level of crime victimization would be reported by nonresponding individuals?


To support the nonresponse bias analysis of the two approaches we plan to conduct a level of effort study that will explore whether specific subgroups are more likely to be nonrespondents at certain stages in the data collection. Part of this effort will look explicitly at those who participate only after multiple attempts (reluctant respondents). Reluctant respondents will include late responders and those who initially refused to participate but later complied. These respondents would have been nonrespondents if the methodology had not included nonresponse conversion efforts or had utilized a brief field period. For this reason we can use these respondents to help develop the profile of nonrespondents. We will also have their victimization data, and so we can assess their potential impact on the estimates. Even though these “reluctant” respondents may be different than the final nonrespondents, a comparison between the two groups will be useful.


Another type of nonresponse analysis that will be explored is the use of geographic data to characterize the respondents and nonrespondents. Although the pilot is only conducted in one area, there are likely to be differences within that area that can be assessed using data from the American Community Survey or the 2010 Census. These data will include characteristics such percent minority, median income, home ownership, race, ethnicity, etc.






1 McNamee, R. (2003). Efficiency of two-phase designs for prevalence estimation. International Journal of Epidemiology, 32, 1072-1078.

17


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorgiambo_p
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy