SVS 2016 OMB Supporting Statement Part B

SVS 2016 OMB Supporting Statement Part B.docx

Supplemental Victimization Survey (SVS)

OMB: 1121-0302

Document [docx]
Download: docx | pdf




Supporting Statement


B. Collection of Information Employing Statistical Methods


1. Universe and Respondent Selection


The sample universe for the SVS is all persons age 16 or older in NCVS interviewed households. The NCVS sample of households is drawn from the more than 120 million households nationwide and excludes military barracks and institutionalized populations. In 2016, the annual national sample is planned to be approximately 105,000 designated addresses located in 542 stratified Primary Sampling Units (PSUs) throughout the United States. The sample consists of seven parts, each of which is designated for interview in a given month and again at 6-month intervals.


Every ten years, the Census Bureau redesigns the samples for all of their continuing demographic surveys, including the NCVS. In 2015, the 2000 sample design started to phase out and the 2010 sample design started to be phased in. Although the PSUs did not change in 2015, some of the cases assigned to 2015 interviews were selected from the 2010 design procedures from the Master Address File (MAF). The MAF contains all addresses from the most recent decennial census plus updates from the United States Postal Service, state and local address lists, and listings. The MAF is the frame used to reach the target NCVS population. Beginning in 2016, some PSUs will be removed from sample, some new PSUs will be added to the sample, and some continuing PSUs that were selected for both the 2000 and 2010 designs will remain in sample. The phase-in and phase-out of the designs will occur from January 2016 through December 2017. As part of the 2010 design, new addresses are selected each year from a master list of addresses based upon the 2010 Decennial Census of Population and Housing and addresses from the United States Postal Service. The new sample sizes are larger than in previous years to support state-level estimates in 22 states. In 2016, approximately 75% of the sample will be drawn from the 2010 design, with the remaining 25% from the 2000 design.


The NCVS uses a rotating sample. The sample usually consists of seven groups for each month of enumeration. When the SVS will be in the field, there will be three rotation groups that were selected as part of the 2000 sample design. These three rotation groups will only be in continuing PSUs and will contain about 19% of all SVS sample units. The remaining sample will be divided into seven rotation groups that were selected as part of the 2010 sample design. In 2016, there will be an influx of new units added to the NCVS sample as part of the 2010 sample design. Approximately 17% of sample addresses from July to December of 2016 will be invited to participate in the survey for the first time in person.



Each interview period the interviewer completes or updates the household composition component of the NCVS interview and asks the crime screen questions (NCVS-1) for each household member age 12 or older. The interviewer then completes a crime incident report (NCVS-2) for each reported crime incident identified in the crime screener. Following either the screener or the administration of the crime incident report, depending on whether a crime was reported, each household member age 16 or older will be administered the SVS. Each household member provides the information by self-response. Proxy respondents will not be allowed for the SVS. For the NCVS, proxy respondents are allowable under very limited circumstances and represent less than 3% of all interviews. All forms and materials used for the NCVS screener and crime incident report have been previously approved by OMB (OMB NO: 1121-0111). The SVS instrument is included in Attachment 2.


SAMPLING


Sample selection for the NCVS, and by default the SVS, has three stages: the selection of primary sampling units or areas known as PSUs, the selection of address units in sample PSUs, and the determination of persons and households to be included in the sample.


Survey estimates are derived from a stratified, multi-stage cluster sample. The PSUs composing the first stage of the sample are formed from counties or groups of adjacent counties based upon data from the decennial census and the American Community Survey (ACS). The larger PSUs are included in the sample with certainty and are considered to be self-representing (SR). The remaining PSUs, called non self-representing (NSR) because only a subset of them are selected, are combined into strata by grouping PSUs with similar geographic and demographic characteristics. For the NCVS, decennial census counts, ACS estimates, and administrative crime data drawn from the FBI’s Uniform Crime Reporting Program are also used to stratify the PSUs.

Stage 1. Defining and Selection of PSUs


Defining PSUs – Formation of PSUs begins with listing counties and independent cities in the target area. For the NCVS, the target area is the entire country. The counties are either grouped with one or more contiguous counties to form PSUs or are PSUs all by themselves. The groupings are based on certain characteristics such as total land area, current and projected population counts, large metropolitan areas, and potential natural barriers such as rivers and mountains. The resulting county groupings are called PSUs.


After the PSUs are formed, the large PSUs and those in large urban areas are designated SR. The smaller PSUs are designated NSR. Determining which PSUs are considered small and which are large depends on the survey’s SR population cutoff, whether estimates are desired for the state, and the size of the MSA that contains the PSU.

Stratifying PSUs – The NSR PSUs are grouped with similar NSR PSUs within states to form strata. Each SR PSU forms its own stratum. The data used for grouping the PSUs consists of decennial census demographic data, ACS data, and administrative crime data. NSR PSUs are grouped to be as similar or homogeneous as possible. Just as the SR PSUs must be large enough to support a full workload so must each NSR strata. The most efficient stratification scheme is determined by minimizing the between PSU variance and the within PSU variance.


Selecting PSUs – The SR PSUs are automatically selected for sample or “selected with certainty.” One NSR PSU is selected from each stratum with probability proportional to the population size using a linear programming algorithm. Historically, PSUs have been defined, stratified, and selected once every ten years.

Stage 2. Preparing Frames and Sampling within PSUs


Frame Determination – To ensure adequate coverage for the target population, the Census Bureau defines and selects sample from address lists called frames. The 2000 and 2010 sample designs use different frame systems. The 2000 sample design was selected from four frames: a unit frame, an area frame, a group quarters (GQ) frame, and a new construction or permit frame. The 2010 sample design was selected from a unit frame and a GQ frame.


In the 2000 design, each address in the country was assigned to one and only one of the four frames. Frame assignment depended on four factors:

  1. what type of living quarters are at the address

  2. when the living quarters were built,

  3. where the living quarters were built, and

  4. how completely the street address was listed.


The main distinction between the 2000 and 2010 frames is the procedure used to obtain the sample addresses.


In the 2010 design, each address in the country was assigned to the unit or GQ frame based on the type of living quarter. Two types of living quarters are defined in the decennial census. The first type is a housing unit (HU). An HU is a group of rooms or a single room occupied as separate living quarters or intended for occupancy as separate living quarters. An HU may be occupied by a family or one person, as well as by two or more unrelated persons who share the living quarters.


The second type of living quarters is GQ. GQs are living quarters where residents share common facilities or receive formally authorized care. About 3% of the population counted in the 2010 Census resided in GQs. Of those, less than half resided in non-institutionalized GQs. About 97% of the population counted in the 2010 Census lived in HUs.


Within-PSU Sampling – All of the Census Bureau’s continuing demographic surveys, such as the NCVS, are sampled together. This takes advantage of updates from the January MAF delivery and ACS data. In the 2010 sample design, about 28.6% of the HU sample is selected every year; although 57% of the cases selected for 2016 interviews were selected in 2015 to start the 2010 sample design. The GQ sample is selected every three years.


Selection of samples is done one survey at a time (sequentially). Each survey determines how the unit addresses within the frame should be sorted prior to sampling. For the NCVS, each frame is sorted by geographic variables. A systematic sampling procedure is used to select addresses from each frame. A skeleton sample is also selected in every PSU. Every six months new addresses on the MAF are matched to the skeleton frame. The skeleton frame allows the sample to be refreshed with new addresses and thereby reduces the risk of under-coverage errors due to an outdated frame.


Addresses selected for a survey are removed from the frames, leaving an unbiased or clean universe behind for the next survey that is subsequently sampled. By leaving a clean universe for the next survey, duplication of addresses between surveys is avoided. This is done to help preserve response rates by insuring no unit falls into more than one survey sample.


Stage 3. Sample within Sample Addresses


The last stage of sampling is done during initial contact of the sample address during the data collection phase. For the SVS, if the address is a residence and the occupants agree to participate, then an attempt is made to interview every person age 16 or older who lives at the resident address and completes the NCVS-1. The NCVS has procedures to determine who lives in the sample unit and a household roster is completed with names and other demographic information. If someone moves out (in) of the household during the interviewing cycle, he or she is removed from (added to) the roster.


State Samples


From July of 2013 through December of 2015, BJS and Census boosted the existing national sample in the 10 largest states based on population size in order to test the feasibility, cost, and precision of state-level violent and property crime victimization estimates. Prior research conducted through the NCVS redesign had revealed that by building on existing sample in the largest states, direct state-level three year rolling estimates with 10% relative standard error (RSE) could be generated for a reasonable cost.


Beginning in January of 2016, BJS and Census boosted the existing national sample in the 22-largest states. The states receiving a sample boost include Arizona, California, Colorado, Florida, Georgia, Illinois, Indiana, Maryland, Massachusetts, Michigan, Minnesota, Missouri, New Jersey, New York, North Carolina, Ohio, Pennsylvania, Tennessee, Texas, Virginia, Washington, and Wisconsin. In each of the 22 states, enough sample will be selected to achieve a 10% RSE for a three year average violent victimization rate of 0.02. Sample sizes in the remaining 28 states and the District of Columbia will be determined based on previous sample sizes. Unlike the 2000 sample design, no strata cross state boundaries and all 50 states and the District of Columbia will have at least one sampled PSU.


For the core NCVS, interviewers are able to obtain interviews with about 87% of household members in 84% of the occupied household units in sample in any given month. In 2016, we assume that there will be about 1.195 responding persons who are age 16 or older to the NCVS for each designated sample address. Thus, we expect 124,400 persons age 16 or older to provide NCVS interviews from July through December of 2016. Of those, we expect 90%, or about 111,960 persons, will complete the 2016 SVS.


2. Procedures for Collecting Information


The SVS is designed to calculate national estimates of stalking victimization for the target population – the noninstitutionalized resident population age 16 years or older. The SVS is administered to all age-eligible NCVS respondents during the 6-month period from July through December of 2016.

DATA COLLECTION


Data collection includes a screener and incident survey. Based on the VAWA definition of stalking on the screener, respondents are asked to report if they have experienced repeated unwanted contacts or behaviors that caused them to experience substantial emotional distress, or to fear for their safety or the safety of someone else, or that would have caused a reasonable person to fear for their safety or the safety of someone they know in the 12-months prior to the interview. Each eligible person age 16 or older will be asked the screener questions. The screener questions collect the following information: (1) unwanted contacts or behaviors; (2) repeated course of conduct (i.e. experiencing the same behavior or contact more than once, or experiencing two or more different behaviors one time); (3) actual fear; (4) substantial emotional distress; and (5) reasonable fear. When a respondent reports an eligible stalking victimization, the SVS incident instrument is then administered to collect detailed information about this victimization to learn more about the nature and consequences of the victimization.


The SVS incident instrument covers eight areas including (1) offender information, (2) duration of stalking, (3) frequency of stalking, (4) motive for stalking, (5) other threats or attacks the victim may have experienced, (6) help-seeking, (7) self-protective actions, and (8) cost to victim. The offender section asks about the offender characteristics including number of offenders, sex, age, race, Hispanic origin, and the victim-offender relationship. Section two, on duration of stalking asks about how long the behaviors had been happening and how the victim found out about the behaviors. Section three, frequency of stalking, asks about how many times the behaviors occurred in the past 12 months. The fourth section, motive for stalking, asks why the victim thought the offender was stalking them. Section five is a follow-up to the reasonable fear questions in the screener, and asks how the victim was attacked or threatened, whether a weapon was involved, and if someone close to them was threatened or attacked. Section six, help-seeking, asks about reporting to police, and seeking/receipt of victim services. Section seven, self-protective actions, asks respondents about actions they took to protect themselves or stop the behaviors from continuing. Finally, section eight, cost to victim, asks respondents about how the stalking may have affected them, any related distress or socio-emotional problems, and the impact of victimization on school or work.


3. Methods to Maximize Response Rates


Census Bureau staff mails an introductory letter (NCVS-572(L) or NCVS-573(L)) (Attachment 4 and Attachment 5) explaining the NCVS to the household before the interviewer's visit or call. When they go to a house, the interviewers carry cards identifying them as Census Bureau employees. The Census Bureau trains interviewers to obtain respondent cooperation and instructs them to make repeated attempts to contact respondents and complete all interviews. SVS response rate reports will be generated on a monthly basis and compared to the previous month’s average to ensure their reasonableness.


As part of their job, interviewers are instructed to keep noninterviews, or nonresponse from a household or persons within a household, to a minimum. Household nonresponse occurs when an interviewer finds an eligible household but obtains no interview. Person nonresponse occurs when an interview is obtained from at least one household member, but an interview is not obtained from one or more other eligible persons in that household. Maintaining a high response rate involves the interviewer’s ability to enlist cooperation from all kinds of people and to contact households when people are most likely to be home. As part of their initial training, interviewers are exposed to ways in which they can persuade respondents to participate as well as strategies to use to avoid refusals. Furthermore, the office staff makes every effort to help interviewers maintain high participation by suggesting ways to obtain an interview, and by making sure that sample units reported as noninterviews are in fact noninterviews. Also, survey procedures permit sending a letter to a reluctant respondent as soon as a new refusal is reported by the interviewer to encourage their participation and to reiterate the importance of the survey and their response.

In addition to the above procedures used to ensure high participation rates, the Census Bureau implemented additional performance measures for interviewers based on data quality standards. Interviewers are trained to and assessed on administering the NCVS-1 and the NCVS-2 exactly as worded to ensure the uniformity of data collection, completing interviews in an appropriate amount of time (not rushing through them), and keeping item nonresponse and “don’t know” responses to a minimum.


The Census Bureau also uses quality control methods to ensure that accurate data is collected. Interviewers are continually monitored by each Regional Office to assess whether performance and response rate standards are being met and corrective action is taken to assist and discipline interviewers who are not meeting the standards. For the core NCVS, interviewers are able to obtain interviews with about 88% of household members in 83% of the occupied units in sample in any given month. Only household members age 16 or older who have completed the NCVS-1 will be eligible for the SVS. Among eligible respondents who completed the NCVS-1, the 2006 SVS had an additional 83% response rate.

We expect that approximately 124,400 persons age 16 or older in NCVS sample households will have completed the NCVS-1 and be eligible for the SVS from July through December of 2016. Of these, we anticipate that 90%, or about 111,960 persons in 2016, will complete the SVS. Using the 2006 data, we estimate that 1.5% of all persons age 16 or older will have experienced stalking victimization.


Upon completion of the 2016 SVS, the Census Bureau will conduct complete analyses of nonresponse, including nonresponse and response rates, respondent and nonrespondent distribution estimates, and nonresponse bias estimates for various subgroups. Should the analyses reveal evidence of nonresponse bias, BJS will work with the Census Bureau to assess the impact to estimates and ways to adjust the weights accordingly.


4. Final Testing of Procedures


The revised 2016 SVS instrument underwent cognitive testing conducted by the Center for Survey Measurement at the U.S. Census Bureau, under their generic clearance for questionnaire pretesting research (OMB number 0607-0725), from September through November of 2015. The cognitive testing was focused on the current redesigned instrument. The purpose of the cognitive testing was to (1) fully test the redesigned instrument, (2) establish validity of new and revised questions, and (3) examine if the questions were well understood by the expanded target population of persons age 16 or older. The testing was conducted in five rounds, and an iterative methodology was used to identify and address problematic questions at the end of each round. The iterative method allowed for assessment of whether or not revised question wording addressed the problems interviewers were observing during the previous rounds. Evidence from the study indicated that the final version of the questions performed well. The questions were well understood, easy for interviewers to administer, easy for respondents to understand and answer, and captured the intended information. The majority of questions required no revisions, and of those questions that required revisions, most were minor modifications.


With the expansion of the sample universe to include teens (persons ages 16 and 17) in the 2016 SVS data collection, there was some concern over teens incorrectly screening in to the survey reporting normative parental tracking and monitoring as stalking behaviors. Findings from this study indicate that while such false positives are possible, they are highly unlikely, and teens are able to distinguish between normative parental behaviors and typical stalking behaviors.


Overall the screening questions (SQ1a-l) tested well. Two key recommendations for these items included (1) adding in reminders of the stem of the question being about unwanted behaviors and contacts, and (2) adding another screener question on monitoring activities through social media apps (i.e. Instagram, Twitter, or Facebook). The other key items to determine if a respondent was a stalking victim and would continue on to the incident instrument – repetition of contacts or behaviors, actual fear, emotional distress, and reasonable fear – all tested well with respondents and minimal changes were made.


One question regarding whether the unwanted contacts or behaviors were related (i.e. committed by the same person or by others on behalf of that person) was added at the recommendation of the TRP. However, this question proved extremely problematic in cognitive testing. It was revised in each round of testing, but still caused some comprehension issues with respondents. Because this question was serving as a key item to decide whether the respondent would screen out of the survey, any source of respondent confusion is problematic, and there would be concern with potential false negatives occurring too often related to this question. The decision was made to remove this question to avoid incorrectly screening victims of stalking out of the survey. BJS consulted with stalking research experts who served on the Technical Review Panel, and there was no cause for concern with removing this question after testing revealed the potential issues.


The majority of the remaining recommendations from cognitive testing consisted of changes to the wording or structure of a question to remove any confusion. For instance, several mark-all questions in the incident instrument were changed to forced choice yes/no format. These changes were recommended to improve the quality of the data collected. The final report outlining these and other recommendations of the cognitive testing and the testing protocols are included with this package as Attachments 6 and 7.


By June 2016, the Census Bureau will translate the survey instrument into an automated CAPI instrument. Census Bureau staff, including instrument developers and the project management staff, will conduct internal testing of the CAPI instrument.


Interviewers will be provided with an SVS self-study which is mandatory to complete prior to initiating any interviews. Interviewer training is usually conducted a month prior to the first month of interview. This allows the interviewers time to familiarize themselves with the survey content and any special instrument functionality that is specific to conducting interviews for the SVS.


5. Contacts for Statistical Aspects and Data Collection


The Victimization Statistics Unit at BJS takes responsibility for the overall design and management of the activities described in this submission, including developing study protocols, sampling procedures, and questionnaires and overseeing the conduct of the studies and analysis of the data by contractors.


The Census Bureau is responsible for the testing of interview materials and the collection of all data. Ms. Meagan Meuchel is the NCVS Survey Director and manages and coordinates the NCVS and SVS. BJS and Census Bureau staff responsible for the SVS include:


BJS Staff:

all staff located at-

810 7th Street NW

Washington, DC 20531


Census Staff:

all staff located at-

4600 Silver Hill Road

Suitland, MD 20746

Jeri Mulrow

Acting Director

Bureau of Justice Statistics

202-514-9283

Meagan Meuchel

NCVS Survey Director

Associate Directorate for Demographic Programs – Survey Operations

301-763-6593

Michael Planty, Ph.D.

Deputy Director

Bureau of Justice Statistics

202-514-9746

Jill Harbison

NCVS Assistant Survey Director

Associate Directorate for Demographic Programs – Survey Operations

301-763-4285

Lynn Langton, Ph.D.

Chief

Victimization Statistics Unit

202-353-3328

Timothy Gilbert

Survey Statistician

Associate Directorate for Demographic Programs – Survey Operations

301-763-5436

Rachel Morgan, Ph.D.

Statistician

Victimization Statistics Unit

202-616-1707


Jennifer Truman, Ph.D.

Statistician

Victimization Statistics Unit

202-514-5083




6


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorlangtonl
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy