BJS Response to OMB Passback

Response to OMB regarding passback questions.docx

National Crime Victimization Survey

BJS Response to OMB Passback

OMB: 1121-0111

Document [docx]
Download: docx | pdf

MEMORANDUM FOR: Shelly Martinez, Office of Statistical and Science Policy

Office of Management and Budget


FROM: Lynn Langton, Statistician, Victimization Statistics Unit

Bureau of Justice Statistics


THROUGH: Michael Planty, Chief, Victimization Statistics Unit

Bureau of Justice Statistics


SUBJECT: Response to OMB Passback on Supporting Statement of the Bureau of Justice Statistics’ National Crime Victimization Survey Information Collection Review


DATE: August 5, 2012



This memo responds to the questions and comments provided by the Office of Management and Budget (OMB) after reviewing the Informational Collection Review (ICR) Supporting Statement for the Bureau of Justice Statistics’ (BJS) National Crime Victimization Survey (NCVS). The changes detailed in this memo, as well as all noted editorial corrections, have been incorporated into the revised Part A and Part B sections of the Supporting Statement which are included as attachments.


PART A


Comment 1, page 5. OMB raised concerns that in the discussion of the redesign project on Enhanced Contextual Priming, BJS was mixing measurement error with sampling error. BJS did not intend to imply that the project was designed to address sampling error. However, OMB is correct that there are two related benefits from the ECP work: First, if priming is able to cue respondents’ memories and get them to report victimizations they would not have remembered otherwise, measurement error is reduced. Second, if the priming results in the survey picking up a greater number of unweighted victimizations, the reliability of victimization rates is increased. The supporting statement now attempts to clarify that the project is primarily concerned with measurement error and the increased reliability of estimates, as well as providing contextual data that can be used to examine correlates of crime.


BJS is currently testing the utility of using enhanced contextual priming (ECP) questions to trigger respondents’ memories and reduce measurement error associated with unreported victimization. Research conducted early in the history of the National Crime Survey (NCS), the predecessor to the NCVS, indicated that persons asked a set of attitudinal questions before the crime screening questions reported experiencing more crime. To the extent that this type of priming is able to illicit a greater number victimizations and reduce measurement error, the reliability of victimization estimates is also increased. Moreover, the addition of attitudinal questions is expected to increase the analytical value of the survey by providing contextual data, which may then be used in analyses examining the correlates of crime.

When the results of the ECP project are delivered around the end of 2012 or beginning of 2013, BJS will begin to incorporate the findings from the project into other on-going redesign work to improve the NCVS screener and crime incident report. For instance, if initial findings suggest that adding attitudinal questions on the crime screener may cue respondents’ memories and generate more reports of victimization, future instrument redesign work and testing will reflect this finding.


Comment 2, page 7. OMB requested a sentence detailing the reasons for the shift from the Ernst to Ohlsson sampling method. We have added the following statement for clarification:


The Ohlsson method requires that the first stage sample be selected independently which allows for annual sampling and affords greater flexibility and efficiency in terms of producing small area estimates and responding to sample cuts or boosts.



Comment 3, page 7-8. OMB requested clarification on whether the 2013 NCVS Companion Study would require full OMB review or would go in under the generic clearance. Because of the complexity of the project and the large sample sizes, the ICR will be submitted for full OMB review, separate from the generic clearance. This has been clarified in the text of Part A.

Comment 4, page 8. OMB questioned whether the previously released NCVS data files for the largest MSAs were ever used by BJS to routinely produce victimization estimates for those areas. OMB also noted the need to highlight whether BJS was committed to producing routine estimates with the new files. While BJS published a handful of report off of these data files, we did not routinely release victimization estimates for the MSAs. BJS is committed to assessing the reliability of the MSA data and the feasibility of producing routine estimates for these areas using two- or three- year rolling averages. We will need to assess a number of factors such as sample sizes and sampling error and whether post-stratification adjustments are required before we can commit to producing regular MSA- level estimates.

Comment 5, page 9. OMB questioned whether the findings from BJS Visiting Fellow, Dr. Lynn Addington’s assessment of the Crime Incident Report would be detailed in a formal report. OMB also requested that BJS provide explanation of how the impact of a new crime screener and incident report on crime rates would be tested prior to full release in the field. Dr. Addington’s research, as well as her recommendations based on the research, will be incorporated into a technical report that will ultimately be available on the BJS website. We expect that Dr. Addington’s report will be completed by September, 2014. A key deliverable that Dr. Addington will produce is a process or protocol for a routine technical review of the NCVS instruments and items on a set cycle. A technical review panel will meet periodically to assess the performance and utility of existing items, give consideration to additions and deletions, and propose strategies for item construction, placement, and testing.  

Prior to the initial fielding of the new instrument, the instrument will be subjected to cognitive testing, as well as an assessment of its effects on crime rates. Depending upon funding, BJS will either conduct a small scale study of the two instruments or conduct tests with outgoing rotations of the NCVS sample. This latter decision is contingent upon Dr. Addington’s proposed protocols and BJS decisions related to the optimal number of times to reinterview households selected into the NCVS sample. Given the timing of Dr. Addington’s report, the time required to conduct the tests, and scope of BJS’ current efforts related to NCVS redesign and subnational estimation program, we do not expect to complete this work before 2015. Hence, the reference to providing a revised instrument to OMB in 2015 has been removed from the Supporting Statement.

Comment 6, page 10. OMB requested additional materials detailing the proposed BJS data collections on victim service agencies and persons with disabilities living in group quarters. Links to the solicitations for these two projects have been added as footnotes in the Supporting Statement and the documents are also included as attachments with this response. Both solicitations closed recently and BJS is in the process of reviewing applications and selecting awardees. Once the projects have started (probably in early 2013), BJS will be able to provide more detail on the scope and timing of the projects.

Comment 7, page 18. OMB noted that Section 4. Efforts to Identify Duplication should include a discussion on the overlap between the NCVS and the FBI’s National Incident-Based Reporting System (NIBRS). The following discussion on NIBRS has been added to that section:

The FBI’s National Incident-Based Reporting System (NIBRS) also includes similar crimes as the NCVS (as well as a number of additional offense types) and collects basic demographic data on the age, sex, and race of persons arrested (offenders). Like the UCR, NIBRS includes only crimes known to police. It is also limited by a lack of information on the characteristics of victims and the victim response to criminal incidents. To date, 43% of law enforcement agencies report NIBRS data to the FBI.1 The reporting agencies cover about 29% of the population of the United States, meaning that the data are not nationally representative.

Comment 8, page 22. OMB noted a discrepancy between the burden hour and respondent hours noted in ROCIS and the numbers reported in the text of the supporting statement. These discrepancies have been corrected and a new 83-I is attached.

PART B

Comment 1, page 1. OMB requested clarification regarding the use of the term ‘household’ throughout the Part B statement and why the term ‘housing unit’ was not used instead. The term ‘housing unit’ denotes the physical structure occupied by a household. Although the NCVS sample is based on housing units, the units of analysis are the household (e.g. all persons for whom the primary place of residence is that particular housing unit) or the individual persons age 12 or older within the household. In other words, because the unit of analysis is the occupants of the housing unit, we often use the term household. The term ‘household’ does not denote persons that are necessarily related to one another, just the summation of all occupants of a housing unit. The term ‘household’ also does not refer to the particular persons occupying the housing unit at time-in-sample one; if persons leave the housing unit and are replaced, the new occupants pick-up as the in-sample household.

To avoid confusion the reference to ‘household’ on page 1 has been changed to ‘housing unit’ and the definitions for ‘housing unit’ and ‘household’ are clarified in a footnote at the bottom of the. Also, the following discussion regarding replacement households was added to the text at the end of the National Sample section:

If members of a household move out and are replaced during the time that the housing unit is in sample, the new occupants are considered a replacement household and begin the interview process where the old household left off. For instance, if household members move out and are replaced after their third interview, the replacement household is interviewed for the fourth through seventh interviews. Each month, replacement households account for approximately 7% of the NCVS sample.

Comment 2, page 2. OMB questioned the utility of state versus more local area estimates of victimization. The following paragraphs were added to Part B to provide additional information regarding the anticipated outcomes and utility of the boost.

This boost in sample in the largest states will not only result in reliable 3-year state-level victimization estimates, it will also allow BJS to identify potential issues that could arise from the process of boosting state samples. Assuming the initial seven state boost is feasible and produces state-level victimization estimates with the expected level of precision, the seven states with direct estimates could ultimately be expanded on to include additional states, as well as metropolitan and other local areas within the states, when the new sample and sampling design is introduced in 2016.

More specifically, the pilot boost in the seven states is designed to achieve several purposes. First, it will provide BJS with more precise cost data related to survey administration that can then be used in determining the costs of its plans for a larger subnational program based upon a combination of direct and model-based estimates. The boosted sample in the seven states will come from areas that are outside of the current PSUs, and collecting data from the boosted sample may incur additional field costs, such as those associated with hiring and training new FRs, travel costs, and other field costs. BJS currently does not have good estimates of these costs, but once these estimates are generated at the state level, they can then be used to generate estimates of the costs associated with sampling for large cities after 2016. Second, and related, to determining if there are any efficiencies to be gained over time through the pilot boost, BJS intends to conduct the pilot study for three years for three reasons: (1) the three-year period was used in designing the amount of sample required to generate violent and property crime estimates with coefficients of variation of less than 10%; (2) a one-year pilot would not provide sufficient time to assess whether the allocation of cases among FRs can be done in ways that can achieve cost efficiencies; and (3) the current cost estimates for additional sample in the seven states is sufficiently modest to allow BJS to run the test for three years.

The state-level victimization estimates to be produced from the pilot boost will also have utility for BJS in describing changes in victimization rates. Because the seven states account for about one-third of the U.S. resident population, the crime rates in these states can be used to help decompose changes in national-level crime rates. Second, used in conjunction with the subnational work on MSAs and generic area estimates (described in Part A), BJS will be able to describe changes in crime rates arising from the largest states, 40 of the largest MSAs, and various types of generic areas. Third, the state-level victimization estimates are useful to states governments in allocating funds for victim services. Apart from state police, state funding for crime policy is limited to passing through a portion of Justice Assistance Grants to local jurisdictions and funding for programs such as victim services or domestic violence programs. State-level victimization estimates provide a basis for assessing need and program allocation decisions. Finally, the state pilot boost effort does not preclude BJS from making local or city-level estimates in the future. Rather, BJS through annual sampling and the stratification plans for the 2016 sample design, BJS can select cities or MSA areas for boosts, contingent upon funding, and produce estimates for large cities as well. In deciding to boost in seven states now, rather than a limited number of cities, BJS chose an option that will yield important cost data about survey operations that it can use in making future decisions about its subnational estimate program as well as generate estimates having utility in their own right.



The three-year, rolling average estimates meeting the 10% coefficient of variation of standard is consistent with other federal subnational estimate programs estimates. For example, the American Community Survey (ACS) generates estimates based one-year, three-year, or five-year rolling averages, depending on the size of the area. The design of the pilot boost with its goal of three-year rolling average estimates was based on assumptions about national-level crime rates applied to the states. The state pilot boost will allow BJS to test these assumptions and identify possible gains in efficiency that could be applied to a larger set of states or could allow for fewer years of data to generate reliable estimates for a given cost.


Comment 3, page 3. OMB questioned how the third stage sample selection is completed. This is detailed on page 5 under the section ‘Stage 3: Sample Within Sample Address.’ Additional detail regarding what constitutes a household is detailed within that section and under comment 4.


Comment 4, page 5. OMB asked for clarification on whether a ‘household’ constituted a familial unit or all persons living in a housing unit, regardless of their relationship. The section has been revised to provide clarification on this point:



The last stage of sampling is done during initial contact of the sample address during the data collection phase. For the NCVS, if the address is a residence and the occupants agree to participate, there are procedures to identify all persons who live at the resident address.2 A household roster is completed to detail the name and other demographic information for all persons living at the resident address regardless of their age or relationship to one another. Based on the household roster, every person aged 12 or older who lives at the resident address is then interviewed If a household member under age 12 turns 12 while the household is in sample, that person is then eligible to be interviewed as well. If someone moves out (in) during the interviewing cycle, they are removed from (added to) the roster.

Comment 5, page 6. OMB requested clarification on whether the state boost would pull sample from the Master Address File (MAF). Yes, the intention is to select sample addresses from the MAF. The change request submitted to OMB prior to the boost will include a sampling plan from Census that more specifically details the methodology used to generate two or three- year rolling average state estimates at an acceptable level of precision (coefficients of variation of 10% or less).

Comment 6, page 7. OMB requested additional information on what the response rates represent and what the attrition rates are over the 3 year period. The following paragraphs were added to Part B to provide explanation.

In 2011, the household response rate for the NCVS was 90 percent and the individual response rate was 84 percent. Nonresponse at the household level is known as Type A non-interview and is computed based on the total number of households in which no household members completed an interview, divided by the total number of households in sample, minus Type B and C non-interviews (designated when housing units are vacant, demolished, under construction, or no longer existing as a housing unit for some other reason). The individual response rate is based on the annual number of interviewed persons divided by the total number of eligible respondents in responding households (households not classified as a Type A, Type B or Type C non-interview).

BJS has requested that Census produce technical documentation for the NCVS data on an annual basis beginning with the 2011 data release. Among the items included in BJS’ outline for the technical documentation are sections on household and person response rates by demographics, mode, and time in sample, as well as attrition and retention rates over time. The Census Bureau has not yet provided BJS with this documentation but when it is available it will be shared with OMB.

Comment 7, page 8. OMB requested that a nonresponse bias analysis plan be attached as a supporting document. We do not currently have a nonresponse bias analysis plan in place. BJS has requested that Census draft a nonresponse bias analysis plan for the 2013 data collection and that plan will be shared with OMB as soon as it is available.




Attachments.

1 Details on NIBRS reporting are available through the Justice Research and Statistics Association (JRSA) Resource Center at http://www.jrsa.org/ibrrc/background-status/nibrs_states.shtml (last accessed July 31, 2012).

2 The criteria for separate living quarters are that the occupants must live separately from any other individuals in the building and have direct access from outside the building or through a common hall or entry.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorlangtonl
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy