CFSLLEA Supporting Statement Part B final

CFSLLEA Supporting Statement Part B final.docx

Census of Federal, State, and Local Law Enforcement Agencies, 2014

OMB: 1121-0346

Document [docx]
Download: docx | pdf


B. Statistical Methods


1. Universe and Respondents Selection


The universe for the 2014 Census will consist of all State and local law enforcement agencies operating in the U.S. that are publicly funded and employ at least one full-time sworn officer with general arrest powers and all Federal law enforcement agencies employing personnel authorized to carry a firearm and make arrests. These types of agencies include, state police/highway patrol, county police departments and sheriff’s offices, as well as regional, municipal, tribal, and special purpose police departments. In addition, law enforcement agencies operated by the executive, judicial, and legislative branches of government, as well as those under the Department of Homeland Security and the Department of Justice are within scope of the Census universe.


The 2014 Census universe was developed through a series of steps. The initial universe list for the state and local portion of the 2014 Census was developed in conjunction with another BJS project, the Law Enforcement Agency Identifiers Crosswalk (Crosswalk). The Crosswalk is a master list of all state and local law enforcement agencies that contains data elements, such as agency identifiers, geographic location information, and ORI numbers, which enable linkages across databases developed by BJS and by other entities. In 2011, BJS conducted a competitive solicitation process to acquire a vendor to update the 2005 Crosswalk. The National Archive of Criminal Justice Data (NACJD) was awarded the Crosswalk project based on their knowledge and experience with prior versions of the Crosswalk.


The NACJD used four data sources to compile the 2011 Crosswalk, (1) BJS’ 2008 Census of State and Local Law Enforcement Agencies, (2) the law enforcement component of the 2011 NCIC Originating Agency Identifier (ORI) file, (3) the 2010 Uniform Crime Reports Offenses Known and Cleared by Arrest file, and (4) the Nation Directory of Law Enforcement Administrators (Directory). The law enforcement component of the NCIC file is a non-public database maintained by FBI consisting of a list of all ORI numbers, agency names, and UCR state and county codes for every ORI number issued to a law enforcement agency. Due to the nature of reporting to the NCIC, law enforcement agencies may have multiple ORI numbers associated with a single organization. Therefore, the NCIC database is not a list of reporting entities, but instead an exhaustive list of ORI numbers; with, in some cases, multiple ORI numbers representing a single law enforcement agency. The Uniform Crime Reports Offenses Known and Cleared by Arrest file is also produced by the FBI and contains the name and ORI number of each law enforcement agency that has ever submitted data to the Uniform Crime Reports (UCR). The Directory is a private vender list of law enforcement agencies that includes the agency name and point of contact for federal, state, local, and special purpose law enforcement agencies.


The NACJD appended information from each of the four sources listed above to create a merged database. The resulting database included 100,314 records; 17,985 from the 2008 Census, 44,783 from the NCIC file, 21,771 from the UCR file, and 15,775 from the Directory. NACJD matched duplicate agencies listed in the merged database in order to create a file that included all agencies only once. Unique agency records were identified by taking the list of 100,314 agencies and identifying all records with duplicate names and/or ORI numbers. The matching process resulted in a database consisting of 49,088 records. Each record was reviewed multiple times using automated and manual checks to ensure that the source files were combined correctly. In instances where an agency had multiple ORI numbers (e.g., a unique ORI for each sub-station), a single ORI was flagged as “primary.” Secondary ORI numbers were retained in the file, and noted as duplicate numbers associated with a primary agency. Only primary agencies were included in the Census universe list.


Once the four source files were combined and the initial universe list was developed, a state-level spreadsheet was generated that listed all identified agencies in each state. These state-level lists were then sent to the State’s Statistical Analysis Center or Uniform Crime Reporting Program and to the state law enforcement training posts for verification of the completeness and accuracy of the agency list. Information from state-level respondents regarding additions, deletions, or modifications was incorporated into the next version of the universe list. The work done by the NACJD resulted in a universe file that contained 26,085 state and local law enforcement agencies.


Through a competitive award, NORC is the data collection organization for the 2014 Census. NORC began with the 26,085 state and local law enforcement agency universe file created by the NACJD in creating the Census roster. NORC requested a list of current law enforcement agencies from each state’s Police Office Standards and Training (POST) council and then compared the agencies on the POST list with those on the NACJD universe file. Agencies currently in operation were retained in the Census universe. NORC attempted to contact agencies on the NACJD list that were not on the POST list in order to verify their eligibility for participation in the Census. NORC also recruited the assistance of one of the project’s sub-contractors, International Association of Chiefs of Police (IACP), to work with POST offices to verify within scope special jurisdiction agencies. IACP was also tasked to work with the Bureau of Indian Affairs to obtain a current list of all tribal police agencies. In working through the lists provided by other agencies, NORC was able to reduce the Census universe to roughly 19,000 agencies. To date, NORC is continuing work on validating the eligibility of agencies in the Census universe, with increased attention focused on special jurisdiction agencies.



2. Procedures for Information Collection

Data collection for the 2014 Census will involve a series of mailings and non-response follow-up activities, emphasizing questionnaire completion via a secure web-based reporting system. At first, all agency heads will be mailed a pre-notification letter announcing the start of the 2014 Census data collection (see Attachments 12 and 13). This letter will provide the link to the web survey, the agency’s unique login information, and a notification that the web survey is available for use and early reporting. As was done in the 2004 and 2008 administrations of the Census, this mailing will contain a letter from the Director of BJS and a letter of support from either the International Association of Chiefs of Police (IACP) (see Attachment 14) or the National Sheriffs’ Association (NSA) (see Attachment 15) – depending on the type of agency.


In collecting the 2014 Census data, BJS will use a multi-mode approach in which respondents will be directed to a web-based format as the primary mode of data collection (Attachment Screen shots – State & Local, Federal). A web-based collection is the preferred means to increase response rates, expedite the data collection process, simplify data verification, and facilitate report preparation. In 2004, 30% of law enforcement responded to the Census questionnaire through the web-based option; in the 2008 Census administration this percentage increased to 42%. Due to increased web-based capabilities of law enforcement agencies and the project’s strong encouragement to respond using the web-based data collection tool, BJS expects that a majority of the agencies responding to the 2014 Census will use the web-based option.


In order to maximize efficiency and decrease costs over the long-term, BJS proposes to emphasize a web-based collection by provided the web URL, without the paper instrument, in the initial contact. This initial contact would include a statement that a paper instrument would be provided to them if they preferred to complete the survey via mail by requesting it through an 800 number or project e-mail address. In order to assure a high response rate, respondents who fail to complete the web-based questionnaire without requesting a mail version, will still be provided with subsequent contacts including the paper version and telephone follow-ups as needed.


There is some ambiguity in the experimental literature about whether research should offer a choice of web-based or paper modes concurrently or sequentially. Studies have found that offering a concurrent choice in mode does little to improve response rates (Gentry and Good, 2008). Millar and Dillman (2011) found among college students that the concurrent approach did not improve response; however, it did not significantly diminish response rates either. Yet, what Millar and Dillman (2011) did find is that a sequential approach of web then mail improved response. Medway and Fulton (2012) found in their meta-analysis, which included 19 experimental comparisons, that offering concurrent web-based and mail surveys reduced responses. It should be noted that these results are from individual-level surveys, many conducted among college student samples, and not based on establishment surveys.


Approximately one week after the pre-notification letter mailing, all respondents who did not complete the web survey in response to the pre-notification letter will be mailed a survey invitation letter (see Attachments 16 and 17). This letter will convey the importance of the agency’s participation in the 2014 Census and their timely submission. The letter will also include instructions for accessing the web questionnaire and the agency’s unique PIN and password. The 2014 Census invitation letter will provide the URL to complete the web survey but will not include a hardcopy instrument. This will be done to increase the proportion of respondents using the on-line reporting system. Respondents will however be able to receive a hardcopy questionnaire if they prefer by contacting the project 1-800 number or e-mail address.


Approximately one month after the survey invitation letter, a reminder postcard will be sent to all non-responding agencies (see Attachments 18 and 19). This postcard will continue to encourage non-responders to complete the questionnaire via the web. It will contain the 2014 Census toll-free number and e-mail address so that respondents can contact NORC with questions or if they need assistance.


Approximately two months after the survey invitation mailing, non-respondents will be mailed a questionnaire packet via first-class U.S. mail (see Attachments 20 and 21). The questionnaire packet will contain an introductory cover letter from NORC, a personalized hardcopy questionnaire, and a pre-paid business-reply envelope. The cover letter will again convey the importance of the 2014 Census and the agency’s participation. It will include instructions for submitting the completed questionnaire via the web, as well as mailing the completed questionnaire back to NORC in the enclosed BRE, faxing each page of the questionnaire to NORC, or e-mailing a scanned copy of the questionnaire to NORC.


A series of follow up mailings and/or telephone prompts will continue through the data collection period. For example, non-respondents will be sent a replacement questionnaire and mass faxed/e-mailed blasts. All printed communication will convey the importance of the survey and contain the project e-mail address and toll-free number for the respondent. NORC will also conduct a series of telephone prompts to respondents (see Attachments 22 and 23). If needed, personal telephone interviews will be conducted for non-respondents. During the final weeks of data collection, a postcard will be mailed to any non-responding agencies alerting them to the scheduled data collection end date (see Attachment 24). This last chance contact has been implemented on previous studies and serves to motivate non-responders who had failed to complete and return the survey.


In addition to follow-up contacts to non-respondents made by NORC, the IACP and NSA may reach out to hard-to-reach respondents to encourage completion of the survey in a timely manner.


Upon receipt, each questionnaire will be reviewed and edited, and if needed, the data provider will be contacted to clarify responses or provide missing information (see Attachment 25). Prior to contacting the respondent, NORC staff will aim to address data inconsistencies via BJS-approved editing specifications. NORC also will ensure that responses fall within the proper coding schemes specified by BJS. The following is a summary of the data quality assurance steps that NORC will observe during the data collection and processing period:


Data Editing. NORC will attempt to reconcile missing or erroneous data through a manual edit of each questionnaire. In collaboration with BJS, NORC will develop a list of manual edits that can be completed by referring to other data provided by the respondent on the survey instrument. For example, if a screening question was left blank, but the follow-up questions were completed, a manual edit could be made to indicate the intended positive response to the screening question. Through this process, NORC can quickly identify which hardcopy cases require follow up and indicate the items that need clarification or retrieval from the respondent.


Data Retrieval. When it is determined that additional data retrieval is needed, a data collection specialist will contact the data provider for clarification. Throughout the data retrieval process, NORC will document the questions needing retrieval (e.g. missing or inconsistent data elements), request clarification on the provided information, obtain values for missing data elements, and examine any other issues related to the respondent’s submission.


Data Entry. Respondents completing the survey via the web instrument will enter their responses directly into the online instrument. For those respondents returning the survey via hardcopy (mail or fax), data will be entered as responses are received and determined complete. Once the data has been entered into the database, it will be made available to BJS via an SFTP site. To confirm that editing rules are being followed, NORC will review frequencies for the entered data after the first 10 percent of cases are keyed. NORC will also review frequencies from web survey responses. Any issues will be investigated and resolved. Throughout the remainder of the data collection period, NORC staff will conduct regular data frequency reviews to evaluate the quality and completeness of data captured in both the data entry and web modes.



3. Methods to Maximize Response


The design of the 2014 Census questionnaire is consistent with current leading research on survey design, as presented in Dillman, Smyth, & Christian (2010). This research includes several design elements intended to increase the ease of reading and understanding the questionnaire. First, related questions are grouped together in topical sections. In addition, the survey instrument begins with the most salient items. Questions and instructions are presented in a consistent manner on each page in order to allow respondents to comprehend question items more readily. Proper alignment and vertical spacing is also used to help respondents mentally categorize the information on the page and to aid in a neat, well-organized presentation.


Although all of the core items from the 2008 Census are included in the proposed version of the instrument, the format of several survey items has been improved to address concerns about measurement issues and burden. For example, items with instructions to “mark all that apply” were redesigned to include “yes” and “no” responses in order to better distinguish between missing data and a measured absence of an affirmative response. In addition, longer questions on the 2008 version (i.e., Q1- functions performed) were restructured so that subsections of an item will be separate questions in the 2014 version. Specifically, the first item on the 2008 questionnaire had eight subsections, with a total of 54 responses; while the information requested remains unchanged, the structure of the question was modified from a single item to five separate questions, with none of the items exceeding twelve responses. Other examples of these changes include the addition of new response options based on focus group input or frequently used responses under the “other-specify” option in 2008, and splitting response options to obtain more detailed information.


As is standard for BJS surveys, the response rate expected is 90 percent or more. The 2004 Census yielded a response rate of 99.98 percent and the 2008 Census reached a 99.6 percent.


As described in the data collection section, NORC will utilize a variety of techniques to encourage high response rates. In particular, NORC will provide multiple response options including a web-based questionnaire and return via mail or fax. The follow-up plan includes letters and telephone prompts. Mailing attempts will include a letter of support from the IACP or NSA, which will convey the importance of the data collection and the need for response. Additionally, the IACP and NSA will reach out to a selected group of offices to prompt their completion and return of the survey. NORC will complete telephone interviews as needed as well. NORC will also monitor a project specific e-mail address and toll-free number, both of which allow respondents to contact NORC with any questions.


While the majority of respondents will participate after one of the aforementioned contact attempts, a small percentage of respondents will refuse to complete the survey for various reasons. For these respondents, NORC will develop an instrument designed to capture data for the most critical items. Although not ideal, reducing the number of items and respondent burden will allow for collection of the most important data items from those respondents who previously could not complete the entire survey because of time and/or reporting constraints.


Despite the best efforts made during data collection, some data will not be collected. There are two major types of nonresponse: “unit,” when no data are collected for the law enforcement agency, and “item,” when some questions are answered but others are left unanswered. Unit nonresponse is best handled by adjusting the weights to compensate for the reduced frame size and to reduce some nonresponse biases that this type of nonresponse introduces. The final weight will be the base weight (which is 1 for all law enforcement agencies) adjusted for nonresponse through a weighting class adjustment approach. Item nonresponse is often handled by imputation – filling in the missing items based on what can be inferred about their unrecorded values from other items that were recorded together with the stochastic structure that relates the different data elements. NORC plans to primarily use the hot-deck method for imputation to ensure a complete, quality data file.


All quantitative variables will be checked for missing values and subject to imputation if not logically skipped. Flags will be set to indicate where imputations have occurred, with the values of the flags indicating the nature of the imputation. NORC will develop a set of checks for monitoring the imputations. The imputation procedures will be fully documented with imputation workload and donor use statistics included. NORC will work collaboratively with BJS on an imputation plan for the 2014 Census that is deemed both cost-efficient and suitable for the project and implement the plan.



4. Test of Procedures or Methods

Pilot Test Methodology


The IACP and NSA provided NORC recommendations for potential pilot test respondents. The IACP recommended five offices and NSA recommended four. These offices were located in Duluth, Minnesota; Arlington, Virginia; Greenbelt, Maryland; Parker, Arizona; Wheaton, Illinois; Minneapolis, Minnesota; Forsyth, Georgia; Pierre, South Dakota; and Atlanta, Georgia. The goal of the pilot test was to test new or revised questions and revised response options for the 2014 Census to ensure the questions are clear, the information is available, and the responses are comprehensive.


Description of process


The IACP and NSA began recruiting agencies to participate in the pilot test the week of May 6, 2013. NORC developed an invitation letter that the IACP and NSA could use to aid in the recruitment efforts. These letters provided a brief overview of the pilot test and requested that the agencies confirm their interest in participating in the pilot test. Once the agencies had confirmed their participation, NORC mailed questionnaire packets to these agencies. The packets included a cover letter from NORC, a draft copy of the questionnaire, and a postage-paid return envelope for the respondent to return their completed survey. NORC requested completion and return of the pilot instrument by May 29, 2013. For those law enforcement agencies that had not returned their hardcopy surveys by May 29, 2013, NORC sent reminder emails or made reminder phone calls to the pending respondents.


Upon receipt of their completed survey, NORC contacted the respondent to schedule a debriefing interview. The purpose of the debriefing interview was to discuss the clarity of the questions, the completeness of response choices, the overall ease of completing the questionnaire, and the amount of time it took to complete the survey. NORC also probed respondents regarding missing or inconsistent items, out-of-range responses, and items not completed according to directions. A copy of the debriefing script is attached (Attachment Pilot Test Report).


Participants tested a 34-item instrument that was inclusive of previous Census items and new topical items on emerging issues in law enforcement. The estimated time to complete the draft instrument ranged from 15-20 minutes to 4-hours (over the course of a few days). While the majority of respondents indicated that the information requested was readily available, it was at times necessary to contact other individuals in order to complete certain items. Completion time correlated with agency size and whether respondents had to consult with other co-workers to gather information.


With respect to some of the new content proposed for the 2014 Census, respondents indicated that large agencies may have to contact administrative personnel to provide specific information. BJS removed 13 items from the draft version of the instrument in order to reduce the burden associated with the new content. It was decided that many of the proposed items that were removed are more appropriate for the LEMAS data collection. The resulting version of the instrument was reduced to 21 items, with the majority of items containing “yes/no” response options.


Overall, the respondents did not report significant problems with the clarity of the questions or the response options. Some respondents did provide feedback on specific items, which can be found in the attached Pilot Test Report. Respondent suggestions often focused on response options and how they might be interpreted. In response, BJS either eliminated the item from the current draft or revised the language in order to clarify responses. For example, based on the pilot test respondent’s feedback, the following modifications were made:


Item 15: Two respondents were unable to clearly distinguish between the terms “Single duty area” and “Multiple duty areas” and were therefore unsure how best to sort responses among categories a, d, and e. They responded that law enforcement duties often include subcategories or specializations. For example, officers may specialize as bomb techs or investigators. In response, BJS added “PRIMARILY” to the question text (Enter the number of FULL-TIME SWORN personnel that worked PRIMARILY in each of the following . . .). Also, the response categories were modified in an attempted to improve clarity. For example, the response “SINGLE duty area – law enforcement duties” was revised to, “Law enforcement duties only.”



5. Consultation Information


The Law Enforcement Statistics Unit of BJS takes responsibility for the overall design and management of the activities described in this submission, including data collection procedures, development of the questionnaires, and analysis of the data. BJS contacts include the following:


Andrea M. Burch

Statistician

Bureau of Justice Statistics

810 Seventh St., NW

Washington, DC 20531

(202) 307-1138


Michael G. Planty, Ph.D.

Acting-Chief, Law Enforcement Statistics Unit

Bureau of Justice Statistics

810 Seventh St., NW

Washington, DC 20531

(202) 514-9746


William J. Sabol, Ph.D.

Acting-Director

Bureau of Justice Statistics

810 Seventh St., NW

Washington, DC 20531

(202) 514-1062


The project director at NORC is:

David Herda

Senior Survey Director II

NORC at the University of Chicago

55 East Monroe St., 30th Floor

Chicago, IL 60603

(312) 759-5086



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorReview
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy