B. Statistical Methods
1. Universe and Respondent Selection
Based on the preferred sampling plan, the Bureau of Justice Statistics (BJS) has allocated funds to collect data from approximately 1,600 agencies across the country. This will include all agencies serving 4-year campuses with 2,500 or more students and 2-year colleges with 10,000 or more students. In addition to conducting a census of these groups, BJS will survey a representative sample of agencies serving 4-year campuses with 1,000 to 2,499 students and 2-year campuses with 2,500 to 9,999 students. It is expected the 4-year portion of the data collection will consist of 1,133 agencies and the 2-year portion will consist of 466 agencies.
According to Department of Education enrollment data for 2008, 4-year schools with 1,000 or more students account for more than 97% of all students attending 4-year schools nationwide. The enrollment cutoff of 2,500 for 2-year schools will provide similar coverage (92% of all 2-year students). These data are summarized in the table below. Respondents will be selected from a universe consisting of all non-profit institutions, public and private, that are reporting crime data to the U.S. Department of Education as part of their compliance with the mandates of the Clery Act, using the most recent data available to determine their eligibility status.
Postsecondary institutions and students by enrollment category |
|||||||
|
4-year schools (N=2,197) |
|
2-year schools (N=1,313) |
||||
Student Enrollment |
Number of insti- tutions |
Percent of insti- tutions |
Percent of students enrolled |
|
Number of insti- tutions |
Percent of insti- tutions |
Percent of students enrolled |
15,000+ |
201 |
9.1% |
47.4% |
|
106 |
8.1% |
35.4% |
10,000 - 14,999 |
127 |
5.8% |
14.1% |
|
93 |
7.1% |
16.7% |
5,000 – 9,999 |
261 |
11.9% |
16.5% |
|
241 |
18.4% |
25.4% |
2,500 – 4,999 |
362 |
16.5% |
11.4% |
|
251 |
19.1% |
14.0% |
1,000- 2,499 |
524 |
23.9% |
7.9% |
|
245 |
18.7% |
6.3% |
Under 1,000 |
722 |
32.8% |
2.6% |
|
377 |
28.7% |
2.1% |
a. Eligibility
A minimum enrollment requirement of 1,000 students for 4-year campuses and 2,500 students for 2-year campuses will be used. Campuses with enrollments below these thresholds will be considered out of scope for the survey. The primary reason for this approach is to maximize cost efficiency. The proposed minimum enrollment threshold of 1,000 for 4-year schools achieves 97% percent coverage of all 4-year students enrolled nationwide and avoids the additional costs of sampling from the more than 700 campuses with enrollments of less than 1,000. Since these smallest schools account for only 3% of all 4-year students, very little would be added to the body of knowledge about campus law enforcement. Similar logic applies to 2-year schools; using a minimum enrollment threshold of 2,500 to determine eligibility excludes about 400 campuses that account for just 8% of 2-year school enrollment.
To be included in the survey, a campus must meet the minimum enrollment threshold and operate its own police department employing sworn police officers or security department employing nonsworn security officers. Agencies that serve multiple campuses will be treated as a single respondent. Campuses that, through initial screening, are determined to rely on contractual arrangements with external law enforcement agencies or private security firms for their law enforcement services will not be asked to complete the survey.
BJS will identify the agencies that are potentially out-of-scope by categorizing the position titles included on the list of officials reporting Clery Act data. Any official who appears to be affiliated with an office or agency that is not a campus law enforcement agency will be contacted to confirm this out-of-scope status. BJS estimates about 15% of campuses within the enrollment parameters of the survey will be out-of-scope because they to not operate their own law enforcement agency.
BJS will collect the results of the eligibility screening and report the percentage of campuses contacted that operate their own police or security agencies and, for those that do not, the types of other arrangements that exist (local law enforcement agency, private security, etc.), will be reported by size of campus, public vs. private status, and by 2-year vs. 4-year programs.
In addition to the very small percentage of all students represented by smaller campuses, we also expect the eligibility rate of campuses to decline substantially at these smaller size levels. The proportion of private schools increases dramatically in the smaller enrollment categories: more than 90% of 4-year campuses with fewer than 1,000 students are private compared to 14% of those enrolling 15,000 or more students. Our prior surveys show that smaller, private campuses are also more likely to outsource their security functions.
BJS will also exclude for-profit institutions from the survey. Examination of these schools makes it evident that they typically do not operate in a campus environment and would likely be covered by a private security agreement that encompasses the private buildings in which their classes are held. Some of these institutions also operate primarily online which makes it difficult to determine number of students that utilize actual classroom facilities and may eliminate the need for campus law enforcement services altogether. BJS does not feel that the exclusion of the smaller campuses or for-profit institutions will adversely affect its ability to produce conclusive national-level data about campus law enforcement.
Respondents will be identified through the Clery Act contact file provided to BJS by the Department of Education. Each institution is identified by name and IPEDS ID in the file. The file also includes the name, title, address, and phone number of the campus official responsible for submitting Clery Act crime data. BJS has obtained email address for all campuses with IACLEA membership and will obtain email addresses for others through web searches and, finally, through phone calls if necessary. With reporting mandated in order to receive federal funding, the Clery Act file is a very complete listing. This has been confirmed by David Bergeron, of the Department of Education (DOE). DOE will provide a list of the few non-compliant institutions, if any, that fall within the scope of the BJS survey.
b. Sample agencies
BJS has adopted a strategy that balances cost, coverage, and user utility by collecting data from a census of agencies serving larger campuses and a sample of those serving smaller campuses.
The current plan is to conduct a census of agencies serving 4-year campuses with 2,500 or more students and 2-year campuses with 10,000 or more students. We will draw a sample of agencies serving enrollments that fall under these thresholds.
While sampling is being used for smaller campuses, BJS believes it important to conduct a census of the 4-year campuses with 2,500 or more students and 2-year campuses with 10,000 or more students. BJS will create an online analytical tool from the survey data and will provide users the opportunity to create custom tables for any schools they choose in the larger enrollment categories. BJS has historically conducted this survey at an 80% or higher response rate using primarily in-house resources; we expect this rate to be higher in 2011 due to the employment of a data collection agent.
To further maximize the resources available for the data collection and to reduce respondent burden, the law enforcement agencies included in the survey will receive either the “long” 12-page form or the ”short” 8-page form based on their size as it relates to their expected involvement in various types of programs. The projected number of eligible agencies receiving each form is as follows:
1. Long form agencies
Serves a 4-year campus with 5,000 or more students (N=547)
Serves a 2-year campus with 10,000 or more students (N=188)
2. Short form agencies
Serves a 4-year campus with 1,000 - 4,999 students (N=586)
Serves a 2-year campus with 2,500 - 9,999 students (N=278)
The sampling estimations used by BJS in planning the survey are in the table below.
The table illustrates the different assumptions used to reach the sample size for this survey. Estimates of sample size are based on a margin of error of 5% around the estimate with 95% confidence. The leftmost column breaks down schools by form (long or short), by size, by public/private status, and by 2-year/4-year status. Strata marked with a “C” are those for which we will conduct a census; strata marked with “S” will be sampled. In the next two columns are estimates of the proportion of each stratum that will be eligible/in-scope (based on prior surveys and a review of the Clery data contact file) and the proportion that will respond (based on prior surveys and expected improvement in response rates due to the greater time resources of the hired data collection agent). The fourth column shows the raw population size of each stratum, which includes all schools without respect to eligibility; the fifth column shows the expected response rate calculated by multiplying the raw population size by the expected eligibility rate.
Sample estimations for margin of error <0.05
|
|
|
|
|
95% Confidence Limits |
|||||||||
|
|
|
|
|
|
|
||||||||
Campus Type |
Estimated Eligibility Rate |
Expected Response Rate |
Raw Pop. Size |
Estimated Eligible Population |
Raw Sample Size |
Expected Eligible Sample |
Expected Completes |
Raw Sample Size |
Expected Eligible Sample |
Expected Completes |
||||
Agencies receiving long-form questionnaires
|
|
|
|
|
||||||||||
4-year public (C) |
|
|
|
|
|
|
|
|
|
|
||||
10,000+ |
95% |
90% |
267 |
254 |
267 |
254 |
228 |
267 |
254 |
228 |
||||
5,000-9,999 |
95% |
90% |
152 |
144 |
152 |
144 |
130 |
152 |
144 |
130 |
||||
4-year private (C) |
|
|
|
|
|
|
|
|
|
|
||||
10,000+ |
90% |
90% |
67 |
60 |
67 |
60 |
54 |
67 |
60 |
54 |
||||
5,000-9,999 |
90% |
90% |
99 |
89 |
99 |
89 |
80 |
99 |
89 |
80 |
||||
2-year, 10,000+ (C) |
90% |
90% |
209 |
188 |
209 |
188 |
169 |
209 |
188 |
169 |
||||
Total |
|
|
794 |
735 |
794 |
735 |
661 |
794 |
736 |
661 |
||||
Agencies receiving short-form questionnaires |
|
|
|
|
||||||||||
4-year public (C) |
|
|
|
|
|
|
|
|
|
|
||||
2,500-4,999 |
85% |
90% |
119 |
101 |
119 |
101 |
91 |
119 |
101 |
91 |
||||
1,000-2,499 |
80% |
80% |
70 |
56 |
70 |
56 |
45 |
70 |
56 |
45 |
||||
4-year private |
|
|
|
|
|
|
|
|
|
|
||||
2,500-4,999 (C) |
85% |
90% |
253 |
215 |
253 |
215 |
194 |
253 |
215 |
194 |
||||
1,000-2,499 (S) |
80% |
80% |
447 |
358 |
267 |
214 |
171 |
306 |
245 |
196 |
||||
2-year (S) |
|
|
|
|
|
|
|
|
|
|
||||
5,000-9,999 |
85% |
80% |
249 |
212 |
182 |
155 |
124 |
212 |
180 |
144 |
||||
2,500-4,999 |
70% |
80% |
232 |
162 |
175 |
123 |
98 |
216 |
151 |
121 |
||||
Total |
|
|
1,370 |
1,104 |
1,066 |
864 |
723 |
1,176 |
948 |
791 |
||||
Grand Total |
|
|
2,164 |
1,839 |
1,860 |
1,599 |
1,384 |
1,970 |
1,684 |
1,452 |
The remainder of the table shows the sample size required under two different assumptions of the proportions to be estimated for the sampled strata.
Assuming that the estimated proportions are (25%, 20%, 15%) or (75%, 80%, 85%), we would need a raw sample of 1,860, of which we estimate that 1,599 will be eligible and 1,384 will complete.
Assuming the more conservative estimated proportions of (50%, 50%, 50%), a raw sample of 1,970, of which we estimate that 1,684 will be eligible and 1,452 will complete, may be needed.
We prefer to use the total sample of 1,860. In the unlikely event that we would need to estimate the more conservative proportions of 50% for the sampled strata, this sample size will provide a margin of error within 6% with 95% confidence or a margin of error within 5% with 90% confidence. These estimated upper bounds of the margins of errors were already augmented to be conservative because of the use of the added term for correction for continuity (see below).
Standard Errors and RelVariance of the CLE Sample
|
n (# of completed cases) |
N (# of eligible cases) |
se (standard error) |
RelVal (RelVariance) |
4-year private (1,000-2,499) |
|
|
|
|
p=0.25 |
171 |
214 |
0.015 |
0.003 |
p=0.5 |
196 |
245 |
0.016 |
0.001 |
|
|
|
|
|
2-year (5,000-9,999) |
|
|
|
|
p=0.20 |
124 |
155 |
0.016 |
0.007 |
p=0.5 |
144 |
180 |
0.019 |
0.001 |
|
|
|
|
|
2-year (2,500-4,999) |
|
|
|
|
p=0.15 |
98 |
123 |
0.016 |
0.012 |
p=0.5 |
121 |
151 |
0.020 |
0.002 |
For the whole CLE sample in which different types of schools are included, the standard error will be much smaller (less than 0.005) and the relVariance will be no more than 0.0001.
c. Out-of-scopes
A campus will be considered out of scope 1) if the enrollment size is lower than 1,000 students for 4-year campuses or 2,500 for 2-year campuses or 2) if there is not an identifiable police or security agency operated by the institution using employees of the institution.
d. Agency non-response adjustments
As noted the overall response rates for the survey have been good, including 82% in 2004-05. As shown in the table below, response rates have been lower among smaller 4-year campuses.
Student Enrollment |
Percent of institutions responding in prior survey |
Projected percent of institutions responding in 2011 survey |
Overall response rate |
82% |
87% |
Overall 4-year |
81% |
88% |
10,000 or more |
84% |
90% |
5,000 – 9,999 |
79% |
90% |
2,500 – 4,999 |
73% |
90% |
1,000 – 2,499 |
N/A |
80% |
Overall 2-year |
88% |
84% |
10,000 or more |
88% |
90% |
5,000 - 9,999 |
N/A |
80% |
2,500 – 4,999 |
N/A |
80% |
We expect higher response rates across all categories in 2011 due to the employment of a data collection agent for this survey who will have greater resources available for follow-up efforts. Despite expected high response rates, the survey could be subject to some degree of non-response bias. BJS will first assess the extent of potential non-response bias by comparing agencies that do not respond with those that do on variables that are available from the sampling frame, such as enrollment size and type of institution (public or private; 2-yer or 4-year) to see if non-responding agencies differ from responding agencies on these characteristics.
Based on the above table we expect that schools with smaller enrollment and private schools will be less likely to respond than larger schools and public schools. We can further assess the likelihood of non-response bias by comparing survey variables for agencies that responded late in the survey to those that responded early, under the assumption that agencies responding late will be most similar to those that do not respond. If these assessments suggest that there are likely significant differences between responding and non-responding agencies, we will conduct appropriate non-response adjustments to the population estimates.
To help alleviate the effects of biases, BJS will conduct a non-response adjustment procedure by applying survey weights to the completed agency interviews. Weights are inversely proportional to the response rate within a defined group, known as an adjustment cell; this means that completed surveys from groups with lower response rates will receive higher weights than completed surveys from groups with higher response rates. BJS will create adjustment cells that divide the respondents and non-respondents into various categories that are available from the sampling frame and are shown to be related to the characteristics of the agency and their response patterns. BJS expects to create these cells based upon the following factors:
Type of institution (public vs. private)
Type of officers employed (sworn versus non-sworn),
Size of enrollment served
BJS will then compute the response weight as the inverse of the response rate within each adjustment cell. Weights will be used to adjust estimates for non-response.
2. Procedures for Collecting Information
Using a Dillman tailored-design approach1, BJS and the data collection agent, ICF International, will develop multi-mode data collection tools that will allow respondents to complete the data collection instrument in web-based, mail, fax, and telephone modalities at their convenience. BJS will encourage respondents to complete the survey primarily through the web-based option. The data collection agent, ICF International, is developing this web-based instrument. Once available, the web address will be provided to OMB to provide an opportunity to review the design and content of the electronic instruments.
We will provide only the web-based survey option in the first survey invitation. Mailings of a hard copy of the instrument to non-completers will occur approximately 9 weeks and 13 weeks from the survey start. Following the follow-up mailings, for those who have not yet responded to the web-based or paper/pencil modes of the survey, we will then call each respondent agency to have our agency liaisons complete the web-based survey instrument with the agency respondent over the phone. A more detailed description of the data collection procedures to be implemented by the data collection agent on behalf of BJS follows:
Conduct survey pre-notification. We will conduct at least four pre-notification contacts (by mail, postcard, telephone, and email) to increase respondents’ familiarity with the purpose, process, and occurrence of the survey. Survey pre-notification announcements will take place by way of an an email or phone call to the agency to verify contact information followed by an initial introductory letter on BJS letterhead (see attachment) and a postcard announcement just before the start of the survey. The pre-notification communications will also be used to further refine our contact information to ensure that we deliver the actual surveys to the appropriate contact person.
Conduct slow start fielding of the web-based instrument. After we have completed our pre-notification communication phase, we will send email invitations with a link to the web-based survey instrument to a randomly selected subsample of approximately 10% of the respondent agencies. We will use the email invitation with a link for the web-based instrument as our first survey invitation. The slow start fielding enables us to ensure that all data collection procedures are error-free and that we can develop complete responses to all unanticipated technical and content questions from respondents. We will update our online frequently asked questions (FAQ) guide and help desk manual to reflect these responses, allowing us to more efficiently answer respondent questions when the bulk of the sample is invited to participate. We will monitor data collection closely, export data files, run data checks for accuracy and consistency, and monitor respondent comments and questions to ensure that the instrument and procedures allow for efficient and accurate data collection.
At approximately Day 10 of the survey slow start field period, we will export a “live” survey response data file. The test file will be inspected for completeness of responses and for verification of the data cleaning syntax procedures used in our data analysis. Our data verification process includes calculating ranges and measures of central tendency (e.g., average, median) as well as performing edit checks to ensure consistency across responses. We will generate frequencies for all variables for review, including documenting the percentage of missing data by variable and agency. We will also generate cross-tabulations and other descriptive statistics to reveal inconsistencies in dependent data elements (those logically related) and other data anomalies.
Data anomalies will be flagged and the documentation will be included in notes provided with the data dictionary notebook. We will contact respondent agencies to resolve any apparent data anomalies noted in the test file. We will update any procedure that we identify as improving the efficiency and accuracy of the data collection. We expect that only minor changes will be needed to the web system and procedures based on the slow start
Conduct full fielding of the web-based instrument. After approximately Day 15 following the start of data collection, and following any updates to the web system or procedures, we will then send out email invitations to all remaining respondent agencies with a link to the web-based survey instrument (along with sending reminder emails to the respondent agencies who were in the slow start and had not yet completed the instrument). We will again monitor very closely the rate at which data are being entered, the nature of the data being entered, which entails checking for any anomalies or areas where there may be higher levels of incompletion. In addition, we will be closely monitoring respondent feedback, questions, and comments to determine if there are any systemic areas of respondent difficulty needing to be resolved. For follow-up to the initial email to those who have not completed their instruments, we will complete two cycles of reminders, including a reminder email, a postcard, and a reminder telephone call from the agency liaison. Each cycle will take approximately three weeks. Additional cycles to recruit to the web-based instrument will then be staggered to take place every 6 weeks thereafter for those respondent agencies who have not yet completed the survey instrument until the close of the project.
Mail out paper-pencil instrument. For those respondents who have not completed the web-based survey instrument after 9 weeks from the full-field start (following two full cycles of reminders), we will then send out a survey package including both an overview letter (with instructions on how to complete the survey and return or fax back to us) and the paper-pencil version of the survey by way of a FedEx package. The paper-pencil version of the survey will also contain the survey web link that respondents can use to complete the survey if they choose to.
Following receipt of the survey package, agency liaisons will call the agency respondents to ensure that they received the package, understand the instructions, and provide them a point of contact if they have any questions as they complete the instrument. Approximately 2 weeks after their receipt of the survey package, we will send a postcard and email reminders (with links to the web-based instrument as well) for those respondent agencies that did not complete a survey instrument (in either web-based or paper-pencil form). This reminder cycle will be repeated for all non-completers. Approximately 4 weeks after their receipt of their paper-pencil instrument, we will again send another package including both another letter requesting participation and the paper-pencil survey instrument. We will continue to follow our reminder protocol cycles for those who have not completed the instrument.
Conduct follow-up telephone interviews for non-completers. After approximately 4 months from first fielding of the slow start, for those respondent agencies who did not complete either the web-based or paper-pencil measures, we will call each respondent to complete the web-based survey instrument for the respondent in a telephone interview. We expect that this phase will take approximately 3 weeks to complete.
Materials proposed for use during the data collection such as letters, notifications, and scripts are attached in the form in which they were submitted to and approved by the ICF International Institutional Review Board.
3. Methods to Maximize Response
ICF International, the data collection agent hired by BJS for this project, will maximize response by encouraging use of the web-based survey but also collecting data through mail, fax, email and telephone modes. Although response rates for BJS law enforcement surveys in the past have always exceeded 80% (and often 90%), this level of success can never be taken for granted. Therefore, a number of steps will be taken to minimize unit and item non-response for this survey.
Respondents will receive a series of follow-up contacts through different methods – email, postcard, telephone and mail, following the Dillman tailored design method. Each different way of contacting the respondent increases the likelihood that the respondent will attend to the information. When we contact respondents by telephone, we will remind them of the endorsement of the survey the IACLEA organization has made and inform them of the paper-pencil instrument option and that we can send them one at that time. If they decline participation, we will once again remind them that they can complete the survey online and then ask them if there is another person at the agency able to respond to the survey. If we are provided with another contact, we will follow-up with the new contact by phone and by email.
We will offer extensive technical assistance during the survey field period. We will establish a Help Desk and all communication materials will include the email address and telephone number for the Help Desk. The Help Desk to be staffed with dedicated project liaisons. We have found that having an established training and assistance help-line early in the process provides immediate technical training and assistance to agencies and eases respondent burden.
The second form of assistance to respondents will be through a web-based Help Menu for assistance during data collection. We propose to use a ‘Get Help Now’ feature on the web forms that will link to common or frequently asked questions or further information about issues specific to the data entry form on each screen to encourage respondents to enter accurate data.
4. Testing of Procedures
The survey instrument was pre-tested at eight selected campuses by individuals who are representative of the different types and sizes of campuses served by the agencies that will be receiving the survey instrument. The pretest involved 4-year and 2-year campuses with a range of enrollments, and included those under both public and private control. The pretest volunteers are in addition to those who serve in a consultant role on the project (see Part I, #8). BJS has incorporated the comments of the pretest participants into the current version of the questionnaire (see attachments). Please see the attachment titled “pretest” for the comments provided by the pretest participants. They are listed by the commenter’s initials.
Jasper Cooke (JC), Director of Public Safety, Augusta State University
Thomas Johnson (TJ) , Director of Police, Truman State University
Gary Lyle (GL), Director of Public Safety, Anne Arundel Community College
Michael Lynch (ML), Director of Police, George Mason University
James C. Lyon (JL), Jr., Chief of Police, Northeastern Illinois University
Paul Ominksy (PO), Director of Public Safety, Princeton University
David Perry (DP), Chief of Police, Florida State University
Laura Wilson (LW), Chief of Police, Stanford University
5. Contacts for Statistical Aspects and Data Collection
Primary contact for information on statistical methodology, conducting the survey, and analyzing the data:
Brian A. Reaves, Ph.D.
Law Enforcement Unit
Bureau of Justice Statistics
810 Seventh St., NW
Washington, DC 20531
(202) 616-3287
Additional contact:
Joel H. Garner, Ph.D.
Chief, BJS Law Enforcement Unit
Bureau of Justice Statistics
810 Seventh St., NW
Washington, DC 20531
C. Attachments
1. Copy of endorsement letter from International Association of Campus Law Enforcement
Administrators
2. Copies of letters and other supporting materials (from ICF IRB package)
3. Copy of the survey form (with short-form items highlighted)
4. Copy of the pretest results
5. Copy of the regulatory authority (42 U.S.C. 3732)
1 Dillman, D. A., Smyth J. D., Melani, C. L. (2009). Internet, mail and mixed-mode surveys (2009) Hoboken, NJ: Wiley.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | pricel |
File Modified | 0000-00-00 |
File Created | 2021-01-31 |