July 7, 2011
Shelly Wilkie Martinez
Office
of Statistical and Science Policy
Office of Management and
Budget
Dear Ms. Martinez:
As the BJS program manager for the 2011 Survey of Campus Law Enforcement Agencies, I respectfully submit this updated approval request package for consideration by the Office of Management and Budget. Since our previous submission, we have conducted pretests with police chiefs and security directors at eight institutions representing a cross-section of the campuses we would include in the proposed survey. The comments from the pretest participants resulted in numerous improvements to the survey instrument. Where feasible, these suggestions have been incorporated into both the long and short versions of the survey instrument. The new forms have been designed and formatted are included in attachments B and C. The web version will be available for your review soon.
Since the initial submission you received months ago, we have also addressed the issues you raised in your passback questions. Below you find points of reference directing you to the location in the supporting statement (Attachment A) where each question is addressed:
Please clarify if the last administration of this survey was in 2005. How many other times has it been conducted? What is the rationale for its current periodicity? See page 2, paragraphs 1, 2.
How many of the questions on the survey are repeats of the last administration? See page 6, paragraph 1.
How many of the questions are repeats (with or without needed tailoring to context) from the main LEMAS collection? The following questions are identical to, or based on, questions in the LEMAS survey: See page 6, paragraph 2.
Why is this collection being separated from LEMAS? See page 1, paragraphs 3,4.
What is the rationale for the sample design (i.e., a census of large schools and a sample of smaller ones)? Why is a census of larger schools needed? See page 11, paragraph 6.
The needs and uses discussion is fairly weak. What are some specific policy uses of the data from last administration? See page 2, paragraph 4,5 and page 3, paragraphs 1-3.
Please provide a substantive update to the sampling plan referenced in A3 and in Part B. This should include not only estimates but a precision analysis (i.e., why this particular sample size is required) and also the final sample design. See section B, Part 1 (b), pages 11-14.
Is this survey designed for the data to be used in tandem with that reported to the Department of Education? If so, how will that joint use be facilitated (e.g., common data access tool, same reference periods)? See page 3, paragraph 5.
Has BJS determined that the universe of schools reporting to ED’s Clery reporting program includes 100% of those eligible for this collection? See page 6, paragraph 3.
Why is the survey initially mailed if the preferred response mode if web? Literature suggests that this approach can depress overall response rates for household surveys. Is there a different finding for institutional surveys? The initial mailing will include a cover letter that will provide multi-mode (web, mail, fax) options for respondents, but will encourage them to use the web response option. See Section B (2), Procedures for Collecting Information starting on page 15 for more detail on the proposed data collection methods.
Please provide more detail on the timing, number of callbacks, etc, in the survey administration discussion. Section B (2), Procedures for Collecting Information, pages 15-17, has been expanded.
What was the response rate for the last administration of this survey? See table on page 14, for response rates by enrollment category.
What were the results (if completed) of the pretesting? The results of the pretest have been compiled and incorporated into the data collection instrument. Please see Attachment D for the specific comments of the pretest participants.
Please provide cover letters and other materials provided to respondents. Please see Attachment E for a draft of the cover letter (page 6) and other materials proposed for use in conjunction with the survey.
How are the contact people in each agency identified? What contact information (e.g., email) is available on the frame? See page 11, paragraph 4.
Please correct typos that occur across the questionnaire (e.g., Questions 11, 12, 18, 42, 44). These have been corrected.
Question 5 is double barreled, i.e., it first asks for what appears to be a subset of personnel in question 1, then asks a further subset of this group. Suggest rewording or splitting. This has been modified.
How does the group in Question 11 match up to the group(s) in question 5? Are they supposed to be a further subset? Will edit checks be built in to be sure that the relationships among them are correct? This has been modified. Edit checks will be used to check the relationship between questions for consistency when appropriate.
Question 16 – is “Auxiliary Services” the name of a program? This question has been revised and no longer includes this term.
Question 51 may be missing a word. As written it is unclear. This has been corrected.
If you have further questions or need clarification on any matter, please contact me. Thank you for your consideration.
Sincerely,
Brian A. Reaves, Ph.D.
Statistician
Bureau of Justice Statistics
202-616-3287
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | reavesb |
File Modified | 0000-00-00 |
File Created | 2021-01-31 |