Community Policing Self-Assessment Tool:
Documenting Today and Planning for Tomorrow
A User’s Guide
ICF International
Police Executive Research Forum
U.S. Department of Justice Office of Community Oriented Policing Services
June 2010
Community Policing Self-Assessment Tool:
Documenting Today and Planning for Tomorrow
A User’s Guide
Bruce Taylor
Deirdre Weiss
Drew Diamond
Police Executive Research Forum
Beth Heinen
Rebecca Mulvaney
Alex Conlon
Candace Cronin
ICF International
Rob Chapman
Matthew Scheider
Office of Community Oriented Policing Services
This project was supported by Cooperative Agreement Number 2004-CK-WX-K028 awarded by the Office of Community Oriented Policing Services, U.S. Department of Justice. Points of view or opinions contained in this document are those of the authors and do not necessarily represent the official position or policies of the U.S. Department of Justice, ICF International, or the Police Executive Research Forum and its members. Rather, the references are illustrations to supplement discussion of the issues.
CONTENTS
PART I: Introduction to the Community Policing Self-Assessment Process 1
PART II: Conducting a Community Policing Self-Assessment 10
Checklist I: Planning for the Self-Assessment Process 13
Checklist II: Implementing the Self-Assessment Tools 20
Checklist III: Obtaining CP-SAT Results 28
Checklist IV: Strategic and action planning for the future 29
We would like to take this opportunity to express our sincere gratitude to a number of organizations and individuals who made this effort possible. First, we thank the police agencies who participated in our pilot testing. These agencies provided critical feedback and numerous suggestions that improved the quality of the self-assessment tool. We are truly grateful for the time, resources, and commitment these departments put into the piloting effort. The Appleton (Wisconsin) Police Department piloted the first version of the self-assessment tool and contributed greatly to this document. The department took on an enormous task—with a much longer instrument—and did so with enthusiasm and good humor. We are particularly grateful to former Chief Richard Myers, Captain Julie Bahr, and Fiscal Resources Manager Sue Ann Teer. We are also grateful to the staff of the Lowell (Massachusetts) Police Department, particularly former Superintendent Edward Davis and Research Director Sharon Hanson, who assisted with the second pilot test and provided suggestions about how to develop support for the self-assessment. The Charlotte-Mecklenburg Police Department served as the third pilot site and provided valuable advice about how to administer the self-assessment forms. We extend special recognition to Chief Darrel Stephens; Paul Paskoff, director of research, planning, and analysis; and John Couchell, research team supervisor. The Ocala (Florida) Police Department served as the fourth site. Chief Samuel Williams and Major Rick Lenemier provided thoughtful suggestions about how to encourage officers to participate in the self-assessment. We also thank the Leesburg (Virginia) Police Department, which was the final site for the paper-based version and was able to complete the piloting in a short amount of time before the end of 2006. Thanks to Chief Joseph R. Price and Captain Jeff VanGilder for their dedication to completing the piloting during an otherwise busy time of the year and for their suggestions about how to partner with local agencies in this self-assessment process. Finally, we are very grateful to the Gaithersburg (Maryland) Police Department for their willingness to serve as the pilot test site for the online version of the instrument. In particular, Chief John King and Captain Chris Bonvillain were instrumental in allowing us to gather invaluable suggestions and feedback on the usability of the Web-based assessment.
We are grateful to the numerous police chiefs, sheriffs, and researchers at Regional Community Policing Institutes whose input helped the project team develop and refine the community policing framework that is the basis of the self-assessment tool. We also received invaluable feedback from the policing professionals and others who attended conference presentations about the project, including presentations at the Police Executive Research Forum’s Annual Meeting, the National Crime Prevention Council Conference, the International Association of Chiefs of Police Annual Conference, a National Sheriffs’ Association meeting, and the Department of Justice Office of Community Oriented Policing Services (the COPS Office) Conference. This project was also guided by Technical Advisory Group (TAG) members who assisted with the development of the community policing framework and assisted with the review of the tool. TAG members were Richard Bennett, American University; Dennis Kenney, John Jay College; Chief Richard Myers, Appleton Police Department; and Norm Peterson, University of Minnesota. Thanks also to the police practitioners and researchers who assisted with the review of the self-assessment tool or provided other guidance and suggestions throughout the course of the development of the self-assessment tool. Special thanks to Bonnie Bucqueroux, Michigan State University; Ed Connors, Institute for Law and Justice; Gary Cordner, Eastern Kentucky University; Deputy Chief Ron Glensor, Reno Police Department; Chief Robert Lunney, ret.; Melissa Schaefer Morabito, Rutgers University; Dennis Rosenbaum, University of Illinois at Chicago; Captain Mike Wells, ret., Concord Police Department (CA); and Jerry Williams, University of Colorado at Denver.
Finally, we thank our project monitor, Rob Chapman of the COPS Office, for his steady hand in guiding this project. We also received tremendous support throughout the project from COPS Office Director Bernard K. Melekian, former COPS Office Director Carl Peed and Assistant Director Matthew Scheider. Our thanks also to Amy Schapiro and Cynthia Pappas, both of the COPS Office, for their suggestions and input into this project.
********************
Authors’ Note: This project is a first attempt at developing a community policing assessment tool. The authors of this assessment and guide welcome feedback about the implementation of the tool and the tool itself. Direct questions and comments to Rob Chapman at 202.514.8278.
Law enforcement agencies rarely solve public safety problems alone. The agencies depend on other government agencies, the social services network, and the communities served—the stakeholders—for solutions to community-wide problems. In community policing, agencies work in partnership with a wide range of agencies to identify problems, prioritize solutions, allocate resources, and improve public trust.
Community policing emphasizes proactive problem solving in a systematic and routine fashion. Rather than responding to crime only after it occurs, community policing encourages agencies to proactively develop solutions to the immediate underlying
conditions contributing to public safety problems.
In community policing, law enforcement agencies align their organizational management, structure, personnel, and information systems to support community partnerships and
proactive problem-solving efforts. The community policing model is delivered through aspects of traditional law enforcement enhanced with prevention strategy, proactive problem-solving, community engagement, and community stakeholder partnerships.
Under the community policing model, police management infuses community policing ideals throughout the agency via climate and culture, leadership practices, formal labor relations, decentralized decision-making and accountability, strategic planning, policies and procedures, organizational evaluations, and increased transparency. 1
Exhibit 1: Community Policing Framework
Agencies that hold this same philosophical approach to community policing and embrace the broad tenets of the community policing definition will find that the CP-SAT will help them measure their progress in adopting community policing principals and procedures. To make this process credible and the findings reliable, an agency needs to take an honest approach to the self-assessment. The CP-SAT tool provides information about the agency’s strengths and gaps, which the local agencies must place into context as part of their interpretation of the results because it is only one part of a larger picture. Full implementation of community policing is an ideal. The CP-SAT provides agencies with the ability to establish a baseline and track progress of their community policing implementation.
It is also important to stress that the CP-SAT is a process assessment tool, not an impact assessment tool. In other words, the tool focuses on the processes used by agencies implementing community policing (e.g., how well is an agency implementing community policing?) rather than the results of those processes (e.g., what are the effects of an agency’s implementation of community policing?).
Collaborative partnerships between the law enforcement agency and the individuals and organizations they serve to develop solutions to problems and increase trust in police.
Community policing, recognizing that police rarely can solve public safety problems alone, encourages interactive partnerships with relevant stakeholders. The CP-SAT measures three aspects of an agency’s community partnerships, described below.
General engagement with the community. Engagement refers to the extent to which the agency proactively reaches out to the community to involve it in the community policing process.
Extent to which an agency has a wide range of partnerships. CP-SAT participants respond to questions regarding the types of sectors that are active community partners with the agency. These sectors included government agencies that serve the community, community-based organizations, local businesses, and individuals.
Level of interaction with the agency’s partners. Level of interaction refers to the extent to which there is true collaboration between the agency and community partners, versus, for example, one-way communication.
General skill in problem solving. The CP-SAT also assesses perception of the general skill level of officers in integrating problem-solving into their daily work.
There are six versions, or forms, of the tool, each distributed to various rank levels within the organization:
Officers.
Supervisors.
Command Staff.
Civilian Staff.
Community Partners.
Cross-Agency Team.
Agency structure varies greatly, therefore, each agency should examine which ranks fit best for the first three versions of the tool: officer level, supervisor level, and command staff level. A lieutenant in a small agency, for example, may be considered command staff while in a large agency the same rank could be considered as a supervisor. The tool also includes a civilian form and a form to be completed by key community partners. The sixth version of the assessment tool is the cross-agency version, which will be completed by a cross-section of agency personnel as well as community representatives.
An agency is committing personnel, staff time, and equipment resources when it agrees to use the CP-SAT. In addition, the CP-SAT requires cooperation and willingness throughout the agency and at all levels, which can involve getting union support for the assessment. Depending on an agency’s size, the assessment takes anywhere from 2 to 3 months and requires one person to serve as the project manager. The self-assessment involves all agency personnel by asking them to fill out an assessment survey, which requires approximately 45 minutes to complete. For larger agencies, a sample of personnel may be used in lieu of all staff.
Representatives from a few community partners are asked to give feedback about the agency. An agency that is seeking an honest, accurate feedback needs to choose community representatives who best know the agency, whether they have the most positive perception of the agency or not. The agency needs to make these expectations clear when asking community representatives to participate in the CP-SAT process.
In addition, the assessment benefits from an (optional) cross-agency team comprising at least 12 members: three officers, three supervisors, three command staff, and three community members. This team makeup allows for one person from each demographic category per CP area (problem solving, community partnerships, and organizational transformation). Agencies may also consider having more than three community members serve on the cross-agency team or adding civilian employees or personnel from other departments (e.g., Neighborhood Preservation, Parks and Recreation).
When an agency self-administers the CP-SAT, that agency plans, organizes, communicates with responders, compiles and analyzes the data, writes the report, and distributes the results.
The CP-SAT is typically administered as an electronic survey. The online format offers several advantages. First, it removes the need for data entry, and thus reduces the possibilities for errors. Second, the Department of Justice Community Oriented Policing Services (COPS) offers an automated summary reporting feature that imports the data from the survey software and creates a summary of results, significantly reducing data analysis and reporting time and effort. Third, the online assessment is more convenient for responders because it allows them to complete the survey from virtually any computer with an Internet connection. Because some agencies lack sufficient computers to accommodate access for all staff, CP-SAT is also available as a paper-and-pencil tool.
The second part of this User’s Guide, “Conducting a Community Policing Self-Assessment,” provides step-by-step directions for conducting the assessment, from planning through report distribution.
This part of the user’s guide describes the process agencies can use to implement the Community Policing Self-Assessment Tool (CP-SAT). It explains how agencies can manage the CP-SAT; describes the roles of the project director and the cross-agency team; introduces the CP-SAT forms and explains how to use them; and details the data collection, compilation, analysis, and report process.
How is the CP-SAT administered?
ICF International works with the agency on behalf of the COPS Office to plan and organize the assessment, stands by to offer assistance and advice throughout the assessment, and analyzes and reports the data after the assessment is complete.
The CP-SAT is typically administered via electronic survey. In certain cases, such as a very small agency or an agency without computer access, alternative arrangements can be made. Please contact Rob Chapman at (202) 514-8278 for more information about special arrangements. The electronic format offers significant advantages. First, it removes the need for data entry, and thus reduces the possibilities for errors. Second, the Department of Justice Community Oriented Policing Services (COPS), working with ICF International, offers an automated summary reporting feature that imports the data from the survey software and creates a summary report, significantly reducing data analysis and reporting time and effort. Third, the online assessment is more convenient for responders because it allows them to complete the survey from virtually any computer with an Internet connection.
How is the CP-SAT managed?
Because every agency varies in size and structure, each agency must consider how the tool best matches the agency’s personnel for skill and availability and the depth of commitment required. Typically, the chief or sheriff appoints a project director to oversee the day-to-day activities of the self-assessment from start to finish.
The project director appoints a cross-agency team and a review team. Cross-agency team members collaborate and come to consensus to respond to questions in the three modules of the CP-SAT survey forms to provide a balanced baseline. The cross-agency team comprises at least 12 members:
Three officers,
Three supervisors,
Three command staff, and
Three community members.
To get the broadest view, agencies may consider having more than three community members serve on the cross-agency team or adding civilian employees or personnel from other departments (e.g., Neighborhood Preservation, Parks and Recreation).
We also encourage agencies to appoint a review team at. At the conclusion of the process, the review team examines the findings of the self-assessment, reaches out to stakeholders for any additional input desired, and makes recommendations to the agency. In many cases, the individuals from the cross-agency team fill this role. A sample organizational structure is provided in Exhibit 2 below.
An overview of the steps in the self-assessment process is shown in Exhibit 3. The project director plans all tasks and ensures completion of all day-to-day activities.
Appendix 1 provides a specific checklist for the project director
How are the survey forms completed?
The self-assessment process is divided into two kinds of activities: an individual-level tool to be completed by officers/deputies, supervisors, command staff, civilian staff, and representatives from community partners, and a cross-agency tool to be completed by a team that is composed of officers/deputies, supervisors, command staff, and community members.
Exhibit 3: Community Policing Self-Assessment Tool Overview |
CHECKLIST I: Planning for the Self-Assessment Process
Checklist II: Implementing the Self-Assessment Tools
Cross-Agency Team
Individual Officer, Supervisor, Command Staff, Civilian Staff, and Community Partner Surveys Administer Surveys Online
CHECKLIST III: Obtaining CP-SAT Results
Reporting Online Data
CHECKLIST IV: Strategic and Action Planning for the Future
Step 6: CEO (Chief) decides actions to pursue, including disseminating the results. |
The rest of this section of the user’s guide provides details about how to complete each activity listed in the Community Policing Self-Assessment Tool Checklist. A specific activity is named at the beginning of each section along with details about ways to complete it. When appropriate, the section includes tips and lists questions to consider. Some sections also reference additional information in the appendices.
Step 1: Initiate planning and logistics
Step 1A: Ensure technical requirements are met
Meeting a few technical requirements can ensure the administration of CP-SAT goes smoothly:
Each staff member must have access to a computer with Internet access.
Each staff member must have an email address that they check regularly.
If either of the above technical requirements is not met, please send an email to the CP-SAT Administrator at ICF International (CPSAT@icfsurveys.com) to discuss the best way of moving forward. If both the above technical requirements are met, then you are able to start administering CP-SAT as soon as your agency is ready. Provide ICF with the necessary information, described below.
Step 1B: CEO appoints a project director
The CEO of the law enforcement agency appoints a project director to oversee the self-assessment process from start to finish. The following list summarizes the project director’s main tasks:
Work with the CEO to promote and build support for the self-assessment.
Work with the CEO to brainstorm plans for disseminating the results of the self-assessment process.
Select members of the cross-agency team.
Provide direction and set out tasks, timeline, and resources to the cross-agency team and cross-agency chairperson.
Give orientation to all participants who are going to complete the tool (e.g., during role call or through a memo).
Assemble a review team to interpret summary report results.
Obtain input from stakeholders on the results as part of the review team.
Compile recommendations from the review team to present to the CEO.
The project director is the day-to-day manager of the self-assessment process, a spokesperson for this effort, and a resource for the cross-agency team. Appendix 1 lists the tasks that the project director should complete during the self-assessment process.
Step 1C: Build support
The CEO and the project director should promote and build support for the self-assessment through briefings, memos, and meetings with stakeholders. The message is to explain the self-assessment process, answer questions, and alleviate concerns about the process. By building support for the self-assessment from the beginning of the process, employees are more likely to support it. Early in the process, the CEO and project director also should engage the collective bargaining units that represent officers.
TIP: The following groups are stakeholders in the self-assessment process:
TIP: Agencies with strong collective bargaining units need to be especially sensitive to engaging union leadership in the assessment process. Even passive lack of support can undermine data collection. Agencies should seek active public support from the union throughout the process, beginning with a letter to the membership expressing the positive aspects of the assessment. |
Step 1D: Plan for dissemination of results
The CEO and project director plan for later dissemination of the assessment results. During the initial planning, the CEO and project director consider which organizations and people will receive the results, what information will be disseminated, and how it will be distributed. The CEO and project director also select about 10 members to form a review team. The review team examines the findings of the self-assessment, reaches out to stakeholders for any additional input desired, and makes recommendations to the agency. In many cases, the individuals from the cross-agency team (described below) fill this role. At minimum, the project director and cross-agency chairperson also serve on the review team.
Step 2: Select cross-agency team and community partners
Agencies have the option of using a cross-agency team to promote discussion among representatives of various community stakeholder groups. Members of the cross-agency team discuss and come to consensus answers on each item of the CP-SAT. The agency benefits from the cross-agency team through the thorough discussion about problem areas and differences in perception among stakeholders. To get the most thorough feedback from the cross-agency team, agencies need to select community partners with which the agency works.
The project director selects at least 12 cross-agency team members: three officers, three supervisors, three command staff, and three community members. With this cross-demographic mix, the project director can divide the team into three subgroups. Each discusses one of the three modules in the CP-SAT (problem solving, community partnerships, and organizational transformation). Agencies may want to consider having more than three community members on the cross-agency team or adding civilian employees or personnel from other departments (e.g., Neighborhood Preservation, Parks and Recreation).
When choosing community representatives for the cross-agency team and participants to complete the community partner form, the project director appoints representatives from community organizations and individuals from the committee to participate in the CP-SAT. It is best to ask representatives and individuals that have worked with the agency in depth and have a reasonable perspective on the agency’s community partnerships and internal processes. These partners should be individuals who are willing and comfortable with giving feedback about the agency. If your agency is aiming to receive honest, accurate feedback, choose partners who best know the agency, even if not all partners have the most positive perception of the agency. It is important that community partners are honest about their experiences with the agency, so the project director should discuss their expectations with partners when inviting them to participate in the CP-SAT process.
Exhibit 4: Recommendations for choosing cross-agency team members and community partners |
The officers, supervisors, and command staff on the cross-agency team should:
The community members on the cross-agency team and the community partners who will complete the community partnership form should:
|
The project director sets a date, time, and location for the initial cross-agency team meeting to introduce tasks, plans a timeline, and allocates resources. The time required to complete cross-agency team tasks can vary considerably, depending on agency size and other factors. A sample timeline is provided in Exhibit 5.
Exhibit 5: Example timeline for cross-agency team key tasks |
Planning for the self-assessment process: 1–2 weeks
Implementing the self-assessment tools: 2 to 3 weeks
Automated summary reporting: 1 week
Strategic and action planning for the future: 2 weeks
The cross-agency team chairperson and the project director, as part of the review team, make recommendations for future action. |
Appendix 2 has a checklist of tasks for the cross-agency team and its chairperson to complete during the self-assessment process.
Step 3: Cross-agency team meets, and the project director develops a detailed plan
Step 3A: Cross-agency team members review the tool and tasks and select a chairperson. At this meeting participants are divided into groups to complete one of the three modules: problem solving, community partnerships, or organizational transformation.
The project director sets a time, date, and location for the cross-agency team meeting, provides direction, and establishes a timeline for the completion of the survey. The following items should be on the initial cross-agency meeting agenda:
Review the self-assessment process (the structure, how to complete the tool).
Define expectations for the cross-agency team.
Select a cross-agency chairperson.
Divide the cross-agency team into three groups, with each group focusing on one of the modules: community partnerships, problem solving, or organizational transformation.
Set a deadline for the cross-agency chairperson to enter the cross-agency team consensus answers in the online assessment.
Appendix 3 contains an example of how to review the self-assessment process with the cross-agency team. It provides details about the cross-agency team process and expectations for members.
As mentioned previously, a cross-agency team should have at least 12 members, with a minimum of three officers, three supervisors, three command staff, and three community members. Agencies can modify the cross-agency team to meet their own needs; for example, some agencies may select a larger or smaller cross-agency team, depending on the size of the organization. For the team, agencies may select additional community members or appoint civilian personnel or labor and union leaders.
To complete the cross-agency form, each of the three subgroups of equal numbers people focus on the assigned module: community partnerships, problem solving, or organizational transformation. At least one member from each demographic category should serve on each subgroup; in other words, one officer, one supervisor, one command staff member, and one community member will serve on a subgroup that completes the community partnerships module. Before each cross-agency subgroup meeting, the project director should distribute the paper-based cross-agency form, which lists the assessment items (see Appendix 4).
The project director selects one member of the cross-agency team, preferably a member of the agency, to serve as the team chairperson, because he or she will be involved in several aspects of the project, including responsibility for the following items during the self-assessment process:
Chair the meeting when the cross-agency team completes its form.
Enter the cross-agency team consensus answers into the online assessment.
Participate in the review team that examines the summary report, obtain input from stakeholders, and make recommendations for future action.
The cross-agency chairperson should have good leadership, supervision, and oversight capabilities. While the following capabilities should characterize all members of the cross-agency team, but it is particularly important for the chairperson:
Be familiar with the workings of the agency.
Have a strong background in community policing.
Be viewed as a credible leader among colleagues.
Be capable of serving as a mediator if a dispute emerges.
It is likely that individuals with these skills will be senior sworn agency officials.
Step 3B: Project director develops a detailed plan for administration and reporting.
At the beginning of the planning process, the project director should first consider whether the agency will distribute the assessment to everyone in the agency or if it will use a sampling methodology. This will affect all other planning decisions made in this section.
For larger agencies, especially those with a few hundred or more officers, it may be advantageous to sample police personnel to complete the self-assessment tool, rather than asking everyone in the agency complete the forms. Details on sampling methodologies appear in Appendix 5.
A
If the agency has fewer than 200 line-level officers, all officers should complete the survey. When possible, it is preferable to have as many cases as possible to analyze for maximum reliability and validity. Having more than 100 cases to analyze also allows the agency to examine differences in results between certain subgroups (e.g., officers with less than 1 year of experience to those with higher levels of experience) if additional data analyses are completed by the agency beyond the results contained in the summary report.
Is completion of the survey voluntary?
What, if any, consequences will there be for an individual who does not complete the form?
What safeguards, if any, are needed to ensure that procedures are followed?
How will the data be used?
Administration. The project director makes plans to administer the assessment, working with agency and labor/union representatives to answer the questions listed in the text box. Engaging the union in this process early on can be an effective way to gain support for the self-assessment. The project director should get the opinions and perspective about the project from people at all levels in the organization.
Exhibit 6: Important planning questions for administration of the assessment tool |
|
Reporting. After the individual-level data collection is complete and a cross-agency representative enters the cross-agency team consensus answers in the online form, ICF runs the automated summary report and emails it to the agency, typically within 3 business days. A sample automated summary report can be found in Appendix 6.
Step 4: Conduct agency orientation
The project director or a designee conducts an orientation for all participants. (Appendix 7 contains a sample script.) The orientation can be at roll-call, by email, through the agency newsletter, or in some other media. Following is a list of points to make in the orientation:
Why the agency is conducting the self-assessment.
Support for the self-assessment from the chief and the project director and any other agency leaders.
Benefits of completing the survey (e.g., to identify gaps in training).
The process for online data collection (e.g., an email with a link).
Deadline for completing the forms.
Anonymity and confidentiality assurance.
Agency plans for sharing the results of the assessment.
Appropriate times to complete the form (e.g., free patrol time).
Demonstration of labor/union groups support for the initiative.
Need for respondents to complete the survey independently.
Ways to contact the project director for help or answers on completing the form.
Cross-agency team
Step 1: Cross-agency team creates consensus scores for assigned modules
After the cross-agency team’s initial orientation meeting to explain the self-assessment process, the three subgroups appointed by the project director meet separately to discuss their assigned modules: community partnerships, problem solving, and organizational transformation.
Before the cross-agency subgroup meetings, the project director distributes the paper hardcopy cross-agency form, which lists the assessment items (see Appendix 4). Members of the subgroup work together to complete the assigned module and then record the consensus answers on a master paper form. Designated cross-agency team members collect the forms and check facts. Although most questions can be answered during group discussions, some may require additional information from other organizations. It should take about 1 to 2 hours to complete the assigned module.
One subgroup member takes notes to provide a snapshot of the discussions to help the cross-agency chairperson interpret the automated summary report. Although the notes are not built into the online function, attaching the information to the automated report can provide deeper detail for the agency CEO.
Step 2: Full cross-agency team discusses results
After the three cross-agency team subgroups reach consensus answers on their assigned modules, the agency has the option to bring the entire cross-agency team (i.e., all three subgroups) together to brief each other and provide additional feedback. The cross-agency chairperson facilitates this meeting and records any subsequent changes on the master paper form.
Step 2: Cross-agency representative enters scores into online assessment
After the cross-agency team reaches consensus on all questions in the CP-SAT, the chairperson or designated representative enters the answers in the online CP-SAT.
Distribute surveys to individual officer, supervisor, command staff, civilian staff, and community partners
The online CP-SAT software is Vovici Survey Workbench, a comprehensive, Web-based feedback system. Survey Workbench is an intuitive interface that helps users to create, manage, and report data. Vovici can be adapted for large, complex surveys, allowing for many question types, advanced branching, SSL encryption, persistence, unlimited survey length, and unlimited number of participants taking each survey. Vovici Survey Workbench also allows for users to email survey links and reminders to participants and track each participant’s survey status (e.g., survey link accessed, survey completed). Vovici servers host the software and data, and can be accessed through a Web browser, which eliminates problems with downloading software and hosting on individual computers. Vovici Survey Workbench users have real-time access to survey data and reporting through the user interface, or data can be exported into a CSV or SPSS file for reporting in another software program.
Administer assessments online
The online CP-SAT is administered to six different participant groups: officers, supervisors, command staff, civilian staff, community partners, and the cross-agency team. An agency can distribute the CP-SAT assessment link to each of the participant groups in two ways: (1) using an agency listserv or (2) using the Vovici Survey Workbench software.
The first option may be easiest if the agency has a separate email list for officers, supervisors, command staff, and civilian staff. If an agency lacks such email lists, or if the agency plans to track who has and has not completed the survey, it is best to distribute the survey link using the Vovici Survey Workbench software (option 2).
Option 1: Use agency email lists to distribute the CP-SAT link to agency personnel and community stakeholders.
In an email, ICF International provides the project director with a link to the CP-SAT assessment.
The project director emails agency personnel and community stakeholders the assessment link and the passwords they will use to identify their participant groupas presented in Exhibit 7. The project director sends separate emails to each type of participant group with only that group’s password. For example, all command staff receive one email that provides only the “command” password; the email does not list any other participant group passwords. Passwords are case sensitive and they are all in lowercase; thus, passwords in the email should be shown in all lowercase.
Exhibit 7: CP-SAT passwords |
|
Participant type |
CP-SAT password |
Officers |
officer |
Supervisors |
supervisor |
Command Staff |
command |
Civilian Staff |
civilian |
Community Partners |
partner |
Cross Agency Team |
agency |
Exhibit 8 provides sample language for the distribution email.
Exhibit 8: Example self-assessment distribution email |
Dear [XX agency] CP-SAT participants,
This email provides you with a link to your Community Policing Self Assessment Tool (CP-SAT) and the password to access it. I strongly encourage you to complete this assessment. It is very important to our agency because it tells us how we are doing in our implementation of community policing. It will also guide us in identifying areas we need to improve.
Your responses to this survey are confidential and anonymous. The agency receives no individual identifiers on the data, and the agency cannot link individual data to a specific email address. This self-assessment is not a test, and it has no right or wrong answers. Please answer each question honestly. The assessment will take you approximately ½ to 1 hour of your time.
Assessment URL: www.example.com/abcdepfghijklmnopqrstuvwxyz Password: officer
Please complete the assessment by [Day of the week, MM/DD/YYYY]. If you have any questions, please contact [First Name Last Name] at [555-555-5555].
Thanks, [First Name Last Name] [Chief of Police] [XX Police Agency] |
The project director sends reminder emails to all participants midway through the data collection period.
Exhibit 9 provides sample language for a reminder email.
Exhibit 9: Example reminder email |
Dear [XX agency] CP-SAT participants,
This is a reminder to participate in the Community Policing Self Assessment Tool (CP-SAT). If you have already completed the assessment, thank you for your time. If you have not yet completed the assessment, I strongly encourage you to complete it. This assessment is very important because it tells us how we are doing in our implementation of community policing. It will also guide us in identifying areas we need to improve.
Your responses to this survey are confidential and anonymous. The agency receives no individual identifiers on the data, and the agency cannot link individual data to a specific email address. This self-assessment is not a test, and it has no right or wrong answers. Please answer each question honestly. The assessment will take you approximately ½ to 1 hour of your time.
Assessment URL: www.example.com/abcdepfghijklmnopqrstuvwxyz Password: officer
Please complete the assessment by [Day of the week, MM/DD/YYYY]. If you have any questions please contact [First Name Last Name] at [555-555-5555].
Thanks, [First Name Last Name] [Chief of Police] [XX Police Agency] |
Option 2: Use the Vovici Survey Workbench software to distribute the CP-SAT.
ICF will distribute the CP-SAT assessment link for the Vovici Survey Workbench. The agency project director sends an email to ICF with the following five documents attached:
Participant list.
Name and email address of the person who will send the email to participants.
Text for the distribution email.
Text for the reminder email.
Assessment timeline.
Assemble participant list
ICF needs a Microsoft Excel list of participants to upload to the Web-based survey software. The Excel document should be formatted with two columns of information for each participant: email address and password. Please note that headings (e.g., “Email”) should be included in the Microsoft Excel document. See Exhibit 10 below for an example.
Exhibit 10: Example Microsoft Excel participant list structure |
|
Password |
|
joe.smith@email.com |
officer |
jane.anderson@email.com |
command |
suzy.que@email.com |
civilian |
maria.martinez@email.com |
officer |
frank.thomas@email.com |
partner |
matthew.davis@email.com |
supervisor |
Passwords identify the category of participant as presented in Exhibit 11. All participants of the type (e.g., supervisors), both agency personnel and community partners, use the same password (e.g., “supervisor”). Please note that passwords are case sensitive; thus, passwords should be in all lowercase letters.
Exhibit 11: CP-SAT passwords |
|
Participant type |
CP-SAT password |
Officers |
officer |
Supervisors |
supervisor |
Command Staff |
command |
Civilian Staff |
civilian |
Community Partners |
partner |
Cross Agency Team* |
agency |
*Only the chairperson of the cross agency team should be included in the participant list. Other members of the cross-agency team should not receive the assessment URL and “agency” password. If cross-agency team members are also taking individual-level assessments, team members should be included in the participant list with their appropriate individual-level password (e.g., officer), but not their cross-agency team password (i.e., agency).
The Microsoft Excel participant list file should be saved as a .csv document for uploading to the Vovici Survey Workbench software. To save as this file as a .csv file, you must first delete the unused sheets from the workbook (e.g., sheet 2 and sheet 3) by right clicking on the unused tabs and select “Delete.”
Create a folder named “CPSAT” on your hard drive as the destination for saving the .csv file.
In the Microsoft Excel document “File” menu, select “Save As.” In the “Save As” box, choose the “CPSAT” folder. Type “CPSATParticipantsCSV” in the “File Name” text box to designate CPSATParticipants as the name of the .csv participant file. In the “Save as type” drop down menu, select “CSV” (comma delimited) (*.csv)” and click “Save.”
Select “yes” if you receive a Microsoft Excel prompt that notifies you that “CPSATParticipantsCSV.csv may contain features that are not compatible with CSV (comma delimited).”
Customize the email address and name of the person sending the distribution email
The Web-based survey software program can customize the email address and display name of the person that will send the distribution emails. This email address and name should be an individual who has the credibility to encourage individuals to open the email and participate in the assessment (e.g., the agency CEO).
Before the assessment distribution email is sent from the person specified, it is important for the project director to check with the agency’s information technology (IT) department to identify any potential issues with the assessment distribution email (e.g., spam filters). The project director should specifically ask if an email coming from outside the agency that looks as if it is from an internal agency individual will be blocked by the agency’s email filtering (e.g., avoiding spoofing). If the IT department cannot definitively determine if this practice is allowed, contact ICF about sending a test email from the Vovici Survey Workbench software to the project director.
Write the assessment email with link to CP-SAT
ICF, using the Vovici Survey Workbench application, will distribute a customizable email invitation to participants. The project director provides ICF International with the text for the assessment distribution email. Exhibit 12 provides example language that you can adapt, as appropriate. Please note that the password is individualized for each participant based on the Microsoft Excel participant list requested in #1 above.
Exhibit 12: Example assessment distribution email text |
Dear [XX agency] CP-SAT participants,
Below you will find your Community Policing Self Assessment Tool (CP-SAT) URL and password. I strongly encourage you to complete this assessment. This assessment is very important to assess how well our agency is implementing community policing and to identify areas for improvement.
Your responses to this survey will be kept confidential. There are no individual identifiers in the data that the agency will receive, and the agency will not be able to link an individual’s data to their email address. This is not a test, and there are no right or wrong answers. Please answer each question honestly. The assessment will take you approximately ½ hour to 1 hour of your time.
Assessment URL: [www.example.com/abcdepfghijklmnopqrstuvwxyz] Password: [officer]
Please complete the assessment by [Friday, MM/DD/YYYY]. If you have any questions please contact [First Name Last Name] at [555-555-5555].
Thanks, [First Name Last Name] [Chief of Police] [XX Police Agency] |
Reminder email text
Vovici Survey Workbench can send reminder emails to participants who have not completed the assessment. The project director provides the text for the email to ICF (See Exhibit 13).
Exhibit 13: Example reminder email text |
Dear [XX agency] CP-SAT participants,
This is a reminder to participate in the Community Policing Self Assessment Tool (CP-SAT). I strongly encourage you to complete it. This assessment is very important to assess how well our agency is implementing community policing and to identify areas for improvement.
Your responses to this survey will be kept confidential. There are no individual identifiers in the data that the agency will receive, and the agency will not be able to link an individual’s data to their email address. This is not a test, and there are no right or wrong answers. Please answer each question honestly. The assessment will take you approximately ½ hour to 1 hour of your time.
Assessment URL: [www.example.com/abcdepfghijklmnopqrstuvwxyz] Password: [officer]
Please complete the assessment by [Day of the Week], [MM/DD/YYYY]. If you have any questions please contact [First Name Last Name] at [555-555-5555].
Thank you, [First Name Last Name] [Chief of Police] [XX Police Agency] |
Plan assessment timeline
The agency project director supplies ICF with the following dates:
Begin date. The begin date—when the initial email with the CP-SAT link and password is sent to participants.
End date. The end date—deadline for participants to complete and submit the assessment. This date is announced in the distribution email and the reminder email. ICF downloads the data the day after the end date to prepare the report. The assessment data collection, the period of time between the begin and end dates, is approximately 2 to 3 weeks.
Reminder date. The reminder date—the date a reminder email is sent to participants that have not completed and submitted the assessment. Typically the reminder is sent at the midpoint of data collection.
The project director sends an email to ICF with these attachments: the .csv participant list; name and email address of the person sending the distribution email; text for the assessment distribution email; text for the reminder email; and the assessment timeline. Address the email to the CP-SAT Administrator at ICF (CPSAT@icfsurveys.com).
Administration frequently asked technological questions
Q. What does the project director do if one or more participants deleted the survey email?
A. If you used option 1 and distributed the CP-SAT assessment link via an agency email list: you can email the participant and include the survey link and password.
A. If you used option 2 and distributed the CP-SAT assessment link using the Vovici Survey Workbench software, ICF can change the participant status in the Vovici Web survey software system and re-send the invitation. The project director should collect the addresses of all participants that have deleted their emails before contacting ICF.
Q. How does the project director check the participant status for who has and has not completed the assessment?
A. [This is only applicable if you used option 2 and distributed the CP-SAT using the Vovici Web survey software system.] Vovici’s participant selector feature allows you to check the completion status of all participants or of specific individuals. At two requested points during the data collection period, ICF can send you a Microsoft Excel file displaying participant status. Data in the “completed” column indicates that participant has completed the assessment. Data in the “started” column, but not the “completed” column, indicates that participant has clicked on the assessment link and has completed between 0% and 99% of the assessment. No data in the “started” or “completed” columns indicates that the participant has not clicked on the survey link and has completed 0% of the CP-SAT. The project director should gather the corrected addresses for all previously incorrect participant email addresses before contacting ICF.
Q. What does the project director do about incorrect email addresses?
A. This is applicable only if you used option 2 and distributed the CP-SAT using the Vovici Web survey software system. Contact ICF with a list of email addresses that were initially provided incorrectly, and provide the correct email addresses. ICF will re-distribute the CP-SAT invitation to the corrected email addresses.
Q. If a participant is marked as “completed” in the status, but claims to have not completed the assessment, what can the project director do?
A. In this case, the only remedy is to clear the participant’s responses and re-enter the name in the participant list. Contact ICF and provide the participant’s information. ICF will clear the participant’s status, re-enter the name in the participant list, and re-distribute the assessment link either using the Vovici application if option 1 was used for CP-SAT distribution or through an email with a link and password if option 2 was used for CP-SAT distribution.
COPS offers an automated summary report in Microsoft Excel to help agencies combine and display CP-SAT data in an easy-to-read format. This automated report imports the raw data, conducts descriptive analyses, and presents the data in a summary report. The automated summary report provides the mean scores for each section of the CP-SAT. Appendix 6 gives an example of the summary report. Appendix 8 provides details on the specific items averaged to create each exhibit in the summary report. Because the summary report is automated, it is simply an overview of the agency’s results, and it does not provide action items for a specific agency. The steps explain the automated summary report.
Step 1: Run automated summary report
A summary report can be created in Microsoft Excel to help agencies easily combine and display their data in an easy-to-read format. The summary report summarizes the data for up to 1,000 participants. ICF sends an agency the Microsoft Excel-based report by email within 3 to 5 business days from the end of data collection. The report is named “CPSAT_REPORT_date_time,” where date and time are the specific date and time the report was run.
Step 2: Access raw data
Unhide raw data worksheets in the report
Although the raw data cannot be viewed immediately in the summary report, it is included in hidden worksheets in the report. If an agency wants to analyze data beyond what is displayed in the automated report, agencies can access the raw data by un-hiding worksheets. Follow these steps to unhide the raw data worksheets:
From the “Format” menu, select “Sheet””Unhide.”
Choose “Sheet1.”
Click “ok.”
Repeat steps 1 to 3 for Sheet2 through Sheet10
Access data through SPSS
To view and analyze data in an SPSS (.sav) file, agencies can request that ICF export their data directly from the Vovici Survey Workbench Web page.
As with other segments of the process listed in this guide, strategic and action planning is an optional (but recommended) stage to maximize the usefulness of the self-assessment.
Step 1: Project director assembles a review team to examine findings and analyze their meaning for the agency
While ICF is running the summary report, the project director assembles a review team to examine the findings from the self-assessment and analyze what they mean for the agency. The project director and the cross-agency team chairperson should be members of this review team because they have been involved throughout the process. The review team is tasked with the following activities:
Examine the automated summary report of the self-assessment findings that ICF submits to the CEO.
Obtain input on the results from stakeholders.
Make recommendations for future action; the project director then presents the recommendations to the CEO.
The review team consists of about 10 people, including the project director and the cross-agency team chairperson. The project director may include other team participants, such as the following:
Other members of the cross-agency team.
Members of the agency’s research and planning unit.
Representatives from all levels of the organization: officers, supervisors, command staff, and civilian staff that were not part of the cross-agency team.
Representatives from labor/union groups.
Representatives from the community or partner organizations who were not part of the cross-agency team.
Members of other local law enforcement agencies.
Criminal justice researchers from local universities.
Members of the review team should have the same characteristics as cross-agency team members. They should be critical thinkers, but, more important, they should be innovative thinkers who can help develop proposed action items for the future.
Step 2: Review team examines the summary report
The automated Microsoft Excel summary report was created to help agencies combine and display their data in an easy-to-read format. The automated summary report provides the mean scores for each section of the CP-SAT. Agency scores above 3.0 indicate agency stakeholders generally feel that the component of community policing is being implemented to some extent. Scores on a specific exhibit above 3.0 mean that participants agreed more than they disagreed with the statements that were averaged for that particular section. Appendix 8 provides details on the specific items averaged to create each exhibit in the summary report. The CP-SAT results and summary report do not grade the agency on its community policing efforts; rather, it measures the agency’s progress compared with its goals. Because the summary report is automated, it is simply an overview of the agency’s results, and it does not provide action items specifically for the agency.
The review team reviews the automated summary report with the project director and discusses ways to obtain comments from stakeholders and make recommendations for future action. The review team also identifies the agency’s strengths and gaps in community policing implementation, with particular attention to ways to add value to the existing strengths and address the gaps. As review team members examine the summary report, they may consult the resources listed by topic in Appendix 9. The references can help the team formulate recommendations.
Step 3: Review team obtains comments from stakeholders
The review team should talk with internal and external stakeholders to hear their comments on the results of the self-assessment and solicit recommendations for future action.
Step 4: Review team makes recommendations for future action
After gathering comments from stakeholders, the review team will develop recommendations for future action in the agency. The recommendations could include additional in-service training on problem-solving techniques, ways to build stronger partnerships, and changes to performance evaluations.
Step 5: Project director compiles recommendations from the review team to present to the CEO
The project director compiles the review team’s recommendations in a final report to present to the CEO.
Step 6: Agency CEO decides actions to pursue, including disseminating the results
The CEO reviews the recommendations and chooses the ones to pursue. The CEO may refer staff that will implement the recommendations to the resources listed in Appendix 9. The CEO also makes the final decisions about which results to disseminate and to whom. The TIP boxes list information that could be disseminated, who should receive the information, and ways to disseminate the information.
TIP: Items to consider disseminating
|
TIP: Possible recipients of the findings
|
TIP: Ways to disseminate the findings
|
The Internet references cited in this publication were valid as of April 2010. Because URLs and Web sites are in constant flux, neither the authors nor the COPS Office can vouch for their current validity.
Baker, T.E. Effective Police Leadership: Moving Beyond Management. Flushing, New York: Looseleaf Law Publications, Inc., 2000.
Eck, J. E. and W. Spelman. Problem-Solving: Problem-Oriented Policing in Newport News. Washington, D.C.: Police Executive Research Forum, 1987.
Ford, J. K. Organizational Survey: An Overview. Michigan State University, School of Criminal Justice, 2004. http://www.cj.msu.edu/~outreach/cp/survey.html
Goldstein, H., Problem-Oriented Policing, New York: McGraw-Hill, 1990.
Goldstein, H. “Improving Policing: A Problem-Oriented Approach,” Crime and Delinquency, Volume 25 (1979): 236–258.
Greene, J. R., “Community Policing in America: Changing the Nature, Structure, and Function of the Police.” Policies, Processes, and Decisions of the Criminal Justice System 3 (2000): 299–370.
Grinc, R.M., “‘Angels in Marble’: Problems in Stimulating Community Involvement in Community Policing,” Crime & Delinquency, 40 (1994)(3): 437–468.
Haberfeld, M.R. (2006). Theories of Police Leadership. Upper Saddle River, New Jersey: Pearson Prentice Hall, 2006.
Maguire, E.R. and S.D. Mastrofski, “Patterns of Community Policing in the United States,” Police Quarterly 3 (2000): 4–45.
Michigan Regional Community Policing Institute. Community Policing: A Road Map for Change. Michigan State University, School of Criminal Justice, no date. http://www.cj.msu.edu/~outreach/rcpi/roadmap.pdf
RAND Corporation. A Measurement Model to Estimating Community Policing Implementation. Santa Monica, California, 2000.
Trojanowicz, R.C. Community Policing Guidelines for Police Chiefs. 1994.http://www.policenet.com/compguide.html
Trojanowicz, R. and B. Bucqueroux. Community Policing: How to Get Started. Cincinnati: Anderson Publishing Co., 1994.
Trojanowicz, R. and Bucqueroux, B., “Toward Development of Meaningful and Effective Performance Evaluations.” East Lansing, Michigan: Michigan State University, 1992. http://www.cj.msu.edu/~people/cp/toward.html.
Trojanowicz, R. C., V.E. Kappeler, L.K. Gaines, and B. Bucqueroux. Community Policing: A Contemporary Perspective, 2nd ed. Cincinnati: Anderson Publishing Co., 1998.
Western Regional Institute for Community Oriented Public Safety. Onsite Assessment Process. Washington, DC: U.S. Department of Justice Office of Community Oriented Policing Services, no date.
Ziembo-Vogl, J. and D. Woods, “Defining Community Policing : Practice versus Paradigm,” Police Studies: International Review of Police Development, 19 (1996) (3): 33–50.
Zhao, J. Why Police Organizations Change: A Study of Community-oriented Policing. Washington, D.C.: Police Executive Research Forum, 1996.
Zhao, J., O.C. Thurman, and N.P. Lovrich, “Community-Oriented Policing across the U.S.: Facilitators and Impediments to Implementation,” American Journal of Police 1 (1995): 11–28.
1 Office of Community Oriented Policing Services, “What is Community Policing.” http://www.cops.usdoj.gov/Default.asp?Item=36; Office of Community Oriented Policing Services, “Community Policing Defined.” http://www.cops.usdoj.gov/RIC/ResourceDetail.aspx?RID=513
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Community Policing Implementation Self Assessment Tool |
Author | dmead |
File Modified | 0000-00-00 |
File Created | 2021-01-30 |