User Manual

CP Self-Assessment Tool Users Guide Draft edited Draft_4 24 09 (3).doc

Community Policing Self-Assessment Tool (CP-SAT)

User Manual

OMB: 1103-0105

Document [doc]
Download: doc | pdf


Community Policing Self-Assessment Tool:

Documenting Today and Planning for Tomorrow

A User’s Guide

DRAFT










ICF International

Police Executive Research Forum

U.S. Department of Justice Office of Community Oriented Policing Services









April 2009







Community Policing Self-Assessment Tool:

Documenting Today and Planning for Tomorrow

A User’s Guide



Bruce Taylor

Deirdre Weiss

Drew Diamond

Police Executive Research Forum



Rebecca Mulvaney

Beth Heinen

Candace Cronin

ICF International



Rob Chapman

Matthew Scheider

Office of Community Oriented Policing Services
















This project was supported by Cooperative Agreement Number 2004-CK-WX-K028 awarded by the Office of Community Oriented Policing Services, U.S. Department of Justice. Points of view or opinions contained in this document are those of the authors and do not necessarily represent the official position or policies of the U.S. Department of Justice, Caliber, or the Police Executive Research Forum and its members. Rather, the references are illustrations to supplement discussion of the issues.

CONTENTS



Acknowledgments


We would like to take this opportunity to express our sincere gratitude to a number of organizations and individuals who made this effort possible. First, we thank the police agencies who participated in our pilot testing. These agencies provided critical feedback and numerous suggestions that improved the quality of the self-assessment tool. We are truly grateful for the time, resources, and commitment these departments put into the piloting effort. The Appleton (Wisconsin) Police Department piloted the first version of the self-assessment tool and contributed greatly to this document. The department took on an enormous task—with a much longer instrument—and did so with enthusiasm and good humor. We are particularly grateful to former Chief Richard Myers, Captain Julie Bahr, and Fiscal Resources Manager Sue Ann Teer. We are also grateful to the staff of the Lowell (Massachusetts) Police Department, particularly former Superintendent Edward Davis and Research Director Sharon Hanson, who assisted with the second pilot test and provided suggestions about how to develop support for the self-assessment. The Charlotte-Mecklenburg Police Department served as the third pilot site and provided valuable advice about how to administer the self-assessment forms. We extend special recognition to Chief Darrel Stephens; Paul Paskoff, director of research, planning, and analysis; and John Couchell, research team supervisor. The Ocala (Florida) Police Department served as the fourth site. Chief Samuel Williams and Major Rick Lenemier provided thoughtful suggestions about how to encourage officers to participate in the self-assessment. We also thank the Leesburg (Virginia) Police Department, which was the final site for the paper-based version and was able to complete the piloting in a short amount of time before the end of 2006. Thanks to Chief Joseph R. Price and Captain Jeff VanGilder for their dedication to completing the piloting during an otherwise busy time of the year and for their suggestions about how to partner with local agencies in this self-assessment process. Finally, we are very grateful to the Gaithersburg (Maryland) Police Department for their willingness to serve as the pilot test site for the online version of the instrument. In particular, Chief John King and Captain Chris Bonvillain were instrumental in allowing us to gather invaluable suggestions and feedback on the usability of the web-based assessment.


We are grateful to the numerous police chiefs, sheriffs, and researchers at Regional Community Policing Institutes whose input helped the project team develop and refine the community policing framework that is the basis of the self-assessment tool. We also received invaluable feedback from the policing professionals and other who attended conference presentations about the project, including presentations at the Police Executive Research Forum’s Annual Meeting, the National Crime Prevention Council Conference, the International Association of Chiefs of Police Annual Conference, a National Sheriffs’ Association meeting, and the Department of Justice Office of Community Oriented Policing Services (the COPS Office) Conference. This project was also guided by Technical Advisory Group (TAG) members who assisted with the development of the community policing framework and assisted with the review of the tool. TAG members were Richard Bennett, American University; Dennis Kenney, John Jay College; Chief Richard Myers, Appleton Police Department; and Norm Peterson, University of Minnesota. Thanks also to the police practitioners and researchers who assisted with the review of the self-assessment tool or provided other guidance and suggestions throughout the course of the development of the self-assessment tool. Special thanks to Bonnie Bucqueroux, Michigan State University; Ed Connors, Institute for Law and Justice; Gary Cordner, Eastern Kentucky University; Deputy Chief Ron Glensor, Reno Police Department; Chief Robert Lunney, ret. (insert name of his former police department); Melissa Schaefer Morabito, Rutgers University; Dennis Rosenbaum, University of Illinois at Chicago; Captain Mike Wells, ret., Concord Police Department; and Jerry Williams, University of Colorado at Denver.


Finally, we thank our project monitor, Rob Chapman of the COPS Office, for his steady hand in guiding this project. We benefited greatly from his input and feedback at all stages in the project. We also received tremendous support throughout the project from COPS Office Director Carl Peed and Assistant Director Matthew Scheider. Our thanks also to Amy Schapiro and Cynthia Pappas for their suggestions and input into this project.



********************


Authors’ Note: This project is a first attempt at developing a community policing assessment tool. The authors of this assessment and guide welcome feedback about the implementation of the tool and the tool itself. Direct questions and comments to Rob Chapman at 202.514.8278.

Introduction



This guide is intended to be a resource for agencies planning to conduct the Community Policing Self-Assessment Tool (CP-SAT). It provides background information on CP-SAT and detailed suggestions for implementation, intended for the individual(s) directing the CP-SAT effort within an agency. Authors: Please explain who the user is for the user’s guide and include that information at the start of this introduction (chief, sheriff, person appointed.by chief or sheriff to administer the assessment, in-house social scientist, consultant social scientist?????? Or the agency as a whole?)

.

During the last 20 years, increasing numbers of policing agencies across the United States have identified themselves as community policing agencies. In fact, one is hard-pressed to find a policing agency that does not claim that it has adopted community policing in some form. During this time, the policing profession has come to view community policing as an effective way to address crime and disorder problems and improve community satisfaction with police services. Yet, the methods agencies use in implementing community policing vary and few tools are available for assessing agency efforts.1 The CP-SAT seeks to serve as the first comprehensive tool for agencies to self-assess their implementation of community policing. The CP-SAT is an important step in advancing community policing. . It is the authors’ sincere hope that this self-assessment tool will help agencies add value to and enhance their existing efforts by identifying their strengths and gaps in implementing community policing.


During the development phase, this tool was pilot tested in five police agencies of various sizes; however, just like any first attempt to develop something of this magnitude, it likely will be necessary to modify and build on this instrument over time. In addition to being a tool for use by local agencies, researchers can use and improve the tool so that the authors can build a body of knowledge about what it means to implement and advance community policing.


Structure of the User’s Guide (Authors: Who is the user?)


This user’s guide provides instruction and advice for administering and analyzing data collected through the CP-SAT. It is divided into two main sections, “Introduction to the Self-Assessment of Community Policing Process” and “Conducting the Self-Assessment of Community Policing.” The first section provides an overview of the self-assessment process and tool, reasons for implementing the self-assessment, and the structure of the tool. The second section provides nuts and bolts guidance about how to complete the assessment. It focuses on the planning process, implementing the tool, data analysis and interpretation, reporting, and strategic and action planning for the future. The guide also includes a resource section to assist agencies with strategic and action planning resulting from the self-assessment process.

Some of the processes and analyses discussed in the user’s guide will not be relevant to all departments. The authors encourage agencies to customize these methods to what will work best in their organization or community. The assessment tool includes many different strategies and approaches. Although it is unlikely that any one agency is engaged in every activity covered by the CP-SAT, conducting the full assessment may spur creative ideas for strategies and approaches that they may want to consider for the future.






















PART I: Introduction to the Community Policing Self-Assessment Process

PURPOSE OF THE SELF-ASSESSMENT PROCESS AND TOOL


The Community Policing Self-Assessment Tool (CP-SAT) operationalizes the philosophy of community policing and allows agencies to measure and evaluate their implementation efforts across three elements (community partnerships, problem solving, and organizational transformation) and associated subelements. The tool, which can be administered online or via paper-and-pencil surveys, does not assess the specific programs of community policing implementation, but rather the three commonly-accepted elements and a definition of community policing that is applicable to both police departments and sheriff’s offices of all types and sizes throughout the United States.



Definition of Community Policing


Community policing is a philosophy that promotes organizational strategies to establish and routinely use community partnerships and systematically apply problem-solving techniques to proactively address the underlying conditions that give rise to crime, social disorder, and fear of crime.


Community policing is a philosophy that promotes and supports organizational strategies to address the causes and reduce the fear of crime and social disorder; agencies are expected to implement or enhance community policing strategies that illustrate community partnerships, problem solving, and organizational commitment.2


Description of Community Policing


Community policing focuses on crime, social disorder, and fear of crime through the delivery of police services that include aspects of traditional law enforcement, as well as prevention, problem-solving, community engagement, and partnerships. The community policing model balances reactive responses to calls for service with proactive problem-solving centered on the causes of crime, disorder, and fear of crime. Community policing requires police and citizens to join as partners in the course of both identifying and effectively addressing these issues.3



The CP-SAT recognizes that the specific activities of community policing agencies may look different in communities of various sizes across the United States because of a number of factors and considerations, but it maintains that these police departments and sheriff’s offices will share commonalities with others through the commonly-accepted key philosophical principles of community policing. These key principles are outlined in Figure 1. Agencies that hold this same philosophical approach to community policing and embrace the broad tenets of the community policing definition will find that the CP-SAT will help them learn more about the state of their community policing implementation. The tool will not work effectively for agencies that hold fundamentally different views of community policing (e.g., viewing community policing solely as a collection of programs). As a first step, agencies should review the framework for community policing principles to familiarize themselves with the concepts assessed in the tool.


Figure 1: Framework for Community Policing Principles.


See separate hard copy for corrections in figure 1.


It is also important to stress that the CP-SAT is a process assessment tool, not an impact assessment tool. In other words, the tool focuses on the processes used by agencies implementing community policing (e.g., how well is an agency implementing community policing?) rather than the results of those processes (e.g., what are the effects of an agency’s implementation of community policing?). This is not to minimize the importance of assessment or evaluation activities that capture effects or outcomes.


To fully assess the implementation of community policing, it is important to have a strong understanding of what community policing comprises in an agency and how it is being implemented, which is what the CP-SAT is meant to provide. Nevertheless, agencies that use the CP-SAT should also consider the various data they may have within their agency that could supplement this assessment by providing information about outcomes of community policing efforts, such as community surveys or crime statistics. The process and outcome data together would provide a rich view of the agency’s community policing. Some departments conduct community surveys that allow them to more fully relate the self-assessment data to various community outcomes.

Importance of Assessing Community Policing Implementation


Agencies engaged in community policing are committed to enhancing trust between themselves and the communities they serve, as well as improving public safety. These efforts require systemic change throughout law enforcement agencies. The self-assessment tool allows agencies to better understand their commitment by obtaining a snapshot of their level of community policing implementation, as well as setting a baseline for their agency efforts and measuring their progress against that baseline. Agencies engaged in the self-assessment process demonstrate their commitment to community policing and to adding value to their existing efforts by identifying the strengths and gaps in their community policing implementation in the context of their agency’s policing style, priorities, strategies, other management prerogatives, resources, and other internal and external factors—all through a nonpunitive, user-friendly, affordable tool.



Agencies will get as much out of the process as they put into it. To make this process credible and the findings reliable, an agency needs to take an honest approach to the self-assessment. The value of the process is in being introspective and transparent. There are no right or wrong answers or findings nor a score that signifies that the agency has effectively implemented community policing because full implementation of community policing is an ideal. Instead, the CP-SAT tool provides information about the agency’s strengths and gaps, which the local agencies must place into context as part of their interpretation of the results because it is only one part of a fuller picture. This tool, therefore, will assess the journey an agency is taking rather than simply the destination. The CP-SAT provides agencies with the ability to establish a baseline regarding their community policing implementation. It is important for agencies to document their progress and maintenance of their goals, as well as identify gaps and opportunities as a way of demonstrating progress toward shared public safety goals and achieving results.


Agencies can use the results of the CP-SAT for a number of activities, depending on their priorities and the direction in which they choose to proceed with the findings. Such activities could include facilitating strategic planning and benchmarking, assessing agency goals, informing training and management initiatives, and external reporting. This self-assessment tool can also be used to determine the current level of community policing implementation, assisting both executives and officers in developing goals to reach the next steps on the implementation continuum.


Administering the assessment tool will provide departments—many perhaps for the first time—with comprehensive data to guide their change efforts. Some of the primary uses of the data and findings from the self-assessment process assessment include the following:


  • Agency performance measurement. Increasingly, local law enforcement agencies must comply with state and local requirements to submit agency performance data. This self-assessment tool will help departments meet these requests by documenting organizational changes and agency objectives and goals that have been reached.

  • Agency budgeting. Assessment data can support the development and justification of budget proposals. The data can allow executives to make informed choices about where to allocate limited resources and to focus funding on the most productive and efficient practices. It will also allow agencies to better target their needs for grant funding by using the data to document agency needs.

  • Agency development. Assessment data can keep an agency focused on the big picture. The data can facilitate building an infrastructure for law enforcement agencies to critically evaluate their strategies, issues, and problems. It can also empower line officers to implement community policing strategies, enabling them to be more effective in their jobs.

  • Support leadership transitions. Assessment data can help incoming chiefs and sheriffs gauge community policing implementation within the agency.

  • Training and development. Assessment results can assist in identifying training and development needs and priorities


Structure of the CP-SAT


The CP-SAT is a user-friendly tool that agencies can self-administer. The “Conducting the Community Policing Self-Assessment” section of this User’s Guide provides step-by-step assistance for implementing the tool. Law enforcement agencies face tight budgets and need a tool that they can implement with little to no outside assistance. Even so, not all departments will have the time, ability, or desire to administer every aspect of the tool. Some agencies may opt to seek outside assistance (from a local university, community college, or other organization that can perform statistical analyses) with data entry and statistical analyses.


There are two ways to administer the CP-SAT – via online surveys or by using paper-and-pencil surveys. While paper-and-pencil may be necessary in some cases, the online format offers several advantages. First, it removes the need for data entry (and greatly reduces the likelihood or error). Next, COPS offers an automated reporting feature that imports the data from the survey software and creates a final report, significantly reducing data analysis and reporting time and effort. Finally, it is more convenient for officers and other participants – allowing them to complete the survey from virtually any computer with an internet connection. In this users guide we have tried to note any steps that are written primarily for agencies that are administering the paper-and-pencil form.


The CP-SAT is divided into three modules that correspond to the three elements identified in the framework of community policing principles: community partnerships, problem solving, and organizational transformation (Figure 1). The assessment tool is organized into three sections that will allow an agency to document its progress toward building and sustaining community partnerships, its aptitude in applying problem-solving techniques, and the organizational changes instituted in support of community policing.


There are six versions, or forms, of the tool, each distributed to various rank levels within the organization. Agency structure varies greatly, therefore, each agency should examine which ranks fit best for the first three versions of the tool: officer level, supervisor level, and command staff level. A lieutenant in a small agency, for example, may be considered command staff while in a large agency the same rank could be considered as a supervisor. The tool also includes a civilian form and a form to be completed by key community partners. The sixth version of the assessment tool is the cross-agency version, which will be completed by a cross-section of agency personnel as well as community representatives.


Further details about the community partnerships, problem-solving, and organizational transformation modules are provided below:



Community Partnership—The extent to which agency staff support and develop collaborative relationships among individuals and organizations in the community. The tool measures three aspects of community partnerships:


  1. The extent to which the agency/officer has multidisciplinary partnerships.

  2. The resources and commitment of the agency’s and officer’s community partners.

  3. The level of interaction between the agency’s and officer’s community partners




Problem Solving4Problem solving is an analytic approach for systematically identifying neighborhood problems through coordinated community and police assessments, collecting and analyzing information about the problems, developing and implementing responses with the potential for eliminating or reducing the problems, and evaluating the responses to determine their effectiveness. Problem solving involves an agencywide commitment to go beyond traditional police responses to crime to actively address a multitude of problems that have an adverse effect on quality of life. Three aspects of problem solving are measured in this survey.


  • 1. General approach to problem solving.

  • 2. Problem-solving processes (the SARA model):

    • Identifying and prioritizing problems (Scanning)

    • Analyzing problems (Analysis)

    • Responding to problems (Response)

    • Assessing problem solving initiatives (Assessment)

  • 3. General skill in problem solving.




Organizational Transformation—The extent to which the agency environment, personnel, practices, and policies support the community policing philosophy and activities. Four aspects of organizational transformation are measured by this survey:


  1. Agency management.

  2. Organizational structure.

  3. Personnel practices.

  4. Technology and information systems.




Frequency of Applying the Self-Assessment Process. A self-assessment process should not be a one-time-only event. Police organizations should conduct self-assessments initially to gather baseline data and then to examine progress in achieving the agency’s goals or addressing the agency’s priorities against those data. Agencies may choose between two main methods to determine how frequently to apply the self-assessment tool. First, agencies could choose to engage in the self-assessment process on a regular basis (for instance, every 1, 2 or 3 years, or some other predetermined period). This allows the agency to set a baseline and to measure progress against it at regular intervals. Agencies may also choose to engage in the self-assessment process when their leadership feels that substantial changes have been made. In this case, agencies will peg the administration of the self-assessment tool to perceived changes in the organization and then examine whether any changes have occurred. Examples of times when the assessment could be undertaken include before major strategic planning efforts or in conjunction with annual or biannual department reports to elected officials and governing bodies. In this case, the tool would be implemented on an as-desired basis rather than on a regular basis (6 months could pass between the first and second implementation of the tool, while 18 months could pass between the second and third implementation). Agencies should consider the option that best fits their needs, time, and budget constraints, as well as the purpose of their effort and what they are seeking to achieve.

Challenges to the Self-Assessment Process


As with any self-assessment, agencies must face a number of challenges when engaging in the process as well as in ways to respond to the challenges. One challenge that agencies face is that they may believe their community policing is more advanced than the self-assessment indicates. For example, an agency’s results may indicate that it is not as far along the continuum as it publicly asserts. This can seem to some police chiefs and sheriffs as a less-than-desirable outcome, but with planning, the risks to such findings can be minimized.


While there are many benefits to conducting a self-assessment, such as taking stock of successes, learning what an agency may be able to improve, and raising awareness of community policing, several issues—including risks—need to be considered carefully before proceeding. The CP-SAT process is an organizational assessment, not merely a survey. To be fully successful, this effort will require a commitment from the entire organization, the devotion of some resources (mostly personnel), a willingness to be self-reflective, and perhaps most important, the courage to manage the consequences of a candid and transparent self-assessment process. From the outset of the self-assessment process, the organization’s chief executive officer should tell members of the organization and the public that the agency is engaging in a self-assessment effort, describe the benefits that self-evaluation provides, and stress that the agency will meet the challenge of identifying its strengths and gaps and will take steps to improve efforts where there are gaps. It is important to be up front, clear, and transparent from the start about the purpose, goals, and objectives of the assessment, not only within the agency but also with political leadership and the community. Furthermore, it is important for agencies not only to recognize the challenges that are identified in the assessment process, but also to develop and implement steps that address the gaps and strengthen existing efforts.
























PART II: Conducting a Community Policing Self-Assessment


This section of the user’s guide describes the process for conducting a self-assessment of community policing, including implementing the various forms that make up the tool (officer, supervisor, civilian staff, community partners, command staff, and cross-agency). While there are a number of potential ways to organize the structure of the self-assessment process, the agency chief or sheriff should appoint a project director to oversee the self-assessment from start to finish. He or she also would appoint a cross-agency team whose members would select a chairperson to manage activities associated with the team, and a review team to examine the findings of the self-assessment, reach out to stakeholders for input, and make recommendations (see Figure 2). Agencies may choose to modify this structure to meet their specific needs and the availability of personnel. Further details about the activities tasked to each group are described later in this section of the user’s guide.



Figure 2: Organizational Structure of the Self-Assessment Process.



A checklist of the steps in the self-assessment process is shown in Figure 3. The project director should use the list as a guide in making sure that all tasks are completed. (Additional checklists for the cross-agency team and cross-agency chairperson are provided in the Appendix 1). The self-assessment process is divided into two kinds of activities: an individual-level tool to be completed by officers/deputies, supervisors, civilian staff, representatives from community partners, and command staff, and a cross-agency tool to be completed by a team that, at a minimum, is composed of officers/deputies, supervisors, command staff, and community members.

Figure 3: Community Policing Self-Assessment Tool Checklist.



CHECKLIST I: Planning for the Self-Assessment Process


  • Step 1: Initial planning and logistics.

  • Step 2: Select cross-agency team.

  • Step 3: Cross-agency team meets and project director develops detailed plan.

  • Step 4: Conduct Orientation.


Checklist II: Implementing the Self-Assessment Tools


Cross Agency Team

  • Step 1: Cross-agency team completes assigned modules.

  • Step 2: Cross-agency team meets to create consensus scores.


Administer Surveys Online

  • Option 1: Distributing the CP-SAT survey link via agency email lists

  • Option 2: Distributing the CP-SAT survey link via Vovici EFM Community

  • Sending Reminders


Administer the Paper-Based Survey

  • Step 1: Distribution of the Tools.

  • Step 2: Personnel complete and return tools.


CHECKLIST III: Data Analysis and Interpretation


Analyzing and Reporting Online Data

  • Step 1: Preparing report.

  • Step 2: Exporting raw survey data.

  • Step 3: Running the report.

  • Step 4: Cleaning the report


Analyzing and Reporting Paper-and-Pencil Based Data

  • Step 1: Track and get to know the data.

  • Step 2: Enter data into the database

  • Step 3: Data cleaning.

  • Step 4: Data transformations.

  • Step 5: Quantitative analysis.

  • Step 6: Develop a codebook.

  • Step 7: Reporting



CHECKLIST IV: Strategic and Action Planning for the Future


  • Step 1: Project director assembles a review team to examine the findings and determine what they mean for the agency.

  • Step 1: Review team examines the report.

  • Step 2: Review team obtains comments from stakeholders.

  • Step 3: Review team makes recommendations for future action.

  • Step 4: Project director compiles recommendations from the review team to present to the CEO.

Step 5: CEO (Chief) decides actions to pursue, including disseminating the results.



The rest of this section of the user’s guide provides details about how to complete each activity listed in the Community Policing Self-Assessment Tool Checklist. A specific activity is named at the beginning of each section along with details about ways to complete it. When appropriate, the section includes tips and lists questions to consider. Some sections also reference additional information in the appendixes.


Checklist I: Planning for the Self-Assessment Process


Step 1: Initial Planning and Logistics


Step 1A: CEO Appoints a Project Director


The CEO of the law enforcement agency will appoint a project director who is empowered to oversee the self-assessment process from start to finish and make the assessment happen. The project director’s main tasks include the following:


  • Work with the CEO to promote and build support for the self-assessment

  • Work with the CEO to brainstorm plans for disseminating the results of the self-assessment process

  • Select members of the cross-agency team

  • Provide direction and set out tasks, timeline, and resources to the cross-agency team and cross-agency chairperson

  • Complete orientation with all staff who are going to complete the tool (e.g., during role call or through a memo)

  • Assemble a review team to examine analytic results

  • Participate on review team, making recommendations for future action

  • Obtain input from stakeholders on the results as part of the review team.

  • Compile recommendations from the review team to present to the CEO.


If the paper-and-pencil form is used the project director’s tasks may also include:

  • Develop plans for distribution, collection, data entry and analysis (with cross-agency chair), and reporting (with cross-agency chair and project director)

  • Distribute the various forms of the tool

  • Oversee data entry

  • Oversee data analysis for the officer, supervisor, civilian, community partner, and command staff forms; development of a codebook; summing scales/indices and developing scores; and running statistics and examining results

  • Draft reports on results for the officer, supervisor, and command staff findings

  • Review draft report from the cross-agency chairperson

  • Synthesize reports from the cross-agency chairperson and results from other forms of the assessment into a single report that is submitted to the CEO


The project director serves as both the day-to-day manager of the self-assessment process and as a spokesperson for this effort as well as a resource for the cross-agency team. If the paper-and-pencil form is being administered, the project director may want to designate a staff member with strong data analysis and report writing capabilities to assist with these tasks.



TIP: The project director should be:


  • Empowered to bring others into the self-assessment process

  • Respected by others within the organization

  • Committed to the goals of the self-assessment

  • An effective spokesperson for the self-assessment process

  • A leader in the organization

  • Strong in logistics and planning.


Officers in some agencies may have concerns about managing both the data collection and analysis within the agency. These agencies should consider an external approach to the assessment. An external consultant could administer and collect the various officer, supervisor, and command staff forms and analyze the findings. He or she could be from a local university or research organization.


Step 1B: Build Support


The CEO and the project director should promote and build support for the upcoming self-assessment through briefings, memos, or meetings with stakeholders to explain the self-assessment process, answer questions, and alleviate concerns about the process. By building support for the self-assessment from the beginning of the process, the agency will be more likely to obtain employee support for it. Early in the process, the CEO and project director also should also engage the collective bargaining units that represent officers.



TIP: Consider engaging the follow groups when promoting and building support for the self-assessment process:


  • All ranks in the police department

  • Labor/union representatives

  • Local public officials, such as the mayor or city manager

  • Key formal and informal leaders in the agency

  • Community partners.


TIP: Agencies with strong collective bargaining units need to be especially sensitive to engaging union leadership in the assessment process. Even passive lack of support can undermine data collection. Agencies should seek to receive active and public support from the union throughout the process, including a letter to the membership expressing the positive aspects of the assessment.




Step 1C: Plan for Dissemination of Results


The CEO and project director should brainstorm plans for disseminating the results of the assessment, as well as who to include on the review team. Initial brainstorming about dissemination may include an examination of the organizations and persons to which the agency will disseminate results, the information that will be disseminated, and the methods of dissemination. Initial brainstorming about the review team should include who will serve on the team. About 10 persons should serve on the review team, along with participation from the project director and cross-agency chairperson. This also is a good time to brainstorm ideas for administering the individual assessment forms (particularly if paper-and-pencil form is used).



Step 2: Select Cross-Agency Team


The purpose of the Cross-Agency Team is to facilitate discussion between representatives of the various stakeholder groups by encouraging them to come to consensus on scores for each section of CP-SAT.


The project director selects cross-agency team members (see Figure 4). If possible, at least 12 persons should comprise the team: three officers, three supervisors, three command staff, and three community members. Agencies may also want to consider having more than three community members serve on the cross agency team or adding civilian employees or personnel from other departments (e.g., Neighborhood Preservation, Parks and Recreation).



Figure 4: Recommendations for Choosing Cross-Agency Team Members.


The officers on the cross-agency team should:


  • Have baseline experience from which they can speak

  • Be in touch with other officers

  • Have the respect of peers

  • Have extensive familiarity with community policing

  • Be able to speak on behalf of officers in the department

  • Be honest and open—and, if need be, critical—but not obstructionist

  • Be able to describe typical, average experiences and not necessarily their own

  • Be willing to speak up.


The supervisors on the cross-agency team should:


  • Have baseline experience from which they can speak

  • Be in touch with other supervisors

  • Have the respect of peers and subordinates

  • Have extensive familiarity with community policing

  • Be able to speak on behalf of supervisors in the department

  • Be honest and open—and if need be, critical—but not obstructionist

  • Be able to describe typical, average experiences and not necessarily their own.


The command staff on the cross-agency team should:


  • Be knowledgeable about what is going on in the entire department

  • Have the respect of peers and subordinates

  • Have extensive familiarity with community policing

  • Be able to speak on behalf of command staff and the department as a whole

  • Be honest and open—and, if need be, critical—but not obstructionist

  • Be able to describe typical, average experiences and not necessarily their own

  • Bave a broad perspective and be able to understand what is going on across the agency.


The community members on the cross-agency team should:


  • Understand the community and know it well

  • Have knowledge of community policing

  • Not be an agency cheerleader.

  • Be able to partner and work with the police department

  • Be honest and open—and, if need be, critical—but not obstructionist

  • Be able to describe typical, average experiences and not necessarily their own

  • Have a broad perspective

  • Not have preconceived notions that will bias the results of the evaluation

  • Have the respect of community members

  • Provide a check on the police department’s view of itself by looking at it from the outside

  • Be willing to speak up on behalf of the community.



The project director will also provide direction for the cross-agency team. He or she will set a date, time, and location for a cross-agency team meeting during which he or she will set out tasks, a timeline, and resources.. Figure 5 contains sample timelines for the cross-agency team. The examples are for illustrative purposes; the time required to complete these tasks could vary considerably, depending on agency size and other factors.



Figure 5: Sample Timeline for the Cross-Agency Team to Complete Key Tasks.



Planning for the Self-Assessment Process: 1-2 weeks


  • The project director nominates members of the cross-agency team.

  • Cross-agency team members meet to review the tool and tasks and select a chairperson; participants divide into three groups to complete one of the three modules: problem solving, community partnerships, or organizational transformation.

  • The project director develops plans for agency staff orientation and the distribution and collection of the assessment forms (if paper-and-pencil-based).

  • The cross-agency chairperson assists the project director in developing plans for the data entry, analysis and reporting (if paper-and-pencil-based):


Implementing the Self-Assessment Tools: 2 to 3 weeks


  • The three cross-agency team groups meet individually to collect data and answer questions in their module.

  • The cross-agency team reviews the findings of the individual modules and completes a single assessment tool.

  • The project director conducts orientation with all staff who will complete the individual-level forms of the tool.

  • The project director manages the administration of the other five forms of the tool: officer, supervisor, command staff, civilian staff, and community partners.


Data Analysis and Reporting: 1 to 4 weeks


  • If the online assessment is used, data analysis and reporting are automated.

  • If the paper-and-pencil version is used, the following tasks are suggested:

  • The cross-agency chairperson and the project director oversee the activities of the data analyst.

    • Tracking and get to know the data

    • Data entry

    • Data cleaning

    • Data transformations

    • Quantitative Analysis

    • Develop the codebook


  • The cross-agency team chairperson drafts report about team findings.

  • The project director drafts reports about officer, supervisor, command staff, civilian staff, and community partner findings.

  • The project director synthesizes reports into a single report.


Strategic and Action Planning for the Future: 2 weeks


  • The cross-agency team chairperson and the project director examine the final report as part of the review team.

  • The cross-agency team chairperson and the project director obtain input on the results from stakeholders as part of the review team.

  • The cross-agency team chairperson and the project director participate in the review team, making recommendations for future action.




Appendix 1 is a checklist of tasks for the cross-agency team (especially the chairperson) to complete; Appendix 2 lists the tasks that the project director should complete—each during the self-assessment process.


Step 3: Cross-Agency Team Meets and the Project Director Develops a Detailed Plan


Step 3A:

Cross-Agency Team Members Review the Tool and Tasks and Select a Chairperson. At this Meeting Participants Are Divided into Groups to Complete One of the Three Modules: Problem Solving, Community Partnerships, or Organizational Transformation.


The project director will set a time, date, and location for the meeting of the cross-agency team, provide direction, and establish a timeline for the completion of the survey. The following items should be on the agenda:


  • Review of the self-assessment process (the structure, how to complete the tool, and so forth)

  • Expectations for the cross-agency team

  • Selection of a cross agency chairperson

  • Division of the cross agency team into three groups, with each group focusing on one of the modules: community partnerships, problem solving, or organizational transformation.

  • Scheduling a follow-up meeting to discuss each group’s findings and complete the single tool for the cross-agency team.


Appendix 3 contains an example of how to review the self-assessment process with the cross agency team. It provides details about the cross agency team process and expectations for members.


As mentioned previously, a cross-agency team should have at least 12 members, with a minimum of three persons representing officers, supervisors, command staff, and community members. Agencies may choose to modify the cross-agency team to meet their own needs; for instance, some agencies may choose to have a larger or smaller cross-agency team, depending on the size of the organization. They may also choose to place more community members on the team or place civilian personnel or labor/union leaders on the team.


To complete the cross-agency form, each team should divide into three subgroups of equal numbers of persons and focus on the module that each group chose: community partnerships, problem solving, or organizational transformation. At least one member from each group (e.g., one supervisor, one community member) should serve on each sub-group; in other words, one officer, one supervisor, one command staff member, and one community member will serve on a subgroup that completes the community partnerships module.


Cross-agency team members will serve as fact collectors and fact checkers when completing the questionnaire. Each group should work jointly to complete its module. Many of the agency personnel participating in the cross-agency team will be able to answer questions based on their knowledge and experiences in the department, but some items may require gathering information from others who know more about a topic area (e.g., the agency’s strategic planning process). Community members are important in the process because of their experiences with the agency and their abilitly to provide an outside perspective, even when they do not have substantive knowledge about an issue. For instance, a community member will not have detailed knowledge about labor relations in the agency but can provide his or her perspective based on the discussions heard within his or her group. It should take each subgroup approximately 2 to 3 hours to complete its assigned module.


One member of the cross-agency team, preferably a member of the agency, will be elected to serve as the team chairperson, because he or she will be involved in a number of aspects of the project, including responsibility for the following items during the self-assessment process:


  • Chairing the meeting when the cross-agency team completes its form

  • Participating in the review team that examines the final report, obtains input from stakeholders, and makes recommendations for future action.

  • If the paper-and-pencil-based assessment is conducted:

    • Assisting the project director in developing plans for data entry, data analysis, and reporting

    • Working with the project director on completing or overseeing implementation of the analysis plan, development of the codebook, summing scales/indices and developing scores, running statistics, and examination of the results

    • Drafting a report on the results from the cross-agency team for the project director and making needed revisions


The cross-agency chairperson should have good leadership, supervision, and oversight capabilities. While the following abilities and qualities should characterize all members of the cross-agency team, they are particularly important for the chairperson: He or she should be familiar with the workings of the department, have a strong background in community policing, and be viewed as a credible leader among his or her colleagues. He or she should also be capable of serving as a mediator should a dispute emerge. It is likely that individuals with these skills will be senior sworn departmental officials.



Step 3B:

Project Director Develops a Detailed Plan for Administration, Data Entry and Analysis, and Reporting


When beginning the planning process, the project director should first consider whether the agency will distribute the assessment to all persons in the department or whether it will use a sampling methodology. This will affect all other planning decisions made in this section; therefore, this should be determined at the initial stages of the planning process.

If the agency has fewer than 200 line-level officers, all officers should complete the survey. When possible, it is preferable to have at least 100 cases to analyze to allow the data analysis team to examine differences in results between certain subgroups (e.g., officers with less than 1 year of experience to those with higher levels of experience, if using the demographics questionnaire as part of the tool detailed in footnote 8) and it is likely that every officer who receives a form will not complete it.


For larger agencies, especially those with a few hundred or more officers, it may be advantageous to draw a sample of police personnel to complete the self-assessment tool rather than having every person in the agency complete the forms. Sampling is the process of selecting units (e.g., officers) from a population of interest so that by studying the sample one can fairly generalize the results back to the population of officers from the entire agency from which they were chosen. By following the science of sampling it is not necessary to include every person in the data-collection process. For details on sampling methodologies and how to use them, see Appendix 4.

Another important matter for agencies to consider at the beginning stage of the process is the protection of human subjects and confidentiality. Protect the confidentiality of the data could encourage greater and more honest participation among agency personnel. The assessment tool coordinator will have to work through questions such as the following:


  • Is completion of the survey voluntary?

  • Will there be any consequences for an individual who does not complete the form?

  • Who will have access to the data collected during the self-assessment process?

  • Will the data collected be confidential?

  • What safeguards, if any, are needed to ensure that procedures are followed?

  • For what purposes will the data be used?

  • Does the agency require institutional review board approval for effort?



Administration


As the project director begins to make plans for the administration of the assessment, he or she should work with agency and labor/union representatives to determine answers to the questions listed in Figure 6. Engaging the union in this process early on can be an effective way of obtaining buy-in for the self-assessment. It is also helpful to get the opinions and perceptive of persons at all levels in the organization to understand and anticipate questions or concerns they might have about the self-assessment.



Figure 6: Important Questions to Consider when Planning the Administration of the Assessment Tool


  • Are any major activities coming up in the organization that will make completion of the surveys difficult for some or all of the persons?

  • What is the timeline for distribution and collection of the surveys (e.g., 2 weeks)?

  • When will officers be asked to complete the survey? For example, they may be allotted 2 hours of patrol time to complete the survey, or they may complete the survey at the beginning of an in-service training. Specific instructions should be given to officers. It is also important to work with labor representatives to gain support for completing the tool during a time that all parties will support.

  • Are labor/union groups supportive of the self-assessment?

  • Will officers be allowed overtime for completing the survey?

  • Will the project director serve as point of contact for questions or concerns about the self-assessment?

  • Will the agency ask members of the cross-agency team to complete individual-level forms?

  • How does the agency define officers, supervisors, and command staff—regarding who receives which form of the tool and knowing how to answer questions that refer to the command staff or supervisors?

  • Who will have access to the data?

  • (If paper-and-pencil version is used) How and when will the forms be distributed? Examples include distributing paper copies at roll call, during in-service training, or with pay stubs. Another option is that agencies could send electronic copies to participants, which they will download and fill out.

  • (If paper-and-pencil version is used) How will the agency distribute forms to persons not present when forms are initially distributed?

  • (If paper-and-pencil version is used) How will completed surveys be collected? Will they be anonymous or will respondents be tracked?



Data Entry and Analysis (Only if Paper-and-Pencil Version is Used)


If the online version of the CP-SAT is administered, there will be no need for data entry, and the survey software will export data directly to either SPSS or Excel. In addition, COPS provides an Excel Macro that will generate an automated final report. If the paper-and-pencil version is used, or if the agency needs a more customized report, data entry and analysis must be carefully considered.


Early in the planning process, the project director, in conjunction with the cross-agency team chairperson, should assess the data entry and analysis capabilities of in-house and external hardware, software, and personnel. Agencies that have limited in-house capabilities for data entry and analysis should seek outside assistance by forming a partnership with a community college, university, or research organization. Many of these organizations have advanced capabilities in data entry and analysis, including research assistants who can be tasked with this work and would benefit from the experience. In larger organizations, interns could prove helpful with these tasks.


Along with the initial planning for data entry, the assessment tool coordinator and the cross-agency team chairperson should plan for the data analysis by examining the user’s guide analysis plan (see Appendix 6) and customizing it, as needed. Once a decision has been made about how to complete data entry and analysis activities, either in-house or with outside assistance, the project director, in conjunction with the cross-agency chairperson, should make initial plans for the data entry process.



Reporting


The CEO, project director, and cross-agency chairperson should develop the timeline for reporting and discuss their expectations for the report.


Step 4: Conduct Orientation


The project director will orient staff to the project’s goals, procedures, and other concerns, along with a short list of talking points. These talking points may include:


  • Why the agency is conducting the self-assessment

  • Support for the self-assessment from the chief and the project director and any other leaders within the organization

  • Benefits of completing the survey (e.g., to identify gaps in training)

  • How the forms will be collected

  • The timeline for competing the forms

  • How the agency defines officers, supervisors, and command staff

  • Anonymity and confidentiality concerns associated with the process, including whether forms are anonymous and who will have access to the data

  • Agency plans about how to share the results of the assessment

  • When agency personnel are expected to complete the form (e.g., free patrol time)

  • Level of support for the initiative from labor/union groups.


Appendix 5 is a sample orientation script. If the paper-and-pencil version is used, the project director may wish to consider meeting with those who will assist with the distribution to review expectations and activities in the self-assessment process, and plan responses to questions.


Persons distributing the tool should:


  • Understand and communicate the goals, objectives, and benefits of conducting the assessment (i.e., “sell” the value of the process to the respondents)

  • Be able to answer questions about the structure of the self-assessment form.

  • Be able to answer questions about how long the form will take to complete (45 to 60 minutes).

  • Be able to answer questions or concerns about the collection of basic demographic information (if collected)

  • Know how officers, supervisors, and command staff are defined by the agency because some questions on the form ask about these groups.


All persons who are completing the tool should also receive an orientation from either the project director or his or her designee. It may be preferable to conduct the orientation in group sessions before distributing the forms or at the same time as the forms are distributed (if the paper-and-pencil version is used), although other orientation methods are possible, such as by e-mail or through the agency newsletter. Respondents should be instructed to complete the survey independently and be told how to contact the project director if they have questions completing the form.



Checklist II: Implementing the Self-Assessment Tools


Cross-Agency Team

Step 1: Cross-Agency Team Completes the Assigned Modules


Following the meeting about the self-assessment process, the three subgroups focusing on community partnerships, problem solving, and organizational transformation should complete their chosen section of the assessment tool. Members of the cross-agency team will collect and check facts when completing the questionnaire. Although the vast majority of the questions can be answered as the group discusses the module, some questions may involve gathering information from others in the organization. It should take about 1 to 2 hours to complete the assigned module. After completing the tool, each subgroup should give copies of its answers to the rest of the team members to facilitate discussions at the next meeting.


Step 2: Cross-Agency Team Creates Consensus Scores


After each group has completed its assigned module, the entire cross-agency team will meet again so that each subgroup can report its answers to the rest of the team. The meeting should focus on areas of disagreement among the cross-agency team members as well as areas in which there is strong agreement. After discussion and debate, the team completes a single form (either online or on paper) that represents the consensus opinion. As items are discussed, the cross agency chairperson will assist in bringing the team to a consensus about the answer to the questions. One member of the team should take notes, recording where there is a general convergence of opinion, and where and why there are divergences of opinion. The notes will provide a snapshot of the discussions and a dissenting log that can be used later by the cross-agency chairperson when he or she develops the written report. While the notes and dissenting log are not built into the online function, attaching this information to the automated report will result in a much richer set of information to be provided to the CEO. This meeting should take 1 day.


Individual Officer, Supervisor, Command Staff, Civilian Staff, and Community Partner Surveys


Administration of the individual forms of the assessment varies greatly depending on whether the agency chooses to administer the assessment online or in the paper-and-pencil-based version. The steps to administer online are presented first below, and the paper-and-pencil-based steps follow.


The software used to administer the CP-SAT online is Vovici EFM Community software. Vovici EFM Community is a comprehensive, web-based feedback system. EFM Community has an intuitive, web-based interface, which allows users to appropriately create, manage and report on data. Vovici is adaptable for large, complex surveys, allowing for many question types, advanced branching, SSL encryption, persistence, unlimited survey length, and unlimited number of participants taking each survey. Vovici EFM Community also allows for users to email survey links and reminders to participants and track each participant’s survey status (e.g., clicked on link, completed). All data and software is hosted on Vovici’s servers and accessed through a web browser, eliminating potential problems that could arise from downloading software and hosting on individual computers. The CP-SAT survey can be saved as a survey template and accessed by interested agencies who are assigned a username and password within the CPAssessment workgroup. Vovici EFM Community users have real-time access to survey data and reporting through the user interface, or data can be exported into a CSV or SPSS file for reporting in another software. Each EFM Community account will provide police departments with all the functionalities described above for one survey for $895 per year. Customer service via phone, email, and online (i.e., FAQ and an online manual) and automatic upgrades are included for all departments with a Vovici EFM Community account.


Administer Surveys Online


The online CP-SAT should be administered to six different participant groups: officers, supervisors, command staff, civilian staff, community partners, and a cross-agency team. There are two ways in which an agency can distribute the CP-SAT survey link to each of the participant groups: 1) distribution via agency listserv; or 2) distribution via the Vovici EFM Community software. If the agency has a separate email list for officers, supervisors, command staff, and civilian staff, it may be easiest to distribute the survey link via the agency email lists (option 1). If the agency does not have such email lists or if the agency wants to track who has and has not completed the survey, it is best to distribute the survey link via the Vovici EFM Community software (option 2). Below are the specific steps to administering the survey for each of these two options.


Option 1: Distributing the CP-SAT survey link via agency email lists


Obtaining access to CP-SAT Survey

  1. Contact Rob Chapman at the U.S. Department of Justice, COPS Office ([email protected]; 202-514-8278) for the procedures on how to obtain a Vovici EFM Community CPAssessment username and password.

  2. Once you obtain a Vovici EFM Community CPAssessment username and password, click on the following link to access the CP-SAT website: http://efm.cpassessment.com

  3. Logon with your Vovici EFM Community CPAssessment username and password.


Loading the CP-SAT Survey into Your Agency’s Account

  1. From the “Surveys’ main page (i.e., “Surveys” tab at top), click “Create New Survey” in the top left corner

  2. Select “Use an existing template from the library.” and click “Next”.

  3. Select “CP-SAT” and click “Next”.

  4. Select “Open Participation” and click “Next”.

  5. Input a name for your survey (e.g., Springfield Police CP-SAT) and click “Finish”.


Loading the CP-SAT Formatting into your CP-SAT Survey

  1. From the “Surveys” main page, click “Design Questionnaire.” The questionnaire designer will open in a new page.

  2. On the right side panel, click on the middle “Formatting” tab.

  3. Under “Theme”, choose the “ICF COPS (2V2) (2)” theme and click the “Refresh” button at the top of that right panel.

  4. Close the Questionnaire Designer window.


Loading your CP-SAT Survey onto a Webpage

  1. On the “Surveys” main page, click the “Publish” button (found to the top right on the left of the “Share” button).

  2. Your CP-SAT survey link is found in the top of the “Information” box. You can click on this link to test the survey, though the survey is not ready for distribution to agency staff and partners until it is activated.

  3. Click on “Activate” button and click “yes” when asked if you want to continue. Please note that if you tested the survey in step 14, all data that was collected during that phase will be deleted when you activate the survey.

  4. Your survey is now ready to collect data from agency staff and partners.


Distributing Survey Link to Agency Staff and Partners

  1. The survey link and passwords will need to be provided to agency staff and partners via email. The survey link can be found at the top of the “Information” box on the Vovici EFM Community CPAssessment “Surveys” main page. Passwords identify the type of participant as follows:


Participant Type

CP-SAT Password

Officers

officer

Supervisors

supervisor

Command Staff

command

Civilian Staff

civilian

Community Partners

partner

Cross Agency Team

agency


  1. Separate emails should be sent to each participant type providing only the relevant password (i.e., it is not best to send one email to all agency staff providing a list of passwords for each agency type). For example, all command staff should receive one email that only provides their password (i.e., “command”) and this email should not list any other participant type passwords. This ensures that each agency staff member or partners are clear as to which password applies to them. This password provides entry into the appropriate set of CP-SAT questions for each participant type and the data is not able to be reallocated later if a different participant password is entered.

  2. Exhibit XX provides sample language for the survey distribution email.


Dear Officers,

Adkjf;a



Survey Link:

Password:


Thanks,

Joe Smith

Chief of Police

Springfield Police Agency



Option 2: Distributing the CP-SAT survey link via Vovici EFM Community software


Obtaining access to CP-SAT Survey

  1. Contact Rob Chapman at the U.S. Department of Justice, COPS Office ([email protected]; 202-514-8278) for the procedures on how to obtain a Vovici EFM Community CPAssessment username and password.

  2. Once you obtain a Vovici EFM Community CPAssessment username and password, click on the following link to access the CP-SAT website: http://efm.cpassessment.com

  3. Logon with your Vovici EFM Community CPAssessment username and password.


Loading the CP-SAT Survey into Your Agency’s Account

  1. From the “Surveys’ main page (i.e., “Surveys” tab at top), click “Create New Survey” in the top left corner

  2. Select “Use an existing template from the library.” and click “Next”.

  3. Select “CP-SAT” and click “Next”.

  4. Select “Open Participation” and click “Next”.

  5. Input a name for your survey (e.g., Springfield Police CP-SAT) and click “Finish”.


Loading the CP-SAT Formatting into your CP-SAT Survey

  1. From the “Surveys” main page, click “Design Questionnaire.” The questionnaire designer will open in a new page.

  2. On the right side panel, click on the middle “Formatting” tab.

  3. Under “Theme”, choose the “ICF COPS (2V2) (2)” theme and click the “Refresh” button at the top of that right panel.

  4. Close the Questionnaire Designer window.


Loading your CP-SAT Survey onto a Webpage

  1. On the “Surveys” main page, click the “Publish” button (found to the top right on the left of the “Share” button).

  2. Your CP-SAT survey link is found in the top of the “Information” box. You can click on this link to test the survey, though the survey is not ready for distribution to agency staff and partners until it is activated.

  3. Click on “Activate” button and click “yes” when asked if you want to continue. Please note that if you tested the survey in step 14, all data that was collected during that phase will be deleted when you activate the survey.

  4. Your survey is now ready to collect data from agency staff and partners.


Uploading Participant List into Vovici EFM Community Software

  1. A list of survey participants need to be uploaded into the Vovici software from a .csv document. To do this, you must first create a Microsoft Excel document that contains four columns of information for each participant: First Name, Last Name, Email, and Password. An example of how the Microsoft Excel file should be set up is provided in the following exhibit. Please note that headings (e.g., “First Name”) should be included in the Microsoft Excel document, but cells do not need to be shaded or be lined.


First Name

Last Name

Email

Password

Joe

Smith

[email protected]

officer

Jane

Anderson

[email protected]

command

Frank

Thomas

[email protected]

partner

Suzy

Que

[email protected]

officer

Matthew

Davis

[email protected]

civilian


  1. Passwords identify the type of participant as presented in the below exhibit. All participants of the same agency staff or partner type (e.g., supervisors) will have the same password (e.g., “supervisor”).


Participant Type

CP-SAT Password

Officers

officer

Supervisors

supervisor

Command Staff

command

Civilian Staff

civilian

Community Partners

partner

Cross Agency Team

agency



  1. You will need to save the Microsoft Excel file as a .csv document in order to successfully upload the participant list into the Vovici EFM Community software. To save as this file as a .csv file, you must first delete the unused sheets from the workbook (e.g., Sheet 2 and Sheet 3). To delete sheets from a Microsoft Excel workbook, right click on the unused tabs (e.g., Sheet 2) and select “Delete”.

  2. In the Microsoft Excel document “File” menu, select “Save As”.

  3. In the “Save As” box, choose the “CPSAT” folder previously created as the location to which the .csv file should be saved. Type “CPSATParticipants” in the “File Name” text box to designate CPSATParticipants as the name of the .csv participant file. Lastly, in the “Save as type” drop down menu, select “CSV (comma delimited) (*.csv)” and click “Save”.

  4. Select “yes” if you receive a Microsoft Excel prompt that notifies you that “CPSATParticipants.csv may contain features that are not compatible with CSV (comma delimited).”

  5. Close the CPSATParticipants.csv file.

  6. From the Vovici EFM Community CPAssessment “Surveys” main page, click “Select Participants”.

  7. Click “Import Participants…” from the task bar. This option is designated by a green cross to the left of the text in the task bar.

  8. Click “Browse…” to select the previously save .csv participant file.

  9. Navigate through your folders to the CPSAT folder and select the CPSATParticipants.csv file and click “Open.”

  10. Click “Next” in the “Import Respondents” window.

  11. The “Import Respondents” window will show two columns: Respondent Fields column with multiple options (e.g., E-mail, Culture, Key 1, etc.) and the Import Field column with multiple drop-down boxes. Each drop-down box should list “Email”, “First_Name”, “Last_Name”, and “Password” as options because these are the headings in your participant list file.

  12. Next to the “E-mail” respondent field, select “Email” from the drop-down box (i.e., the first drop-down box under the “Import Field” column).

  13. Next to the “Key 1” respondent field, select “Password” from the drop-down box (i.e., the third drop-down box under the “Import Field” column).

  14. Click “Next” and then “Import.”

  15. Once the task has completed, the number of records (i.e., the number of participants uploaded) will be displayed. Click “Close” at the bottom of the “Import Respondents” window. Please note that if the number of records imported does not match the number of participants you were trying to upload, please repeat steps 17 through 32 following each direction very carefully.

  16. Close the “Participant Selector” window.

Distributing CP-SAT Survey Link via Vovici EFM Community Software

  1. From the “Surveys” main page, click “Write Invitations” to develop the email to participants that will distribute the survey link and passwords.

  2. Exhibit XX provides sample language for the survey distribution email.


Dear Officers,

Adkjf;a



Survey Link:

Password:


Thanks,

Joe Smith

Chief of Police

Springfield Police Agency


Sending Reminders



TIP: Ways to Increase Response Rates.


  • Chief stresses the importance of the self-assessment process and participation in completing the forms.

  • Gain support and “buy in” from each group—officers, supervisors, and command staff—as well as police union leadership or other organized labor bodies. Support from these individuals and groups should be sought at the outset of the project.

  • Training and orientation for agency personnel who will be asked to complete the assessment tool form will enhance the overall response rate and reduce the problem of incomplete forms being submitted.

  • Respondents should be notified in advance that the assessment tool form will be distributed and that they will be given adequate time to complete it.

  • Respondents should be assured that their responses will be handled in a confidential manner.

  • Respondents should be reminded to complete the assessment tool forms.

  • Offer to provide the respondents with a summary report of the results of the self-assessment process.

  • Convey how the results will be used to make positive changes, as well as to promote the work being done by the agency and its staff.





Administer the Paper-Based Survey



Step 1: Distribution of the Tools


As discussed previously, all (or a sample of) individual-level assessment tool forms should be distributed to officers, supervisors, command staff, civilian staff, and community partners. (The agency can choose to omit members of the cross-agency team from having to complete the individual-level form). If the agency plans to track responses, an identification number should be assigned to each survey before the forms are distributed. If the forms are not distributed in conjunction with orientation, but at a later time, the following items should be stressed at distribution:


  • Why the agency has chosen to conduct a self-assessment

  • The importance of the self-assessment and the chief’s or sheriff’s support of the process

  • The timeline for completion

  • When and how the forms are to be completed (e.g., 2 hours of patrol time allotted for completion of the forms)

  • How long it takes to complete the forms (45 to 60 minutes)

  • How and where completed form should be returned (e.g., placed in a sealed envelope and then in a box for collection)

  • Whom to contact with questions or concerns

  • Definition of key terms (e.g., command staff)

  • Anonymity and confidentiality of the forms and process (if applicable)

.

During the distribution process and while forms are being completed, the project director should be available to answer questions about the assessment tool process or the forms.


Step 2: Personnel Complete and Return the Tools


Completed forms should be returned to the designated location, following the instructions given to respondents. The project director will be responsible for ensuring the logistics and setup.


After 2 weeks of collecting data, the project director should send out reminders. Contacting nonrespondents is critical. Prior research has shown that information that would have been provided by nonrespondents can be very different from that provided by respondents and, therefore, the data will not be representative of the intended population if recruitment efforts are not successful in drawing them into the data-collection process. Reminders could be in the form of an e-mail, letter, flyer, roll call announcement, or some other method. Agencies using a tracking system can target their efforts on identified nonrespondents, while agencies that do not use a tracking system will send reminders to all persons. Reminders should be sent throughout the data-collection process; for example, if the data collection lasts 4 weeks, the assessment tool coordinator may want to send reminders after the end of weeks 1, 2, and 3. The project director should also keep extra copies of the form on hand for anyone who needs another copy.



Checklist III: Data Analysis and Interpretation

Analyzing and Reporting Online Data


COPS offers an automated report in Microsoft Excel that was created to help agencies easily combine and display their data in an easily-readable format. This automated report imports the raw data, conducts descriptive analyses, and presents the data in a summary report. The steps to generate the automated report are below.


Step 1: Preparing Report

  1. Create a folder named “CP-SAT” on your computer that contains the “CPSAT_Report_Template” and the “ImportData.xls” files.


Step 2: Exporting Raw Survey Data


  1. Click on the following link to access the CP-SAT website: http://efm.cpassessment.com

  2. Logon with your username and password.

  3. In the “Surveys” main page, click “Manager Reponses”.

  4. From the “Actions” menu, select “Export” “CSV (Comma delimited)”.

  5. In the “Export Reponses” page, select “All” columns, “No Filter”, and “Raw Data”. Make sure to also check the box next to “Export Report Values” and click “Next”.

  6. Click on the “.csv” link provided and save the raw data file as “CPSAT_Raw_Data.csv” in the “CP-SAT” folder on your computer.


Step 3: Running the Report

  1. Open the “ImportData.xls” file.

  2. If your security setting is set to “Very High” or “High”, you will need to change your macro security to “Medium” to allow the macro can run. Follow the following steps to check and/or change your macro security:

    1. From the “Tools” menu, select “Macro” “Security”

    2. In the “Security Level” tab, select “Medium” and click “OK”.


  1. Click on the “Upload Data” key

  2. Type in “CPSAT_Raw_Data.csv” when prompted to type the name of your text file and click “OK”.

  3. The report should automatically save in the “CP-SAT” folder on your computer marked with the date and time the report was run.


Step 4: Cleaning the Report



Other


Although the raw data is not immediately viewable in the report, this data is included in hidden worksheets within the report. If an agency wants to analyze data beyond what is displayed in the automated report, agencies can access the raw data by unhiding worksheets. Follow the steps provided below to unhide the raw data worksheets:


  1. From the “Format” menu, select “Sheet””Unhide”

  2. Choose “Sheet1”

  3. Click “ok”

  4. Repeat steps 1 to 3 for Sheet2 through Sheet10


Analyzing and Reporting Paper-and-Pencil-Based Data


The cross-agency chairperson and the project director will manage the data entry and analysis process. In many agencies, a data analyst will complete the actual data entry and analysis activities. Analysis of the cross-agency form is fairly straightforward because there is only one completed form. The scores for each scale or index across all three modules (partnerships, problem solving, and organizational transformation) should be calculated using the scoring sheet provided in Appendix 6. The analysis of the individual assessment forms completed by each officer, supervisor, commander, civilian staff member, and community partner would involve traditional quantitative analysis processes. Most of what is discussed in this section pertains to the analysis of the individual assessment forms.


In some ways the data analysis is the least time-consuming task of the self-assessment process—compared to the investment in staff time to collect the data—but it is a critical step. It is during the analysis stage that the results of the data collection begin be realized. This requires organization and focus to make sure all data are represented correctly. To stay on schedule and complete all the tasks involved, an analysis plan should be followed. The analysis plan should link the assessment objectives, or questions to be answered, with the data collected; it should also spell out the analyses that will be conducted once the data are available. An analysis plan helps sort out the analysis process. The following are typical steps that need to be completed in an analysis plan.


Step 1: Track and Get to Know the Data


The raw data in the completed assessment forms will be entered into a database to allow for statistical analysis. Before data entry takes place, the data analyst who will complete the data entry and analysis should review some of the forms to familiarize himself or herself with the categories on the form.


A database for logging incoming data is a critical component in good research record-keeping. Some agencies will also have set up a procedure for logging completed forms and keeping track of it them until they are ready to do a comprehensive data analysis. Researchers differ in how they prefer to keep track of incoming data. In most cases, researchers develop a database that enables them to assess at any time what data are in the database and what are still outstanding. Agencies can do this with any standard computerized database program, such as Microsoft Access, although this requires familiarity with such programs. It can also be accomplished using standard statistical programs such as SPSS or SAS, and running simple descriptive analyses to develop reports on data status. It is also critical that the data analyst retain the original completed forms until at least a project report is written. The data analyst should always be able to trace a result from a data analysis back to the original forms on which the data were collected.


Step 2: Enter Data into the Database


Once the raw data have been reviewed, the data will be entered into a database, traditionally using an individual personal computer. Many software options are available for data entry. The tool can vary from a simple spreadsheet to a sophisticated program using powerful off-the-shelf software. For example, SPSS (see http://www.spss.com/) and SAS (see http://sas.com/) have data entry modules. The provided CD-ROM has a data entry shell for use with SPSS software and includes variable and value labels. For other options, at http://statpages.org/ there are more than 380 sites with the ability to do online analysis.


The data analyst should also double-check the data entry. One more sophisticated way to assure a high level of data accuracy is to use a procedure called double entry. In this procedure, the analyst enters the data once then uses a special program that allows him or her to enter the data a second time and checks each second entry against the first. If there is a discrepancy, the program notifies the user and allows the user to determine the correct entry. This double-entry procedure significantly reduces entry errors; however, these double-entry programs are not widely available and require some training. An alternative is to enter the data once and set up a procedure for checking the data for accuracy. For instance, the data analyst might spot-check records randomly for data entry errors, correct the errors and conduct a second round of spot-checking (e.g., checking every 10th case). This should continue until a round of spot-checking uncovers no errors.


Step 3: Data Cleaning


Once the data have been entered, the data analyst will use various programs to summarize the data and check that all the data are within acceptable limits and boundaries. Next, the analyst will verify that the data values are correct and conform to a set of rules. Such summaries, for instance, will enable the analyst to easily spot whether there are persons whose age is 601 or who have a 7 entered where a 1-to-5 response is expected. Data cleaning deals with detecting and removing errors and inconsistencies to improve the quality of data. Errors can be detected using descriptive statistics. For example, the data analyst can look at minimum and maximum values to avoid the presence of nonvalid values (e.g., a value of 2 for variables with 0 and 1 response categories) and variables exceeding an expected range. Data cleaning can be accomplished with virtually any statistical software such as Excel, SPSS, or SAS. For all errors found in the electronic data set (such as data outside the correct range for a variable), the researchers should go back to the original surveys to find and identify the correct data.


Step 4: Data Transformations


Once the data have been entered and cleaned, the data analyst will transform the raw data into variables that are usable in the analyses. A number of transformations will need to be performed. The two main transformations—missing values and scale/index summation—are reviewed below. More detail on exact transformations is located in Appendix 6.


Missing values


Not everyone who completes a CP-SAT form will complete all questions. Values will be missing. Many statistical analysis programs automatically treat blank values as missing. These blanks will need to be given designated specific values to represent missing values; for instance, in SPSS, using a value of -99 could indicate that the item is missing. Check the specific program used to determine how to handle missing values (e.g., exclude data point if a participant had any missing data in that scale/index; multiple imputations). Improper handling of missing values will distort analysis because, until proven otherwise, the researcher must assume that missing cases differ in analytically important ways from cases where values are present; that is, the problem with missing values is not so much reduced sample size as it is the possibility that the remaining data set is biased. A number of statistical software programs have modules that address missing values and include several imputation algorithms for more complex analyses.


Scale/Index summation


Once the data analyst has handled missing data issues he or she will want to add up the individual items within each scale/index to get a total score for the scale/index. The CP-SAT was designed with a variety of scales/indices. Scaling is the assignment of objects to numbers according to a rule. With the CP-SAT, the objects are text statements covering a variety of activities, attitudes, and beliefs. The reason these items were put into a scale/index is that the underlying concepts that an analyst hopes to measure are complex and cannot be arrived at by single-item questions. These scales/indices are designed to arrive at single summed numbers for each scale/index. Appendix 6 is a guide for summing each CP-SAT scale/index.


Step 5: Quantitative Analysis


Data analysis is the act of transforming data to extract useful information and facilitate conclusions. The analysis of the individual-level assessment form data (i.e., all the data from the officer, supervisor, and command versions) from the CP-SAT will require prior experience in using descriptive and exploratory statistics. Law enforcement agencies that do not have the expertise or resources for data analysis should consider hiring a consultant, or partnering with a local researcher or university for guidance. Once the data have been cleaned, analyze the data following procedures that reduce and categorize the information to make it easier to manipulate, understand, and report. Tips for analyzing CP-SAT data include the following:


Calculate descriptive statistics to summarize data in a clear and understandable way. Descriptive statistics are used to describe the basic features of the data in a study. They provide simple summaries about the sample and the measures. There are two basic methods of descriptive statistics: numerical and graphical (each method is used in the example report in Appendix 7). Using the numerical approach, one might tabulate the information by counting the frequency of its appearance; tor example, to know how many officers have at least one partner from the business community, simply count the number of “yes” responses to the survey question on number of business partners. Each descriptive statistic reduces a los of data into a simpler summary. If ratings and rankings are of interest, compute the mean (average), median, or mode. To find the mean, or average, divide the sum of all responses to a particular question by the number of responses to that question. To generate the median score, arrange the responses in a list. The middle number is the median. If there are two middle numbers, add the two numbers and average them for a median. To tabulate the mode, look for the number or value that occurs most often. Last, compute a standard deviation which conveys information about the degree to which officers differ on a particular response. Using the graphical approach, one might create a bar chart or a box plot containing detailed information about the distribution of scores.


Descriptive statistics typically are distinguished from inferential statistics. Descriptive statistics simply describe what is or what the data show. Inferential statistics are used to try to reach conclusions that extend beyond the immediate data alone (e.g., whether differences between groups are significant).5 Larger departments (more than 100 officers), in particular, might want to consider these types of analyses, too.6


Step 6: Develop a Codebook


When working with data, it is useful to generate a printed codebook that describes the data and indicates where and how they can be accessed. Minimally, the codebook should include the following items for each variable:


  • Variable name

  • Variable description

  • Variable type (alpha or numeric);

  • Value names and labels

  • Variable format (number, data, text)

  • Date collected

  • Data structure—rectangular or hierarchical, for example

  • Variable location (in the raw database)

  • Information about other key variables such as unique case identifiers, weights, and the like, that are necessary for using the data correctly.


The codebook is an indispensable tool for the analysis team. Together with the database, it should provide comprehensive documentation that enables other researchers whom the agency might subsequently want to analyze the data to do so without any additional information. The codebook also serves as a manual describing data collection, and is an excellent method to document what was done by those conducting the analysis. That way, if someone wanted to know how data collection was carried out a year or so after the data collection, the agency will have a comprehensive record of the activities. The codebook should also include a copy of the CP-SAT survey forms and a description of how the data were collected, including the sampling design.


Step 7: Reporting


Using the results of the data analysis and the notes from the cross-agency team meeting, the cross-agency chairperson will draft a report about the findings across all three modules—community partnerships, problem solving, and organizational transformation—highlighting areas where there is a convergence of opinions as well as a divergence of opinion. He or she submits the draft report to the project director.


Using the results of the data analysis, the project director will draft a report about the findings—community partnerships, problem solving, and organizational transformation—from each individual-level form: officer, supervisor, command staff, civilian staff, and community partners. The project director also reviews the cross-agency team report and integrates it with the findings from the individual-level forms. Finally, the project director submits the final report to the CEO. See Appendix 7 for a sample report.



Checklist IV: Strategic and Action Planning for the Future



Step 1: Project Director Assembles a Review Team to Examine Findings and What They Mean for the Agency


While the data are being analyzed the project director will assemble a review team to examine the findings from the self-assessment and what they mean for the agency. The project director and the cross-agency chairperson should be members of this review team because they have been involved throughout the process. The review team is tasked with the following activities:


  • Examine the final report about the self-assessment findings that is submitted to the CEO

  • Obtain input on the results from stakeholders

  • Make recommendations for future action; the project director will present the recommendations to the CEO.


The review team should consist of about 10 persons, including the project director and cross-agency chairperson. The project direct may want to include others on the team, such as the following:


  • Other members of the cross-agency team

  • Members of the agency’s research and planning unit

  • Representatives from all levels of the organization: officers, supervisors, command staff, and civilian staff who were not part of the cross-agency team

  • Representatives from labor/union groups.

  • Representatives from the community or partner organizations who were not part of the cross-agency team

  • Members of other local law enforcement agencies

  • Criminal justice researchers from local universities.


The members of the review team members should possess the same characteristics as cross-agency team members. They should be critical thinkers, but more important, they should be innovative thinkers who can help develop proposed action items for the future.


TIP: Members of the review team should be:

  • Critical, creative, innovative thinkers

  • Committed to advancing community policing

  • Knowledgeable about the agency

  • Well-respected by colleagues inside and outside the organization

  • Willing to speak up and have their ideas heard.





Step 2: Review Team Examines the Report


The review team will review the report with the project director and discuss ways to obtain comments from stakeholders and make recommendations for future action. The review team also will identify the agency’s strengths and gaps in community policing implementation, paying particular attention to ways to add value to the existing strengths and address the gaps. As review team members examine the report, they may wish to consult the resources listed by topic in Appendix 8. The references can prove useful in helping the team formulate recommendations.


Step 3: Review Team Obtains Comments from Stakeholders

The review team should talk with stakeholders both within and outside of the agency to obtain their comments on the results of the self-assessment and recommendations for future action. This may take the form of informal conversations, meetings, conference calls, or any other method deemed appropriate by the agency.


Step 4: Review Team Makes Recommendations for Future Action


After gathering comments from stakeholders, the review team will develop recommendations for future action within the agency. The recommendations could include additional in-service training on problem-solving techniques, ways to build stronger partnerships, and changes to performance evaluations.


Step 5: Project Director Compiles Recommendations from the Review Team to Present to the CEO


The project director will compile the review team’s recommendations into another report to present to the CEO.


Step 6: CEO (Chief) Decides Actions to Pursue, Including Disseminating the Results


The CEO reviews the recommendations and choose which ones to pursue. He or she may want to suggest to staff who will implement the recommendations that they review the listed in Appendix VIII. The CEO will also make the final decisions about which results to disseminate and to whom. The TIP boxes below list the information that could be disseminated, who should receive the information, and ways to disseminate the information.



TIP: Items to Consider Disseminating


  • Full report about the self-assessment findings

  • Executive summary of the report

  • Recommendations from the review yeam

  • Chief’s decisions about future steps.



TIP: To whom should the agency consider disseminating its findings?


  • All agency personnel

  • Members of the community who participated in the cross-agency team

  • The review team

  • The media

  • Labor/union representatives

  • Local political officials and leaders

  • Community partners

  • The public.




TIP: Ways to disseminate the findings


  • Press releases

  • Press conferences

  • Post information on the agency’s web site

  • Agency newsletters

  • Distribute a report of the findings)

  • Internal agency memoranda

  • E-mail

  • Op-eds

  • City council meetings

  • Presentations to local elected leaders

  • Presentations at town hall meeting or community meetings.




Resources


The Internet references cited in this publication were valid as of March 2008. Given that URLs and web sites are in constant flux, neither the authors nor the COPS Office can vouch for their current validity.


Baker, T.E. Effective Police Leadership: Moving Beyond Management. Flushing, New York: Looseleaf Law Publications, Inc., 2000.


Eck, J. E. and W. Spelman. Problem-Solving: Problem-Oriented Policing in Newport News. Washington, D.C.: Police Executive Research Forum, 1987.


Ford, J. K. Organizational Survey: An Overview. Michigan State University, School of Criminal Justice, 2004.

http://www.cj.msu.edu/~outreach/cp/survey.html


Goldstein, H., Problem-Oriented Policing, New York: McGraw-Hill, 1990.


Goldstein, H. “Improving Policing: A Problem-Oriented Approach,” Crime and

Delinquency, Volume 25 (1979): 236–258.


Greene, J. R., “Community Policing in America: Changing the Nature, Structure,

and Function of the Police.” Policies, Processes, and Decisions of the Criminal

Justice System 3 (2000): 299–370.


Grinc, R.M., “ ‘Angels in Marble’: Problems in Stimulating Community

Involvement in Community Policing,” Crime & Delinquency, 40 (1994)(3): 437–468.


Haberfeld, M.R. (2006). Theories of Police Leadership. Upper Saddle River, New Jersey: Pearson Prentice Hall, 2006.


Maguire, E.R. and S.D. Mastrofski, “Patterns of Community Policing in the

United States,” Police Quarterly 3 (2000): 4–45.


Michigan Regional Community Policing Institute. Community Policing: A

Road Map for Change. Michigan State University, School of Criminal Justice, no date.

http://www.cj.msu.edu/~outreach/rcpi/roadmap.pdf


RAND Corporation. A Measurement Model to Estimating Community Policing

Implementation. Santa Monica, California, 2000.


Trojanowicz, R.C. Community Policing Guidelines for Police Chiefs. 1994.

http://www.policenet.com/compguide.html



Trojanowicz, R. and B. Bucqueroux. Community Policing: How to

Get Started. Cincinnati: Anderson Publishing Co., 1994.



Trojanowicz, R. and Bucqueroux, B., “Toward Development of Meaningful and Effective Performance Evaluations.” East Lansing, Michigan: Michigan State University, 1992.

http://www.cj.msu.edu/~people/cp/toward.html.


Trojanowicz, R. C., V.E. Kappeler, L.K. Gaines, and B. Bucqueroux. Community

Policing: A Contemporary Perspective, 2nd ed. Cincinnati: Anderson Publishing

Co., 1998.


Western Regional Institute for Community Oriented Public Safety. Onsite Assessment Process. Washington, DC: U.S. Department of Justice Office of Community Oriented Policing Services, no date.


Ziembo-Vogl, J. and D. Woods, “Defining Community Policing : Practice versus Paradigm,” Police Studies: International Review of Police Development, 19 (1996)(3): 33–50.


Zhao, J. Why Police Organizations Change: A Study of Community-oriented

Policing. Washington, D.C.: Police Executive Research Forum, 1996.


Zhao, J., O.C. Thurman, and N.P. Lovrich, “Community-Oriented Policing

across the U.S.: Facilitators and Impediments to Implementation,” American

Journal of Police 1 (1995): 11–28.





Appendix 1:

Checklist for the Cross-Agency Team and Cross-Agency Chairperson


CHECKLIST 1: Planning for the Self-Assessment Process


  • Cross-agency team members review the tool and tasks and select a chairperson. Divide participants into groups, with each group to complete one of the three modules: community partnerships, problem solving, or organizational transformation.

  • Cross-agency chairperson assists the assessment tool coordinator in developing plans for the following:

    • Data entry

    • Data analysis

      • Assess in-house and external capabilities to conduct analysis

      • Examine user’s guide analysis plan and customize, as needed

    • Reporting.



CHECKLIST II: Implementing the Self-Assessment Tools


  • Each cross-agency team subgroup completes its assigned module.

  • The cross-agency team reviews the completed modules and completes a single form after reaching a consensus on the items.



CHECKLIST III: Data Analysis and Interpretation


  • The cross-agency chairperson oversees the activities of the data analyst. These activities include the following:

    • Tracking and getting to know the data

    • Data entry

    • Data cleaning

    • Data transformations

    • Quantitative analysis

    • Developing the codebook.



CHECKLIST IV: Reporting


  • Cross-agency chairperson drafts report about cross agency findings.

  • Project director reviews reports and provides comments.

  • Project director synthesizes revised reports from the cross-agency chairperson and the assessment tool coordinator into a single report.



CHECKLIST V: Strategic and Action Planning for the Future


  • Cross-agency chairperson examines final report as part of the review team.

  • Cross-agency chairperson obtains comments from stakeholders as part of the review team.

  • Cross-agency chairperson participates in the review team that makes recommendations for future action.



Appendix 2: Checklist for the Assessment Tool Coordinator


CHECKLIST 1: Planning for the Self-Assessment Process


  • Assessment tool coordinator develops plans for the following:

    • Orientation

    • Distribution

    • Collection

    • Data entry (with cross-agency chairperson)

    • Data analysis (with cross-agency chairperson).

      • Assess in-house and external capabilities to conduct analysis

      • Examine user’s guide analysis plan and customize, as needed

    • Reporting (with cross-agency chairperson and project director).


Checklist II: Implementing the Self-Assessment Tools


  • Orientation with all staff who will complete the tool

  • Distribution of the tools

  • Personnel complete and return tools.



CHECKLIST III: Data Analysis and Interpretation


  • The assessment tool coordinator oversees the activities of the data analyst for the three individual-level forms. These activities include the following:

    • Tracking and getting to know the data

    • Data entry

    • Data cleaning

    • Data transformations

    • Quantitative analysis

    • Developing the codebook.



CHECKLIST IV: Reporting


  • Assessment tool coordinator drafts reports on results for the officer, supervisor, and command staff findings.

  • Project director reviews reports and provides comments.

  • Project director synthesizes revised reports from cross-agency chairperson and assessment tool coordinator into a single report.



CHECKLIST V: Strategic and Action Planning for the Future


  • Assessment tool coordinator examines final report as part of the review team.

  • Assessment tool coordinator obtains comments from stakeholders on the results as part of the review team.

  • Assessment tool coordinator participates in the review team that makes recommendations for future action.


Appendix 3: Example of an Introduction/Orientation Statement for Cross-Agency Team Members


The { } Police Department/Sheriff’s Office plans to conduct a self-assessment to determine the current level of community policing implementation within our agency. The self-assessment will assist both management and officers in developing goals and action items to further advance community policing in our agency.


The self-assessment consists of two main components. First is an individual component in which all levels of the agency will complete a form assessing their community partnerships, problem-solving projects, and organizational transformation in support of community policing. Second is a cross-agency component that involves completing an organizational assessment of our agency in a group format. For this component, you will participate in one of three teams, with each team focusing on one of the three modules in the assessment tool. Each team will be composed of officers, supervisors, command staff, and community members. After the three groups have completed their assigned modules, the entire cross-agency team will meet to fill out a single form about the agency’s community policing implementation. When you meet in that setting, we encourage you to keep a log of the discussions, in particular noting any dissenting opinions. This information, along with the quantitative information from the form’s analysis, will assist our agency in analyzing the results, writing the report, and examining our next steps.


To help guide you through this process, we ask you to select a chairperson for the cross-agency team, who will spearhead this effort by separating you into three teams and setting the date for reviewing the tool that he or she will lead. The chairperson will also be responsible for working with the { } (the assessment tool coordinator) on overseeing data entry and analysis. He or she will also write the initial report of the findings from the assessment and sit on a review team that will review the final report, seek input from stakeholders, and make recommendations for future actions.


Generally, the questions in the self-assessment tool ask for your perceptions about the attitudes, behaviors, skills, and abilities within the agency as a whole. You will be able to answer many of the questions from memory, but some may require talking with additional persons or gathering some basic data. The primary role of the cross-agency team is to serve as fact collectors and fact checkers, which may entail some follow-up work with persons who have knowledge about how to determine the answer to specific questions.


Community members play a unique role in this process. While much of the information gathered will be about the activities and behaviors of officers, we are particularly interested in gauging and understanding your perceptions of the agency. We encourage all cross-agency team members to think critically when examining the agency’s community policing activities.

A word about the data we collect: Your responses to this self-assessment will be confidential and will be seen only by the analysis team. The self-assessment process is voluntary and you may skip items you do not wish to answer, but we encourage you to respond to as many items as possible.

Please let me know if you have questions about the procedures for completing this assessment. Thank you for your valuable support of this important project.


Appendix 4: Sampling


Authoar: To whom is this appendix directed? It is very social-scientist oriented. Is this an attempt to explain sampling to the agency chief, or is it a refresher course for the data analyst? You need some sort of introductory explanation here before plunging into this technical discussion.

For most agencies, the Community Policing Self-Assessment Tool (CP-SAT) will likely be implemented with all command staff and possibly all supervisors because agencies generally do not have very many of those personnel. If your agency has more than 100 line officers, sampling is something that you might want to consider to conserve resources. There are two main types of sampling: non-probability sampling and probability-based sampling. We recommend the use of probability sampling.


Non-probability sampling


The difference between non-probability and probability sampling is that non-probability sampling does not involve random selection and probability sampling does. Does that mean that non-probability samples aren't representative of the population? Not necessarily. But it does mean that non-probability samples cannot depend on the rationale of probability theory. At least with a probabilistic sample, we know the odds or probability that we have represented the population well. We are able to estimate confidence intervals and other statistics. With non-probability samples, we may or may not represent the population well, and it often will be hard for us to know how well we've done so. In general, researchers prefer probabilistic or random sampling methods over non-probabilistic ones, and consider them to be more accurate and rigorous. In applied social research, however, there may be circumstances where it is not feasible, practical, or theoretically sensible to do random sampling.


Non-probability sampling involves two broad types: convenience or purposive sampling. Most sampling methods are purposive in nature because we usually approach the sampling problem with a specific plan in mind. For instance, people in a mall who are carrying a clipboard and who are stopping various people and asking if they could interview them are most likely conducting a purposive sample. They might be looking for Caucasian females between 30 and 40 years old. They size up the people passing by and ask anyone who appears to be in that category if they will participate. Purposive sampling can be very useful for situations where you need to reach a targeted sample quickly and where sampling for proportionality is not the primary concern. With a purposive sample, you are likely to get the opinions of your target population, but you are also likely to overweight subgroups in your population that are more readily accessible.


Convenience sampling is the traditional person-on-the-street interviews conducted frequently by television news programs to get a quick (although nonrepresentative) reading of public opinion. The problem with this sampling approach is that we have no evidence that the sample is representative of the populations we are interested in generalizing—and in many cases we would clearly suspect that they are not.


Probability Sampling


Probability sampling is any method of sampling that uses some form of random selection. To have a random selection method, you must set up some process or procedure that assures that the different units in your population have equal probabilities of being chosen. Humans have long practiced various forms of random selection, such as picking a name out of a hat, or choosing the short straw. These days, we tend to use computers as the mechanism for generating random numbers as the basis for random selection.


The simplest form of random sampling is called simple random sampling, where a sample is chosen randomly so that each possible sample has the same probability of being chosen. One consequence is that each member of the population has the same probability of being chosen as any other. Simple random sampling is not the most statistically efficient method of sampling and, because of the luck of the draw, you may not have good representation of subgroups in a population. To deal with these issues, other sampling methods can be used.


Stratified random sampling, also sometimes called proportional sampling, involves dividing a population into homogeneous subgroups and taking a simple random sample in each subgroup (e.g., stratifying on police service areas/precincts). There are several major reasons why you might prefer stratified sampling over simple random sampling. First, it assures that you will be able to represent not only the overall population, but also key subgroups of the population, especially small minority groups. If you want to be able to talk about subgroups, this may be the only way to effectively assure you will be able to. If the subgroup is extremely small, you can use different sampling fractions within the different strata to randomly over-sample the small group.7 When we use the same sampling fraction within strata, we are conducting proportionate stratified random sampling. When we use different sampling fractions in the strata, we call this disproportionate stratified random sampling. Second, stratified random sampling will generally have more statistical precision than simple random sampling. This will be true only if the strata or groups are homogeneous. Some of the possible groups to stratify the sample would be geographic regions of the agency or police service areas/precincts, specialized units, gender, and ethnicity. By using stratification you will be assured that you have enough cases to analyze for each important subgroup.


Systematic Sampling


Systematic sampling is functionally similar to simple random sampling. This kind of sampling includes the direct selection of officers from a list of all officers in the agency. In this process, the assessment tool coordinator or the person with whom the coordinator is working on the research design starts at a random point and selects every nth subject in the sampling frame. In systematic sampling there is the danger of order bias: the sampling frame list may arrange subjects in a pattern, and if the periodicity of systematic sampling matches the periodicity of that pattern, the result may be the systematic over- or under-representation of some stratum of the population. If, however, it can be assumed that the sampling frame list is randomly ordered, systematic sampling is mathematically equivalent to, and equally precise as, simple random sampling. Let's assume that we have an agency with 1,000 officers and that you want to take a sample of 200 officers. To use systematic sampling, the population must be listed in a random order. The sampling fraction would be (f = 200/1,000) 20 percent. Now, select a random integer from 1 to 5. In our example, imagine that you chose 4. Now, to select the sample, start with the 4th unit in the list and take every k-th unit (every 5th, because k=5). You were sampling units 4, 9, 14, 19, and so on to 1,000 and you would wind up with 200 units in your sample. For this to work, it is essential that the units in the population are randomly ordered, at least with respect to the characteristics you are measuring. Why would you want to use systematic random sampling? For one thing, it is fairly easy to do. You only have to select a single random number to start things off. It may also be more precise than simple random sampling. Finally, in some situations there is simply no easier way to do random sampling.


Sample Size


Sample size needs to be estimated in the design phase of the project—where the agency needs to determine how many personnel are going to be asked to complete the CP-SAT forms. For many agencies, they simply will have all the supervisors and commanders complete an CP-SAT form (given that there are not very many of these personnel in the agency). Typically, the issue is going to emerge when deciding on how many line officers will complete the CP-SAT form. For agencies with fewer than 200 line-level officers it generally will be easier to have all the line officers complete the CP-SAT form. It is the agencies with more than 200 line officers that will likely want to consider sampling. For agencies with fewer than 1,000 officers, the easiest strategy might be to use a systematic sampling approach of every other officer (based perhaps on the last digit of the officers’ badge number). Needed sample size does not actually depend on the size of the population to be sampled, and even in the most complex analyses, samples of more than 500 are very rarely needed.


Certain general factors often are considered in assessing sample size requirements. Generally, the size of the sample will need to be larger if one or more of the following applies: the weaker the relationships to be detected, the more stringent the significance level to be applied, the more control variables one will use, the smaller the number of cases in the smallest class of any variable, and the greater the variance of one's variables. Online sample size calculators may be found at http://www.surveysystem.com/sscalc.htm.


The Response Rate


For the CP-SAT to portray an accurate picture of the agency’s adoption of community policing, a representative sample (or the population) of agency personnel has to complete the CP-SAT forms. Above we discussed how to achieve a representative sample. In addition to using an appropriate sampling plan, each agency will have to adopt a follow-up strategy that leads to a good response rate. Even with a sound sampling plan, it is important that most of those officers/respondents who were given an CP-SAT form complete it. For example, if the only personnel who complete the CP-SAT forms are effective community policing officers, then the data will be biased and not represent the group of less effective officers within the agency.


There is no specific response rate that will assure that the sample is representative of the whole agency. For example, it is possible to still have a representative sample if only half the people complete the CP-SAT form—provided that the half who did not complete it is not systematically different. Generally, in survey research you do not want less than a 70 percent response rate. The key issue is making sure that the personnel who complete the survey are not different on important variables that may be related to community policing implementation from those who did not complete the survey. To make this assessment, the agency will need to conduct basic data analyses. First, the agency will need to obtain basic demographic data on all personnel (e.g., years/tenure in law enforcement; within the agency).8 Second, the agency will need to have a way of identifying those who completed the CP-SAT forms and those who did not.


On the first point, the agency will want to link the CP-SAT data to other personnel data maintained by the agency that describes each officer. Areas where comparisons can be made between those who complete the CP-SAT form and those who do not would be years of service, rank of officer, and other background information that appears relevant to capabilities in the area of community policing. If differences emerge between these two groups, the agency will know the limitations of the results and to whom they do not apply.


One way to handle differences between respondents and nonrespondents is to weight responses to adjust accordingly. For instance, in an agency with 100 officers, if too few rookies are in the respondent pool, one might wish to weight their responses more than the nonrookies’ responses. For instance, if the true proportion by years of experience is 90 nonrookies and 10 rookies, and if 80 percent of the responses (n=80) are from nonrookies, then one could weight each nonrookie response by 2.25. This, in effect, gives 180 nonrookies and 20 rookies. To avoid artificially increasing sample size from 100 to 200, one needs further weighting to scale back to 100.

With observed differences in mind, the agency can also do an additional wave of data collection and make special efforts to recruit specific noncompleters. This is sometimes referred to as intensive post-sampling.


On the second point, to make the link between the CP-SAT data and other personnel data the agency will need to set up a simple tracking system and create a tracking form that links the respondent’s assigned research identification number (assigned to all persons who have been given an CP-SAT form), and their badge number or other number linked to another database containing the necessary background comparative information.

Appendix 5: Sample Orientation Script


The { } Police Department/Sheriff’s Office is completing a self-assessment of how we implement community policing. Chief/Sheriff { }, { }, and I believe that this effort is important to our agency because { }.


Sworn members of the agency are being asked to complete a form assessing each staff person’s community partnerships, problem-solving projects, and the degree to which the structure of the organization supports community policing.


We will use the information to help with internal planning, and to inform training and management initiatives toward the full advancement of community policing. This self-assessment will also be used to determine the current level of community policing implementation within our agency, which will assist both management and officers in developing goals to reach the next steps in advancing community policing.

Each respondent should complete the Community Policing Self-Assessment (CP-SAT) form independently.


There are a few key terms used in the forms and we’d like to make sure that we all have clear definitions. For officers, we refer to { }. Persons at { } rank make up supervisors, and we define command staff as people at { } rank or higher.


Your responses on this self-assessment will be confidential and will be seen only by the team analyzing the data. The self-assessment process is voluntary and you may skip items that you do not wish to answer, but we encourage you to respond to as many items as possible.


Please complete the form during {when to complete the form} and return it to {where and in what} by {date}. We estimate it will take 45 to 60 minutes to complete the form.


If you have questions about the procedures for completing this assessment or how to complete the form, contact { }.


Thank you for your valuable support of this important project.

APPENDIX 6: Scoring Guide

When reporting results, an agency would want to report the median and/or mean/standard deviation for each score across respondents at each level (e.g., median command staff scores for general approach to problem solving, Problem Solving Processes, and General Skill in Problem Solving, then the same for officers, supervisors, and the cross-agency team). If the agency wished to do a more in-depth analysis, it may also want to look at subscores.



Cross-Agency


Community Partnerships


Wide Range of Partnerships—score (mean of items 1–10)


Resources/Commitment of Partners—score (mean of items 11–13)


Level of Interaction with Most Active Partners

Government partner

  • Reason does not have (item 15: in report, indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Nature of partnership and collaboration—score (mean of items 16–49; 51, 52)

  • Best describes the relationship (item 50: in report indicate percent that marked 1, 2, 3, 4—no subtotal needed)

Community-based organization

  • Reason does not have (item 54: in report, indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Nature of partnership and collaboration—score (mean of items 55–88; 90, 91)

  • Best describes the relationship (item 89: in report, indicate percent that marked 1, 2, 3, 4—no subtotal needed)

Community business

  • Reason does not have (item 93: in report, indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Nature of partnership & collaboration—score (mean of items 94–127; 129, 130)

  • Best describes the relationship (item 128: in report, indicate percent that marked 1, 2, 3, 4—no subtotal needed)


Level of Interaction with Other Partnerships

  • Best describes the relationship—score (mean of items 131–146)



Problem Solving


General Approach—score (mean of items 1–10)


Processes—score (mean of sub-scores)

  • Identifying and prioritizing problems—subscore (mean of items 11–27)

  • Analyzing problems—subscore (mean of items 28–49)

  • Responding to problems—subscore (mean of items 50–66)

  • Assessing problem solving initiatives—subscore (mean of items 67–85)


General Skill—score (mean of items 86–95)



Organizational Transformation


Agency Management—score (mean of subscores)

  • Agency climate and culture—subscore (mean of items 1–11)

  • Leadership—subscore (mean of items 12–39)

  • Labor relations—subscore (mean of items 40–43)

  • Decision-making—subscore (mean of items 44–54)

  • Planning and policies—subscore (mean of items 55–79)

  • Organizational evaluations—subscore (mean of items 80–89)

  • Transparency—subscore (mean of items 90–97)


Organizational Structure—score (mean of subscores)

  • Geographic assignment of officers—subscore (mean of items 98–106)

  • Despecialization—subscore (mean of items 107–113)

  • Resources and finance—subscore (mean of items 114–121)


Personnel Practices—score (mean of subscores)

  • Recruitment, hiring, and selection—subscore (mean of items 122–140)

  • Personnel evaluation and supervision—subscore (mean of items 141–163)

  • Training—subscore (mean of items 164–181)


Technology and Information Systems—score (mean of subscores)

  • Communication/access to data—subscore (mean of items 182–186)

  • Quality and accuracy of data—subscore (mean of items 187–197)



Officer Version


Community Partnerships


Wide Range of Partnerships—score (mean of items 1–10)


Resources/Commitment of Partners—score (mean of items 11-13)


Level of Interaction with Most Active Partner—score (mean of items 14–47)

Government partner

  • Reason does not have (item 49: in report, indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Best describes the relationship (item 50: in repor,t indicate percent that marked 1, 2, 3, 4—no subtotal needed)

  • Collaboration (mean of items 51 and 52)

Community-based organization

  • Reason does not have (item 54: in report, indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Best describes the relationship (item 55: in report, indicate percent that marked 1, 2, 3, 4—no subtotal needed)

  • Collaboration (mean of items 56–57)

Community business

  • Reason does not have (item 59: in report, indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Best describes the relationship (item 60: in report, indicate percent that marked 1, 2, 3, 4—no subtotal needed)

  • Collaboration (mean of items 612)

Individual community member

  • Reason does not have (item 64: in report, indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Best describes the relationship (item 75: in report, indicate percent that marked 1, 2, 3, 4—no subtotal needed)

  • Collaboration (mean of items 66–67)


Level of Interaction with Other Partnerships

  • Best describes the relationship—score (mean of items 68–87)



Problem Solving


General Approach—score (mean of items 1–10)


Processes—score (mean of subscores)

  • Identifying and prioritizing problems—subscore (mean of items 11–27)

  • Analyzing problems—subscore (mean of items 28–48)

  • Responding to problems—subscore (mean of items 49–65)

  • Assessing problem solving initiatives—subscore (mean of items 66–82)


General Skill—score (mean of items 83–92)



Organizational Transformation


Agency Management—score (mean of subscores)

  • Agency climate and culture—subscore (mean of items 1–11)

  • Leadership—subscore (mean of items 12-39)

  • Labor relations—subscore (no items)

  • Decision-making—subscore (mean of items 40–50)

  • Planning and policies—subscore (no items)

  • Organizational evaluations—subscore (no items)

  • Transparency—subscore (mean of items 51-58)


Organizational Structure—score (mean of subscores)

  • Geographic assignment of officers—subscore (mean of items 59–67)

  • Despecialization—subscore (mean of items 68–74)

  • Resources and finance—subscore (mean of items 75–82)


Personnel Practices—score (mean of subscores)

  • Recruitment, hiring, and selection—subscore (no items)

  • Personnel evaluation and supervision—subscore (mean of items 83-97)

  • Training—subscore (mean of items 98–104)


Technology and Information Systems—score (mean of subscores)

  • Communication/access to data—subscore (no items)

  • Quality and accuracy of data—subscore (mean of items 105–112)



Supervisor Version


Community Partnerships


Wide Range of Partnerships—score (mean of items 1–10)


Resources/Commitment of Partners—score (mean of items 11-13)


Level of Interaction with Most Active Partner—score (mean of items 14- 47)

Government partner

  • Reason does not have (item 49: in report indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Best describes the relationship (item 50: in report indicate percent that marked 1, 2, 3, 4—no subtotal needed)

  • Collaboration (mean of items 51 and 52)

Community-based organization

  • Reason does not have (item 54: in report, indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Best describes the relationship (item 55: in report, indicate percent that marked 1, 2, 3, 4—no subtotal needed)

  • Collaboration (mean of items 56 and 57)

Community business

  • Reason does not have (item 59: in report, indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Best describes the relationship (item 60: in report, indicate percent that marked 1, 2, 3, 4—no subtotal needed)

  • Collaboration (mean of items 61 and 62)

Individual community member

  • Reason does not have (item 64: in report, indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Best describes the relationship (item 65: in report, indicate percent that marked 1, 2, 3, 4—no subtotal needed)

  • Collaboration (mean of items 66 and 67)


Level of Interaction with Other Partnerships

    • Best describes the relationship—score (mean of items 68–87)



Problem Solving


General Approach—score (mean of items 1–10)


Processes—score (mean of subscores)

  • Identifying and prioritizing problems—subscore (mean of items 11–27)

  • Analyzing problems—subscore (mean of items 28–48)

  • Responding to problems—subscore (mean of items 49–65)

  • Assessing problem solving initiatives—subscore (mean of items 66–82)


General Skill—score (mean of items 83–92)



Organizational Transformation


Agency Management—score (mean of subscores)

  • Agency climate and culture—subscore (mean of items 1–11)

  • Leadership—subscore (mean of items 12–39)

  • Labor relations—subscore (no items)

  • Decision-making—subscore (mean of items 40–50)

  • Planning and policies—subscore (no items)

  • Organizational evaluations—subscore (no items)

  • Transparency—subscore (mean of items 51–58)


Organizational Structure—score (mean of subscores)

  • Geographic assignment of officers—subscore (mean of items 59–67)

  • Despecialization—subscore (mean of items 68–74)

  • Resources and finance—subscore (mean of items 75–82)


Personnel Practices—score (mean of subscores)

  • Recruitment, hiring, and selection—subscore (no items)

  • Personnel evaluation and supervision—subscore (mean of items 83–105)

  • Training—subscore (mean of items 106–112)


Technology and Information Systems—score (mean of subscores)

  • Communication/access to data—subscore (no items)

  • Quality and accuracy of data—subscore (mean of items 113–120)



Command Staff Version


Community Partnerships


Wide Range of Partnerships—score (mean of items 1–10)


Resources/Commitment of Partners—score (mean of items 11–13)


Level of Interaction with Most Active Partner—score (mean of items 14– 47)

Government partner

  • Reason does not have (item 49: in report, indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Best describes the relationship (item 50: in report, indicate percent that marked 1, 2, 3, 4—no subtotal needed)

  • Collaboration (mean of items 51 and 52)

Community-based organization

  • Reason does not have (item 54: in report, indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Best describes the relationship (item 55: in report, indicate percent that marked 1, 2, 3, 4—no subtotal needed)

  • Collaboration (mean of items 56 and 57)

Community business

  • Reason does not have (item 59: in report, indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Best describes the relationship (item 60: in report, indicate percent that marked 1, 2, 3, 4—no subtotal needed)

  • Collaboration (mean of items 61 and 62)

Individual community member

  • Reason does not have (item 64: in report, indicate percent that marked 1, 2, 3, 4, 5, 6—no subtotal needed)

  • Best describes the relationship (item 65: in report, indicate percent that marked 1, 2, 3, 4—no subtotal needed)

  • Collaboration (mean of items 66 and 67)


Level of Interaction with Other Partnerships

    • Best describes the relationship—score (mean of items 68–87)



Problem Solving


General Approach—score (mean of items 1–10)


Processes—score (mean of subscores)

  • Identifying and prioritizing problems—subscore (mean of items 11–27)

  • Analyzing problems—subscore (mean of items 28–49)

  • Responding to problems—subscore (mean of items 50–66)

  • Assessing problem solving initiatives—subscore (mean of items 67–85)


General Skill—score (mean of items 86–95)



Organizational Transformation


Agency Management—score (mean of subscores)

  • Agency climate and culture—subscore (mean of items 1–11)

  • Leadership—subscore (mean of items 12–39)

  • Labor relations—subscore (mean of items 40–43)

  • Decision-making—subscore (mean of items 44–54)

  • Planning and policies—subscore (mean of items 55–79)

  • Organizational evaluations—subscore (mean of items 80–89)

  • Transparency—subscore (mean of items 90–97)


Organizational Structure—score (mean of subscores)

  • Geographic assignment of officers—subscore (mean of items 98–106)

  • Despecialization—subscore (mean of items 107–113)

  • Resources and finance—subscore (mean of items 114–121)


Personnel Practices—score (mean of subscores)

  • Recruitment, hiring, and selection—subscore (mean of items 122–140)

  • Personnel evaluation and supervision—subscore (mean of items 141–163)

  • Training—subscore (mean of items 164–181)


Technology and Information Systems—score (mean of subscores)

  • Communication/access to data—subscore (mean of items 182–186)

  • Quality and accuracy of data—subscore (mean of items 187–197)




Reverse Scored Items


Cross-Agency


OT 110 (Despecialization)

PS 60


Officer


OT 71

PS 59


Supervisor


OT 71

PS 59

PS72


Command Staff


OT 110 (Despecialization)

PS 60


APPENDIX 7: Sample Report













The Anytown Police Department

Self-Assessment of Community Policing









December 2006




INTRODUCTION


The Anytown Police Department completed a self-assessment of its community policing implementation activities in October 2006. Ninety-seven of the agency’s 160 sworn employees completed at least some portion of the self-assessment (6 command staff, 22 supervisors, and 69 officers). This report summarizes the findings from the cross-agency team and the individual-level surveys for command staff, supervisors, and officers. The report is divided into three main sections of results: the community partnership component of the instrument followed by the problem-solving and organizational transformation sections. Results are reported as median values, unless otherwise noted. Items are aggregated across each respondent by taking the mean (or average) of that individual’s items. The median value, however, was chosen as the best indicator for representing the perceptions of community policing activities across the department. The median is the middle number of the respondent’s answers.



Community Partnerships Module


Measuring Community Partnerships


The Community Partnerships Module measures three concepts:


  1. The extent to which the agency has a wide range of partnerships.

  2. The resources/commitment of the agency’s community partners.

  3. The level of interaction with the agency’s partners.


Community Partnership Findings


The results presented here represent a snapshot of the department’s partnership activities. The results are reported by the three major sections listed above. The section about the level of interaction with the agency’s partners in divided into two smaller sections: The first section reports the findings from the cross agency team; the second section focuses on the individual-level surveys completed by the command staff, supervisors, and officers. The results are presented by rank level to highlight differences in perceptions across levels. All results include one cross-agency team survey (which was completed as a group) and command staff, supervisor, and officer survey results.9 It is important to note that the referent for items varies slightly across the versions, particularly the questions that focus on the level of interaction with agency partners. For those questions, the cross-agency team responded for the agency as a whole; supervisors responded for the officers they supervise; and command staff and officers responded from their own personal perspectives.


Types of Community Partners


The questions in this module focus on partnerships. Four sectors of partnership are identified.


  1. Government agencies: Relevant state, local, or federal government agencies, organizations, departments, or units. This may include the parks and recreation department, public works, parole/probation, and human service organizations.

  2. Community-based organizations: Organizations that generally provide services to the community or community members, or advocate on their behalf.

  3. Businesses: For-profit groups either operating in the community or otherwise having an interest in the community or its residents.

  4. Individual community members: Those who live, work, or otherwise have an interest in the community.


Wide Range of Partnerships


On a scale from 0 to 4 (0=Strongly disagree; 1=Disagree; 2=Neither agree nor disagree; 3=Agree; 4=Strongly agree), participants responded to items regarding their efforts (or the efforts of the agency or the officers they supervise) to establish and sustain a range of different types of partners. The items included statements such as, “The agency integrates its efforts with other agencies that deliver public services.” And “Officers consult community members for solutions to community problems.” Figure 1 illustrates the median score on this section for each level respondent. As shown, the cross-agency team and command staff tended to rate the department higher on multidisciplinary partnerships than did supervisors and line officers.



In figure 1, insert a period at the end of the title: “Figure 1: Wide Range of Partnerships.” And hyphenate “cross agency” to read: cross-agency



Resources/Commitment of Partners


Participants responded to items regarding the extent to which community partners are expected to provide support and resources to the law enforcement agency. Participants rated responses on a scale from 0 to 4 (0=Strongly disagree; 1=Disagree; 2=Neither agree nor disagree; 3=Agree; 4=Strongly agree). These items included statements such as, “Community members are held accountable for developing solutions to community problems.” Figure 2 illustrates the median score on this section for each level respondent. As shown, the cross-agency team and command staff tended to rate the department higher on resources and commitment of partners than did supervisors and line officers.



In Figure 2, insert a period after the title: “Figure 2: Resources/Commitment of Partners.” And hyphenate “Cross Agency” to read: Cross-Agency Tteam


Level of Interaction with Partners


Cross-Agency Team Results


Most Active Partner by Sector


The cross-agency team identified its most active partners in three sectors: government agency, community-based organization (CBO), and community business partners. The most active government agency partner was Anytown’s Code Enforcement Department. Its most active CBO partner was Metropolitan Ministries, and its most active business partner was the West End Business District.


For each sector, the cross-agency team members answered questions about the nature of their partnership and collaboration with each partner. Items included the resources that both the law enforcement agency and the community partner bring to the partnership, such as accurate and current community information and human resources. The sectors also include questions characterizing their relationship, such as whether the partnership is characterized by trust and shared ownership of the problems as well as whether the partnership is critical to reaching community policing goals. Questions also include whether the partner is engaged in short- or long-term problem-solving projects with the law enforcement agency. The mean value for the nature of the agency’s partnership and collaboration was 3.31 with Code Enforcement, 3.04 with Metropolitan Ministries, and 3.26 with the West End Business District.


The cross-agency team also was asked which of the following best characterizes their relationship with the partners:


  • One-way communication from the agency to the partner

  • One-way communication from the partner to the agency

  • Two-way information sharing

  • Collaboration, shared power, and shared decision-making.


For Code Enforcement, the relationship was characterized as “collaboration, shared power, and shared decision-making.” It was characterized as “two-way information sharing” for both Metropolitan Ministries and the West End Business District.


Interaction with Other Partners by Sector


The cross-agency team also had the opportunity to list other agency partnerships and put them into categories according to their interactions with the partners. All response options began with: “Interaction with the partners involves…” and the response options were as follows:


  • One-way communication from you (or the officers you supervise) to the partner

  • One-way communication from the partner to you (or the officers you supervise)

  • Two-way information sharing

  • Collaboration, shared power, and shared decision-making.


The mean response for this component was 3.02. The organizations are listed in Table 1.


Table 1: Additional Partners (Cross Agency).

Government Agencies

Anytown Fire Department

Anytown Parks and Recreation Department

Parole

Johnson County Sheriff’s Office



Community-Based Organizations

CASA

Anytown Community College

Red Cross

Local Business

Lakewood Shopping Center

Anytown Chamber of Commerce

Oakwood Apartments

Additional Community Partners

Rabbi Joel Sherman

Reverend Al Green

Daniel Stone


Command Staff, Supervisor, and Officer Results

Single, Most Active Community Partner


This section focuses on the degree of the relationship between individual officers and their single most active community partners. General questions were asked regarding the relationship with the most active partner for officers, supervisors, and commanders. The referent for officers and command staff was their own personal activities, while the referent for supervisors was “the officers I supervise.”


Participants were asked to indicate their (or the officers’ that they supervise) most active community partner from any sector. Table 2 lists the partners mentioned.


Table 2: Most Active Partner (by Participant Version).

Command Staff

Code Enforcement

West Anytown Revitalization

Parole

Supervisors

March of Dimes

City of Anytown Community Programs Coordinator

State Attorney’s Office

Neighborhood Watch, Greenbrier neighborhood

Anytown recycling

Jane Brown

Anytown Chamber of Commerce

City of Anytown Community Programs Director

Metropolitan Ministries

Officers

Mothers Against Drunk Driving

Lakewood Heights Group

HACC

New Haven Retirement Center

Safe Havens

YMCA Achievers

Chamber of Commerce

Hollywood 16 Mgmt. Personnel

GNRC

Crime Stoppers

Wilson Elementary School

Anytown Fire Department

Miller Leasing

Oakwood Apartments

Confidential Informants

Assistant Principal at Garfield Middle School

City of Anytown Parks and Recreation Department

Children's Advocacy Center and Child Protection Team

Reverend Green

Citizen Watch Groups

CASA

West Anytown High School

Anytown School Board



Nature of the Partnership with the Single, Most Active Community Partner


This section focuses on questions about the relationship between the individual officer (or the officers who are supervised) and his or her most active community partner. Items include the resources that both the law enforcement agency and the community partner bring to the partnership, such as accurate and current community information and human resources. The section also includes questions characterizing their relationship, such as whether the partnership is characterized by trust and shared ownership of the problems as well as whether the partnership is critical to reaching community policing goals. Items were rated on a 5-point scale ranging from 0 to 4 (0=Strongly disagree; 1=Disagree; 2=Neither agree nor disagree; 3=Agree; 4=Strongly agree). The median scores by agency staff level for questions in this subsection are as follows: 2.82 for command staff, 2.12 for supervisors, and 2.49 for officers.


Most Active Partners by Sector


These questions focus on the most active partnerships between officers and partners from four sectors: government agencies, CBOs, community businesses, and individual community members.


Government Agency


When asked to name their most active government agency partner, 0 command staff, 4 supervisors, and 25 officers stated that they (or the officers they supervise) did not have such partnerships.. Table 3 lists the government partners of the respondents who said that they had partners.


Table 3: Government Agency Partners (by Participant Version).

Command Staff

Code Enforcement

U.S. Attorney’s Office

Parole

Supervisors

Anytown Department of Human Services

Springfield Police Department

State Attorney’s Office

City of Anytown Community Programs Director

City of Anytown Community Programs Coordinator


Officers

Anytown Fire Department

Parole

Anytown Public Works

City of Anytown Parks and Recreation Department

Johnson County Health Department



The participants who said that they did not have an active government agency partner were asked to identify the barrier(s) preventing such relationships. Table 4 lists the percentages for each potential barrier to such a partnership. Respondents were asked to select all barriers that applied.


Table 4: Reasons for Lack of a Government Agency Partnership.

Command Staff

Weak Leadership

N/A (reported agency had partner)

Inability to leverage financial resources

N/A (reported agency had partner)

Inability to leverage nonfinancial resources

N/A (reported agency had partner)

Little or no shared ownership of problems

N/A (reported agency had partner)

Too few shared goals

N/A (reported agency had partner)

Inability of law enforcement agency to take on nontraditional goals of activities

N/A (reported agency had partner)

Supervisors

Weak Leadership

0%

Inability to leverage financial resources

0%

Inability to leverage nonfinancial resources

100%

Little or no shared ownership of problems

100%

Too few shared goals

100%

Inability of law enforcement agency to take on nontraditional goals of activities

100%

Officers

Weak Leadership

17%

Inability to leverage financial resources

8%

Inability to leverage nonfinancial resources

10%

Little or no shared ownership of problems

19%

Too few shared goals

19%

Inability of law enforcement agency to take on non-traditional goals of activities

23%


Relationship with Partner. Participants who said that they have a government agency partner were asked to select a response that best described the level of collaboration and communication in that partnership. All response options began with: “Interaction with the government agency involves…” and the response options to describe the relationship were as follows:

  • One-way communication from you (or the officers you supervise) to the partner

  • One-way communication from the partner to you (or the officers you supervise)

  • Two-way information sharing

  • Collaboration, shared power, and shared decision-making.


The responses chosen are presented in Table 5.


Table 5: Relationship with a Government Agency Partner.

Command Staff

One-way communication from you to the partner

17%

One-way communication from the partner to you

33%

Two-way information sharing

33%

Collaboration, shared power, and shared decision-making

17%

Supervisors

One-way communication from the officers you supervise to the partner

33%

One-way communication from the partner to the officers you supervise

44%

Two-way information sharing

11%

Collaboration, shared power, and shared decision-making

11%

Officers

One-way communication from you to the partner

45%

One-way communication from the partner to you

41%

Two-way information sharing

9%

Collaboration, shared power, and shared decision-making

5%


Collaboration with Partner. These questions asked participants to state the extent to which they (or the officers they supervise) collaborate with the government partner in both short-term and long-term problem-solving projects. The median response across agency level was 3.5 for command staff, 2.4 for supervisors, and 2.0 for officers.


Community-Based Organizations (CBO)


When asked to indicate their most active CBO partner (or that of the officers they supervise), 0 command staff, 8 supervisors, and 26 officers stated that they did not have such partnerships:. Table 6 lists the CBO partners identified by the respondents who had partners.


Table 6: Community-Based Organization Partners (by Participant Version).

Command Staff

Metropolitan Ministries

West Anytown Revitalization

Elks Lodge

Supervisors

March of Dimes

Red Cross

Lutheran Social Services

Boy Scouts

Anytown recycling

CASA

Metropolitan Ministries



Officers

Mothers Against Drunk Driving

Lakewood Heights Group

HACC

Crime Stoppers

Safe Havens

YMCA Achievers

Citizen Watch Groups

GNRC

Children's Advocacy Center and Child Protection Team

Anytown Victim Advocacy Group




Participants who indicated that they did not have an active CBO partner were asked to identify the barrier(s) preventing such relationships. Table 7 lists the percentages for each potential barrier to such a relationship. Respondents were asked to select all barriers that applied.


Table 7: Reasons for Lack of a Community-Based Organization Partnership.

Command Staff

Weak Leadership

N/A (reported agency had partner)

Inability to leverage financial resources

N/A (reported agency had partner)

Inability to leverage nonfinancial resources

N/A (reported agency had partner)

Little or no shared ownership of problems

N/A (reported agency had partner)

Too few shared goals

N/A (reported agency had partner)

Inability of law enforcement agency to take on nontraditional goals of activities

N/A (reported agency had partner)

Supervisors

Weak Leadership

100%

Inability to leverage financial resources

100%

Inability to leverage nonfinancial resources

0%

Little or no shared ownership of problems

100%

Too few shared goals

100%

Inability of law enforcement agency to take on nontraditional goals of activities

100%

Officers

Weak Leadership

14%

Inability to leverage financial resources

8%

Inability to leverage nonfinancial resources

8%

Little or no shared ownership of problems

18%

Too few shared goals

20%

Inability of law enforcement agency to take on no-traditional goals of activities

20%


Relationship with Partner. Participants who indicated that they had a CBO partner were asked to select a response that best described the level of collaboration and communication in that partnership. Responses are presented in Table 8.


Table 8: Relationship with a Community-Based Organization Partner.

Command Staff

One-way communication from you to the partner

33%

One-way communication from the partner to you

17%

Two-way information sharing

50%

Collaboration, shared power, and shared decision-making

0%

Supervisors

One-way communication from the officers you supervise to the partner

43%

One-way communication from the partner to the officers you supervise

29%

Two-way information sharing

21%

Collaboration, shared power, and shared decision-making

7%

Officers

One-way communication from you to the partner

42%

One-way communication from the partner to you

33%

Two-way information sharing

14%

Collaboration, shared power, and shared decision-making

12%


Collaboration with Partner. Participants were asked to indicate the extent to which they collaborate with the CBO partner on short- and long-term problem-solving projects. The median response for this component was 3.0 for command staff, 2.20 for supervisors, and 2.25 for officers.


Community Business


When asked to identify their most active community business partner, 1 command staff, 4 supervisors, and 10 officers stated that they (or the officers they supervise) did not have such partnerships:. The community business partners identified by respondents who said that they had partners are listed in Table 9


Table 9: Most Active Community Business Partner (by Participant Version).

Command Staff

Anytown Chamber of Commerce

West End Business District Merchants


Supervisors

G & T Manufacturing

Eastland Mall

Super 8 Movie Theater

Anytown Chamber of Commerce



Officers

Quick Stop Convenience Store

Al’s Liquor

Park Plaza Strip Mall

Miller Leasing

Oakwood Apartments

K&S Construction

Anytown Chamber of Commerce

Hollywood 16 Mgmt. Personnel



Participants who said that they do not have an active community business partner were asked to identify the barrier(s) preventing such relationships. Table 10 lists the potential barriers and percentages for each. Respondents were asked to select all barriers that applied.


Table 10: Reasons for Lack of a Community Business Partnership.

Command Staff

Weak Leadership

No answer reported

Inability to leverage financial resources

No answer reported

Inability to leverage nonfinancial resources

No answer reported

Little or no shared ownership of problems

No answer reported

Too few shared goals

No answer reported

Inability of law enforcement agency to take on nontraditional goals of activities

No answer reported

Supervisors

Weak Leadership

100%

Inability to leverage financial resources

100%

Inability to leverage nonfinancial resources

0%

Little or no shared ownership of problems

100%

Too few shared goals

100%

Inability of law enforcement agency to take on nontraditional goals of activities

100%

Officers

Weak Leadership

11%

Inability to leverage financial resources

9%

Inability to leverage nonfinancial resources

13%

Little or no shared ownership of problems

21%

Too few shared goals

23%

Inability of law enforcement agency to take on nontraditional goals of activities

17%


Relationship with Partner. Participants who indicated that they had a community business partner were asked to select a response that best described the level of collaboration and communication in that partnership. Table 11 shows the findings.


Table 11: Relationship with the Community Business Partner.

Command Staff

One-way communication from you to the partner

20%

One-way communication from the partner to you

20%

Two-way information sharing

40%

Collaboration, shared power, and shared decision-making

20%

Supervisors

One-way communication from the officers you supervise to the partner

22%

One-way communication from the partner to the officers you supervise

39%

Two-way information sharing

28%

Collaboration, shared power, and shared decision-making

11%

Officers

One-way communication from you to the partner

36%

One-way communication from the partner to you

39%

Two-way information sharing

20%

Collaboration, shared power, and shared decision- making

5%


Collaboration with Partner. Participants to indicate the extent to which they collaborate with the community business partner on short- and long-term problem-solving projects. The median response for this component was 2.1 for command staff, 2.14 for supervisors, and 2.29 for officers.


Individual Community Member


When asked to name their most active individual community member partner (or that of the officers they supervise), 0 command staff, 5 supervisors, and 27 officers stated that they did not have such partnerships:. Table 12 lists the community members named by the respondents who had a partner.


Table 12: Most Active Individual Community Member Partner (by Participant Version).

Command Staff

Reverend Al Green

Mary Lloyd


Supervisors

Rabbi Joel Sherman

Chris Johnson

Kim Richards

Christine Taylor



Officers

Andrew Griffin

Amy Simpson

Father Charles O’Neil

Jose Rodriguez

John South

Jamie Wilson

Daniel Stone




Participants who said that they did not have an active individual community member partner were asked to identify the barrier(s) preventing such relationships. Table 13 provides the percentages for each potential barriers. Respondents were asked to select all barriers that applied.


Table 13: Reasons for Lack of an Individual Community Member Partnership.

Command Staff

Weak Leadership

N/A (reported agency had partner)

Inability to leverage financial resources

N/A (reported agency had partner)

Inability to leverage nonfinancial resources

N/A (reported agency had partner)

Little or no shared ownership of problems

N/A (reported agency had partner)

Too few shared goals

N/A (reported agency had partner)

Inability of law enforcement agency to take on nontraditional goals of activities

N/A (reported agency had partner)

Supervisors

Weak Leadership

100%

Inability to leverage financial resources

0%

Inability to leverage no-financial resources

0%

Little or no shared ownership of problems

100%

Too few shared goals

100%

Inability of law enforcement agency to take on nontraditional goals of activities

100%

Officers

Weak Leadership

18%

Inability to leverage financial resources

11%

Inability to leverage nonfinancial resources

13%

Little or no shared ownership of problems

13%

Too few shared goals

18%

Inability of law enforcement agency to take on nontraditional goals of activities

18%


Relationship with Partner. Participants who indicated that they have an individual community member partner were asked to select a response that best described the level of collaboration and communication in that partnership. Findings are presented in Table 14.


Table 14: Relationship with the Individual Community Member Partner.

Command Staff

One-way communication from you to the partner

17%

One-way communication from the partner to you

33%

Two-way information sharing

50%

Collaboration, shared power, and shared decision-making

0%

Supervisors

One-way communication from the officers you supervise to the partner

31%

One-way communication from the partner to the officers you supervise

38%

Two-way information sharing

31%

Collaboration, shared power, and shared decision-making

0%

Officers

One-way communication from you to the partner

38%

One-way communication from the partner to you

38%

Two-way information sharing

19%

Collaboration, shared power, and shared decision-making

5%


Collaboration with Partner. Participants were asked to indicate the extent to which they collaborate with the individual community member through short- and long-term problem-solving projects. The median response for this component was 2.2 for command staff, 2.43 for supervisors, and 2.29 for officers.


Interaction with Other Partners by Sector


The command staff, supervisors, and officers also had the opportunity to list other agency partnerships and to put them into categories according to their interactions with the partners. All response options began with: “Interaction with the partners involves…” and the replies were as follows:


  • 1=One-way communication from you (or the officers you supervise) to the partner

  • 2=One-way communication from the partner to you (or the officers you supervise)

  • 3=Two-way information sharing

  • 4=Collaboration, shared power, and shared decision-making


The mean response for this component was 2.30 for command staff, 2.41 for supervisors, and 2.22 for officers. Table 15 lists the organizations or individual provided by the respondents.


Table 15: Additional Partner (by Participant Version).

Command Staff

Anytown Chamber of Commerce

Anytown Public Works Department


Supervisors

Anytown School Board

Anytown Community College

Glenview Hospital

Lakewood Shopping Center

East Anytown High School


Officer

Mark Lewis

Amy Simpson

Oakwood Apartments

Anytown Food Pantry

Glenview Elementary School

Metropolitan Ministries

Rabbi Joel Sherman





Problem-Solving Module


MEASURING PROBLEM SOLVING


The Problem-Solving Module measures three concepts:


  1. General approach to problem solving.

  2. Problem-solving processes including:

      1. Identifying and prioritizing problems (Scanning)

      2. Analyzing problems (Analysis)

      3. Responding to problems (Response)

      4. Assessing problem-solving initiatives (Assessment)

  3. General skill in problem solving.



PROBLEM-SOLVING FINDINGS

The results presented here are a snapshot of the department’s problem-solving approach and activities, as reported by the three major sections outlined above. Further details are provided in the problem-solving processes section. The results are presented by rank level to highlight differences in perceptions across levels and include one cross-agency team survey (which was completed as a group) and supervisor, command staff, and officer survey results.10 It is important to note that the referent for items varies slightly across the version: the cross agency team and command staff responded for the agency as a whole; supervisors responded for the officers they supervise, and officers responded from their own personal perspective.


General Approach to Problem Solving


On a scale from 0 to 4 (0=Strongly disagree; 1=Disagree; 2=Neither agree nor disagree; 3=Agree; 4=Strongly agree), participants responded to items about the agency’s general approach to problem solving. The items included statements such as, “The agency collects information at each stage of problem solving.” and “Patrol officers typically respond to calls for service using a problem-solving approach.” Figure 3 illustrates the median score on this section for each level respondent. The cross agency team and command staff tended to rate the department higher on general problem solving than did supervisors and line officers.


In Figure 3, add a period after Problem Solving in the title; hyphenate Cross Agency to read: Cross-Agency



Problem-Solving Processes


Questions here referred to the various phases of problem solving: identifying and prioritizing problems (scanning), analyzing problems (analysis), responding to problems (response), and assessing problem-solving initiatives (assessment). Responses were provided on a 5-point scale from 0 to 4 (0=Strongly disagree; 1=Disagree; 2=Neither agree nor disagree; 3=Agree; 4=Strongly agree). A sample question on Identifying and Prioritizing Problems (Scanning Phase) was, “When identifying problems in your community, the agency reviews formal documentation (for example, police reports and citizen complaints).” To capture the agency’s approach to Analyzing Problems (Analysis Phase), questions were asked such as, “When analyzing problems, the agency ensures that relevant information has been collected before proceeding with a detailed analysis.” For Responding to Problems (Response Phase), a sample item presented was, “When responding to problems in your community, the agency brainstorms new solutions with stakeholders.” To capture participants’ perspectives on Assessing Problem Solving Initiatives (Assessment Phase), items were presented such as, “When assessing its problem-solving efforts, the agency examines whether the response was implemented as planned.” The median values for Problem-Solving Processes by agency staff level are provided in Figure 4.


Punctuate Figure 4, as follows: in title, Problem-Solving Processes

Hyphenate Cross Agency to read: Cross-Agency



Subcomponent 1: Identifying and Prioritizing Problems (Scanning)


Questions in this section focus on the extent to which officers (or the agency) take the time to identify and prioritize problems for the agency’s problem-solving efforts. The questions focus on the sources used during the scanning process, including community outreach and mapping specific crimes, as well as on agency support for scanning activities. The median scores by agency staff level for questions in this subsection are as follows: 2.85 for the cross-agency team, 2.93 for the command staff, 2.14 for supervisors, and 2.22 for officers.


Subcomponent 2: Analyzing Problems (Analysis)


This section focuses on the extent to which officers (or the agency) take the time to analyze problems that have been identified in the community. Some questions concentrate on various resources used, such as routinely collected police data and information, crime analysts, and community partners, while other questions concern agency support for analysis activities. The median scores by agency staff level for questions in this subsection are as follows: 2.97 for the cross-agency team, 2.64 for command staff, 2.08 for supervisors, and 2.26 for officers.


Subcomponent 3: Responding to Problems (Response)


Items in this subcomponent focus on how officers (or the agency) respond to identified community problems. The questions focus on the use of traditional and nontraditional police tactics, involvement of community partners, and linking the response with the results of the analysis. Some questions concern agency support for response activities. The median scores by agency staff level for questions in this subsection are as follows: 3.00 for the crossagency team, 2.84 for command staff, 2.14 for supervisors, and 2.14 for officers.


Subcomponent 4: Assessing Problem-Solving Initiatives (Assessment)


This section focuses on how officers (or the agency) assess their problem-solving efforts. Questions include whether the response is monitored to sustain effectiveness, the types of assessment conducted, and the involvement of community partners. Some questions concern the level of support within the agency for assessment activities. The median scores by agency staff level for questions in this subsection are as follows: 2.86 for the cross-agency team, 2.58 for command staff, 1.96 for supervisors, and 2.10 for officers.


General Skill in Problem Solving


Participants rated the problem-solving skills of the agency, the officers they supervise, and themselves. Skills included using technology to facilitate problem solving, data analysis, applying “best practices” in problem solving, using problem-oriented policing literature, understanding the complexities of various public safety and crime problems, collaborating with the community in problem solving, and public speaking. For each skill, participants provided ratings on a scale from 0 to 4 (0=very low; 1= low; 2=satisfactory; 3=high; and 4=very high). The cross-agency team and command staff tended to rate personnel slightly higher on problem-solving skills than did supervisors and officers (see Figure 5).


In Figure 5, add a period after Problem Solving

Hyphenate Cross Agency to read: Cross-Agency



ORGANIZATIONAL TRANSFORMATION MODULE


MEASURING PRINCIPLES OF ORGANIZATIONAL TRANSFORMATION


The Organizational Transformation Module measures four concepts:


  1. Agency management: This section covered agency climate and culture, leadership, labor relations, decision-making, planning and policies, organizational evaluations, and transparency.

  2. Organizational structure: This section included questions on geographic assignment of officers, despecialization, and resources and finances.

  3. Personnel practices: In this section, questions addressed recruitment, selection, and hiring; personnel evaluations and supervision; and training.

  4. Technology/information systems: This section consisted of questions regarding: communication/access to data and the quality and accuracy of data.

ORGANIZATIONAL TRANSFORMATION FINDINGS


All results included analyses of one cross-agency team survey (which was completed as a group), and command staff, supervisor, and officer surveys.11 The figures below represent the median scores for the following sections: agency management, organizational structure, personnel practices, and technology and information systems. All questions were rated on Likert-type scales. Unless otherwise mentioned, the values of the scale were: 0= Strongly Disagree, 1= Disagree, 2= Neither Agree Nor Disagree, 3= Agree, and 4=Strongly Agree.


Agency Management


Participants rated the extent to which they agreed with questions relating to the seven subcomponents of Agency Management, which are each discussed in more detail below. Figure 6 shows the median value of the aggregated subcomponents for each agency staff level. The ratings provided by the cross-agency team and the command staff were higher than those provided by supervisors and officers.


In Figure 6, insert a period after Agency Management

Hyphenate Cross Agency to read: Cross-Agency


Subcomponent 1: Agency Climate and Culture


Participants answered questions related to the community policing philosophy of their department. The items included the whether there is agreement on what constitutes community policing in the agency and whether addressing quality-of-life concerns is a legitimate police activity. The median scores by agency staff level for questions in this subsection are as follows: 3.73 for the cross-agency team, 3.07 for command staff, 2.40 for supervisors, and 2.42 for officers.


Subcomponent 2: Leadership


Questions in this section inquired about the work, actions, and behaviors of leadership such as the chief/sheriff and top command staff, when it comes to supporting community policing. The median scores by agency staff level for questions in this subsection are as follows: 2.72 for the cross agency team, 3.29 for command staff, 1.89 for supervisors, and 2.00 for officers.


Subcomponent 3: Labor Relations


Items in this section focus on the extent to which labor groups are engaged in community policing and the how collective bargaining agreements facilitate community policing. Only the cross agency team and command staff responded to these questions. Their median scores are 3.45 and 2.87, respectively.


Subcomponent 4: Decision-Making


These items asked participants to respond to questions that cover topics such as the decentralization of the agency, the impact of the agency’s organizational structure on decision-making and authority, and the extent to which officers are empowered to make decisions. The median scores by agency staff level across these types of items are as follows: 3.17 for the cross-agency team, 3.5 for command staff, 1.86 for supervisors, and 2.02 for officers.


Subcomponent 5: Planning and Policies


Participants were asked to rate questions on the extent to which the community policing philosophy is incorporated into the agency’s strategy plan and mission and how well community policing has been institutionalized into policies and procedures. Only the cross-agency team and command staff were asked these questions. The median scores by agency staff level on this subcomponent are 3.63 and 3.13, respectively. .


Subcomponent 6: Organizational Evaluations


This subcomponent included items related to the extent to which the agency incorporates community policing into its organizational performance measurement system. Questions for this subcomponent were included only in the cross-agency team and command staff versions of the survey. The median scores by agency staff level are 4.00 and 2.67, respectively.


Subcomponent 7: Transparency


In this subsection, respondents answered questions regarding the extent to which the agency is open and forthcoming about its community policing practices and intentions. These questions referred to communication of the agency with external parties such as the press and other government agencies to provide critical information on agency activities. The median scores by agency staff level are: 3.38 for the cross-agency team, 2.75 for command staff, 1.97 for supervisors, and 1.72 for officers.


Organizational Structure


Participants rated the extent to which they agreed with questions relating to the three subcomponents of Organizational Structure, which are discussed in more detail below. Figure 7 demonstrates the median value of the aggregated subcomponents for each agency staff level. As depicted in the graph, the ratings provided by the cross-agency team and the command staff were higher than those provided by supervisors and officers.


In Figure 7, add a period after Organizational Structure

Hyphenate Cross Agency to read: Cross-Agency


Subcomponent 1: Geographic Assignment of Officers


Participants were asked to indicate the extent to which beat assignments were of a long enough length to facilitate increased relationship building with community members and to see a measurable effect of one’s actions on community concerns. The median values for this subcomponent by agency staff level are: 3.00 for the cross-agency team, 2.5 for command staff, 2.13 for supervisors, and 2.31 for officers.


Subcomponent 2: Despecialization


These items referred to the extent to which the agency takes a generalist approach to community policing activities (instead of using special units) and the amount of time line officers are provided to contribute to such efforts. The median values for this subcomponent are: 2.88 for the cross-agency team, 3.06 for command staff, 1.88 for supervisors, and 2.0 for officers.


Subcomponent 3: Resources and Finances


Participants were asked to rate the extent to which the agency uses specific resources to facilitate partnerships and problem solving. The median values for this subcomponent are: 3.62 for the cross-agency team, 2.92 for command staff, 2.08 for supervisors, and 2.00 for officers.


Personnel Practices


Participants rated the extent to which they agreed with questions relating to the three subcomponents of Personnel Practices, which are discussed in more detail below. Figure 8 demonstrates the median value of the aggregated subcomponents for each agency staff level. As depicted in the graph, the ratings provided by the cross-agency team were higher than those provided by the command staff, supervisors, and officers.


In Figure 8, insert a period after Personnel Practices

Hyphenate Cross Agency to read: Cross-Agency



Subcomponent 1: Recruitment, Selection, and Hiring


Questions for this subcomponent related to the extent to which the agency’s recruitment, selection, and hiring practices are aligned with community policing principles. These questions were asked only of the cross-agency team and command staff. The median values were 3.85 and 2.33, respectively.


Subcomponent 2: Personnel Evaluation and Supervision


These items referred to the extent to which agency personnel are held accountable for mentoring and facilitating community-based problem solving and relationship building as well as engaging in collaborative partnerships throughout the community. The median values for this subcomponent are as follows: 4.00 for the cross-agency team, 2.38 for command staff, 1.89 for supervisors, and 2.0 for officers.


Subcomponent 3: Training


Participants were asked to indicate the extent to which community policing knowledge and skill building are incorporated into the agency’s training. The median values for this subcomponent are: 3.70 for the cross-agency team, 2.70 for command staff, 1.91 for supervisors, and 2.25 for officers.


Technology and Information Systems


Participants rated the extent to which they agreed with questions relating to the two subcomponents of Technology and Information Systems, which are discussed in more detail below. Figure 9 demonstrates the median value of the aggregated subcomponents for each agency staff level. For the overall component of Technology and Information Systems, the ratings provided by the cross-agency team were higher than those provided by the command staff, supervisors, and officers.


In Figure 9, add a period after Technology and Information Systems

Hyphenate Cross Agency to read: Cross-Agency



Subcomponent 1: Communication/Access to Data


This subcomponent asked respondents to indicate their level of agreement with statements about the ways in which the agency uses information technology (IT) to communicate and access data. These questions referred to the extent to which IT is used to help facilitate continued dialog and promotion of community policing. The median values for this subcomponent are: 3.00 for the cross-agency team, 3.20 for command staff, 2.20 for supervisors, and 2.40 for officers.


Subcomponent 2: Quality and Accuracy of Data


Participants were asked to indicate the extent to which IT is used to ensure ease of access to critical data for problem-solving efforts and to track and organize data in a user-friendly format. The median values for this subcomponent are: 3.75 for the cross-agency team, 2.58 for command staff, 2.5 for supervisors, and 2.29 for officers.

APPENDIX 8: Resources


The following resources may be useful to police agencies as they plan for activities following the self-assessment. This list includes resources from the COPS Office, PERF, and Caliber, as well as selected other organizations, but is by no means exhaustive. For ease of review, the resources are grouped by topical area.


The Internet references cited in this publication were valid as of March 2008. Given that URLs and web sites are in constant flux, neither the authors nor the COPS Office can vouch for their current validity.


Citizen Complaints


Walker, Samuel, Carol Archbold, and Leigh Herbst. Mediating Citizen Complaints Against Police Officers: A Guide for Police and Community Leaders. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2002

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RIC=134



Civil Rights

International Association of Chiefs of Police. Protecting Civil Rights: A Leadership Guide for State, Local, and Tribal Law Enforcement. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2006

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RIC=251.



Consent Decrees


Davis, Robert C., Nichole J. Henderson, Janet Mandelstam, Christopher W. Ortiz, and Joel Miller. Federal Intervention in Local Policing: Pittsburgh’s Experience with a Consent Decree. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2006.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=90.


Davis, Robert C., Nichole J. Henderson, Christopher W. Ortiz. Can Federal Intervention Bring Lasting Improvement in Local Policing. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2005.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=23.


Davis, Robert C. Christopher W. Ortiz, Nichole J. Henderson, Joel Miller, and Michelle K. Massie. Turning Necessity into Virtue: Pittsburgh’s Experience with a Federal Consent Decree. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2002.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=217



Community Policing (General)

Chapman, Robert and Matthew Scheider. Community Policing for Mayors: A Municipal Service Model for Policing and Beyond. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2006.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=32.


Community Policing Consortium. Understanding Community Policing: A Framework for Action. Washington, D.C.: Bureau of Justice Assistance, U.S. Department of Justice, 1994.

http://www.ncjrs.gov/pdffiles/commp.pdf.


Flynn, Daniel W. Defining the “Community” in Community Policing. Washington, D.C.: Community Policing Consortium and Police Executive Research Forum, 1998. http://www.policeforum.org/upload/cp_570119206_12292005152452.pdf.


Fridell, Lorie A. and Mary Ann Wycoff, eds. Community Policing: The Past, Present, and Future. Washington, D.C.: Police Executive Research Forum, 2004.


Reuland, Melissa, Melissa Schaefer Morabito, Camille Preston, and Jason Cheney. Police-Community Partnerships to Address Domestic Violence. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2006.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=151



Community Surveying


Weisel, Deborah. Conducting Community Surveys: A Practical Guide for Law Enforcement Agencies. Washington, D.C.: Bureau of Justice Assistance and the Office of Community Oriented Policing Services, U.S. Department of Justice, 1999. : http://www.ojp.usdoj.gov/bjs/pub/pdf/ccspglea.pdf.



Equipment and Facilities


Bulletproof Vest Partnership/Body Armor Safety Initiative. Washington, D.C.: Office of Justice Programs, U.S. Department of Justice.

http://www.ojp.usdoj.gov/bvpbasi/


International Association of Chiefs of Police. Police Facility Planning Guidelines: A Desk Reference for Law Enforcement Executives. Alexandria, Virginia, 2002.

http://www.theiacp.org/documents/pdfs/Publications/ACF2F3D%2Epdf.



Ethics


Kentucky Regional Community Policing Institute. Ethics for the Individual Officer: A Self-Assessment. (Training Course)

http://www.kycops.org/CourseInfo.htm


International Association of Chiefs of Police. Ethics Toolkit: Enhancing Law Enforcement Ethics in a Community Policing Environment., Washington, D.C.: International Association of Chiefs of Police and the Office of Community Oriented Policing Services, U.S. Department of Justice, 2002. http://www.theiacp.org/profassist/ethics/index.htm.



Evaluation


International Association of Chiefs of Police. Establishing and Sustaining Law Enforcement-Researcher Partnerships: Guide for Law Enforcement Leaders. Washington, D.C.: National Institute of Justice and International Association of Chiefs of Police, no date.


International Association of Chiefs of Police. Establishing and Sustaining Law Enforcement-Researcher Partnerships: Guide for Researchers. Washington, D.C.: National Institute of Justice and International Association of Chiefs of Police, no date.


Milligan, Stacy Osnick, Lorie Fridell, and Bruce Taylor. Implementing an Agency-Level Performance Management System: A Guide for Law Enforcement Executives. Washington, D.C.: National Institute of Justice and Police Executive Research Forum, 2006.

http://www.ncjrs.gov/pdffiles1/nij/grants/214439.pdf.


Moore, Mark H. and Anthony Braga. The “Bottom Line of Policing:” What Citizens Should Value (and Measure!) in Police Performance. Washington, D.C.: Police Executive Research Forum, 2003. http://www.policeforum.org/upload/BottomLineofPolicing_576683258_1229200520031.pdf.


Ward, Kristin, Susan Chibnall, and Robyn Harris. Measuring Excellence: Planning and Managing Evaluations of Law Enforcement Initiatives. Washington, D.C.: Office of Community Oriented Policing Services and ICF International, 2007.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=431.


Homeland Security


Carter, David L. Law Enforcement Intelligence: A Guide for State, Local, and Tribal Law Enforcement Agencies. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2004.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=277 .


Davies, Heather. J. and Gerard R. Murphy. Protecting Your Community from Terrorism, Volume 2: Working with Diverse Communities. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2004.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=168.


Davies, Heather. J. and Martha R. Plotkin. Protecting Your Community from Terrorism, Volume 5: Partnerships to Promote Homeland Security. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2005.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=171.


Loyka, Stephen A., Donald A. Faggiani, and Clifford Karchmer. Protecting Your Community from Terrorism, Volume 4: The Production and Sharing of Intelligence. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2005.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=170.


Murphy, Gerard R. and Martha R. Plotkin. Protecting Your Community from Terrorism, Volume 1: Improving Local-Federal Partnerships. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx:RID=359


Reuland, Melissa and Heather J. Davies. Protecting Your Community from Terrorism, Volume 3: Preparing for and Responding to Bioterrorism. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2004.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=169.


Scheider, Matthew, Robert E. Chapman, and Michael F. Seelman, “Connecting the Dots for a Proactive Approach,” BTS America (4)(2003): 158–162.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=245 .



Labor Relations


DeLord, Ronald G. and Jerry Sanders. Police Labor-Management Relation,s Vol. I: Perspectives and Practical Solutions for Implementing Change, Making Reforms, and Handling Crises for Managers and Union Leader. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2006.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=249.


Polzin, Michael J. and Ronald G. Delord. Police Labor-Management Relation, Vol. II: A Guide for Implementing Change, Making Reforms, and Handling Crises for Managers and Union Leaders. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2006.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=250.



Leadership


International Association of Chiefs of Police. Police Chiefs Desk Reference: A Guide for Newly Appointed Police Leaders. Washington, D.C.: Bureau of Justice Assistance, U.S. Department of Justice, 2004.

http://www.theiacp.org/research/PCDR.pdf.


International Association of Chiefs of Police. Leadership in Police Organizations (LPO) Program.

http://www.theiacp.org/cp12/pdfs/CPL_Update.pdf


Police Executive Research Forum. Senior Management Institute for Police. (Check PERF calendar for dates)

http://www.policeforum.org.

Scott, Michael. Managing for Success: A Police Chief’s Survival Guide. Washington, D.C.: Police Executive Research Forum, 1986.



Marketing


Margolis, Gary J. and Noel C. March, “Creating the Police Department’s Image,” Police Chief 71 (4)(2004): 25–27, 29,30, 33–34.


Sprafka, Harvey E., “Marketing the Smaller Agency,” Police Chief 71 (9)(2004): 20–25.


Chermak, Steven and Alexander Weiss, “Marketing Community Policing in the News: A Missed Opportunity?” NIJ Research in Practice, July 2003. Washington, D.C.: Office of Justice Programs, 2003.

http://cops.usdoj.gov/ric/ResourceDetail.aspx?RID=133 .



Operations


McEwen, Tom, Deborah Spence, Russell Wolff, Julie Wartell, and Barbara Webster. Call Management and Community Policing: A Guidebook for Law Enforcement. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003.

http://www.cops.usdoj.gov/ric/ResearchDetail.aspx?RID=22.


Organizational Change


Schneider, Andrea, Clark Kimerer, Scott Seaman, and Joan Sweeney. Community Policing in Action! A Practitioner’s Eye View of Organizational Change. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=279

.



Partnerships


Gordon, Mary Beth. Making the Match: Law Enforcement, the Faith Community and the Value-Based Initiative. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003.

http://cops.usdoj.gov/ric/ResourceDetail.aspx?RID=132.


Khashu, Anita, Robin Busch, and Zainab Latif. Building Strong Police-Immigrant Community Relations: Lessons from a New York City Project. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, with the Vera Institute of Justice, 2005.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=17.


Lane, Erin, John Lucera, and Rachel Boba. Inter-Agency Response to Domestic Violence in a Medium Sized City. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=114 .


National Center for Victims of Crime and the Police Foundation. Bringing Victims into Community Policing. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2002.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=16.


National Center for Victims of Crime and the Police Foundation. Creating an Effective Stalking Protocol. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2002.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=45 .


Police Executive Research Forum (PERF). Increasing Community-Police Partnerships to Fight Crime: A Case Study of USAID’s Grants Pen Anti-Crime Initiative in Jamaica. Washington, D.C.: USAID and PERF, 2005.


:.


Rinehart, Tammy, Anna T. Lazlo, and Gwen O. Briscoe. Collaboration Toolkit: How to Build, Fix, and Sustain Productive Partnerships. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2001.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=236.



Personnel


Charlotte-Mecklenburg Police Department. Employee Conduct: Investigations & Discipline. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice; Charlotte, North Carolina: Charlotte-Mecklenburg Police Department, 2006.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=86 .


Kochel, Tammy Rinehart, Anna T. Laszlo, and Laura B. Nickles. SRO Performance Evaluation: A Guide to Getting Results. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2005.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=240


Oettmeier, Timothy N. and Mary Ann Wycoff. Personnel Performance Evaluations in the Community Policing Context. Washington, D.C.: Community Policing Consortium and Police Executive Research Forum, 1997.

http://www.policeforum.org/library.asp?MENU=35 (then click Human Resources Issues)


Policing Organizations: Web Sites


Commission on Accreditation of Law Enforcement Agencies, Inc.

http://www.calea.org


International Association of Chiefs of Police

http://www.theiacp.org


The National Organization of Black Law Enforcement Executives

http://www.noblenational.org/


National Sheriffs’ Association

http://www.sheriffs.org


Office of Community Oriented Policing Services, U. S. Department of Justice

http://www.cops.usdoj.gov


Police Executive Research Forum

http://www.policeforum.org


Police Foundation

http://www.policefoundation.org



Problem Solving


Boba, Rachel. Problem Analysis in Policing. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice and the Police Foundation, 2003. http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=161.


Bynum, Timothy S. Using Analysis for Problem-Solving: A Guidebook for Law Enforcement. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2006.

http://www.cops.usdoj.gov//ric/ResourceDetail.aspx?RID=223.


Center for Problem-Oriented Policing.

http://www.popcenter.org/


Charlotte-Mecklenburg Police Department. Early Intervention System: A Tool to Encourage and Support High Quality Performance. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice; Charlotte, North Carolina: Charlotte-Mecklenburg Police Department, 2006.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=84.


Chavez, T. Dave, Jr., Michael R. Pendleton, and Jim Bueerman. Knowledge Management in Policing Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2005.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=124.


Clarke, Ronald V. and John E. Eck. Crime Analysis for Problem Solvers in 60 Small Steps. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2005.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=47


Clarke, Ronald V. and Herman Goldstein. Reducing Theft at Construction Sites: Lessons from a Problem-Oriented Project. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RIC=179

Clarke, Ronald V. and Herman Goldstein. Thefts from Cars in Center-City Parking Facilities: A Case Study in Implementing Problem Oriented Policing. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=204


Community Policing Consortium. Collaborative Problem Solving Training. Facilitator’s guide and slides.

http://www.policeforum.org.


Community Policing Consortium. Mechanics of Problem Solving Training. Facilitator’s guide and slides.

http://www.policeforum.org.


Community Policing Consortium. Supervising the Problem Solving Process Training. Facilitator’s guide and slides.

http://www.policeforum.org.

O’Shea, Timothy and Keith Nicholls. Crime Analysis in America. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2002.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=48

.

O’Shea, Timothy and Keith Nicholls. Crime Analysis in America: Findings and Recommendations. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=49.


Perkins, Matt, Scott Phillips, Tammy Rinehart, Karin Schmerler, and Meg Townsend. Problem-Solving Tips: A Guide to Reducing Crime and Disorder through Problem-Solving Partnerships. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 1998.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=164


Problem-Oriented Guides for Police (POP Guides). Published by the COPS Office, the POP Guides cover a wide variety of problem-specific topics to help identify potential factors and underlying causes of the problems, indentify known responses, and provide potential measures to assess the effectiveness of problem-solving efforts.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=392

Sampson, Rana and Michael S. Scott. Tackling Crime and Other Public-Safety Problems: Case Studies in Problem-Solving. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 1999.

http://www.cops.usdoj.gov/ric/Resourcedetail.aspx?RID=202


Schmerler, Karin, Matt Perkins, Scott Phillips, Tammy Rinehart, and Meg Townsend. A Guide to Reducing Crime and Disorder through Problem-Solving Partnerships. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2006.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=164


Scott, Michael S. Problem-Oriented Policing: Reflections on the First 20 Years. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2000.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=162


Racially Biased Policing


Fridell, Lorie, Robert Lunney, Drew Diamond, and Bruce Kubu. Racially Biased Policing: A Principled Response. Washington, D.C.: Police Executive Research Forum and the Office of Community Oriented Policing Services, U.S. Department of Justice, 2001.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=172



Recruitment, Hiring, and Selection


Aamodt, Michael G. Law Enforcement Selection: Research Summaries. Washington, D.C.: Police Executive Research Forum, 2004. http://www.radford.edu/~maamodt/LawEnforcement%20Selection/Aamodt%20-%20Research%20Summaries.pdf.


Police Executive Research Forum and International City/County Management Association. (1999.) Selecting a Police Chief: A Handbook for Local Government. Washington, D.C.: International City/County Management Association and Police Executive Research Forum, 1999.


Scrivner, Ellen. Innovations in Police Recruitment and Hiring–Hiring in the Spirit of Service. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2006.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=113



Reentry


La Vigne, Nancy G., Amy L. Solomon, Karen A. Beckman, and Kelly Dedel. Prisoner Reentry and Community Policing: Strategies for Enhancing Public Safety. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2006.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=159


Urban Institute, Reentry Roundtable Meeting. Prisoner Reentry and Community Policing: Strategies for Enhancing Public Safety Working Papers. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2004. Available at: http://www.cops.usdoj.gov/mime/open.pdf?Item=1509.



Restorative Justice


Nicholl, Caroline G. Community Policing, Community Justice, and Restorative Justice–Exploring the Links for the Delivery of a Balanced Approach to Public Safety. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 1999.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=33


Nicholl, Caroline G. Toolbox for Implementing Restorative Justice and Advancing Community Policing. A Guidebook prepared for the Office of Community Oriented Policing Services, U.S. Department of Justice. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 1999.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=216



Strategic Planning, Policy Development


Finn, Peter, Meg Townsend, Michael Shively, and Tom Rich. A Guide to Developing, Maintaining, and Succeeding with Your School Resource Officer Program. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2005.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=231


International Association of Chiefs of Police. National Law Enforcement Policy Center. Assists law enforcement agencies across the United States in developing and refining law enforcement policy.

http://www.theiacp.org/pubinfo/PolCtr.htm


International Association of Law Enforcement Planners. An organization for people who working in planning and research for criminal justice agencies and who share information about innovations, problems, and solutions confronting criminal justice today.

http://www.ialep.org


Kelling, George L. and Mary A. Wycoff. The Evolving Strategy of Policing: Case Studies of Strategic Change. Washington, D.C.: National Institute of Justice, U.S. Department of Justice, 2001.

http://www.ncjrs.gov/pdffiles1/nij/grants/198029.pdf.


Police Executive Research Forum. Police Department Budgeting: A Guide for Law Enforcement Chief Executives. Washington, D.C: Police Executive Research Forum, 2002.


Society of Police Futurists International. An organization of law enforcement practitioners, educators, researchers, private security specialists, technology experts, and others involved in developing, analyzing, and interpreting long-range forecasts to accurately anticipate and prepare for the evolution of law enforcement into the future.

http://www.policefuturists.org/.


Spence, Deborah, Barbara Webster, and Edward Connors. Guidelines for Starting and Operating a New Police Department. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2006.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=104

.



Technology and Information Management

21st Century Solutions, Inc. Building a 3-1-1 System for Police Non-Emergency Call–A Case Study of the Austin Police Department: Executive Summary. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003. http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=234


21st Century Solutions, Inc. Building a 3-1-1 System for Police Non-Emergency Calls, A Process and Impact Evaluation. (Austin Police Department) Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=235

.


Abt Associates, Inc. Information Systems Technology Enhancement Program (ISTEP): ISTEP II Case Studies, Final Report. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=117

.


Boba, Rachel. Introductory Guide to Crime Analysis and Mapping. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice and the Police Foundation, 2001.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=273

.


Dunworth, Terrence, Gary Cordner, Jack Greene, Timothy Bynum, Scott Decker, Thomas Rich, Shawn Ward, and Vince Webb. Police Department Information Systems Technology Engagement Project (ISTEP). Prepared by Abt Associates, Inc. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2000.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=116

.


Faggiani, Don, Bruce Kubu, and Ramona Rantala. Facilitating the Implementation of Incident-Based Data Systems. Washington, D.C.: Bureau of Justice Statistics, U.S. Department of Justice, 2005.

http://www.policeforum.org/library.asp?MENU=187.


Fridell, Lorie A. By the Numbers: A Guide for Analyzing Race Data from Vehicle Stops. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2004.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=242


Fridell, Lorie A. Understanding Race Data from Vehicle Stops: A Stakeholder’s Guide. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2005.

http://www.cops.usdoj.gpv/ric/ResourceDetail.aspx?RID=220


Harris, Kelly J. and William H. Romesburg. Law Enforcement Technology Guide: How to plan, purchase, and manage technology (successfully!). Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2002.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=243


Hawkins, Dan M. Law Enforcement Tech Guide for Communications Interoperability: A Guide for Interagency Communications Projects. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2006.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=238


McEwen, Tom, Deborah Spence, Russell Wolff, Julie Wartell, and Barbara Webster. Call Management and Community Policing: A Guidebook for Law Enforcement. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=22 .


McMahon, Joyce, Joel Garner, Ronald Davis, and Amanda Kraus. How to Correctly Collect and Analyze Racial Profiling Data: Your Reputation Depends on It. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2002.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=109


O’Shea, Timothy C. and Keith Nicholls. Crime Analysis in America. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=48


Police Foundation. Crime Mapping Laboratory. Integrating Community Policing and Computer Mapping: Assessing Issues and Needs among COPS Office Grantees. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice and Police Foundation, 2000.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=272


Rich, Tom, Peter Finn, and Shawn Ward. Guide to Using School COP to Address Student Discipline and Crime Problems. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2001.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=103


Romesburg, William H. Law Enforcement Tech Guide for Small and Rural Police Agencies: A Guide for Executives, Managers, and Technologists. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2005. http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=126


SEARCH: The National Consortium for Justice Information and Statistics. SEARCH focuses on helping identify and solve information management challenges facing state and local justice and public safety agencies.

http://www.search.org.


Skogan, Wesley, Susan M. Hartnett, Jill DuBois, and Jason Bennis (The Institute for Policy Research, Northwestern University). Policing Smarter Through IT: Lessons in Enterprise Implementation. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2004.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=269.


Skogan, Wesley, Susan M. Hartnett, Jill DuBois, Jason Bennis, So Young Kim, Dennis Rosenbaum, Lisa Graziano, and Cody Stephens (The Institute for Policy Rresearch, Northwestern University and the University of Illinois at Chicago). Policing Smarter Through IT: Learning from Chicago’s Citizen and Law Enforcement Analysis and Reporting (CLEAR) System. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=152


Walker, Samuel. Early Intervention Systems for Law Enforcement: A Planning and Management Guide. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2003.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=85


Walker, Samuel and Stacy Osnick Milligan. Supervision and Intervention with Early Intervention Systems: A Guide for Law Enforcement Chief Executives. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2005. http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=241


Walker, Samuel and Stacy Osnick Milligan. Strategies for Intervening with Officers through Early Intervention Systems: A Guide for Front-Line Supervisors. Washington, D.C.: Office of Community Oriented Policing Services, U.S. Department of Justice, 2006.

http://www.cops.usdoj.gov/ric/ResourceDetail.aspx?RID=197



Training


Police Training Officer (PTO) Program. An alternative national model for field training that incorporates community policing and problem-solving principles.

http://www.cops.usdoj.gov/Default.asp?Item=461


Regional Community Policing Institutes (RCPI). A national network providing comprehensive, innovative, cutting-edge education, training, and technical assistance to COPS Office grantees.

http://www.cops.usdoj.gov/Default.asp?Item=115


APPENDIX 9: About ICF International


ICF is a global professional services firm that partners with government and commercial clients to deliver consulting services and technology solutions in social programs, human capital management, homeland security, defense, energy, environment, and transportation.  ICF International has more than 4,000 employees serving clients in the Americas, Asia, and Europe. Services for this contract were provided by the Applied Organizational Research group

The Applied Organizational Research group is staffed by more than 80 professionals, including more than 30 Ph.D.-level staff members with significant experience in organizational assessment, personnel assessment, selection and placement, strategic workforce planning, and training and leadership development.  


To learn more about ICF International, visit www.icfi.com.

.

APPENDIX 10: About the Police Executive Research Forum


The Police Executive Research Forum (PERF) is a national organization of progressive law enforcement chief executives from city, county, and state agencies who collectively serve more than half of the country’s population. Established in 1976 by 10 prominent police chiefs, PERF has evolved into one of the leading police think tanks. With membership from many of the largest police departments in the country and around the globe, PERF has pioneered studies in such fields as community and problem-oriented policing, racially-biased policing, multijurisdictional investigations, domestic violence, the police response to people with mental illnesses, homeland security, management concerns, use of force, and crime-reduction approaches.


PERF’s success is built on the active involvement of its members: police chiefs, superintendents, sheriffs, and other law enforcement leaders. PERF also has different kinds of memberships that allow the organization to benefit from the diverse views of criminal justice researchers, law enforcement of all ranks, and others committed to advancing policing services to all communities. As a nonprofit organization, PERF is committed to the application of research in policing and to promoting innovation that will enhance the quality of life in our communities. PERF’s objective is to improve the delivery of police services and the effectiveness of crime control through the exercise of strong national leadership, the public debate of criminal justice issues, the development of a body of research about policing, and the provision of vital management services to all police agencies.


In addition to PERF’s cutting-edge police and criminal justice research, the organization provides a wide variety of management and technical assistance programs to police agencies throughout the world. The organization also continues to work toward increased professionalism and excellence in the field through its training, leadership, and publications programs. For example, PERF sponsors the Senior Management Institute for Police, conducts searches for communities seeking police chief executives, and publishes some of the leading literature in the law enforcement field that addresses the difficult issues that challenge today’s police leaders. PERF publications are used for training, promotion exams, and to inform police professional about innovative approaches to community problems.


To learn more about PERF visit www.policeforum.org.

APPENDIX 11: About the Office of Community Oriented Policing Services


Who We Are

The Office of Community Oriented Policing Services (the COPS Office) was created through the Violent Crime Control and Law Enforcement Act of 1994. As a component of the Justice Department, the mission of the COPS Office is to advance the practice of community policing as an effective strategy to improve public safety. Moving from a reactive to proactive role, community policing represents a shift from more traditional law enforcement practices. By addressing the root causes of criminal and disorderly behavior, rather than simply responding to crimes once they have been committed, community policing concentrates on preventing both crime and the atmosphere of fear it creates. Additionally, community policing encourages the use of crime-fighting technology and operational strategies and the development of mutually beneficial relationships between law enforcement and the community. By earning the trust of the members of their communities and making those individuals stakeholders in their own safety, law enforcement can better understand and address the community’s needs and the factors that contribute to crime.

What We Do

The COPS Office awards grants to tribal, state, and local law enforcement agencies to hire and train community policing professionals, acquire and deploy cutting-edge crime-fighting technologies, and develop and test innovative policing strategies. COPS Office funding provides training and technical assistance to advance community policing at all levels of law enforcement, from line officers to law enforcement executives, as well as others in the criminal justice field. Because community policing is inclusive, COPS Office training also reaches state and local government leaders and the citizens they serve.

Since 1995, the COPS Office has invested $12.4 billion to help law enforcement advance the practice of community policing, and has enabled more than 13,000 state, local, and tribal agencies to hire more than 117,000 police officers and deputies. Our online Resource Information Center (RIC) offers publications, DVDs, CDs, and training materials on a wide range of law enforcement concerns and community policing topics. To date, we have distributed more than 1.1 million of these knowledge resources.

Through this broad range of programs, the COPS Office offers support in virtually every aspect of law enforcement, making American safer, one neighborhood at a time.



To learn more about the COPS Office, visit www.cops.usdoj.gov


1 The existing assessment resources identified at the beginning of this project included the Onsite Assessment Process developed by the Western Regional Institute for Community Oriented Public Safety and funded by the U.S. Department of Justice Office of Community Oriented Policing Services (the COPS Office); an organizational survey developed by J. Kevin Ford at Michigan State University with funding from the COPS Office; and a community policing checklist in the 1994 Trojanowicz and Bucqueroux publication Community Policing: How to Get Started.

2 Office of Community Oriented Policing Services, “Community Policing.” http://www.cops.usdoj.gov/Default.asp?Item=171.

3 Office of Community Oriented Policing Services, “What Is Community Policing?” http://www.cops.usdoj.gov/default.asp?Item=36.

4 A major conceptual vehicle for helping officers to think about problem solving in a structured and disciplined way came out of work from Eck and Spelman’s report on problem solving in Newport News, Virginia (Eck and Spelman, 1987). As part of a problem-oriented policing project in Newport News, officers worked with researchers from the Police Executive Research Forum to develop a problem-solving model that could be used to address any crime or disorder problem. The result was the SARA model. Since the mid-1980s, many officers have used the SARA model to guide their problem-solving efforts. Although the SARA model is not the only way to approach problem solving, it can serve as a helpful tool. While the CP-SAT is consistent with the principles of SARA, departments do not need to be using the SARA model to make use of this tool. The problem-solving component of this tool is general enough that those using non-SARA model approaches to problem solving will find the tool equally applicable.

5 For instance, whenever one wishes to compare within a sample the average performance between two groups he or she should consider the t-test for differences between groups. The t-test assesses whether the means of two groups are statistically different from each other. This analysis is appropriate whenever one wants to compare the means of two groups. A difference of proportions test could be done using the Chi-Square statistical test. Most of the major inferential statistics come from a general family of statistical models known as the General Linear Model (GLM). This includes the t-test, Analysis of Variance (ANOVA), Analysis of Covariance (ANCOVA), regression analysis, and many of the multivariate methods like factor analysis, multidimensional scaling, cluster analysis, discriminate function analysis, and so on. A GLM should be used to learn about the relationship among multiple variables.

6 It should noted that inferential statistics (e.g., t-tests) are used if the groups in the data are part of a sample drawn from some larger population. If, instead, the whole population was administered the COP self-assessment instrument (which is likely to be the case in many agencies), then whatever the differences are between two groups, that is what they are, with no t-tests needed.


7 Although you'll then have to weight the within-group estimates using the sampling fraction whenever you want overall population estimates.

8 Possible items include the following:

How many years have you worked with your current agency?

How many years have you worked in law enforcement (including sworn and unsworn)?

What is your rank in the department (civilian, sergeant, officer, deputy)?

Have you ever served as a sector/community police officer or equivalent position?

If yes, how many months have you held/did you hold the position?

9 Ninety-seven individuals returned the CP-SAT and 88 percent of those individuals attempted the community partnerships module (62 officers, 17 supervisors, and 6 command staff). The totals for officers, supervisors and command staff include the cross-agency team, which consisted of three command staff, three supervisors, three officers, and three community members.

10 Ninety-seven individuals returned the CP-SAT and on average 93 percent of those individuals attempted some part of the problem-solving module (65 officers, 19 supervisors, and 6 command staff). The totals for officers, supervisors, and command officers include the cross-agency team, which consisted of three command staff, three supervisors, three officers, and three community members.

11 Ninety-seven individuals returned the CP-SAT, and on average 90 percent of those individuals attempted some part of the organizational transformation module (63 officers, 18 supervisors, and 6 command staff). The totals for officers, supervisors, and command officers include the cross-agency team, which consisted of three command staff, three supervisors, three officers, and three community members.





File Typeapplication/msword
File TitleCommunity Policing Implementation Self Assessment Tool
Authordmead
Last Modified Bychapmanr
File Modified2009-07-28
File Created2009-07-28

© 2024 OMB.report | Privacy Policy