CPDO_OMB Supporting Statement Part B

CPDO_OMB Supporting Statement Part B.docx

Census of Public Defender Offices

OMB: 1121-0095

Document [docx]
Download: docx | pdf

Census of Public Defender Offices

OMB Control Number 1121-0095

OMB Expiration Date: XX/XX/XXXX


B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS


1. Universe and Respondent Selection


The Census of Public Defender Offices (CPDO) will be administered to all eligible public defender offices in the United States and Territories. Over the course of nearly 6 months (from approximately May through November 2023), the National Association for Public Defense (NAPD) led the effort to build the universe of public defender offices in all 50 states, the District of Columbia and the 5 U.S. Territories (OMB #1121-0339).


The universe was developed by identifying all offices that are publicly funded, have a physical address, employ at least one W-2 earning attorney, and provide direct public defense representation for adults and/or juveniles who are accused of a crime or delinquency or accused in a trial court of violating conditions of a sentence. Within this definition exists a variety of funding sources, office structures, specialties of defense provided, levels of urbanicity, and geographic coverage among public defender offices. Table 1 provides an overview of the types of jurisdictions identified through the universe development process.


To conduct this data collection, NAPD divided the states and District of Columbia into two categories: those where a single body had oversight over all public defense programs in the state and those where every county would need to be investigated. For the statewide systems (n = 41) NAPD contacted statewide oversight bodies via phone calls. This communication involved requesting each office to identify and provide contact information (office and office leader) for every public defense program that met the eligibility criteria.


For the states lacking a single oversight body (n = 15), NAPD conducted county by county research online, and cross-checked findings with any county leaders in the state to ensure that the list was complete. As necessary, NAPD called known/identified offices in the state to request leader name or physical address information. On occasion, NAPD could cross reference the draft list with State Bar directories and/or professional service association lists, if publicly posted. This effort resulted in a universe list of 1,892 public defense offices as of August 2024 (see Table 1). An additional round of checks for new offices will be conducted prior to the start of data collection. BJS estimates there will be about 2,000 offices at the time of data collection.



Table 1. Jurisdiction Types Identified during Frame Building

Jurisdiction Type

Definition

N

Cluster

Office is multi-county, or serves a judicial circuit or district consisting of more than one county

562

County

Office serves one county, an independent city/county equivalent, or a judicial district/circuit containing only one county

1054

Municipal County or State, OTHER

Office is a non-profit public defender office serving indigent clients in municipal or county cases, sometimes on a contract basis across an entire state.

2

Statewide

Offices (usually in statewide systems and often specialty offices like appellate defenders, juvenile defenders, etc.) that take cases across an entire state

44*

Territorial

Office serves one of the US Territories

8

(blank)

Jurisdiction unknown

220

Total


1892**

*Number exceeds the number of statewide systems identified earlier due to specialty offices.

**The total represented in the table reflects the status of the frame as of August 2024.


Six to eight weeks prior to the start of the 2025 CPDO data collection, NAPD will review the list of public defender offices and make outreach to statewide contacts, to confirm office level contact information in states where re-districting is underway. Throughout the remainder of the project, the office list will be updated with any leadership changes or new contact information discovered during data collection. The dynamic nature of frame maintenance will ensure the most current public defender leadership and office contact information is used for data collection.



2. Procedures for collecting information


Data collection for the CPDO will involve a series of communication efforts using multiple modes. Table 2 summarizes the office contacts and non-response follow up.



Table 2. Contact Procedures


Contact

Attachment

Contact Description

Timing of Contact

Details

1

7

Pre-notification letter from BJS with CPDO Overview (mail)

1 week before data collection begins

Send to all offices via USPS

2

8

Invitation letter with the survey link (email)

Start of data collection

Send to all offices via email

3

9a, 9b

Ad hoc survey link (email)

Throughout data collection

Send to all offices requesting the survey link on an ad hoc basis or when sections are delegated to staff within the office.

4

10a, 10b, 10c, 10d, 10e

Reminder message from project team and BJS (email)

2 weeks after data collection begins and then once per week until week 36

Send to non-responding offices via email

5

11

Reminder letter and hardcopy survey (mail)

4 weeks after data collection begins

Send to non-responding offices via USPS mail

6

12

Reminder calls from NORC (telephone)

8 weeks after data collection begins and every 3 weeks until the close of data collection

Call non-responding offices

7

13

Reminder letter from BJS and hardcopy survey (mail)

12 weeks after data collection begins

Send to non-responding offices via USPS mail

8

14

Reminder from project team (mail)

16 weeks after data collection begins

Send to non-responding offices via USPS mail

9

15

Reminder calls from NAPD (telephone)

16 weeks after data collection begins

Call non-responding offices

10

16

Last chance communication postcard from BJS

(mail)

20 weeks after data collection begins and 4 weeks before data collection closes

Send to non-responding offices via USPS mail

11

17

Thank you letter (email)

Upon survey completion

Send to each office as they complete the survey

Data collection will begin with a BJS pre-notification letter signed by the BJS Acting Director mailed via USPS to each eligible public defender office (Attachment 7). The letter will inform each office of the upcoming CPDO data collection, the reason for the collection, importance of the data to the public defender community, and contact information for any questions the office may have. One week after the pre-notification letter is sent, the project team will email the formal invitation letter and unique login information to each office (Attachment 8). Ad hoc requests for the survey link and office PIN and question delegation requests will be sent via email (Attachments 9a-9b). Two weeks after data collection begins, weekly email reminders will be sent to non-responding offices Attachments 10a-10e).


It is anticipated that a web-based format is the preferred mode for response. However, for non-responding offices, a mailing including a reminder letter (Attachment 11), hardcopy survey, and a postage-paid business reply envelope will be mailed via USPS four weeks after the initial email invitation. These materials will include a numeric case ID and bar code so returned surveys can be easily reconciled with the frame. The letter will include contact information for the data collection agent and also inform the office that the survey can be completed by web with the provided unique PIN or telephone, if preferred.


Ten weeks after the start of data collection, outbound telephone calls will be made to non-responding offices (Attachment 12). These calls will confirm that the office received the CPDO materials, respond to any questions and determine the respondent(s) for the office. Twenty weeks after the start of data collection, a second hardcopy survey and a reminder letter (Attachment 13) will be mailed via USPS first class mail. A postage-paid business reply envelope will be included in the packet. Four weeks after the hardcopy survey mailing, a reminder (Attachment 14) will be mailed to non-responding offices and telephone outreach from NAPD (Attachment 15) will begin. Four weeks prior to the close of data collection, a last chance postcard (Attachment 16) will be sent to non-responding offices.


The web-based survey is programmed so that sections may be delegated to others within the office. It is BJS’s understanding that contributions from several staff within the office may be needed to complete the survey and the delegation functionality allows the primary office respondent to direct sets of questions to colleagues as needed. The process for delegation depends upon how much of the form will be delegated. If the entire form is to be delegated, the respondent will share the online census link and office PIN with the designee. If a subset of questions are to be delegated, the respondent can select the delegate button within the online census, provide the name and email of the designee, and that designee will be sent an email with a unique link and a request to complete the questions. That designee will only see the questions that have been delegated and will not be able to see responses to any other questions.


At the end of the web-based survey, respondents will have the option to print their responses. Offices will also receive a thank you email (Attachment 17) upon survey completion. Offices completing the hardcopy survey will be instructed to place their completed survey form in the provided business reply envelope and place it in the mail. They may also scan or take photos of the completed form and send these to the project email. If an office prefers to complete the CPDO survey by telephone, the data collection agent will enter responses directly into the web-based form on behalf of the office.


Data Processing

Upon receipt of a survey (web or hardcopy), data will be reviewed and edited, and if needed, the respondent will be contacted to clarify answers or provide missing information. Respondents who submit via web will be prompted with real-time validation checks when submitting missing or inconsistent data. Any unresolved items that remain after the respondent submits will result in recontact by NORC staff to the respondent to attempt to resolve these issues. The following is a summary of the data quality assurance steps that NORC will take during the data collection and processing period:

Data Editing. NORC will reconcile missing or erroneous data through automated and manual edits of each questionnaire. In collaboration with BJS, NORC will develop a list of edits that can be completed by referring to other data provided by the respondent on the survey instrument. For example, if a screening question was left blank, but the follow-up questions were completed, a manual edit could be made to indicate the intended positive response to the screening question. Through this process, NORC can quickly identify which hardcopy cases require follow-up and indicate the items that need clarification or retrieval from the respondent.

Data Processing. When the project team identifies a potential data issue, such as missing or inconsistent answers, an NORC professional staff member will contact the respondent for clarification. Throughout the data retrieval process, NORC will document the critical questions needing retrieval (e.g., missing or inconsistent data elements), request clarification on the provided information, obtain values for missing data elements, and examine any other issues related to the respondent’s submission.

Data Entry. Respondents completing the survey via the web instrument will enter their responses directly into the online instrument. For those respondents returning the survey via hardcopy (mail or fax), data will be entered once received and determined complete. Throughout the remainder of the data collection period, NORC staff will conduct regular data frequency reviews to evaluate the quality and completeness of data captured in both the web and hardcopy modes.


3. Methods to Maximize Response


The 2007 CPDO had a 97% response rate for county-based offices and 100% offices for state public defender programs. BJS anticipates the 2024 CPDO will achieve an 85% response rate through various activities to ensure high response rates.


Instrument development. In developing the CPDO instrument, the project team’s priority was to make an instrument consistent with the 2007 CPDO to permit comparisons over time. The team also sought to address known problems with the wording of that instrument, to update the instrument to reflect contemporary concerns, and to add questions on matters relevant to defense practices that have emerged since the last survey.


Because the CPDO instrument is intended to be sent to defender offices across the nation, consultation with subject matter experts was an essential part of instrument preparation. The process of revising the instrument began with the project team engaging in a series of meetings to discuss and update the 2007 CPDO instrument. The team distinguished two lists of items: revisions it recommended BJS make immediately and revisions requiring discussion with the panel of subject matter experts.


Immediate revisions recommended by the team and accepted by BJS included:


  • Straightforward wording updates (e.g., dates were changed to a 2024 reference period, addition of ‘piped’ responses added to questions in Section B);

  • Modifications to questions about attorney attrition and turnover, responding to new concerns in the field about the seriousness of this issue, and removing a question format that required respondents to perform mathematical operations;

  • Deletion of ‘identity theft’ questions that were added by BJS as an area of special interest in 2007;

  • Revisions to response options for mutual exclusivity on questions about case counting methods, horizontal representation, and timing of entry of attorneys into cases;

  • Rewriting or elimination of certain low value questions (e.g., replacement of questions on standards in place within offices with questions on office policies; replacement of questions on whether investigative services are ‘available’ to attorneys with questions on investigator staffing and salaries);

  • Addition of ‘estimate’ checkboxes to demanding questions requiring specific numerical responses;


The project team met with a subject matter expert panel in Salt Lake City, UT, in Summer 2023 to discuss issues where input from the field was especially important. As a result of the expert panel meeting, the following changes were made:


  • Enhancements to questions about oversight of offices, including the types of powers usually reserved to oversight boards;

  • Enhancements to questions about office policies, incorporating new areas in which offices may have policies;

  • Enhancement to questions about data collection systems to reflect recent developments in technology;

  • Revisions to questions regarding financial eligibility screening of defendants to account for the limited role some defenders play in these processes;

  • Elimination of ambiguous or unfamiliar terms and vocabulary (e.g., ‘alternative defender office’; ‘management attorney’);

  • Reducing the burden of questions by asking for data in challenging areas (i.e., staffing, caseloads, and expenditures) in the most straightforward ways possible.


The final instrument (Attachment 1) will be web-based and supported by various online help functions to maximize response rates. The project email and toll-free number will appear on each page of the web survey. The survey is programmed so that responses are captured in real-time as they are entered allowing the respondent to complete the survey as they have time. A delegation feature was also programmed which will allow for subsections of the survey to be sent to other knowledgeable staff within the office.


Outreach and recruitment efforts. Recruitment strategies for the CPDO were developed to maximize response rates. The CPDO is a complete enumeration of public defender offices in the United States and diverse communication methods will inform the public defender community of the importance of the project, how the data will be used and encourage participation. BJS will utilize several approaches to communicate the CPDO activities:

  • Develop pre-census administration digital marketing strategies inclusive of social media campaigns and the creation of videos/print/digital material for presentations at annual defender trainings and events. NAPD will use its own platforms and collaborate with other national and state associations in order to educate the broadest audience of public defense programs/program leadership about the goals and benefits of CPDO. (One month prior to launch.)

  • Distribution of project updates via the data collection agent, NAPD’s social media platforms, NAPD website, NAPD’s monthly newsletter and various roundtables and convenings with other public defender advocacy and membership programs. (Throughout data collection.)

  • Engage NAPD’s Network of Leaders to promote the survey through its Virtual Town Hall for Executives, Executives listserv and Executives-oriented online and in-person events. (Throughout data collection.)

  • Communicate about the CPDO survey via the NAPD listserv, newsletter and at online leadership forums. (Throughout data collection.)

  • Varying nonresponse outreach during data collection to include communication from BJS, NORC, and NAPD.


Each of these activities will link to the CPDO project page hosted by the data collection agent. This web page lists the co-Principal Investigators, BJS staff and CPDO Expert Panel members.

Nonresponse follow-up protocols. Public defender offices that do not complete the survey will be contacted through a series of follow-up prompts to remind them to complete the survey. This approach is consistent with best survey practices and moves from the least costly to the most costly outreach. Table 2 (above) outlines the follow-up contacts. Contacts will remain fluid in their schedule and source as response rates are closely monitored in previous outreach efforts. That is, subsequent outreaches will be responsive to notable upticks in response rates by determining whether they are generated by BJS, NAPD or NORC.


Addressing nonresponse


Weighting for Unit Non-Response


Although it is anticipated that the steps mentioned above will result in participation from 85% of offices, there will likely be a percentage that do not complete the survey in a timely manner. While the CPDO is intended to be a census, the final list of responding offices is likely to be a nonrandom sample of the study population due to differential response rates across subpopulations. NORC will develop an analysis weight for each respondent through a two-step weighting adjustment procedure. The sample base weight is 1 for all sample members because it is a census. The first step is an eligibility step. If any respondents are found to be ineligible, the base weight will be set to zero at this step. Based on rough estimates from the 2007 CPDO, it is estimated that a small percentage, about 6%, of defender offices on the current roster will not be eligible for the CPDO.


The second step is a nonresponse weight adjustment step. Through this adjustment, the weight carried by non-respondents is transferred to respondents within each adjustment cell so each responding office will represent a portion of the non-responding offices and the sum of the weights will be the total number of eligible offices. To determine which variables should be used to create the adjustment cells, a non-response bias analysis to compare response rates among different subgroups will be carried out. Variables analyzed may include the office structure, state, and American Community Survey (ACS)1 variables. Variables where subgroups have the largest differential response rates will be used to define the adjustment cells for the nonresponse weight adjustment. The details of these analyses will be included in a non-response bias analysis report. To avoid introducing unnecessary weight variation, each adjustment cell must contain at least 20 cases.

As mentioned above, the response rates were high in 2007, with 97% for the county-based offices and 100% for the state offices. However, for the 2025 CPDO, BJS expects about an 85% response rate. Lower response rates increase the risk of bias, resulting in differences between the sample estimates and the target population's actual values. If the final response rate falls below 85%, NORC will conduct a more thorough non-response bias analysis to assess the weighting and, if necessary, improve the sample weights. This analysis will involve comparing the characteristics of respondents and non-respondents to identify any significant discrepancies, comparing the sample estimates to external sources, comparing estimates using the base weights to the adjusted weights, and evaluating survey metadata such as response dates, ranks, and lengths in relation to the response behavior.


Imputation for Item Non-Response


While high item response is anticipated, agencies may leave items blank due to lack of access to the information or the fact that the data are not being recorded by the office’s records management system. NORC proposes to use hot-deck and multiple imputation methods to ensure a complete data file. For item non-response rates less than 5%, imputation will not be used.

For non-response rates (5-20%), hot deck imputation is a cost-efficient imputation method that protects relationships between variables that are observed in the non-missing data. For each variable to be imputed using hot deck, the file will be sorted by variables that have correlations with the variable to be imputed. These “sort” variables will be chosen based on models in which the variable to be imputed is the dependent variable and the independent variables (sort variable candidates) will be other questionnaire items or variables known for all eligible agencies considered for weighting (see above). The initial models will be formed based on subjective knowledge of which independent variables are associated with the missing variable and its response status and the design variables. Several modeling techniques can be used, and NORC plans to use tree-based classifications and regressions. These sort variable candidates can be quantitative (continuous and discrete) or categorical, but the quantitative sort variables chosen will be converted to categorical variables during imputation so that later variables in the sort order can still have an impact on the sorting of the file. Once the file is sorted, hot deck imputation uses the nearest neighbor as the donor for the missing value. Each variable to be imputed will be sorted according to its own set of sort variables. This single-imputation method does result in an under-representation in variance.

If there is a high non-response rate for certain items (20 to 60 percent), hot deck will not work as well due to the necessary re-use of donors. In this case, one routinely used option for variables with high non-response is multiple imputation. Multiple imputation involves model building for multiple related variables in which all are imputed together or sequentially. In multiple imputation, the missing values are imputed several times to create multiple complete datasets by replacing missing values with plausible values. The multiple complete datasets are used to estimate the parameter of interest. The estimates will be different from each other reflecting the randomness of drawing from the plausible values. Then the estimates are pooled together to obtain one estimate and its variance. Multiple imputation can better reduce non-response bias in items with high missing data rates (20% or greater) than hot deck imputation.

Variables with missing rates greater than 60% will not be used for analysis or appear in BJS statistical reports, but will be retained in the archived file.

Any imputed values will be merged into the final data file (and flagged) prior to delivery to BJS and subsequent archiving of data to the public. In the case of single imputation, the values will be inserted and flagged in both the internal BJS and external public data, with a dataset and documentation indicating the variables that should be used containing values with single imputation. If there are variables where it is necessary to use multiple imputation, the multiple datasets containing imputed values will be provided to BJS for both internal and external public use, with documentation indicating the variables that require multiple imputation and programming code to create valid point and variance estimates. As with the single imputation variables, multiple imputation values will be flagged.


4. Testing of Procedures


The CPDO project team (NORC, the Urban Institute, NAPD, and Dr. Andrew Davies) conducted cognitive testing of the 2024 CPDO instrument under BJS’s generic clearance (OMB No. 1121-0339) from February through June 2024.


Cognitive testing


The project team completed cognitive testing of the CPDO instrument between February and June 2024 to assess its quality and to evaluate whether offices had the information needed to respond to the draft questions. A total of 12 offices participated in a pretest of the instrument by returning completed responses for the CPDO questionnaire. Of those 12 offices, the project team conducted cognitive interviews with 11 offices. The chief public defender at the 12th office was unable to complete the cognitive interview.

BJS and the project team sought to test the instrument on various types of offices while also ensuring geographic diversity. Accordingly, the 11 offices that completed the CPDO instrument and participated in cognitive testing represented the following subsets/office types:

  1. Single county office in a county-based public defense state

  2. Multi-county office in a county-based public defense state

  3. Juvenile office

  4. Office in a U.S. territory/county equivalent

  5. Office within a statewide jurisdiction

  6. Office in a rural jurisdiction

  7. Office in an independent city

  8. Office with a mix of state/county funding

  9. Office in an urban jurisdiction

  10. Alternate/conflict defender

  11. Office with an elected public defender


The cognitive interviews focused on issues identified around readability, question clarity, adequacy of response options, comprehensiveness, and the burden on the respondent. The instrument instructions and screener questions were also considered and discussed with respondents.

The cognitive test was conducted using a hardcopy version of the instrument to focus respondent feedback on survey content rather than usability and functionality. Data collection began with an emailed invitation letter requesting participation from the office lead. As consent to participate in the cognitive test was received, each office was sent the hardcopy instrument via email and USPS mail. Completed instruments were received on a flow basis. Upon receipt of a completed instrument, a cognitive interview was scheduled to review feedback on the draft survey questions, to discuss burden and to review responses. The cognitive testing final report is included in Attachment 6.

Instructions and Screener Questions. Regarding the instructions for the CPDO, one cognitive test respondent indicated that he was confused about whether he needed to fill out the questionnaire once or twice – since in his jurisdiction, he is simultaneously the head of both a public defender office and a conflict defender office. The project team concluded that this confusion could be avoided by amending Instruction #2 (cover page) to add the following text: “If you are the head of more than one public defender office, you should receive multiple solicitations for this survey: we ask that you fill out one questionnaire for each office. If you need more copies of the questionnaire, please contact the CPDO team via email at [email protected].”

General Information. Section A of the CPDO captures general information about oversight, funding, and the total operating expenditures for each public defender office. Based on the cognitive test responses, several revisions were made as described below, and two new questions were added to this section. For item A1, the first response option was edited from “An entire state” to “An entire state or United States territory.” For items A2, A3, A5, and A6, the “don’t know” response option was eliminated. For items A5 and A6, the words “state or territory-level body” replaced “state-level board or commission.” Furthermore, in A5 and A6, response options were revised to allow for selecting yes or no for “Funded” and “Overseen” options. In addition, the following two new questions about the unionization of attorneys and staff were added (A9 and A10).

The question text for item A12 was revised to align with wording from the 2007 instrument which provides clarity to respondents on excluding any one-time costs. Also, the word “criminal” was added so that it reads: “provide criminal pubic defense services to defendants”. This change was made because there were some cognitive test respondents who engaged in representation of other types of clients that mistakenly thought they should include expenditures on that other work in the budget number they provided to answer this question. Original item A13 was removed as the list of budgetary earmarks was insufficient. Finally, for what is now item A13, response choice (e) was revised from “Private funds” to “Fees charged to clients for representation” as public defenders rarely receive other private funding. The project team determined that any private funding would be better categorized under response (f) Other (please describe).

Staffing. Section B of the CPDO captures information about staffing in public defender offices, including attorney demographics. Based on cognitive test feedback, several revisions were made to item response categories and a new question was added to this section. First, for line item B1(a), a decision was made to include the chief public defender, so the new text reads: “Attorneys, including the chief public defender, with management or supervisory responsibilities over other attorneys”. As a quality control measure on the web instrument, if a respondent does not enter any value in B1(a), a soft prompt will be added asking: “Are you sure this is correct? Remember, the chief public defender should be included in these numbers.” The respondent will be able to confirm that it is, in fact, correct and continue, or the respondent could revise their response. A question was added following B1 to determine the number of hours per week an attorney must work to be considered full-time.

In response to gender of attorneys (B2), several cognitive test respondents objected to including only male/female response options. However, to date, there is a lack of best practices for proxy reporting of gender on administrative collections and BJS has chosen to align with previous administrative surveys by asking about attorney sex. To maintain consistency, question B15 was also changed to asking about sex.


To align with the new race standards articulated in the Statistical Policy Directive 15, question B3 response options (g) “Middle Eastern or North African alone” and (h) “Multiracial and/or Multiethnic” were added. B16 aligns with the minimum race categories for SPD 15 with a mark all that apply option. See Part A Question 7 for more explanation on why BJS deviated from the detailed race categories.

For item B6, which asks for the minimum and maximum salary for various staff positions, the word “annual” was added before salaries in the question for clarity.

The team also learned that the question text for item B7 regarding staff attrition was poorly worded (several cognitive testers reported that it made no sense as written). Therefore, item B7 has been revised by adding “at any time during the calendar year 2024 (January 1, 2024 – December 31, 2024) in place of “as of December 31, 2024”.

For item B10 based on cognitive test feedback, BJS added another response option (“Chair of state-level board or commission”), the question about the final authority that selects the chief public defender for the office.

Caseloads. Section C of the CPDO census records caseload information for public defender offices. At the start of Section C, an instruction specifying the reference period has been added. Based on feedback received about the inadequacy of response options for question C1, which asks about case types or categories, another category for ‘life without parole’ was added.

Additional input from several cognitive test respondents suggested the need for another category (’Failure to pay child support’) in the list of case types. There was one cognitive test respondent who suggested adding ‘Children in Need of Supervision’ (CHINS) as a separate category, but the team declined to do this because CHINS is not a universally understood term across the country, and it is redundant with other categories. For consistency with C1, ‘life without parole’ and ‘failure to pay child support’ were added to item C3 asking for the numbers of cases received by case type.

Eligibility for Services. Section D of the census collects information from public defender offices about eligibility for services. As a result of cognitive test feedback we received, for the first question in this section (item D1), BJS added a third response option (“Not applicable – no screening process is used for persons seeking representation from our office”). Any respondent who selects ‘Not applicable’ will skip to question D5. For item D2 (financial eligibility criteria), another column (“Some courts consider, others do not”) was included to account for variety within jurisdictions in eligibility determination procedures. This change was made based on recommendations from cognitive test respondents. BJS also added the phrase “to the best of your knowledge” to the D2 question text. Finally, based on feedback from respondents, BJS rephrased question D4 to focus on how a person qualifies for counsel rather than how a person is denied counsel.

Office Resources. The final section of the census (Section E) captures information about office resources within public defender offices. Regarding question E1 about office policies, cognitive test respondents noted that two important office policies (‘leave policy’ and ‘professional development policy’) should be added to this question and are now items E1(i) and E1(m). In E6, “volume of digital discovery evidence” was added to acknowledge the demands placed on technological infrastructure by digital evidence. In addition, one respondent reported that the use of the word “communication” in item E7(h) was unclear/too vague. Therefore, BJS replaced the phrase “stores all communication” with “stores information about communication with clients” for this item, to eliminate confusion. Finally, the original theme of digital evidence in E8 was retained while the wording and response option structure was changed to align with other BJS administrative surveys capturing the types of digital evidence and storage devices reviewed by offices.

Burden. Cognitive test respondents reported an average burden of 61.6 minutes to complete the CPDO census questionnaire. The burden reported by those offices who participated in the pretest ranged from a minimum of 15 minutes to a maximum of 120 minutes.



5. Consultants on the Statistical Aspects of the Design


The Judicial Statistics Unit of BJS takes responsibility for the overall design and management of the activities described in this submission, including data collection procedures, development of the questionnaires, and analysis of the data. The following individuals consulted on statistical aspects and collecting/analyzing the data:


Bureau of Justice Statistics Contacts


Persons consulted on data collection, analysis, and methodology

Ryan Kling

Statistician and Project Manager

(202) 704-0076

William Adams, MPP

Urban Institute


George (Ebo) Browne, PhD

Statistician

(202) 598-1395

Andrew Davies, PhD

Deason Criminal Justice Reform Center


Suzanne Strong, PhD

Chief, Judicial Statistics Unit

(202) 880-7387

Heather Hall

Consultant



Jeanette Hussemann, PhD

NORC at the University of Chicago



1 https://www.census.gov/programs-surveys/acs/data.html

2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMorgan, Rachel (OJP)
File Modified0000-00-00
File Created2024-11-27

© 2025 OMB.report | Privacy Policy