Supporting Statement B

INDEX-USE-SSB.docx

Information Collections to Advance State, Tribal, Local and Territorial (STLT) Governmental Agency System Performance, Capacity, and Program Delivery

Supporting Statement B

OMB: 0920-0879

Document [docx]
Download: docx | pdf














Assessment of the National Health Security Index for Public Health Preparedness


OSTLTS Generic Data collection Request

OMB No. 0920-0879



Supporting Statement – Section B


Submitted: 6/9/2020




Program Official/Project Officer

Shoukat Qari, DVM, PhD

Senior Health Scientist

Office of Applied Research (OAR)

Office of Science and Public Health Practice (OSPHP)

Center for Preparedness and Response (CPR)

Centers for Disease Control and Prevention

(770) 488-8808

[email protected]

1600 Clifton Road Mail Stop H21-6

Room 6224

Atlanta, Georgia 30329



Table of Contents




Section B – Data collection Procedures


  1. Respondent Universe and Sampling Methods


The respondent universe consists of a total of 278 respondents, representing 26 state, 8 territorial, and 244 local government employees who have primary responsibility for overseeing public health emergency preparedness and response (PHEP) activities for their jurisdiction. These roles include state and territorial directors of public health preparedness, as well as county, city, and regional public health preparedness coordinators. Two separate data collection methods will be used to collect data from these respondents: (1) an Online Assessment Instrument; and (2) a Semi-Structured Interview Guide Instrument.


The respondent types to be recruited for the Online Assessment Instrument include public health preparedness officials from 26 state governments, 244 local governments, and 8 territorial governments.


The respondent types to be recruited for the Semi-Structured Interview Guide Instrument consist of a small subset (n=42) who complete the Online Assessment Instrument and will include 5 state preparedness directors, 1 territorial preparedness director, and 36 local preparedness coordinators.


Online Assessment Instrument


A total of 278 respondents will receive the Online Assessment Instrument. The selection criteria for this sample is as follows:

  • Six states will be randomly selected from each of the 4 U.S. Census regions, with 2 additional states randomly selected from the more populous Southern region. The state public health preparedness director in each of the 26 sampled states will receive the assessment.

  • U.S. local government public health agencies will be stratified into the 4 U.S. Census regions, and further stratified into metropolitan, micropolitan, and non-metropolitan area classifications based on Census definitions, for a total of 12 strata. From each of the 12 strata, 20 local health departments will be randomly selected without replacement, for a total of 240 agencies. The designated local PHEP coordinator in each selected local agency will receive the assessment.

  • The local public health preparedness director in each of the four large metropolitan local public health agencies that are directly funded by the CDC PHEP program (New York, District of Columbia, Los Angeles, and Chicago) will be purposefully selected to receive the assessment.

  • The territorial public health preparedness director in each of the 8 U.S. territories that receive CDC PHEP program funding will be purposefully selected to receive the assessment.


Semi-Structured Interview Guide


Respondents from five state jurisdictions and one territorial jurisdiction will be selected to participate in interviews. These selected jurisdictions are California, Florida, Kentucky, New York, Texas, and Puerto Rico. These jurisdictions were purposefully selected to reflect diversity in geographic region, and recent past experience with public health emergencies. Within each jurisdiction, 7 individuals will be purposefully selected to participate in interviews, resulting in a total of 42 respondents for the semi-structured interview portion of this information collection. These respondents will be selected from the larger pool of respondents that respond to the Online Assessment Instrument to as to achieve diversity with respect to rural and urban jurisdiction designation and familiarity and experiences with the Index use as reported in the Online Assessment.


The selection criteria for respondents in each state/territory is as follows:

  • The director of public health preparedness for each selected state and territory.

  • Three local public health preparedness coordinators that serve rural jurisdictions in each selected state and territory, and that responded to the Online Assessment Instrument.

  • Three local public health preparedness coordinators that serve urban jurisdictions in each selected state and territory, and that responded to the Online Assessment Instrument.


The total number of unique respondents to be recruited for qualitative data collection using the Semi-Structured Interview Guide Instrument will be 42 persons. Respondents will be assigned to one of three respondent subgroups corresponding to the three types of qualitative data to be collected using the Semi-Structured Interview Guide Instrument:

Subgroup 1: Index Use and Utility Subgroup: key informant interviews 18 respondents

Subgroup 2: Core Metrics Subgroup key informant interviews 18 respondents

Subgroup 3: Index Tools Subgroup: 8 key informant interviews.

Respondents will be assigned to one of the three subgroups based on their knowledge and use of the Index as self-reported on the Online Assessment and based on the recommendation of the state PHEP director. Each respondent will be asked to answer only the questions for their assigned subgroup and will not be asked to answer questions for other subgroups.



  1. Procedures for the Collection of Information


Data will be collected via an online assessment and key informant interviews, and respondents will be recruited through a notification (see Attachment D—Online Assessment Information Letter and Attachment E—Interview Information Letter) to the respondent universe. The notification email will explain:

  • The purpose of the data collection, and why their participation is important

  • Instructions for participating

  • Method to safeguard their responses

  • That participation is voluntary

  • The expected time to complete the instrument

  • Contact information for the project team


For the Online Assessment Instrument data collection process, all sampled respondents will be emailed and mailed the Study Information Letter (Attachment D) to ensure that respondents are contacted in the event of incorrect or blocked email addresses. The Letter includes information on how to opt out immediately from the study via email, phone, or web link, and how to nominate an alternative respondent from the sampled agency. Approximately one week later, respondents who do not opt out will receive a second email containing the secure, encrypted REDCAP web link to the assessment. If a sampled respondent nominates an alternative respondent from their agency, this alternative respondent will receive the second email and all subsequent data collection communications. Respondents will be given an initial period of 14 days to complete the assessment and will receive 1 email reminder after the first 7 days. Non-responders after the initial 14-day period will receive a second email reminder and an additional 7 days to complete the assessment. Non-responders after 21 days will receive a third email reminder and a telephone call along with an additional 7 days to complete the assessment. If the target response rate of 80% has not been reached by day 28, a fourth email reminder and a second telephone call reminder will be used, along with an additional 7-day response period. Data review and data cleaning will begin on or after day 35, but the data collection system will remain open to accept late responses through day 60. Sampled respondents who do not submit a usable response within 60 days will be coded as non-responders to the assessment.


Data collection using the Online Assessment Instrument will occur using the NIH-developed REDCAP electronic data capture system, hosted on a secure computer server at the University of Colorado. Respondents will submit their responses through an easy to use, encrypted, web-based interface to the REDCAP server. Each respondent will be assigned a unique study identification number that will be used to track responses, so that no identifying information about the respondent or the jurisdiction will be collected through the REDCAP data system. Once the 60-day data collection window has ended for the assessment, the data file will be transferred from the REDCAP server to a secure data analysis server that also resides behind the university data protection firewall. Only the approved members of the project team will be able to access data on the REDCAP server and the analysis server, using login and password credentials that are tightly monitored by the university data security system. The analysis file will be stored and analyzed only on secure, encrypted and password-protected file servers and computers maintained at the university behind the data security firewall that meet all requirements for data security. All analyses will be conducted using SAS, Stata, and R statistical software. Because the project team includes members from three universities who will be involved in data analysis (University of Colorado, UCLA, and New York University), the final de-identified analytical data file will be created at the University of Colorado and then shared with team members at the two other universities, so that analyses can be performed in parallel. The REDCAP data system is deployed at all three universities and will be used to exchange the final de-identified analytical data file across the universities using a secure, encrypted protocol. After data exchange, team members at UCLA and New York University will store and analyze the de-identified data file on secure, encrypted and password protected servers and computers that meet requirements for data security.


Qualitative data collection using the Semi-Structured Interview Guide Instrument will begin approximately 6 months after completion of the Online Assessment data collection. Respondent identification and recruitment for the qualitative data collection will begin approximately 2 months before data collection begins. In each of the 5 selected states and one territory, we will use a snowball respondent selection process that begins by contacting the designated PHEP director for each state and territory (see Attachment E - Interview Information Letter) via email and phone to recruit their participation. During a follow-up phone call to confirm participation, we will identify dates to schedule the interview and solicit recommendations about additional local PHEP stakeholders to include in the qualitative data collection. The nominated respondents will then be classified based on rural or urban jurisdiction designation and based on levels of Index familiarity and use as reported on the Online Assessment. Six local respondents will then be purposefully selected for interview in each state and territory so as to achieve diversity in rural/urban designation and Index familiarity and use, giving preference to respondents who are nominated by the designated state or territorial PHEP director. Local respondents will be contacted via email and phone using the Interview Information Letter. Interviews will be conducted via telephone. All interviews will be audio-recorded with verbatim transcripts generated from recordings. No identifying information for respondents and jurisdictions will be captured in the recordings or transcripts; instead, a unique study identification number will be assigned to each respondent and coded on each recording and transcript to enable tracking and linkage with other data sources. Electronic files for all recordings, transcripts, and interviewer notes will be maintained on encrypted and password protected computer servers and computers in compliance with requirements for data security. These qualitative data files will be shared across project team members at the three universities using the REDCAP system, using the same procedures described above for quantitative data files. Qualitative data files will be coded and analyzed using ATLAS.ti and NVIVO text analysis software. Coded qualitative data files will be linked with the quantitative analytical file to support mixed-method analyses by using the unique study identifiers assigned to each qualitative and quantitative observation. This linkage and analysis will occur only on encrypted, password-protected servers and computers in compliance with requirements for data security.



  1. Methods to Maximize Response Rates Deal with Nonresponse


Although participation in the data collection is voluntary, the project team will make every effort to maximize the rate of response. The data collection instruments were designed to allow for skipping questions based on responses to previous questions, thereby minimizing response burden.


Following the distribution of the invitation to participate in the quantitative data collection using the Online Assessment Instrument, (See Attachment D – Assessment Information Letter), respondents will have 10 business days to complete the instrument. Those who do not respond within 10 business days will receive a reminder (email and phone call, See Attachment E – Online Assessment Follow-up Protocol) urging them to complete the instrument. Up to four cycles of reminders will be used. Those who do not respond within 10 business days from the last reminder email or phone call will be considered non-responders.


Following the distribution of the invitation to participate in the qualitative data collection using the Semi-Structured Interview Guide Instrument (See Attachment F – Interview Information Letter) the project team will contact each respondent by phone within 7 days to schedule the interview and conduct snow-ball identification of additional respondents. Those who do not respond within 10 business days will receive a reminder (email and phone call, See Attachment G – Interview Follow-up Protocol) urging them to complete the instrument. Respondents who are unreachable by phone or email within 10 business days of follow up will be considered non-responders. All interviews will be scheduled for completion within 60 days of scheduling. Respondents who cancel or do not show up for scheduled interviews will be rescheduled within 10 business days. Respondents who fail to complete a re-scheduled interview will be considered non-responders.





  1. Test of Procedures or Methods to be Undertaken


The estimate for burden hours is based on a pilot test of the two data collection instruments by 4 public health preparedness professionals. In the pilot test, the average time to complete the Online Assessment Instrument including time for reviewing instructions, gathering needed information and completing the instrument, was approximately 42 minutes (range: 32 to 52 minutes). For the purposes of estimating burden hours, the upper limit of this range (i.e., 52 minutes) is used.


The average time to complete the Semi-Structured Interview Guide Instrument including time for reviewing instructions, gathering needed information and completing the instrument, was approximately 57 minutes (range: 43 to 70 minutes). For the purposes of estimating burden hours, the upper limit of this range (i.e. 70 minutes) is used.



  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


Shoukat Qari, DVM, PhD

Senior Health Scientist

Office of Applied Research

Center for Preparedness and Response

U.S. Centers for Disease Control and Prevention

Atlanta, GA

Phone: 770-488-8808

Email: [email protected]


Teresa Waters, PhD

Professor and Chair

Department of Health Management and Policy

College of Public Health

University of Kentucky

Lexington, KY

Phone: 859-323-7422

Email: [email protected]


Glen P. Mays, PhD, MPH

Professor and Chair

Department of Health Systems, Management and Policy

University of Colorado School of Public Health | Anschutz Medical Campus

13001 E. 17th Place, Mail Stop B119

Aurora, CO 80045

Email: [email protected]

Phone 303-724-3759

Fax:303-724-4495





David Eisenman, MD, MSHS

Professor and Director

Center for Public Health and Disasters, David Geffen School of Medicine, and Fielding School of Public Health, University of California-Los Angeles

200 UCLA Medical Plaza, Suite 420, Los Angeles, CA 90095

Phone: 310-794-2452

Fax: 310-794-0732

Email: [email protected]


David Abramson, PhD

Clinical Associate Professor and Director

Population Impact, Recovery and Resilience Program, College of Global Public Health, New York University

715/719 Broadway, New York, NY 10003

Phone: 212-992-6298

Email: [email protected]


Karl Ensign, MPP

Chief Program Officer

Association of State and Territorial Health Officials (ASTHO)

Arlington, VA

Phone: (571) 527-3143

Email: [email protected]


Laura Biesiadecki

Senior Director, Preparedness, Recovery and Response

National Association of County and City Health Officials

Washington, DC

Phone: 202-507-4205

Email: [email protected]



LIST OF ATTACHMENTS – Section B

  1. Attachment D – Online Assessment Information Letter

  2. Attachment E – Online Assessment Follow-up Protocol

  3. Attachment F – Interview Information Letter

  4. Attachment G – Interview Follow-up Protocol



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorkhx2
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy