IAPHD Supporting Statement B_IZ Funding

Statement B_IZ Funding.docx

Information Collections to Advance State, Tribal, Local and Territorial (STLT) Governmental Agency System Performance, Capacity, and Program Delivery

IAPHD Supporting Statement B_IZ Funding

OMB: 0920-0879

Document [docx]
Download: docx | pdf









Challenges in Implementing CDC Immunization Funding in Health Departments


OSTLTS Generic Information Collection Request

OMB No. 0920-0879





Supporting Statement – Section B







Submitted: April 8, 2014



Program Official/Project Officer

Jack Nemecek, PhD, MBA

Public Health Advisor

Centers for Disease Control and Prevention (CDC)

National Center for Immunization and Respiratory Diseases

Immunization Services Division

Program Operations Branch

1600 Clifton Rd. N.E.                                                                         

Mailstop: A-19                       

Atlanta, GA  30333 

Telephone: 404-639-8219

Fax: 404-639-8615

[email protected]


Section B – Data Collection Procedures


  1. Respondent Universe and Sampling Methods

Respondents to the web-based instrument will be the universe of immunization program managers from the 64 state, local, and territorial health departments awarded immunization funding from three major federal funding streams. These health department awardees include 50 states, 8 territories, and 6 cities including the District of Columbia (see Attachment A – List of Health Department Awardees). The web-based instrument will be sent to the 64 health department awardees.


Respondents to the telephone interview will be a sample of immunization program managers that completed the web-based instrument. The goal of the interviews is to delve more in-depth into the responses uncovered in the web-based instrument to gain a deeper understanding of experiences and examples within the health department environment. A stratified random sampling methodology will be used to select the interviewees to compare those who were more successful at utilizing their federal immunization funding to those who were not.


A total of 24 interviews will be conducted, four interviewees randomly selected in each of six segments based on how large the health department’s immunization program is and by whether they found it easy or hard to spend the money across all the federal funding they received or some of the federal funding received, as noted by their survey responses in each of the funding sections. There is a question in each federal funding section of the web-based instrument that asks if it was very easy/easy/hard/very hard to spend that specific funding during the grant period. We recognize that some awardees may have received money from multiple funding streams and have answered the question differently depending on the funding stream asked about. Therefore, interviewees will be stratified by six segments:

  1. Larger immunization programs (11+ FTE staff) who found it easy/very easy to spend the money across all funding streams (VFC, 317, special federal funding)

  2. Smaller immunization programs (10 or fewer FTE staff) who found it easy/very easy to spend the money across all funding streams (VFC, 317, special federal funding)

  3. Larger immunization programs (11+ FTE staff) who found it easy/very easy to spend the money across some of the funding streams, but hard/very hard to spend the money on other funding streams

  4. Smaller immunization programs (10 or fewer FTE staff) who found it easy/very easy to spend the money across some of the funding streams, but hard/very hard to spend the money on other funding streams

  5. Larger immunization programs (11+ FTE staff) who found it hard/very hard to spend the money across all funding streams (VFC, 317, special federal funding)

  6. Smaller immunization programs (10 or fewer FTE staff) who found it hard/very hard to spend the money across all funding streams (VFC, 317, special federal funding).



  1. Procedures for the Collection of Information

Data collection will occur in two phases. Phase 1 will include a web-based instrument sent to 64 immunization program managers, and Phase 2 will build off of the earlier phase to explore responses to the web-based instrument more in-depth by conducting telephone interviews with 24 immunization program managers.


In Phase 1, data will be collected through a web-based instrument using Qualtrics® distributed to all individuals who comprise the respondent universe (immunization program managers at the 64 health department awardee sites). There will be only one response per health department awardee site.


The Qualtrics® tool will be the data collection instrument used to disseminate the questions and gather the data. This will reduce the burden of respondents by allowing them to complete the questions online at their own convenience. The data collection instrument was designed to collect the minimum information necessary for the purposes of this project (i.e., limited to 36 questions maximum, fewer questions presented to respondents who skip sections with skip patterns employed.


The Phase 2 interviews are intended to be brief and focused. They will be conducted with a sample of 24 respondents who completed the web-based instrument, to complement the limitations of close-ended questions on the web-based tool and provide a more nuanced and comprehensive portrait of the health department’s experiences. While questions are open-ended, they directly build off of the respondents’ answer on the web-based tool. Trained qualitative researchers will conduct the interviews by telephone using a semi-structured guide.


Following OMB approval, data collection will commence with the web-based instrument via e-mail communication (see Attachment E – Web-based Instrument Introductory Email) to all immunization program managers. A link to the web-based instrument, which is programmed in Qualtrics® will ask for their response to the instrument within a 2-week period (10 business days).


Two reminder emails using the same text will be sent to responders. (see Attachment F – Web-based Instrument Reminder Email). The first will occur after five business days (seven calendar days) following the initial e-mail, the second reminder, will be sent just prior to the 10th business day.


Two weeks after the close of the data collection period for the web-based instrument, a follow up email will be sent to those respondents not selected for the interviews thanking them for completing the web-based instrument (see Attachment G – Web-based Instrument Follow-up Email).


Upon completion of the data collection for the web-based instrument, preliminary data analysis will be conducted within two weeks. After examining responses to the web-based instrument, a sample of 24 respondents will be asked to participate in the next round of data collection, a telephone interview, to further delve in-depth into the specific challenges and facilitators related to utilizing federal immunization funding. Immunization program managers will be contacted via email thanking them for completing the web-based instrument and asking them participate in the brief telephone interview (see Attachment H – Interview Introductory Email). Respondents will be asked to participate in an interview within a 2-week period (10 business days). If no response is received within 5 business days, a reminder email will be sent out to non-responders (see Attachment I – Interview Reminder Email). At the close of the data collection period for the interviews, a follow up email will be sent, thanking respondents for their participation (see Attachment J –Interview Follow-up Email).


Data from the web-based instrument will be downloaded, cleaned, and analyzed in SPSS Version 20. Frequencies and bivariate analyses will be conducted for close-ended questions when looking at responses across all grantees or by type of grant. Open-ended questions on the instrument will be converted to text responses.


For the interviews, qualitative data from the interviews will be imported in NVivo® qualitative data analysis software. The collected qualitative data will be coded and analyzed thematically, where data analysts will identify key themes that emerged across groups of interviews by segment or other characteristics. Frequency and intensity of discussions on a specific topic will be key indicators used for extracting main themes.


  1. Methods to Maximize Response Rates - Deal with Nonresponse

For both the web-based instrument and interviews, notification (see Attachments E and H) and reminders via emails (see Attachments F and I) will be utilized to maximize response rates. The notifications and emails will be sent to the potential respondent universe: 64 immunization program managers for the web-based instrument and a sample of 24 immunization program managers for the telephone interviews. If the initial person contacted to complete the questions is no longer with the health department or in the role of immunization program manager, follow-up communication with the health department will be conducted to identify the current immunization program manager. The web-based data collection tool will be open for 10 business days, while the interviews will be conducted within a 10 day period several weeks later. We will request that both the web-based instrument and interview be completed only by the individual receiving the email (the immunization program manager) so that only one response is received from each health department. Higher response rates will yield more reliable information; however, no scientific inferences will be made.


  1. Test of Procedures or Methods to be Undertaken

Web-based instrument: The web-based instrument was pilot tested by six (6) public health professionals who have worked in state or large local health departments or closely with health departments and have experience with federal immunization funding, and thus were similar in background and experience to the target respondents. In the pilot test, the average time to complete the web-based instrument including time for reviewing instructions, gathering needed information and completing the questions was approximately 15-20 minutes. The upper limit of this range (20 minutes) was used for the purposes of our estimated burden hours.


Interview guide: The interview guide was pilot tested by two public health professionals who also completed the web-based instrument pilot test. In the interview pilot test, the average time to complete the telephone interview including time for introductions, asking core questions, and asking probes was approximately 14-16 minutes. The upper limit of this range (16 minutes) was used for the purposes of our estimated burden hours.


  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The following individuals were consulted to provide advice about the design of these data collection activities:


Lisa Wolff, ScD

Director, Research and Evaluation

Health Resources in Action

617-279-2240 x201

[email protected]


James W. Buehler, MD

Professor in the Department of Health Management and Policy

Drexel University School of Public Health

267-359-6019

[email protected]


The team of individuals working on information collection, including instrument development, data collection, and data analysis are contractors and members of CDC’s National Center for Infectious and Respiratory Diseases, Immunization Services Division as listed in Table B-2.


Table B-2: Staff Responsible for Instrument Design, Data Collection and Analyses

Name

Agency

Telephone Number

Email

Jack Nemecek

CDC/OID/NCIRD

404-639-8219

[email protected]

Ansley Hynes

CDC/OID/NCIRD

404-718-4520

[email protected]

Megan Lindley

CDC/OID/NCIRD

404-639-8717

[email protected]

Sarah McKasson

NNPHI; Contracted by CDC/OID/NCIRD

504-872-0773

[email protected]

Chris Kinebrew

NNPHI; Contracted by CDC/OID/NCIRD

504-301-9833  

[email protected]

Lisa Wolff

HRiA; Contracted by NNPHI

617-279-2240 x201

[email protected]

Valerie Polletta

HRiA; Contracted by NNPHI

617-279-2240 x321

[email protected]


The data for the web-based instrument will be analyzed using basic descriptive or bivariate analyses. The data for the interviews will be analyzed using qualitative techniques of thematic coding. Because the major purpose of this data collection is for program improvement, we do not anticipate the use of complex statistical techniques.


LIST OF ATTACHMENTS – Section B

Note: Attachments are included as separate files as instructed.


Attachment E - Web-based Instrument Introductory Email

Attachment F - Web-based Instrument Reminder Email

Attachment G - Web-based Instrument Follow-up Email

Attachment H - Interview Introductory Email

Attachment I - Interview Reminder Email

Attachment J - Interview Follow-up Email



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorgel2
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy