Supporting Statement B

Supporting Statement B.docx

Enhancing Linkage of Sexually Transmitted Infection (STI) and HIV Surveillance Data in the Ryan White HIV/AIDS Program (RWHAP) Evaluation

OMB: 0906-0060

Document [docx]
Download: docx | pdf

Supporting Statement B

HIV/AIDS Bureau Enhancing Linkage of STI and HIV Surveillance Data in the Ryan White HIV/AIDS Program Evaluation Contract

OMB Control No. 0906-XXXX

New Information Collection Request

  1. Collections of Information Employing Statistical Methods

The Health Resources and Services Administration (HRSA), HIV/AIDS Bureau (HAB) is requesting approval from the Office of Management and Budget (OMB) for data collection activities for the evaluation of the Enhancing Linkage of STI and HIV Surveillance Data in the Ryan White HIV/AIDS Program (Enhancing STI Linkage) demonstration project.

The purpose of the Enhancing STI Linkage demonstration project is to help participating jurisdictions match HIV and STI surveillance data then use those matched data to enhance linkage to and re-engagement in health care and improve health outcomes for people with Human Immunodeficiency Virus (HIV) in the Ryan White HIV/AIDS Program (RWHAP). A Technical Assistance Provider (TAP) will provide to the four participating jurisdictions tailored training and technical assistance (TA) to facilitate data sharing across STI and HIV surveillance systems. The contractor will conduct a mixed-methods evaluation of the Enhancing STI Linkage demonstration project. This application seeks approval for the evaluation’s data collection activities.


The evaluation will collect longitudinal data and develop comparative case studies of participating jurisdictions. This is an observational study with no comparison group.

The contractor seeks OMB approval only for the two evaluation data collection activities that will engage more than nine respondents. Those activities are 1) semi-structured interviews with jurisdiction stakeholders and 2) data end-user survey. Additionally, the evaluation will collect data from nine or fewer total respondents through semi-structured interviews with TAP representatives, jurisdiction-level aggregate statistics, jurisdiction personnel time reporting, and jurisdiction-specific documents. What follows is a description only of those activities for which OMB approval is sought, i.e. semi-structured interviews with jurisdiction stakeholders and data end-user survey. The other evaluation activities are briefly described in Supporting Statement A to give OMB a sense of the entire scope of evaluation activities.

1. Respondent Universe and Sampling Methods

Semi-structured interviews with jurisdiction stakeholders: Semi-structured interviews will be conducted with 24 total jurisdiction stakeholders, which includes 12 jurisdiction TA participants (3 from each jurisdiction) and 12 policy stakeholders (3 from each jurisdiction). The contractor will work with the TAP and jurisdictions to purposively select potential participants to invite for these interviews. Respondents will be staff leading this initiative in their jurisdiction and familiar with HIV and STI surveillance data and collection processes, and staff involved in using data to drive decision making and policy making. The TAP will provide the contractor with access to the names and email addresses of jurisdictional contacts for outreach and recruitment. The contractor will then conduct outreach and recruitment with jurisdiction contacts by email, copying the TAP to keep them informed. Interviews will utilize purposive sampling and are not intended to be statistically representative or generalizable to all RWHAP-funded jurisdictions. In the event a selected potential participant is unresponsive to contact attempts or refuses to participate, another potential respondent who meets the aforementioned criteria will then be contacted, as applicable.


Data end-user survey: Data end-users consist of health department and/or clinical staff who are intended to use the linked data to do their work. The contractor will collaborate with the TAP and each jurisdiction to identify a list of data end-users. It is anticipated each jurisdiction will identify approximately 35 data end-users (140 total data end-users); however, the exact number in each jurisdiction may be fewer, dependent upon the organization of the jurisdiction. The contractor will work with the main POC in each jurisdiction to determine its data end-users and obtain a list of their names, roles, and email addresses. This list will be used to field the web-based survey.


Exhibit 1 describes the sampling approach and measures for the data sources in the information collection.


Exhibit 1. Data Sources, Sampling Approach, and Measures


Data Source

Approach and Measures

Semi-structured Interviews with Jurisdiction Stakeholders

Sampling: Purposive sample of individuals involved in TA activities within each jurisdiction.


Topics:

  • TA recipient interviews (12 total respondents; 3 per jurisdiction): Organizational processes associated with data sharing and linkage (i.e., pain or friction points associated with the process of sharing data such as stigma and confidentiality issues; facilitators to linking HIV and STI surveillance data; MOUs, policies, and procedures in place; perceived impact of linked data on process of care and HIV outcomes) and technical processes of data sharing and linkage (i.e. common data linkage processes, procedures, and best practices); perceptions of the TA provided by the TAP; suggestions for improving linked data and data use outcomes; lessons learned during implementation, use of reports with linked data, unanticipated costs.

  • Policy stakeholder interviews (12 total respondents; 3 per jurisdiction): If and how linked HIV and STI data is being used for policy and/or decision making; why linked HIV and STI data is or is not being used.

Analysis: All qualitative data will be entered into NVivo 12 to allow for standardized coding by topic and theme. Dual coding (i.e., two independent coders, achieving Kappa coefficient >0.80) of interview notes using thematic codebook and NVivo 12 software, to identify common themes and examine divergence and convergence of themes across interviewee types and other jurisdiction-level characteristics.

Data End-User Survey

Sampling: Census of data end-users. Assume 35 potential respondents per jurisdiction (n=140) and a 75% percent response rate (n=105).


Topics: Use of linked data; perceived usefulness and quality of linked data; time spent participating in activities associated with use of linked data.


Analysis: All quantitative data will be entered into SAS or Stata compatible databases for analysis. Quantitative analyses will use a threshold of p<0.05 for determining statistical significance. Descriptive analyses; comparison of survey responses from time 1 to time 2 using tabulation or a paired-samples t-test.


Power: Assume a binary question with a mean of 0.5 and 105 completed surveys. The anticipated cluster-adjusted 95% confidence interval with 80% power is 0.40 – 0.60.


Generalizability: Participating jurisdictions. Assume that survey non-response will be missing at random.



2. Procedures for the Collection of Information

Semi-structured interviews with jurisdiction stakeholders: All semi-structured interviews will be conducted virtually via videoconference using Cisco WebEx video conferencing platform. Informed consent will be obtained verbally from each participant prior to the interview. Interviews will be guided by a semi-structured interview guide, and notes will be taken during each interview by contractor staff. Interviews will also be audio recorded using WebEx, with participants’ consent. Audio recordings will immediately be uploaded after each interview and saved to the contractor’s secure servers using a secure file transfer protocol (FTP), and deleted from WebEx’s archives. Interviews will be transcribed using encrypted computers and laptops and saved directly to secure servers.


Interviews with jurisdiction TA recipients will be conducted annually in each of the three years of the evaluation; each interview is expected to take 60 minutes.


Interviews with policy stakeholders will occur annually in the second and third year of the evaluation; each interview is expected to take 30 minutes.

Data end-user survey: A brief, web-based survey will be conducted with data end-users. Survey administration will use a three-stage recruitment email approach whereby an initial email is sent to all data end-users introducing the evaluation and alerting potential respondents to the forthcoming survey. The second email will contain a live link to the web-survey, and approximately one week later a reminder email will be sent to all non-respondents.


The survey will be fielded over an eight-week period at two time points during the evaluation– in the second and third years - to capture changes in data end-users’ perceptions and use of linked data over time. The web survey will be programmed using Confirmit software, which allows for the design and programming of input screens that visually guide respondents through the survey instrument and encourage accurate data entry. Beta testing and quality checks will be performed prior to the launch to ensure survey skip patterns work correctly and data are accurately captured.


3. Methods to Maximize Response Rates and Deal with Nonresponse

Semi-structured interviews with jurisdiction stakeholders: Interview guides were developed with consideration to length and comprehension level so it is appropriate for jurisdictional stakeholders to complete in the allotted time. Non-response will be minimized by leveraging the TAP’s relationship with stakeholders to identify appropriate respondents, ensure contact information is correct, and follow up with potential respondents who are non-responsive to the contractor’s recruitment emails.


Data end-user survey: Response rates will be maximized using the following approaches. First, the contractor will engage the TAP to convey the importance of survey participation to jurisdictions to ensure the maximum possible response rate. Second, survey response rates will be maximized by keeping the survey brief - approximately 15 questions in which respondents can complete in ten minutes. The survey will be particularly sensitive to the fact that clinical and health department staff are busy and unlikely to complete a long, cumbersome survey. Third, survey response rate reports will be run weekly to assess completion and identify additional outreach opportunities to encourage participation, as needed. Finally, survey topics are relevant to potential respondents’ daily work, which to increases the likelihood staff will participate.


A 75% response rate is expected overall and for each jurisdiction at both survey time points. For non-responding individuals, the contractor will send out an additional reminder email to contacts in that jurisdiction to encourage participation.


A nonresponse and loss to follow-up analysis will be performed to determine if survey weight adjustments are necessary to reduce potential bias. The criterion used for weighting is a logistic regression model on nonresponse (y/n) or loss to follow-up (y/n) with jurisdiction and respondent type as predictors. If the logit model finds these indicators significant, the contractor will perform an adjustment. Should significant survey effects exist, the base will be adjusted to realign the sample and reduce bias in data analysis. All weighting work will be performed using SAS® software.


4. Tests of Procedures or Methods to be Undertaken

Data collection procedures were pilot tested to refine wording, increase efficiency, and verify burden estimates. During the pilot testing, the contractor tested and refined the semi-structured interview guide, and sought input from HRSA and the TAP to ensure interview questions were clear and relevant and the length was feasible for the allotted interview time. The contractor also obtained feedback on the data end-user survey instrument from several potential respondents to ensure question comprehension, relevance, appropriate length, and ease of completion. This process verified that the questions were likely to be interpreted correctly by survey respondents and the length of the survey was feasible to complete in 10 minutes.


5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Sara Woody, MHA, COR Level II

Management Analyst

Data Management and Analysis Branch

Division of Policy and Data

HIV/AIDS Bureau

Health Resources and Services Administration

Phone: 301.443.3452

Email: [email protected]

Role: Oversees design of data collection plan, collection of data, and data analysis


Jane Fox, MPH

Principal Associate, Abt Associates

Phone: 617.520.3910

Email: [email protected]

Role: Principal Investigator


Leigh Evans, PhD

Associate, Abt Associates

Phone: 617.520.4504

Email: [email protected]

Role: Project Director


Jason Brinkley, PhD

Senior Associate, Abt Associates

Phone: 919.294.7745

Email: [email protected]

Role: Project Quality Advisor


Ryan Kling, MA

Principal Associate, Abt Associates

Phone: 617.349.2460

Email: [email protected]

Role: Evaluation Lead


Janet Myers, PhD MPH

Professor of Medicine, University of California, San Francisco

Phone: 415.502.1000

Email: [email protected]

Role: External Consultant/Subject Matter Expert


Auntré Hamp, MEd, MPH, LPC

Research Assistant Professor, Georgetown University Medical Center

Phone: 202-687-0385

Email: [email protected]

Role: TAP Project Director





4


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Single-Sided Body Template
AuthorAdministrator
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy