supporting statement A

supporting statement A.docx

Develop and Implement UCARE4LIFE Message Library

OMB: 0915-0371

Document [docx]
Download: docx | pdf





Develop and Implement UCARE4LIFE Message Library


OMB Control No. 0915-XXXX


Supporting Statement A

Develop and Implement UCARE4LIFE Message Library

OMB Control No. 0915-XXXX

Terms of Clearance: None

  1. Justification

  1. Circumstances Making the Collection of Information Necessary

An estimated 1.2 million individuals are living with HIV in the United States (CDC, 2011a), and approximately 50,000 people are newly infected each year (Prejean et al., 2011). The burden of HIV/AIDS is high among certain populations. Young people, aged 15 to 29, accounted for 39% of all new HIV infections in the United States in 2009 (CDC, 2011b). Like for the epidemic as a whole, racial/ethnic minority youth and young men who have sex with men (MSM) are disproportionately affected by HIV. For example, in 2009, 65% of diagnoses of HIV infection were reported among African Americans aged 13–24 years (CDC, 2013a). And in 2010, African American MSM aged 13 – 24 years accounted for more than half of new infections among all MSM (CDC, 2013b). HIV is also more prevalent in certain regions of the country. The highest number of adults and adolescents living with an AIDS diagnosis are in the South (CDC, 2012), and at the end of 2010, the South accounted for 45% of the estimated 33,015 new AIDS diagnoses. In 2009, 48% of all persons with a diagnosis of AIDS who died were in the South (CDC, 2012).

Unfortunately, many people diagnosed with HIV do not link with care and among those who do, only about 50% remain engaged with regular care (Gardner et al., 2011). The process of engagement in HIV medical care (spanning initial linkage through long-term retention) is complex and dependent on a variety of interconnected individual, interpersonal, and structural factors that are similar to those that place people at risk for HIV in the first place (Steele et al., 2007). At the individual level, people diagnosed with HIV have numerous decisions to make about seeking health care, staying in care, taking ART, adhering to an ART regimen sufficient to reduce viral load, and taking additional precautions to prevent transmission to others. For those taking ART in particular, retention in care is necessary to ensure ongoing receipt of ART, evaluation of the emergence of medication toxicities, and identification of treatment failure with the opportunity to switch regimens (Bodenlos et al., 2007; Geng et al., 2010). Nonadherence to ART can lead to poor clinical outcomes and significantly decreased life expectancy (Murphy et al., 2001). In addition, poor medication adherence can increase the likelihood of HIV transmission during risky sexual activity (Quinn et al., 2000).

Currently, few interventions exist to help HIV positive people manage their own condition. Available self-management interventions are clinically based, require significant time commitments on the part of patients and clinicians, or involve considerable resources and expense (Osterberg & Blaschke, 2005). These challenges can make interventions less accessible to people who could benefit from them and may limit their potential relevance, effectiveness, or sustainability over time (Lewis et al., 2013). Consequently, simple-to-use, easily implemented and disseminated interventions are needed for HIV self-management.

One emerging self-management intervention that has shown promise is text-messaging based interventions. Findings from a recently completed Agency for Healthcare Research and Quality (AHRQ) study with HIV positive MSM (completed by the contractor for this study, RTI International), for example, showed that participation in an SMS intervention increased self-reported medication adherence among those who began the study as non-adherent and received tailored medication reminders as well as significant improvements in clinical outcomes (i.e., viral load and CD4 count) (Lewis et al., 2013). Furthermore, intervention participants not only demonstrated increased HIV knowledge and perceived social support from baseline to follow-up but there were also significant reductions in self-reported sexual risk behaviors among those who received sexual risk reduction messages (Uhrig et al., 2012). Overall receptivity to the messages and intervention was high (Uhrig et al., 2012; Harris et al., 2013). Almost all participants reported that messages were easy to understand, most trusted the information, felt it is important to have programs like this, always read the messages, and felt the messages gave good advice (Uhrig et al., 2012). Additional findings from this study indicate the feasibility and acceptability of the intervention from the perspectives of both patients and providers (Harris et al., 2013).

In response to the emerging evidence indicating the benefit of using SMS to manage HIV infection, and the need to support young people living with HIV, the MAC AIDS Fund in partnership with the U.S. Department of Health and Human Services Health Resources and Services Administration (HRSA) is supporting the UCARE4LIFE program. The aims of this study, “Develop and Implement UCARE4LIFE Message Library,” are to develop, test, and maintain a message library that addresses topics of HIV self- management and to develop, implement, conduct, and evaluate a pilot study that delivers text messages to racially and ethnically diverse young people who are HIV-positive and receiving care at Ryan White grantee sites in southern states with high rates of HIV infection: Alabama, Arkansas, Tennessee, Kentucky, Louisiana, Mississippi, North Carolina, and South Carolina.

  1. Purpose and Use of Information Collection

The purpose of the UCARE4LIFE study is to develop, implement, and evaluate short message service (SMS), or text messages, to improve retention in care and HIV medication adherence among racially and ethnically diverse HIV-positive youth who are receiving care at select Ryan White clinics in states that have been particularly hard-hit by the HIV/AIDS epidemic (see Section A.1) The primary aims of this study are to: (1) develop, test, and maintain a message library that addresses topics of HIV disease self-management and risk reduction and (2) develop, implement, conduct, and evaluate the text messaging.

There are three respondent universes for this data collection: (1) Individuals who represent the characteristics of the intervention’s target population, (2) patients from selected clinics who participate in the text-messaging intervention study, and (3) providers from participating clinics.

  1. Individuals with characteristics similar to members of the target population will participate in small group discussion. Participants will be recruited from community-based organizations (CBOs) and HIV primary care clinics in North Carolina, where RTI is headquartered. These agencies will advertise the data collection by way of recruitment flyers (Appendix A). RTI will conduct a brief screening with those who respond to the advertisement to determine their eligibility (Appendix B). Those who are eligible will be assigned to a group based on gender, age, and language (English or Spanish). Thirty-two individuals will be segmented to take part in one of eight group discussions, with four participants in each. The groups will be held in community-based settings to minimize respondent burden and to make participants feel more at ease. Each group will last about 2 hours. The focus of the data collection will be an assessment of participants’ interest in the message topics; the novelty of the messages; the extent to which the messages are motivational, credible, useful, offensive, believable, actionable, or stigmatizing; and preference for wording, phrases, and acronyms (see discussion guide in Appendix C). The data gathered through this collection are essential in that they will inform the development and refinement of the messaging strategy.

  2. Patients who participate in the intervention study will take part in one or two data collections.

  • Web-based surveys: Five hundred intervention participants (250 cases who receive text-messages over the 9-month study period and 250 controls who receive standard care) will complete four Web-based surveys at four time points (baseline, 3 months, 6 months, and 9 months). Participants will be recruited (see intervention study recruitment flyer in Appendix D) and screened by staff from the HRSA-selected clinics (see screener in Appendix E). Eligible participants will be randomized as cases or controls via computer-generated random number assignment. All participants will be asked to complete a baseline Web-based survey using a private computer terminal at their clinic (see baseline survey in Appendix F). The intent of the baseline assessment survey is twofold. First, these data will serve to generate a user profile algorithm so that only relevant (or tailored) messages will be sent to case participants (e.g., only cigarette smokers will receive messages that encourage cessation). Second, the data will be used to establish a baseline from which to measure change over time among both cases and controls as the survey will be re-administered at 3-month intervals (at 3, 6, and 9 months). The data from the follow-up surveys (see follow-up survey in Appendix G) are critical in that they will be used to address questions related to implementation effectiveness and patient satisfaction.

  • Qualitative in-depth interviews: A subset of intervention participants will be asked to participate in qualitative in-depth interviews to more fully assess the intervention’s effectiveness and participants’ likes and dislikes about the program overall and messages in particular. These qualitative data will be summarized and aggregated in an analytic matrix to assess variations in response patterns. The findings from this data collection will be useful as a supplement to the survey data to determine the feasibility and acceptability of the program from participants’ perspectives and provide additional context for survey findings.

  1. Providers (e.g., physicians, nurses, case managers) from participating clinics will be recruited by RTI to take part in a 1-hour, in-depth interview. Provider participants will be asked questions to assess organizational context and readiness to change, implementation policies and practices, facilitators and barriers to program implementation and sustainability, implementation climate, value-fit, and recommendations for improvement (see discussion guide in Appendix H). These qualitative data will be summarized and aggregated in an analytic matrix by participant type to assess variations in response patterns. The findings from this collection are critical to understanding the extent to which the text messaging program is feasible and acceptable from providers’ perspectives.

Overall, the UCARE4LIFE study is an important contribution to meeting two primary goals of the National HIV Strategy: Increasing access to care and optimizing health outcomes and reducing HIV-related health disparities. This data collection will allow HRSA to (1) evaluate the effectiveness of the intervention in improving retention in care and medication adherence and reducing risk behaviors among persons who are HIV positive and (2) examine the intervention’s feasibility and acceptability from patients’ and providers’ perspectives. HRSA will use this information to make critical decisions about whether to sustain the intervention and/or expand it to additional Ryan White clinics.

The findings from this study will be disseminated to other federal agencies and the public through reports prepared for/by HRSA and RTI. When appropriate we will also disseminate results through peer-reviewed journal articles and conference presentations. All releases of information will be reviewed and approved by HRSA prior to release.

  1. Use of Improved Information Technology and Burden Reduction

The baseline and follow-up surveys are web-based and will ideally coincide with patients’ regularly scheduled routine care appointments to minimize the burden to respondents. The Web-based surveys comprise 76% of the total respondent burden hours, reflecting the importance of these data in evaluating intervention effectiveness. The Web-based surveys collect the minimum amount of data necessary to accomplish the goals of the evaluation and heavily rely on content used for the prior AHRQ study with HIV-positive MSM. Participants in that study did not report feeling burdened by the survey component.

Our data collection also requires that we employ qualitative research methods through the use of one-time small group discussions and interviews. The qualitative data gathered as part of this collection are crucial to understanding whether the intervention can and should be sustained and/or rolled out to other clinics. The foci of the qualitative data collections do not lend themselves to electronic reporting; hence, electronic reporting will not be utilized.

  1. Efforts to Identify Duplication and Use of Similar Information

In addition to consulting with Federal and non-Federal colleagues, a review of the literature and the Reginfo.gov website was conducted to identify duplication and use of similar information.

The literature review pointed to one United States-based information collection involving text-messaging interventions with same aged HIV-positive youth/young adults by Dowshen and colleagues (2013). There are some important differences, however, between that study and UCARE4LIFE. The Dowshen study (1) focused on a general sample of HIV-positive youth/young adults (whereas the respondent universe for the planned study includes HIV-positive racial/ethnic minority youth/young adults) and (2) was designed to promote ART adherence only [whereas the planned study promotes ART adherence but takes it to the next level by also focusing on the myriad other factors that contribute to adherence (e.g., social support, substance use) and retention in care as well as retention in HIV risk].

Review of recent information collection requests on the Reginfo.gov website did not identify recent information collection requests of a similar nature.

Based on this review, we have confirmed the need for the present study.

  1. Impact on Small Businesses or Other Small Entities

No small businesses will be involved in this study.

  1. Consequences of Collecting the Information Less Frequently

If these data were not collected, HRSA would be unable to determine the intervention’s efficacy and feasibility and acceptability from patients’ and providers’ perspectives. Thus, decision-makers would not have the information necessary to determine whether the intervention should be disseminated more widely. There are no legal obstacles to reduce burden.

Table A.6 shows the planned data collections, the frequency of each collection, and a justification for the stated frequency.

Table A.6 Data Collection Frequency and Justification

Data Collection (N)

Frequency (Location)

Justification for Frequency

In-person small group discussions with members of the target population who are non-intervention participants (N=32)

One time (NC-based CBOs)

Necessary to guide development/refinement of the messages.

Web-based surveys with intervention participants (N=500)

4 (baseline and 3, 6, and 9 months) (in clinic)

The baseline survey is necessary to establish a baseline from which to measure change in key outcomes over the 9-month study period. The follow-up surveys are necessary to assess interim and long-term effectiveness and the intervening factors that may contribute to changes in key outcomes.

Interviews with a subset of intervention participants (N=100)

One time (telephone)

Necessary to explore the feasibility and acceptability of the messages and messaging approach.

Interviews with providers from participating clinics (N=30)

One time (telephone)

Necessary to assess the feasibility and acceptability of the intervention to understand issues important to wider dissemination.



  1. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

Participation in the intervention will require data collection to occur more often than quarterly. Cases and controls will respond to the survey four times during the 9-month study period (see Table A.6). In addition, cases who are selected and agree will participate in an in-depth interview between months 6 and 9.

  1. Comments in Response to the Federal Register Notice/Outside Consultation

Section 8A:

  • A 60-day Federal Register Notice was published in the Federal Register on May 10, 2013, vol. 78, No. 91; pp. 27406-07 (see Appendix I). There were no public comments.

Section 8B:

HRSA has undertaken two efforts to consult with persons to obtain their views on the availability of data, frequency of collection, the clarity of instructions and record keeping, disclosure, or reporting format, and on the data elements to be recorded, disclosed, or reported. First, the HRSA Technical Point of Contact consulted with Angela Nunley from the Agency for Health Care Research and Quality to discuss the prior project RTI conducted under contract with AHRQ to develop, implement, and evaluate a text-messaging intervention with HIV positive MSM attending an ambulatory care clinic.

In addition, HRSA has formed a UCARE4LIFE Federal Steering Committee to provide ongoing guidance for the project. Thus far, the committee has met six times and will continue to meet on a quarterly basis. The current members of the committee are shown in Table A.8B.

Table A.8B Members of the UCARE4LIFE Federal Steering Committee

Name (Title)

Affiliation

Contact Information

Anglin, Trina (Branch Chief, Adolescent Health)

HRSA, Maternal Child Health Bureau, Adolescent Health

Phone: 301.443.4291

Email: [email protected]

Applebaum, Bethany (Public Health Analyst)

HRSA, Office of Women’s Health

Phone: 301-443-1236

Email: [email protected]

Atienza, Audie (Program Director)

NIH, National Cancer Institute, Division of Cancer Control and Population Sciences

Phone: 240.276.6715

Email: [email protected]

Augustson, Erik (Program Director)

NIH, National Cancer Institute, Division of Cancer Control and Population Sciences

Phone: 240.276.6774

Email: [email protected]

Broussard, Lauren (Project Officer)

HHS, Office of Adolescent Health

Phone: 240.453.2808

Email: [email protected]

Cheever, Laura (Associate Administrator)

HRSA, HAB

Phone: 301.443.1993

Email: [email protected]

Cook, Gary (Deputy Director)

HRSA, HAB, Division of Metropolitan HIV/AIDS Programs

Phone: 301.443.9090

Email: [email protected]

Doshi, Rupali (Medical Officer)

HRSA, HAB, Office of Associate Administrator

Phone: 301.443.5313

Email: [email protected]

Endale, Hanna (Branch Chief, Central Region)

HRSA, HAB/Division of Community HIV/AIDS Programs

Phone: 301.443.1326

Email: [email protected]

Fanning, John (Senior Policy Advisor)

HRSA, HAB/Division of Community HIV/AIDS Programs

Phone: 301.443.8367

Email: [email protected]

Figueroa-Gonzalez, Margarita (Clinical Advisor)

HRSA, Office of Regional Operations

Phone: 301.443.1380

Email: [email protected]

Gomez, Miguel (Director, AIDS.gov)

HHS, Office of the Assistant Secretary for Health

Phone: 202.690.5560

Email: [email protected]

Huang, Anna (UCARE4Life Technical POC, Medical Officer)

HRSA, HAB/Division of Community HIV/AIDS Programs

Phone: 301.443.3995

Email: [email protected]

Kapogiannis, Bill (Program Director, Adolescent Medicine Trials Network for HIV/AIDS Intervention)

NIH, National Institute of Child Health and Human Development


Phone: 301.402.0698

Email: [email protected]

Lee, Sonia (Associate Program Director, Adolescent Medicine Trials Network)

NIH, National Institute of Child Health and Human Development

Phone: 301.594.4783

Email: [email protected]

Malitz, Faye (Senior Program Advisor)

HRSA, HAB, Division of Policy and Data

Phone: 301.443.3259

Email: [email protected]

Mansergh, Gordon (Behavioral Scientist)


CDC, National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention (NCHHSTP)

Phone: 404.639.6135

Email: [email protected]

Matoff-Stepp, Sabrina (Director)

HRSA, Office of Women’s Health

Phone: 301.443.8664

Email: [email protected]

Palow, Diana (Branch Chief, HIV Education)

HRSA, HAB, HIV Education, Division of Training and Capacity Development

Phone: 301.443.4405

Email: [email protected]

Pitman, David (Staff Assistant)

HRSA, HAB/Division of Community HIV/AIDS Programs

Phone: 301-443-0279

Email: [email protected]

Reyes, Menina (Administrative Associate)

HRSA, HAB/Division of Community HIV/AIDS Programs

Phone: 301.443.0957

Email: [email protected]

Rice, Martin (Nurse Consultant)

HRSA, Office of Rural Health Policy

Phone: 301.443.2983

Email: [email protected]

Robilotto, Susan (Clinical Consultant/Medical Officer)

HRSA, HAB, Divisions of Metropolitan & State HIV/AIDS Programs

Phone: 301.443.6554

Email: [email protected]

Robinson, Susan (Associate Director, Communication Science)

CDC, National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention (NCHHSTP)

Phone: 404.639.8025

Email: [email protected]

Ross, Polly (Director, Division of Community HIV/AIDS Programs)

HRSA, HAB/Division of Community HIV/AIDS Programs

Phone: 301.443.7602

Email: [email protected]

Sowah, Lillian (Project Officer)

HRSA, HAB/Division of Community HIV/AIDS Programs

Phone: 301.443.5671

Email: [email protected]

Thompson, Renata (Contracting Officer’s Representative and Project Officer)

HRSA, HAB/Division of Community HIV/AIDS Programs

Phone: 301.443.4364

Email: [email protected]

Wegman, Lynn (Deputy Director, Division of Community HIV/AIDS Programs)

HRSA, HAB/Division of Community HIV/AIDS Programs

Phone: 301.443.5658

Email: [email protected]


  1. Explanation of any Payment/Gift to Respondents

Incentives are intended to recognize the time burden placed on participants, encourage their cooperation, and convey appreciation for their contributions to the research. Numerous empirical studies have established that incentives can significantly increase participation rates (Abreu & Winters, 1999; Shettle & Mooney, 1999; Greenbaum, 2000). Based on the research team’s extensive experience conducting research of a similar nature, we have learned that incentives are necessary to sufficiently attract participants.

We are seeking approval to provide payment to those who participate in small group discussions, Web-based surveys, and case participant in-depth interviews. In reviewing OMB’s guidance on the factors that may justify provision of incentives to research participants,1 we have determined that the following principles apply to this data collection:

  1. Improved coverage of specialized respondents, rare groups, or minority populations: The proposed data collection includes vulnerable populations: Persons who are HIV positive and youth. HIV-positive persons are considered vulnerable because they are a stigmatized and marginalized group. Youth are considered vulnerable by virtue of their age. In addition, this study will include minority youth and young adults who also may be stigmatized and marginalized due to their race/ethnicity. To develop a culturally appropriate intervention to address the unique needs of the targeted population, it is imperative that sufficient numbers are included in the data collection. Yet, based on the study team’s prior experience conducting data collections with the targeted populations, long-term engagement can be challenging due to competing basic needs, health issues, and social and emotional vulnerabilities. Provision of an incentive is necessary to ensure that a sufficient number of respondents from the targeted population participate in all data collections.

  2. Data quality: If we are unable to recruit sufficient numbers of respondents to participate in the qualitative data collections (i.e., small group discussions and interviews with intervention participants), we will be unable to adequately test the messages to see whether they are acceptable, understandable, etc. to various segments of the target population (e.g., Spanish-speakers) or examine the overall feasibility and acceptability of the text-messaging program to inform HRSA’s roll out plans Likewise, for the quantitative data collection (i.e., Web surveys), if we are unable to garner participation in the intervention and also encourage participants to complete the follow-up data collections, the integrity of the intervention, and thus the quality of the data, will be compromised. This is particularly applicable when we consider that this data collection will include vulnerable/hidden subgroups (see #1).

  3. Reduced survey costs: We anticipate that without the incentive as an inducement, we will need to screen more people to achieve the desired cooperation rate and conduct more extensive follow-up with intervention participants to encourage them to complete the follow-up surveys. For example, the current estimated annualized burden for the screener for the intervention study is 250 hours. Without the incentive, we expect the burden to be 325 hours, an increase of approximately 30%. Costs to respondents and the Federal government will increase accordingly.

  4. Complex study design: To examine the efficacy of the intervention, this study requires that the same participants who complete the baseline Web survey complete the follow-up Web surveys at three additional time points: at 3-, 6-, and 9-months. In addition, a subset of case participants will be invited to participate in an in-depth interview which will require scheduling another appointment.

In addition to these factors, we also considered the incentive amounts provided to participants in other studies of a similar nature. See Table A.9.1.

Table A.9.1 Incentive Amounts for Prior Studies of a Similar Nature

Publication

Study Population (Purpose)

Data Collection Type

Amount

Dowshen et al. (2012)

HIV-positive youth aged 14–29 (to explore the efficacy of a text-messaging intervention to promote ART adherence

Survey at baseline and at weeks 6, 12, 18, and 24

$200 ($40 per)

Muessig et al. (2013)

African American MSM aged 18–30 (to inform the development of a text messaging intervention)

One-time focus group proceeded by a brief survey

$50 gift card

George et al. (2012)

African American and Latino MSM aged 18–25 (to explore current texting practices and the feasibility/ acceptability of text messaging as a means of conducting sexual health promotion)

One-time brief survey plus focus group

$40

Cornelius et al. (2013)

African Americans aged 13–18 (to examine the efficacy, feasibility, and acceptability of a mobile phone-based HIV prevention)

Survey at baseline, 1 month, and 3 months

$50 upon completion of the 3-month surveya

Uhrig et al. (2012)

HIV-positive MSM (to explore the preliminary efficacy of a text-messaging intervention to promote ART adherence, retention in care, and risk reduction)

Survey at baseline and at 3 months

$55 ($25 for the baseline and $30 for the follow-up survey)

a Note that study participants also received a free smartphone with unlimited text messaging and web access for the 90 day text messaging intervention period.

In Table A.9.2, we propose the incentive amounts for each data collection keeping in mind the overarching OMB principles applicable to this study as well as the incentive amounts provided to participants in similar data collections. The calculations are based on an annual mean wage rate of approximately $22.00 as determined by the Bureau of Labor Statistics (May 2012 National Occupational Employment and Wage Estimates United States, 2012) ( and for case participants, the estimated cost of receiving the text messages (an average of $16.60 over the 9-month intervention period).

Table A.9.2 Formula for Determining Incentive Amounts for Each Data Collection

Data Collection

Total Time

Mean Hourly Wage Rate

Estimated Wage

Proposed Incentive Amount

Small group discussion

2 hours

$22.01

$44.02

$50

Web-based surveys

4 hours (1 hour per)

$22.01

$88.04

$100

In-depth interview

1 hour

$22.01

$22.01

$25



  1. Assurance of Confidentiality Provided to Respondents

RTI received approval from their internal institutional review board (IRB) to conduct the data collections mentioned herein. RTI’s IRB granted separate approvals to conduct the small group discussions and text-messaging intervention (encompassing Web-based surveys and in-depth interviews with case participants) and an exemption for the provider in-depth interviews. The approval notices for each component are attached in Appendices J1–J3. The specifics of the informed consent procedures for each of these components are described below.

10.1 Small Group Discussions

The consent procedure for the small group discussions differs for participants aged 15–17 (adolescents) and those aged 18–24 (adults).

10.1.1 Adolescents

RTI’s IRB granted a waiver of parental consent for the small group discussions because a requirement of parent/guardian permission would make the inclusions of adolescents nearly impossible. For some adolescents, contacting a parent/guardian could place the child at risk. Many homeless youth, for example, have suffered physical and sexual abuse in their home environments. By notifying their parents/guardians of their location, we would inadvertently be placing the adolescent at risk. Additionally, some adolescents may not be willing to participate if parental consent is required. This applies to adolescents who are homeless as well as those who live with their parents/guardians. For instance, many adolescents may not want their parents/guardians to know that they received services from a CBO.

  • Screening: To determine study eligibility, prospective adolescent participants will contact RTI by telephone to be screened (see screener in Appendix B). The prospective participant will be told that RTI is conducting a research study and that we will need to ask some personal questions to determine eligibility. After reviewing the verbal consent script in its entirety and prior to asking adolescents to provide consent, we will ask them two questions to gauge their comprehension of verbal assent: (1) Do you have to answer all of the questions that I ask you? (Correct answer: No) and (2) Are we going to ask you about your HIV status? (Correct answer: Yes). If more than one question is answered incorrectly, we will terminate the screening process. If one or both questions are answered incorrectly, we will re-read the applicable passage from the verbal assent form and re-ask the question. If the adolescent answers correctly, we will proceed with the screener. Otherwise, we will terminate the screening process. The questions are presented in a separate module in the screener (see Section 2). We will obtain verbal consent from those who are eligible based on their comprehension of the data collection. Those who meet the study’s eligibility requirements will be invited to participate in a group discussion.

  • Group discussion: The assent process for adolescents attending the group discussion will be facilitated by a designated staff person from one of our partner CBOs/clinics to provide an additional measure of protection. (Note that this “advisor” will be trained by RTI personnel on his/her role prior to data collection.) The advisor will be present during the administration of the assent (Appendix K1). At the end of the reading of the assent, RTI staff will leave the room to provide the adolescent and the advisor time to speak privately. After the adolescent and advisor have talked, RTI staff will reenter the room, answer any questions, and determine if the adolescent wants to proceed with the data collection. They will then be asked three questions by the RTI staff person to assess their comprehension of the assent. However, instead of asking “Are we going to ask you about your HIV status?” we will ask “Will we audio-tape the group?” If the RTI staff person determines at any point that the prospective participant is acting intoxicated or under the influence of drugs and therefore unable to provide assent, the staff person will dismiss the prospective participant. Those who comprehend the assent will be asked to sign the assent form and will be given an unsigned copy of the form to keep. The advisor will be asked to sign the assent form as a witness. We will then escort the adolescent to the group.

10.1.2 Adults

  • Screening: Prospective adult participants will contact RTI by telephone to be screened for eligibility (Appendix B). The prospective participant will be told that we are conducting a research study and that we will need to ask some personal questions to determine eligibility. We will then obtain verbal consent from the prospective participant to continue with the screener. Those who meet the study’s eligibility requirements will be invited to participate in an interview.

  • Group discussion: Upon arrival at the data collection site, participants will be given a consent form (Appendix K2), and the facilitator or note-taker (both RTI staff) will review it with them. If the prospective participant has any questions, they will be answered. If the prospective participant chooses to sign the consent form, he/she will be given an unsigned copy to keep and then escorted to the group session. If an RTI staff person determines that the prospective participant is acting intoxicated or under the influence of drugs and therefore unable to provide consent, the staff person will not allow the person to participate.

10.2 Text-messaging Intervention

As determined by RTI’s IRB, for the intervention study, adults and adolescent are treated similarly with regards to informed consent procedures because adolescents aged 15–17 who participate must be legally able to consent to their own treatment (emancipated minors, “mature” minors, and those who qualify to receive treatment on their own through “access statutes”) and are thus treated as adults.

10.2.1 Screening

Potential participants will call the telephone number for the onsite study coordinator listed on the recruitment flyer (Appendix D) to initiate the screening process. Once phone contact is made with a prospective participant, the study coordinator will describe the study and determine whether or not the person is interested in participating. If the individual is interested; and for those aged 15–17, legally able to consent to their own treatment as mentioned above; the study coordinator will administer a screener to determine eligibility and gather basic demographic information to adequately describe the sampling frame (Appendix E).

10.2.2 Intervention

Participants who are eligible and agree to participate will be randomized to the case or control study condition after which the study coordinator will administer informed consent. There are separate consent forms for cases and controls (see Appendices L1 and L2). The consent will cover all aspects of the study (e.g., agreement to have their medical record reviewed, participate in all surveys, receive all of the messages they are eligible to receive [cases only], and willingness to be re-contacted to participate in an in-depth interview should they be selected to do so [cases only]). Participants will be advised that they can withdraw from the study without consequence.

At the time of consent, each participant will also sign an authorization for use or disclosure of health information, first names, and telephone numbers to be compliant with the Health Insurance Portability and Accountability Act (see Appendix M). Also, case participants will be asked to provide consent to be re-contacted should they be selected to participate in an in-depth interview (see end of case consent form in Appendix L1). Those who do not consent to be re-contacted are still eligible to participate in the intervention.

As mentioned, participants will be provided with information about how to withdraw from the study during the consent process, and they will also be given a pocket card with this information (Appendix N). Participants can withdraw in one of two ways. (1) They can contact the onsite study coordinator who will notify RTI research staff. Using the participant’s unique identification number, RTI will terminate the transmissions to the associated mobile phone. (2) Participants can send a “STOP” command via SMS by replying to any message they have received including the word “STOP” or “QUIT” in the body of their response (the commands are not case-sensitive). The text-messaging system is configured to automatically terminate any pending messages to individuals who send a stop request and transmit a withdraw notification via e-mail or SMS to project administrators.

Given the sensitivity of the subject and the age of participants, extra precautions will be put into place to help protect case participants’ privacy. To minimize the potential for a breach of privacy, we will inform participants of this risk during enrollment and encourage them to mitigate this risk by (1) not sharing their phones or messages with others, (2) password protecting their mobile devices, (3) deleting project-related text messages after reading them, (4) notifying their cell phone carrier to terminate service in case of loss, and (5) reading their text messages in private.

Cases will also be informed during enrollment that there may be a very minimal risk of disclosure in the event a third party (e.g., a parent or guardian) installs monitoring software on their cellular device. It is unclear if such software is capable of meeting the claims made on various websites that purport to enable parents and spouses to access sensitive information on a target device. In addition to requiring a jail broken phone, the installation and monitoring via such software requires an above-average degree of technical proficiency. Another source of very minimal risk of unintentional disclosure may arise if parents review a participant’s cell phone bill and note a large number of messages originating from the text messaging system’s shortcode (72334). This number is the only means of identifying the origin of the text messages, cannot be called, and will not reply to messages sent to the shortcode.

An additional measure we are taking to protect privacy is to require cases to create a personal identification number (PIN) at study enrollment that must be entered to retrieve texts containing sensitive information that could potentially disclose a participant’s HIV status. Only messages that mention HIV specifically or topics closely related to HIV (e.g., ART) will be PIN protected to reduce participant burden. To alert cases that a sensitive text is ready for retrieval, they will first receive a text message that says, “Your message is ready for retrieval. Enter your pin number now.” The text message containing the sensitive information will be sent to the case immediately after PIN entry. We will advise cases during the enrollment process that some of the messages will be of a sensitive nature and may inadvertently disclose their HIV status if others see the message on their phone. We will therefore strongly encourage, as mentioned above, that participants read all text messages in private and that they not share their PIN with anyone else. Cases will have the opportunity to change their PIN at any time should they be concerned about a privacy breech. To change their PIN, cases will call the onsite coordinator and he/she will log into the clinical site’s control system (which only includes enrollee information, not data) and change the PIN manually.

In the event of participants notifying the site coordinator they have lost their phone, the site coordinator will be able to login to the UCARE site to temporarily suspend any pending messages intended for the subject. Participants can also e-mail the study from a computer located at a library, internet café, or elsewhere to suspend delivery of the messages. Once the loss of hardware has been resolved, the site coordinator can restart the intervention and resume messaging.

10.2.3 In-depth case participant interviews

Prior to initiating data collection (see discussion guide with introductory script in Appendix O), participants will be told that they can choose to not answer questions and can stop the interview at any time without penalty and that none of their comments will be linked with their name or shared with their doctor or anyone at the clinic.

10.3 Provider Interviews

Prior to initiating data collection with providers, we will obtain verbal consent (Appendix H). All of the questions we plan to ask relate to participants’ daily work at their respective clinics. Further, the data will not contain identifying information.

  1. Justification for Sensitive Questions

Providers and cases who participate in an in-depth interview will not be asked sensitive questions.

Small group discussion participants will be asked for their assessment of text messages that may include information that is sexually explicit and/or related to substance use. Participants’ input on the sample messages is critical to ensuring text message content for these domains that is understandable and culturally relevant to the target population.

Intervention participants will be asked sensitive questions about sexual practices and drug and alcohol use during the Web surveys as reductions in associated behaviors are key outcomes of the intervention. Intervention participants’ answers to these kinds of questions during the baseline survey will be used to inform the tailoring algorithm for the text messages. Only cases who report recent sexual activity and/or substance use at baseline will receive related text messages as such messages are not relevant for those who are not sexually active and/or substance users. Answers to sensitive questions on the follow-up surveys will be used to monitor sexual and substance-use behaviors over time to determine the intervention’s effectiveness in reducing associated risk behaviors relative to the control group. If these questions were not asked, HRSA would be unable to determine the intervention’s impact on reducing these important risk behaviors.

During the informed consent process, small group discussion and intervention participants will be advised that some questions may make them feel upset or uncomfortable and that they will be directed to an appropriate referral source should this be necessary.

During screening, small group discussion and intervention participants will also be asked to identify their race and ethnicity as the study’s target population includes HIV-positive minority youth and young adults. Race/ethnicity information will be used for the following purposes:

  1. To help ensure (to the extent possible) equal representation of English and Spanish speakers in the small group discussions.

  1. To identify participants’ preference for placement in an English or Spanish-language small group discussion among those who identify as Hispanic/Latino.

  2. To identify participants’ preference for receiving English or Spanish-language text messages among those who identify as Hispanic/Latino.

  1. Estimates of Annualized Hour and Cost Burden

12A.

This section summarizes the total burden hours for this information collection in addition to the cost associated with those hours. The estimates are based on the research team’s experience with the AHRQ study and with other HIV-affected groups.

The total response burden is estimated at 1,968.5 hours. Table 12.A provides detail about how this estimate was calculated. For the small group discussions, it is anticipated that 128 individuals will be screened for eligibility in order to obtain a total sample size of 32. Screening will take approximately 15 minutes per individual (8 burden hours). The discussions with the 32 participants will last 2 hours (64 burden hours).

For the study’s intervention component, 1,000 individuals will be screened to determine eligibility in order to recruit a sample of 500 participants. Screening will take approximately 15 minutes per individual (250 burden hours). Intervention participants will complete a Web survey at baseline, 3-, 6, and 9-months (4 surveys total), each of which will take approximately 45 minutes to complete (1,500 burden hours).

Up to 100 case participants will participate in a 1-hour in-depth, qualitative telephone interview (100 burden hours).

Up to 30 providers from participating clinics will take part in a 45-minute in-depth, qualitative telephone interview (22.5 burden hours).

Table 12.A Estimated Annualized Burden Hours

Type of Respondent

Form Name

No. of Respondents

No. Responses per Respondent

Average Burden per Response
(in Hours)

Total Burden Hours

Small group discussion participants

Screener

128

1

15/60

8

Discussion

32

1

2

64

Intervention participants

Screener

1,000

1

15/60

250

Web-based survey

500

4

45/60

1,500

In-depth interviews (with cases)

100

1

1

100

Providers

Pilot study qualitative interviews

30

1

45/60

22.5

Total





1,968.5



12B.

The total respondent costs are estimated to be $43,473.50. Table 12.B provides details about how this estimate was calculated. The calculations for small group discussion and intervention participants are based on an annual mean wage rate of approximately $22.00 as determined by the Bureau of Labor Statistics (May 2012 National Occupational Employment and Wage Estimates United States, 2012). The hourly wage rate for providers was calculated as an average of the mean hourly wages of family and general practitioners, physician assistants, registered nurses, and nurse practitioners because the composition of the sample is unknown at this time.

Table 12.B Estimated Annualized Burden Costs

Type of Respondent

Total Burden Hours

Hourly Wage Rate

Total Respondent Costs

Small group discussion participants

72

$22.01

$1,584.72

Intervention participants

1,850

$22.01

$40,718.50

Providers

22.5

$52.01

$1,170.23

Total



$43,473.50



  1. Estimates of other Total Annual Cost Burden to Respondents or Recordkeepers/Capital Costs

There are no other costs to respondents or record keepers.

  1. Annualized Cost to Federal Government

The contractor’s costs are based on estimates provided by the contractor who will carry out the data collection activities. With the expected period of performance, the annual cost to the federal government is estimated to be $604,022, see Table A.14. This is the cost estimated by the contractor, RTI, and includes the estimated cost of coordination with HRSA, data collection, analysis, and reporting.


Table A.14 Government Costs

Item/Activity

Details

Average Annual Cost

HRSA oversight of contractor and project

Technical Point of Contact: 0.025 of FTE

Contractual Point of Contact: 0.025 of FTE

$5,900

RTI recruitment and data collection

6,973 labor hours for recruitment and data collection ODCs

$479,349

RTI analysis and reporting

1,529 labor hours and ODCs

$118,773

Total


$604,022

FTE = full-time equivalent; ODC = other direct cost

  1. Explanation for Program Changes or Adjustments

This is a new data collection.

  1. Plans for Tabulation, Publication, and Project Time Schedule

16.1 Plans for Tabulation

Qualitative data from the small group discussions and interviews with cases and providers will be entered into an electronic data matrix by the RTI note taker during the data collection and stored on a password protected computer. RTI will conduct thematic or ground theory analysis of the data to understand participants’ reactions to the sample messages or experiences with and impressions of the overall intervention in a rigorous and detailed manner. RTI and HRSA will review the preliminary data within 1 week after each data collection is completed via a debriefing conference call. RTI project staff will further analyze the data in the matrices and summarize results in separate reports for each type of data collection (small group discussions, interviews with case participants, and interviews with providers).

Quantitative data from the Web-based surveys will be analyzed in two phases: (1) Preliminary analyses of simple pre–post comparisons between participants in the case and control groups on primary outcome variables and (2) multivariable analyses of the association between the case condition and changes in outcome variables between the baseline and follow-up assessments. The first phase of data analysis will include basic summary statistics for the purposes of describing the sample, determining whether participants randomized to case/control (i.e., study) conditions differ significantly on pretest measures, and examining the distribution of the primary outcome variables. We will also compute means for continuous, normally distributed variables of interest and frequencies for categorical variables of interest, both for the entire sample and separately for each study condition. Statistical tests, such as chi-square tests and t-tests, will be conducted to evaluate preliminary differences by study condition, and any variables found to differ significantly between conditions will be evaluated as potential covariates for the analysis of primary outcome variables. In addition, the distributions of primary outcome variables will be examined to determine whether the distributional assumptions of planned analytic procedures are met.

Once preliminary analyses in the first phase are complete, we will begin to develop preliminary models that assess the association between the case condition and downstream mediators and outcomes. These models will include comparisons of outcomes between study conditions at each individual time point and on repeated measurements over time (merged baseline and follow-up data). Cross-sectional comparisons will be conducted using a combination of linear and logistic regression models and comparisons of changes from baseline to follow-up using multilevel models to account for lack of independence across repeated measurements. For example, our hypothesis that exposure to ART adherence messages will increase case participants’ ART adherence will be tested in a multilevel regression model, where a measure of ART adherence is specified as the dependent variable, the study condition is specified as the primary independent variable, and accounting for time as a repeated measure. These models will also include covariates for a number of background characteristics, including variables to control for pretest differences in study groups. The time by study condition interaction will be tested to determine if there is a difference in change in adherence among participants in the different study conditions. The overall goal of these models is to determine the extent to which changes in the outcomes of interest differ by study condition.

Our models will primarily be conducted among participants who complete all four surveys. However, prior to estimation of our models, we will analyze patterns of attrition among all sample participants in order to identify factors that make some respondents more likely to complete all of the surveys. As with any multi-wave cohort survey, it is expected that a certain percentage of participants who complete the baseline survey may not complete any of the three follow-up surveys. This may be particularly true for control participants as they will not be exposed to the intervention that is in part designed to improve engagement in care; thus, it is possible that control participants may miss scheduled appointments coinciding with survey administration more often than case participants. Once data collection is complete, we will analyze patterns in attrition and identify, through multivariable analyses, baseline factors (such as sample source) that are most predictive of future attrition. Once these variables are identified, they will be included in our primary analysis models as a way to control for self-selection into the cohort of respondents that complete all surveys. In addition, the impact of missing data will be minimized through the use of appropriate likelihood-based estimators incorporated in the multilevel modeling techniques we will utilize (e.g., SAS PROC MIXED and GLIMMIX). These methods yield unbiased estimates and accurate standard errors without sacrificing cases (and thus maximizing statistical power) when data are missing completely at random or predicted by other variables in a given model but independent of the potential values of the outcome itself (i.e., missing at random). If participants do not appear to be missing at random, we will use pattern mixture models to account for the informative attrition.

Table shells for planned quantitative analyses are in Appendix P.

16.2 Publication Plans

The qualitative and quantitative data gathered for this study will be summarized in reports prepared for HRSA by RTI. HRSA may publish these reports on the Internet. It is also possible that data from this study will be published in peer-reviewed manuscripts or presented at conferences; the manuscripts and conference presentations may appear on the Internet. Specific plans for peer-reviewed publications and conference presentations have not yet been developed.

16.3 Timeline

Clearance is requested for a period of 2 years. The project’s time schedule is shown in Table A.16.3.

Table A.16.3 Project Time Schedule

Activity

Timing

Conduct discussion groups

Upon IRB approval

Baseline data collection

3 months after OMB approval

Begin delivery of text messages to case participants

3 months after OMB approval

First follow-up data collection

6 months after OMB approval

Second follow-up data collection

9 months after OMB approval

Third follow-up data collection

12 months after OMB approval

Final follow-up data collection

15 months after OMB approval

In-depth interviews

16 months after OMB approval

Provider in-depth interviews

16 months after OMB approval

Quantitative data analysis

18 months after OMB approval

Qualitative data analysis

19 months after OMB approval

Submit final report

22 months after OMB approval

Submit at least one manuscript

24 months after OMB approval



  1. Reason(s) Display of OMB Expiration Date is Inappropriate

We do not seek approval to eliminate the expiration date.

  1. Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to the certification.


1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleInstructions for writing Supporting Statement B
AuthorJodi.Duckhorn
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy