Supporting Statement B

Supporting Statement B.docx

[NCHHSTP] National HIV Behavioral Surveillance System

OMB: 0920-0770

Document [docx]
Download: docx | pdf


National HIV Behavioral Surveillance System (NHBS)

OMB NO. 0920-0770






Supporting Statement B





Revision












July 25, 2022







Project Officer:

Dr. Cyprian Wejnert

Behavioral Scientist, Behavioral Surveillance Team

National Center for HIV, Hepatitis, STD, and TB Prevention

Coordinating Center for Infectious Diseases

Centers for Disease Control and Prevention

1600 Clifton Rd, NE, MS US8-4

Atlanta, Georgia 30329

Phone: (404) 639-6055

Fax: (404) 639-8640

E-mail: [email protected]

TABLE OF CONTENTS


Section B Justification


  1. Respondent Universe and Sampling Methods


  1. Procedures for the Collection of Information


  1. Methods to Maximize Response Rates and Deal with Non-response


  1. Tests of Procedures or Methods to be Undertaken


  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data



Table B1 Overview of NHBS Information Collection, by Participating Project Area (MSA), Round, and Cycle, 2003-2025


Table B2 Expected Response Rates and Samples Size, NHBS




  1. Respondent Universe and Sampling Methods


The size of the respondent universe for the National HIV Behavioral Surveillance System (NHBS) is unknown. Men who have sex with men (MSM) are estimated to be 4% of the U.S. population (Grey et al., 2016). However, the number of MSM at risk of HIV is unknown; the size of the population of persons who inject drugs and that of heterosexually active persons at risk of HIV infection are unknown. Thus, it is not possible to create sampling frames of these populations.


The selection of MSAs in which NHBS is conducted is based on the burden of HIV in metropolitan statistical areas (MSAs) in the United States, since HIV/AIDS is primarily an epidemic that affects urban areas in the U.S. NHBS project areas comprise the state and local health departments with the highest number of HIV infections diagnosed during the 3-year time period 2017-2019 reported to CDC, limiting eligibility to one MSA or Division per health department jurisdiction. A total of 20 health departments will participate in NHBS in Round 7.


Individuals chosen for inclusion in NHBS are those in populations that have the largest potential contribution to the spread of HIV: men who have sex with men (MSM), persons who inject drugs (PWID), and heterosexually active persons (HET) at increased risk of infection. NHBS is specifically designed to characterize individuals in these risk groups attending specific physical venues (e.g., bar, clubs, gatherings) or using online venues (e.g., social network or dating apps) or recruited by their peers, who agree to participate in an interview regarding HIV testing and risk behaviors, and who meet appropriate eligibility criteria. The project is not intended to yield representative data about any group except those who meet the above description (i.e., high-risk persons who meet eligibility criteria and who attend specific venues or are willing to be recruited by their peers).


In addition to basic eligibility criteria, such as living in the MSA and being 18 years of age or older, the sample of persons selected to participate in this project will vary each year depending on the specific population under investigation. During the MSM cycle, men who report having sex with a man and attend physical or online venues in which many attendees are men who have sex with other men or men who report having sex with a man and who were recruited to participate by a peer will be selected. In the PWID cycle, men and women who have injected drugs in the past year and who were recruited to participate by a peer who injects drugs will be selected. In the HET cycle, men and women who have had sex with a person of the opposite gender in the previous 12 months, are not older than 60 years, and who were recruited to participate by a peer will be selected. Eligible persons will be recruited as described below in the section entitled ’Selection of Respondents’.


Staff in health departments participating in NHBS will recruit until they meet their yearly quota of 500 participants meeting the inclusion criteria listed below. Through an informed consent process, selected persons will be asked to participate in an interview. After completing the interview, respondents will be offered a free HIV test.



Respondent eligibility criteria


Participant Inclusion criteria



To be eligible, potential participants in all cycles (MSM, HET, PWID) must:


  • Be 18 years old;

  • Be able to speak and understand either English or Spanish;

  • Be a resident of the Metropolitan Statistical Area;

  • Have the capacity to provide informed consent for participation;

  • Have not previously participated in the current NHBS cycle




Participants in each cycle must meet additional inclusion criteria:

  • MSM: Man who had sex with another man in their lifetime

  • PWID: People who inject drugs in the past 12 months

  • HET: Had sex with an opposite-sex partner in the past 12 months; are not older than 60 years




Participant Exclusion criteria



Participants in all cycles will be ineligible for participation if they:

  • Are younger than 18 years of age;

  • Do not reside in a selected Metropolitan Statistical Area;

  • Are unable to speak or understand English or Spanish;

  • Do not have the capacity to provide informed consent for participation;

  • Have previously participated in the current NHBS cycle




Participants in each cycle are excluded if they:


  • MSM: Did not have sex with another man in their lifetime

  • PWID: Did not inject drugs in the past 12 months

  • HET: Did not have sex with an opposite-sex partner in the past 12 months; are older than 60 years




Operational definitions of the target populations for each cycle must identify persons at high risk of HIV infection to be effective. The definitions for MSM and PWID are based on behavioral criteria alone because HIV prevalence among these groups is high and thus anyone engaging in the relevant sexual or drug use behaviors is presumed to be at risk.


In contrast, HIV prevalence among heterosexually active persons in the U.S. is low, and therefore, an operational definition for the HET cycle based on “sexual contact with an opposite-sex partner” is not specific enough to identify heterosexually active persons at risk for HIV infection. In order to develop an operational definition for use in the NHBS HET cycle, CDC staff reviewed the literature, analyzed available data from the Supplement to HIV/AIDS Surveillance (SHAS) project (OMB 0920-0262, exp. 06/30/2004), and held a consultation with scientists in academia and public health. These efforts led to an operational definition that combines behavioral and other criteria. To be interviewed for the HET cycle, one must report sexual contact with an opposite sex partner during the past 12 months. Moreover, because HIV risk is highest in younger age groups, individuals who are older than 60 years of age are excluded. Based on previous analysis of data from NHBS-HET, a further criterion of low income (having a household income at or below 150% of the HHS poverty guidelines adjusted for geographic differences in the cost of living) is applied to determine the population of “heterosexually active persons at increased risk.” Only participants who meet these additional criteria are invited to recruit peers for NHBS-HET.




Selection of Respondents


The methods for NHBS were chosen based on multiple consultations with sampling methodologists, those with expertise conducting research or behavioral surveillance activities with the three populations of interest, and public health practitioners who provide services to these populations, as described in Section A8. The selection of appropriate methods to recruit representative samples of participants is complicated by the fact that population-based samples of these groups – which are marginalized, hidden, or otherwise stigmatized due to the illegal or illicit behavior of their members - are not feasible as they cannot be easily identified as members of these populations or enumerated for sampling purposes. Several guiding principles determined the selection of methods to conduct behavioral assessments with the three populations. These principles included the selection of methods that would 1) result in the most representative sample possible of each population, 2) be feasible for implementation in the heterogeneous areas to be included in NHBS, and 3) allow for standardized recruitment of the targeted number of respondents during each cycle.


NHBS uses two convenience sampling methods: venue-based, time-space sampling and respondent-driven sampling (RDS). These are methods with demonstrated ability to recruit the respective populations (Abdul-Quader et al., 2006; Diaz et al., 2001; Heckathorn, Semaan et al., 2002; Magnani et al., 2005; MacKellar et al., 1996; Mansergh et al., 2006; McFarland and Caceres, 2001; Muhib et al., 2001; Ramirez-Valles et al., 2005; Semaan et al., 2002; Valleroy et al., 2000; Wang et al., 2005).


Venue-based, time-space sampling


Venue-based, time-space sampling activities can be grouped into three components. Each component is described in more detail below. Briefly, activities in the first component include identifying the venues (or “spaces”) and times to recruit MSM. Venues are assessed by local project staff for the number of MSM in attendance at different times, logistics and feasibility of recruiting and conducting the data collection activities, and safety. Activities in the second component include constructing monthly sampling frames of accessible venues and specific day/time periods during which each venue has at least 8 MSM in attendance during an average 4-hour period. From the monthly sampling frames, project staff members randomly select a set of venues and day-time periods in two stages and schedule data collection at those venues on those days at those times on monthly calendars. Activities in the third component include conducting screening for eligibility, recruitment, data collection and HIV testing as scheduled on the monthly calendar. Activities in the third component are described fully in the section 2 below entitled, “Procedures for the Collection of Information.”


Venues eligible for consideration for the MSM cycle of NHBS are defined as public or private locations that are attended by MSM for purposes other than receiving medical care, mental healthcare, social services, or HIV/STD diagnostic testing or prevention services. Venues eligible for consideration may be physical or online and include bars, dance clubs, retail businesses, cafes and restaurants, health clubs, social and religious organiza­tions or groups, adult bookstores and bathhouses, high-traffic street locations, parks, beaches, and special events such as gay pride festivals, raves, circuit parties, and social or dating applications. All eligible venues are assessed for accessibility of NHBS operations. Only accessible venues are included on the sampling frame. As a general principle, in order to reach sample size goals, venues included on the sampling frame are expected to yield a minimum of 8 MSM in attendance during an average 4-hour sampling event. Some venues are excluded from sampling frames due to low MSM attendance, lack of safety, or disapproval by owners or managers. The approval of venue owners or managers will be necessary to proceed with data collection in many entertainment, commercial, and online venues that are included in sampling frames. For each accessible venue, specific day-time periods are identified as being well-attended by MSM. Venue-specific-day-time periods may occur once or twice per month (e.g., a social organization that meets only once per month) or daily (e.g., a busy street corner in a gay neighborhood). Whereas the majority of the venues on sampling frames will be identified before the start of data collection, local staff members are expected to identify new venues during the data collection period and likewise to keep track of those that no longer serve MSM. New venues must be considered for inclusion in the monthly sampling frames and venues that have closed must be excluded from the sampling frames. An updated sampling frame is created each month, which includes all venues identified by the staff to be currently operating within the selected MSA. This ensures the sampling frame is as accurate as possible.


After the initial universe of venues and associated day-time periods are identified, sampling frames are constructed. Each project area constructs two sampling frames. The first frame is the venue frame. The second sampling frame is the list of venue-day-time periods (or “VDTs”) for each venue listed in the venue frame; this frame is called the “VDT frame.” On a monthly basis, venues and day-time periods are randomly selected from their respective frames and scheduled for sampling on a calendar for the upcoming month. The sampling plan is designed to optimize representation of MSM from different venues. Thus, venues are given an equal probability of selection each month and sampling is conducted without replacement, using a VDT software program described in Section A. Random selection of venue-day time periods may not be feasible or practical in some or all cities during or immediately following states of emergency (e.g. natural disasters, pandemics, etc.). In such rare situations, project areas may select venues and times that prioritize participant and staff safety.


Recruitment of men for the interview occurs at the randomly selected venues during the randomly selected day-time periods according to the monthly sampling calendar. During these events, field staff members perform three main duties− counting venue attendees, recruiting participants, and conducting interviews.


During recruitment, the field supervisor obtains a count of all men who appear to be ≥ 18 years of age who attend a physical venue or are accessible online for an online venue. Those individuals who have been counted form the pool of persons eligible for recruitment into NHBS.


At a physical venue, the field supervisor directs a staff member to approach sampled men and recruit them for participation in NHBS. At an online venue, a staff member contacts sampled men systematically, such as every fifth man on a social network app and recruits them for participation in NHBS; or sampled men contact project staff because of an online ad, post, group message, profile, or peer-referral and are then recruited for participation in NHBS. Project staff will recruit using a script similar to the following: “Hi, my name is (name) and I work for (organization). We are conducting an important health survey and I would like to ask you just a few quick questions.” If the man accepts the contact, the staff member will then let him know that he must complete an eligibility screener to determine if he is eligible to participate, and that not all selected men will be eligible. If the prospective participant agrees, the staff member will assign him to an interviewer to assess his eligibility for participation using the eligibility screener, described in section 2, “Procedures for the Collection of Information” below (Attachment 3a). Eligibility screening may occur at that time, or it may be scheduled for a later date if the prospective participant prefers. Men may be approached for recruitment publicly, but eligibility screening will always occur privately.



Respondent-Driven Sampling

Respondent-driven sampling (RDS) is used to recruit participants. RDS is a chain-referral sampling strategy similar to snowball sampling. It starts with a limited number of “seeds” who are chosen by referrals from people who know the local population well, or through outreach to areas where the population can be found. Seeds complete the study activities (eligibility, behavioral assessment, HIV test) and then are asked to recruit a specified number of persons (usually between 3 and 5) whom they know, who are PWID (for the PWID cycle), MSM (for the MSM cycle), or HET (for the HET cycle). Seeds who agree to recruit their peers are given between 3 – 5 non-replicable coupons (Attachment 14). Coupons may be physical coupons or provided digitally. The code on each coupon is linked to 1) the Survey ID of the participant the coupon is issued to (i.e., the recruiter) and 2) the Survey ID of the participant returning the coupon (i.e., the recruit). The coupon information is entered and stored in the Coupon Manager application. These persons, in turn, come to the study field office with a valid coupon, complete the behavioral assessment, receive an HIV test if they consent, and are asked to recruit others. This recruitment process continues until the sample size has been reached. Participants receive incentives for participating, as well as for recruiting others. Starting with a small number of seeds, limiting the number of individuals each participant can recruit, and allowing a significant number of recruitment “waves” to occur (a “wave” refers to each additional generation of recruits stemming from a seed), is expected to lead to the distribution of a final sample that resembles the underlying eligible population living in the project area and that is unbiased by the characteristics of the seeds (Heckathorn, 1997; Heckathorn, 2002).



Among eligible participants in the NHBS-HET cycle, only those who are of low income (i.e., having a household income at or below 150% of the HHS poverty guidelines), have not injected drugs without a prescription in the past 12 months, and if male, have not had male sex partners in the past 12 months, will be asked to recruit peers for participation. This will ensure that recruitment continues within populations most at risk for heterosexual transmission of HIV infection.




Sample size


NHBS project areas are health departments in U.S. Metropolitan Statistical Areas (MSA) with the highest prevalence of HIV. The current number of participating project areas (MSA) is 20.


On an annual basis each project area recruits and interviews a minimum of 500 eligible persons from the relevant high-risk group. During the period of this Revision request (2023-2025), the total number of respondents per year is estimated to be 10,000 for all project areas (20 MSA * 500 respondents per MSA = 10,000 respondents).


Although NHBS methods do not use probability sampling, sample size goals are based on an assumption that every element has a known nonzero probability of being sampled. If we assume NHBS to be a probability sample, then the sample size of 500 participants per project area allows local areas to estimate a proportion of 50% with precision roughly + 5% for outcomes of interest – for example, the proportion of eligible participants who engage in unprotected sex, share needles/syringes, or have never been tested for HIV. Once again if we assume that NHBS is a probability sample, the larger national sample of 10,000 respondents per cycle should provide adequate power and precision to evaluate most behaviors of interest overall and by the major demographic variables shown below.


Table B1 provides a summary of NHBS information collection from 2003-2022, with projections for 2023-2025. Of note: the MSM samples are likely to have a higher proportion of white participants than PWID or HET; the PWID samples are likely to be older than MSM or HET samples, and predominantly male; the HET samples are likely to have high proportions of Blacks and Hispanics due to the inclusion of poverty as a factor in determining where to sample. These expectations are based on results to date.


Table B1. Overview of NHBS Information Collection, by Participating Project Area (Funded Health Department), Round, and Cycle, 2003-2025


Round 1

Round 2

Round 3

Round 4

Round 5

Round 6

Round 7*


2003-2007

2008-2010

2011-2013

2014-2016

2017-2019

2020-2022

2023-2025


MSM1

IDU1

HET1

MSM2

IDU2

HET2

MSM3

IDU3

HET3

MSM4

IDU4

HET4

MSM5

IDU5

HET5

MSM6

MSM6

PWID6

MSM7

PWID7

HET7

Atlanta (Georgia Dept. of Human Resources)

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

Baltimore (Maryland Dept. of Health and Mental Hygiene)

x

x

x

x

x

x

x

x

x

x

x

 

x

x

x

x

Boston (Massachusetts Dept. of Public Health)

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x





Chicago (Chicago Dept. of Public Health)

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

Dallas (Texas Dept. of Health)

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x





Denver (Colorado Dept. of Public Health)

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

Detroit (Michigan Dept. of Community Health)

 

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

Fort Lauderdale (Florida Dept. of Health)

x

x

x

 

 

 

 

 

 

 

 

 

 

 

 

 

 





Houston (Houston Dept. of Health and Human Services)

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

Indianapolis (Indiana State Dept. of Health)


















x

x

x

x

Las Vegas (Nevada Dept. of Health)

 

x

x

 

 

 

 

 

 

 

 

 

 

 

 

 

 





Los Angeles (Los Angeles County Health Dept.)

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

Memphis (Tennessee Dept. of Health)

 

 

 

 

 

 

 

 

 

 

 

x

x

x

x

x

x

x

x

x

x

Miami (Florida Dept. of Health)

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x





Nassau (New York State Dept. of Health)

 

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x





New Haven (Connecticut Dept. of Public Health)

 

x

x

 

 

 

 

 

 

 

 

 

 

 

 

 

 





New Orleans (Louisiana Dept. of Human Services)

 

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

New York City (NYC Dept. of Health and Mental Hygiene)

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

Newark (New Jersey Dept. of Health and Senior Services)

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

Norfolk (Virginia Dept. of Health)

 

x

x

 

 

 

 

 

 

 

 

x

x

x

x

x

x

x

x

x

x

Philadelphia (Philadelphia Dept. of Public Health)

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

Portland (Oregon Health Authority)

 

 

 

 

 

 

 

 

 

 

 

x

x

x

x

x

x

x

x

x

x

San Diego (California Dept. of Health Services)

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

San Francisco (San Francisco Dept. of Public Health)

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

San Juan (Puerto Rico Health Dept.)

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

Seattle (Washington Dept. of Health)

 

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

St. Louis (Missouri Dept. of Health and Senior Services)

 

x

x

x

x

x

 

 

 

 

 

 

 

 

 

 

 





Washington, DC (DC Dept. of Health)

x

 

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

x

Total project areas

17

24

25

21

21

21

20

20

20

20

20

22

23

23

23

23

23

20

20

20

20

* A total of 20 project areas with the highest HIV prevalence are funded for NHBS Round 7 through cooperative agreements starting in fiscal year 2022.


Expected response rates

Response rates for venue-based, time-space sampling are largely dependent on how many people accept being approached for recruitment and meet the eligibility criteria; among those who do accept and are found eligible, participation rates are expected to be high (Diaz et al., 2001; Muhib et al., 2001; Valleroy et al., 2000). The response rate for the MSM cycle using venue-based, time-space sampling is expected to be approximately 70%, based on results from NHBS to date (NHBS, OMB # 0920-0770, exp. 1/31/2023). A benefit of the peer-driven sampling conducted in RDS (Heckathorn, 2002; Johnston et al., 2006; Ramirez-Valles et al., 2005; Stormer et al., 2006; Wang et al., 2004; Yeka et al., 2006) is that recruiters are told, generally speaking, what the eligibility criteria are in order that they can recruit eligible participants. For this reason, response rates for the PWID and HET cycles using RDS are expected to be higher than for venue-based sampling, approximately 90%. Results from NHBS to date support this expected response rate (OMB# 0920-0770, exp. 1/31/2023). Further details and calculations are provided in Table B2 below:

Table B2: Expected Response Rates and Sample Size, NHBS*

 

MSM Cycle

PWID Cycle

HET Cycle

 

Screened

Participants

Screened

Participants

Recruiters

Screened

Participants

Recruiters

TOTAL

14,000

10,000

11,000

10,000

5,000

11,000

10,000

5,000

 









Hisp-anic

3,500

2,950

2,200

2,000

1,000

2,500

2,100

1,400

Black

4,620

3,350

3,700

3,500

2,500

5,200

4,900

1,800

White

4,304

3,700

4,500

4,000

1,250

2,000

1,800

1,200

Other

1,576

1,400

600

500

250

1,300

1,200

600

 









Male

14,000

10,000

7,800

7,000

3,550

5,500

5,000

2,500

Female

0

0

3,200

3,000

1,450

5,500

5,000

2,500

 









18–34 years of age

8,000

5,700

2,200

2,000

1,000

6,048

5,500

2,750

35 years and older

6,000

4,300

8,800

8,000

4,000

4,952

4,500

2,250



* Based on experience from NHBS, participation rates tend not to differ across race, age, and gender categories. Therefore, the expected numbers of participants by race, age, and gender have the same frequency distribution as the numbers screened by race, age, and gender.



2 Procedures for the Collection of Information



All eligibility screening and interviews will be conducted by trained project staff. Participation in the project is voluntary. Respondents may refuse to participate at all or in part. Respondents may refuse to answer questions or stop participation at any time without penalty. The approved Project Determination Form (Attachment 11) indicates that because NHBS is a routine disease surveillance activity the protocol will not be reviewed by CDC’s IRB. Each participating health department will be required to obtain approval for this project from their IRB as required by their local review and approval processes and federal regulations before data collection.


NHBS utilizes periodic data collection cycles of each at-risk population (MSM, PWID and HET) to reduce the burden on populations of interest and on the health department staff that conduct project activities. Thus, data collection for each risk group occurs once every three years.


For the MSM (venue-based, time-space sampling) cycle, each man approached will be invited to be screened for eligibility; the informed consent process will be initiated with eligible persons. During the consent process, each component of the project is described, and the person approached must indicate which component(s), if any, he agrees to participate in. These include: 1) participating in the NHBS behavioral assessment; 2) HIV testing; 3) other diagnostic testing (offered in some, but not all MSAs); and 4) storing leftover serum (offered in some, but not all MSAs). Informed consent will be obtained by having the interviewer read the consent script and indicating on the portable computer whether the person being recruited provided verbal consent. After consent is obtained, the behavioral assessment will be conducted; an HIV test will be performed for those who consent, after the behavioral assessment has been completed. Men approached may elect to participate in the behavioral assessment and not to participate in the HIV testing. Men who refuse the behavioral assessment will not be offered HIV testing.


During respondent driven sampling cycles, persons who receive a coupon (Attachment 14) to participate in NHBS will be asked to make an appointment to participate in the behavioral assessment; walk-in hours are usually available (determined locally). When a potential respondent comes to the field site or calls the project phone number, his coupon is assessed to ensure it is valid, using the Coupon Manager application described in Section A3. After the coupon is validated, the potential respondent is invited to be screened for eligibility; the informed consent process will be initiated with eligible persons. During the consent process, each component of the project is described, and the eligible person must indicate which component(s), if any, he/she agrees to participate in. These include: 1) participating in the NHBS behavioral assessment; 2) HIV testing; 3) other diagnostic testing (offered in some, but not all MSAs); and 4) storing leftover serum (offered in some, but not all MSAs). Informed consent will be obtained by having the interviewer read the consent script and indicating on the portable computer whether the person being recruited provided verbal consent. After consent is obtained, the behavioral assessment will be conducted; an HIV test will be performed for those who consent, after the behavioral assessment has been completed. Persons recruited may elect to participate in the behavioral assessment and not to participate in the HIV testing. Persons who refuse the behavioral assessment will not be offered HIV testing. Persons who present to the field staff at the office without a valid coupon will not be allowed to participate in the behavioral assessment.


After the NHBS behavioral assessment and HIV testing are completed, the interviewer asks the participant if he or she would be willing to recruit other participants, an activity for which a small incentive (approximately $10; see Section A) will be given. After a brief training on the recruitment process, those who agree to recruit their peers are given up to 5 coded, non-replicable coupons (Attachment 14). The participant is told to give one coupon to each of between 1 - 5 peers (determined locally) meeting the eligibility criteria (according to the model script in Attachment 15). Each coupon has the local NHBS project name and location(s) printed on it with a brief explanation of the project. The code on the coupon is linked to 1) the Survey ID of the participant the coupon is issued to (i.e., the recruiter) and 2) the Survey ID of the participant returning the coupon (i.e., the recruit). The coupon information is entered and stored in the Coupon Manager application. After receiving coupons and recruiter training, the participant is provided the incentive and given instructions about returning for a recruitment reward after distributing a coupon(s).


When a participant returns for his/her incentive, he will be asked questions to determine how many coupons were distributed, if anyone refused the coupons, the race or ethnicity of the persons refusing coupons, and the reasons for refusal (Attachment 3e, Recruiter Debriefing). This information will be stored in a password-protected database kept separate from but linked to the eligibility screener and behavioral assessment data by the survey ID. Race and ethnicity are commonly associated with many health outcomes in the U.S. Understanding if there are systematic patterns in coupon refusal provides information about potential bias and non-response in the sampling process.




General Procedures Applying to All Three NHBS Cycles


Mechanisms for returning HIV results to participants are determined locally; follow-up appointments are set before the participant leaves the field site or field office location.


Persons who consent to participate in the behavioral assessment will be administered a structured questionnaire (Attachment 3b-d). The questionnaire collects self-reported demographics, sexual behavior, drug use, HIV testing history, sexually transmitted infection diagnosis, and exposure and access to HIV prevention services from all respondents. The interview instrument will be programmed and will be administered using portable computers either in-person or remotely through secure videoconference or by phone.


The portable computers for data collection and laptop computers for use with Coupon Manager and for data storage after each recruitment event will be password protected and the data on them will be encrypted using standard, 128-bit encryption software. No personal identifiers will be collected or included with responses to the behavioral assessment. The behavioral assessment is expected to take approximately 24 minutes for the MSM cycle, 43 minutes for PWID and 31 minutes for HET (excluding eligibility screening).


Respondents will receive HIV prevention materials after the behavioral assessment and referrals to local HIV prevention and care services, if requested.


Quality Control

Data quality is ensured by the use of computer-assisted interviewing, interviewer training and monitoring, site visits, and data editing. Computer-assisted interviewing improves data quality in several ways:

  1. Interviewer errors are reduced because interviewers do not have to follow complex routing instructions; the computer does the routing for them.

  2. Respondent errors are also reduced. Consistency checks are programmed into the questionnaire so that inconsistent answers or out-of-range values can be corrected or explained while the behavioral assessment is in progress.

  3. Use of computer-assisted interviewing also reduces coding and coding errors, which makes it possible to prepare the data for analysis faster and with fewer errors.


A multi-day interviewer training occurs before the start of each cycle’s data collection. This training covers general interviewing skills, sampling and recruitment protocols, and a question-by-question review of the questionnaire to ensure interviewers understand the purpose of each question and how it should be read and coded in the portable computer. Interviewers have opportunities to practice administering the questionnaire during the training. The training also addresses interviewer integrity, underscoring the importance of collecting quality data and the consequences of inappropriate behaviors, including falsification of data. Project area staff is also trained on how to conduct recruitment procedures, such as approaching men in venues (for the MSM cycle) and training participants to recruit their peers into the study (for all cycles).


During the data collection period, interviewers are monitored by the field supervisors or other management staff. Approximately 10% of each interviewer’s interviews are monitored. Feedback is provided for areas of improvement and in cases of incorrect implementation of the protocol. Monitoring of venue-based, time-space sampling and respondent-driven sampling also includes recruitment procedures. Supervisors provide feedback on ways to help improve response rates.


CDC conducts at least one site visit to each project area per cycle. The purpose of the site visit is to monitor adherence to the NHBS protocols, observe recruitment and behavioral assessment interviews, and obtain feedback on study procedures.


In addition to the automated checks provided through the computer-assisted interview program, editing of the data is performed by CDC following extensive checking of the quality of the data files. Monthly processing allows for identification of errors in the data sets (such as incorrect identification codes or incorrect coding of other critical data elements) or incorrect local data management procedures. CDC regularly convenes conference calls with the project areas and the CDC contractor to address any issues with the data collection application and discuss administration of the behavioral assessment specifically and the project in general.


NHBS behavioral assessment instruments will not collect specific identifiers (e.g., name, address, social security number). Data are collected electronically; no paper instruments are used to collect data for NHBS.





3. Methods to Maximize Response Rates and Minimize Non-response


Response Rate Calculations


Venue-based, time-space sampling

Response rates for venue-based, time-space sampling are dependent on how many people accept the approach. Among those who do accept and are found eligible, participation rates are expected to be high (Diaz et al., 2001; Muhib et al., 2001; Valleroy et al., 2000). Based on previous studies using venue-based, time-space sampling, we expect approximately 20% of men to refuse the approach. Among those who accept the approach, 10% are expected to be ineligible. We expect that approximately 15% of men, after learning what participation in the project entails, will refuse to participate. Some data loss from the portable computers may occur, but should not affect more than 1% of case records collected. Given these estimates, to reach the target number of respondents in each city, project area staff will need to approach 850 men and screen 680 of them; if 10% are ineligible and 15% of eligible refuse to participate, 520 respondents would be expected. Even with data loss of 1%, the target number of 500 respondents would be met.


Unweighted response rates are calculated as a ratio of the number of completed cases to the number of in-scope sample cases, based on guidance from OMB (“Standards and Guidelines for Statistical Surveys,” OMB, September 2006, section 3.2.2). For NHBS-MSM, the calculations based on 500 completed surveys and using the estimated outcomes noted above result in an unweighted response rate of 67% (see Attachment 16 for calculations).


Respondent-driven sampling

Previous studies using RDS find that one-half to two-thirds of persons recruited by their peers for NHBS will present for eligibility screening (Heckathorn, 2002; Johnston et al., 2006; Ramirez-Valles et al., 2005; Stormer et al., 2006; Wang et al., 2005; Yeka et al., 2006). Because recruiters are instructed to invite participation of their peers who meet the general eligibility criteria, it is expected that at least 90% of those presenting at the field site for eligibility screening will be eligible (Ramirez-Valles et al., 2005). In addition, response rates among those found eligible are generally high because those who have taken the initiative to present for eligibility screening are motivated to participate. Generally, persons who are eligible and not interested in participating in the behavioral assessment will not make the effort to come to the field office with the coupon.


Expected response rate calculations are presented in Attachment 16. These calculations were computed using the methods provided in the document “Standards and Guidelines for Statistical Surveys,” OMB, September 2006. The response rate calculations were based on 500 completed surveys or cases (C) per MSA and using the estimated outcomes in response rates based on previous RDS studies, which indicate that response rates will range from 68% – 76%, depending on the rate at which persons recruited by peers present for eligibility screening.


Given that the populations targeted by NHBS are hard to reach, either because their behaviors are illegal, not socially normative, or stigmatized, probability sampling methods cannot be used for NHBS. Expectations of response rates based on probability sampling, therefore, cannot be applied to NHBS. The peer-referral sampling methods used in NHBS were developed precisely to reach hard-to-reach populations for which a sampling frame does not exist, and the expected response rates for NHBS are within the range of those achieved in other studies using these non-probability sampling methods (MacKellar et al., 1996; Thiede et al., 2001). Bias in the samples can be evaluated by measuring the extent to which various sub-populations in each sample – for example, women – recruited other women more frequently than they recruited men, and vice versa. Such calculations are possible via the coupon management system, which tracks who recruited whom, as well as information gathered during the interview process on the size and composition of participants’ social networks. Despite the limitations, the expected response rates for NHBS are anticipated to be adequate for the purposes of describing risk behaviors of persons at high risk of HIV and understanding the prevention efforts needed in the local communities.




Methods to maximize response rates


Response rates for NHBS may be adversely affected by the sensitive nature of the questions. However, NHBS methods also offer ways to maximize response rates, as described below. Monitoring of response rates will be done through conference calls on a weekly basis with each project area and monthly with all project areas together, offering the opportunity to share strategies for improving response rates. Recruitment statistics and sample demographics will be reported to CDC on a weekly and monthly basis, respectively.


Research indicates that providing an incentive to respondents helps raise response rates for long, sensitive, in-person surveys (Kulka, 1995). An incentive is also useful for groups that are hard to reach, including those for whom conventional means of motivation may not work, such as disenfranchised populations like those recruited for NHBS. In addition, these populations (particularly MSM and PWID) have been frequently the focus of health-related data collections, in which an incentive is the norm (Thiede et al., 2001; MacKellar et al., 1996). Research has shown that financial incentives are effective at increasing response rates among female residents in minority zip codes (Whiteman et al., 2003) and among African American participants in a community-based health promotion program (Halberti et al., 2010). A meta-analysis of 95 studies published between January 1999 and April 2005 describing methods of increasing minority enrollment and retention in research studies found that incentives enhanced retention among this group (Yancy et al., 2006). Providing an incentive to NHBS respondents is critical to achieve acceptable response rates.


Incentives have been shown to be effective for promoting participation and reducing nonresponse in similar data collections that involve hidden populations or collect sensitive information. An incentive is also provided to persons who participate in CDC’s HIV-related data collections among other populations, such as the Medical Monitoring Project (OMB 0920-0740, exp. 5/31/2024), which collects sensitive information from HIV-positive persons, also utilizes incentives to reduce nonresponse. Participants in the Medical Monitoring Project are offered $50 as an incentive for their time. Further information on the need for incentives in data collections focused on high-risk and hidden populations or collecting sensitive information is provided in section A.9.


Venue-based, time-space sampling

To maximize response rates for the VBS cycle, the initial approach is critical. Training for interviewers will focus on effective communication (enthusiasm, rapport building in a short period of time) and ability to communicate the value of NHBS (persuasion); demonstrated motivation, persistence, and high energy are critical for successful recruiting. The training will focus on methods for averting refusals and methods to seek participation of sampled persons who are initially reluctant, including role-playing of different scenarios in which the respondent may be difficult to recruit. The basic recruitment philosophy is “respectful persistence;” interviewers are trained to know when to stop. The use of staff other than interviewers for refusal conversions is not done for NHBS.


Venue-based sampling offers the benefit of access to large numbers of the target population in a single location; however, a disadvantage is that the rate of refusal of the approach and of participation (among those who accept the contact) can be high because people attend venues for reasons other than participating in a data collection. In limited cases, respondents who are interested in participating, but are not willing or able to complete the behavioral assessment at the time they are contacted will be offered an appointment to participate on another day. Offering of these “post-event appointments” will be limited, as it is expected that “no show” rates for the appointments will be high.


Respondent-driven sampling

Because RDS is a peer-referral mechanism, the field staff has little control over sampling methods and sample accrual, other than through the recruitment of seeds. One advantage of RDS, however, is that peer referral, which implies endorsement or at least acceptance of the project by a peer, is likely to have a positive impact on response rates. To maximize the effectiveness of peer recruiting, training is provided to recruiters. Peer recruiters may help improve response rates by providing credibility and legitimacy for the project in the target population. In addition, persons recruited by a peer may be more willing to participate than if they had been recruited by someone unknown to them. In this survey in which multiple contacts by staff to boost response rates are not possible, peer recruiters are not so constrained (because they are recruiting persons known to them) and are able to follow up with those they have referred to the project to provide reminders to participate. The “dual-incentive” structure (i.e., providing additional incentives to recruiters when they successfully recruit an eligible participant) also helps to maximize response rates. Convenient location of field sites and hours of operation may also maximize response rates; field sites will be located in areas that are easy to access by public transportation and hours of operation will be set to meet the needs and schedules of the population of interest.


Prior to conducting NHBS, the field staff in each participating area will review existing data sources to determine the characteristics (e.g., race, ethnicity, age, geographic location) of the local population of interest (MSM, PWID, or HET, depending on the data collection cycle). The field staff will also obtain input on the logistics of data collection from local stakeholders and members of the local community. This input will help the local staff identify the most appropriate hours of operation and avoid barriers to participation of persons in the data collection.


Assessing Non-Response Bias


The use of an eligibility screener will allow comparison of the demographic and eligibility-related behavioral data among those who are eligible and ineligible.


The venue-based, time-space sampling method is not conducive to collecting information from those who refuse to be approached. However, information on those who accept being approached but do not consent to participate is available and can be used to compare with those who agree to participate.


To assess non-response bias from RDS, each peer recruiter returning to the field site will be asked, using the recruiter debriefing (Attachment 3e) whether anyone refused a coupon (invitation to participate), why they refused, and the race/ethnicity of those who refused. This information will be collected using a laptop computer. Following up with recruiters has improved rates of participation in other studies implementing RDS (Draus et al., 2005; Ramirez-Valles et al., 2005). However, due to the private nature of NHBS, few, if any, participants can be re-contacted by field staff. Similarly, field staff will rarely be able to initiate contact to encourage peer recruiters to distribute coupons or to ask the recruiters to report on refusals. However, when an NHBS peer recruiter initiates contact with project area staff, such as when a peer recruiter returns to the field site for rewards or calls the project area, the field staff will remind recruiters to encourage any recruits who have not yet presented for eligibility screening to do so.


In addition, peer recruiters will be debriefed about their recruitment efforts when they return to the field site for their recruiter rewards (see Attachment 3e) as described above. This information will be used to understand if certain racial (or ethnic) groups are not responding or if persons are not responding for a particular reason.


Recruitment for all data collection cycles will be monitored through on-going data reports generated weekly and monthly from the data submitted to CDC. These reports will be used during venue-based sampling to monitor approaches of participants by field staff, the number accepting and refusing the approach, the number screened, the number who completed the behavioral assessment, and the characteristics of the accruing sample. During respondent-driven sampling, reports will monitor the seed recruitment, the characteristics of seeds, general recruitment (i.e., participation rate among seeds and non-seeds who present for screening and are eligible ), the characteristics of the resulting sample, the number and length of recruitment chains, the number of recruiters who returned for rewards, the number of coupons distributed to recruiters, the number of persons who present with a coupon for eligibility screening, the number of persons refusing coupons, the race/ethnicity of those refusing coupons, and the reason coupons were refused. The field staff and CDC will use the data in these reports to identify problems with recruitment. Comparing data from the sample characteristics report with the information gathered from local data sources and stakeholders about the local at-risk populations will be used to identify subgroups of the target population whom the data collection may be missing. When a problem with response or recruitment arises during data collection, field staff will be instructed to consult with local stakeholders and members of the local target populations to identify solutions to the problem.


Generalizability


Venue-based, time-space sampling

The data collected during the VBS cycle can be weighted for generalizability. Selection probabilities are based on venue selection and day-time period selection, as well as the response rates and frequency of the respondents’ attendance at venues (MacKellar et al., 1996). Thus, for the MSM cycle, data can be weighted such that they will be generalizable to men meeting the eligibility criteria who attend MSM-specific venues and reside in the selected MSAs. Although some MSM do not attend MSM-specific venues, several surveys suggest that most attend one or more types of venues included in the sampling frames (Ramirez-Valles et al., 2005; Xia eta l., 2006). Thus, the inclusion of a wide range of types of venues in the sampling frame helps increase the external validity of the findings.



Respondent-driven sampling

The statistical theory upon which RDS is based suggests that if peer recruitment proceeds through a sufficiently large number of waves, the composition of the sample will stabilize, becoming independent of the seeds from which recruitment began, and thereby overcoming any bias the nonrandom choice of seeds may have introduced (Heckathorn, 1997; Heckathorn, 2002). (“Waves” are defined as generations of recruits stemming from a seed, i.e., from recruitment efforts of the persons the seed directly recruited and from the recruitment efforts of those the seed’s recruits recruited, etc.) The expected stable sample composition after a sufficiently large number of waves is termed “equilibrium.” Experience with RDS indicates that equilibrium can be achieved in as few as 6 waves. In NHBS-PWID during 2018, most project areas accrued as many as 15-20 waves of recruitment.


Another factor that has an impact on how quickly equilibrium can be reached is called “homophily.” Homophily refers to the degree of insularity, or in-group preference for recruitment. The more insular a group, the more likely they are to recruit others like themselves. Therefore, insularity implies a greater number of waves to reach equilibrium. In NHBS-PWID (2018), homophily – which can also be described as the measure of persons’ preferences to recruit only those who are like themselves - did not exceed .4 (or 40%) for all MSAs, meaning that PWID participants more often recruited those who were dissimilar to themselves (e.g., according to race/ethnicity or gender) than those who were similar to themselves. For example, if homophily is .4 for black participants, then 40% of black recruiters recruited only other blacks and 60% of black recruiters recruited PWID at random, regardless of race. The homophily statistics for NHBS are well within the anticipated bounds reported in other studies using RDS methods. Having a diverse set of seeds (according to race/ethnicity, gender, and age) will help ensure diversity of networks, which is expected to minimize the insularity of the sample.


With the RDS method, the sampling frame is initially the social networks of the seeds, with the social networks of successive waves of peer recruiters added. This frame can be described by information collected from participants regarding who recruited them and information about the sizes of recruiters’ social networks. Recruitment is tracked by the use of coupons; recruiters can be linked to those they have successfully recruited using the Coupon Manager software. Information on who recruited whom is used to calculate cross-group recruitment proportions, such as described above.



4. Tests of Procedures or Methods to be Undertaken


The data collection instruments were developed using questions from previous CDC surveillance projects, such as the Medical Monitoring Project (MMP) (OMB 0920-0740, exp. 5/31/2024), the Transgender Behavioral Surveillance System (OMB No. 0920-0794, exp. 12/31/2010) and the Behavioral Assessment and Rapid Testing project (BART) (OMB No. 0920-0883, exp. 4/30/2014). External consultants helped develop and refine the specific RDS and VBS methods for each data collection cycle (Attachment 9). NHBS has used most of the questions on the eligibility screener and behavioral assessment instruments since 2008. Updates made since 2008 were informed by one or more of the following: cognitive testing conducted by NCHS (see https://wwwn.cdc.gov/QBank/Report.aspx?1089); cognitive interviews conducted internally with <10 individuals; feedback from NHBS interviewers; review of literature on measurement of specific topics; and input from subject matter experts. Prior to implementation in the field, CDC staff will test the skip patterns and responses of the data collection instruments. CDC staff will also conduct mock interviews of their CDC colleagues using the electronic interview application loaded onto portable computers. OMB will be informed of any changes to data collection procedures or instruments as quickly as possible.


5. Individuals Consulted on Statistical Aspects



Consultants on Statistical Aspects


The following individuals consulted on statistical aspects only. They are not involved in collecting or analyzing the data.


Lillian Lin, PhD

Team Leader, Statistics Team

Centers for Disease Control and Prevention

1600 Clifton Rd, NE MS E-48

Atlanta, GA 30333

Phone: (404) 639-2990

Email: [email protected]



John Karon, PhD

Statistician

Centers for Disease Control and Prevention

1600 Clifton Rd, NE MS E-48

Atlanta, GA 30333

Phone: (404) 639- 2020

Email: [email protected]



Myron Katzoff, PhD

Statistician

3311 Toledo Road Room 3117

MS P-08

Hyattsville, Maryland 20782

Phone:301-458-4307

Email: [email protected]



Steve Thompson, PhD

Department of Statistics and Actuarial Science

Simon Fraser University

8888 University Drive

Burnaby, BC V5A 1S6 CANADA

phone 604 268 6591

email [email protected]


Douglas Heckathorn, PhD

Professor, Department of Sociology

344 Uris Hall

Cornell University

Ithaca, NY 14853-7601

phone: 607.255.4368

e-mail: [email protected]


Graham Kalton

Statistician

Westat, Inc.

1650 Research Blvd.

Rockville, MD. 20850

Phone: 301-251-1500

[email protected]













Individuals Collecting and/or Analyzing Data



CDC is not directly engaged with human subjects during data collection. However, CDC Project Staff below will train health department staff in data collection methods, monitor the progress of recruitment by health department staff, and analyze the data.


CDC Project Staff

All CDC project staff can be reached at the following address and phone number:

Behavioral and Clinical Surveillance Branch

Division of HIV Prevention

Centers for Disease Control and Prevention

1600 Clifton Rd, NE MS US8-4

Atlanta, GA 30329

Phone: (404) 639-2090


Cyprian Wejnert, PhD

Team Leader, Behavioral Surveillance Team

Email: [email protected]


Monica Adams, PhD

Epidemiologist

Email: [email protected]


Christine Agnew Brune, PhD

Epidemiologist

Email: [email protected]


Amy Baugher, MPH

Epidemiologist

Email: [email protected]


Dita Broz, PhD

Epidemiologist

Email: [email protected]


Janet Burnett, MPH

Epidemiologist

Email: [email protected]


Susan Cha, PhD

Epidemiologist

Email: [email protected]

Johanna Chapin Bardales, PhD

Epidemiologist

Email: [email protected]


Paul Denning, MD, MPH

Medical Epidemiologist

Email: [email protected]


Lyssa Faucher,MPH

Epidemiologist

Email: [email protected]


Teresa Finlayson, PhD, MPH

Epidemiologist

Email: [email protected]


Dafna Kanny, PhD

Epidemiologist

Email: [email protected]


Katie Lee, MPH

Public Health Advisor

Email: [email protected]


Rashunda Lewis, MPH

Health Scientist

Email: [email protected]


Elana Morris, MPH

Epidemiologist

Email: [email protected]


Ebony Respress, MPH

Epidemiologist

Email: [email protected]


Taylor Robbins, MPH

Epidemiologist

Email: [email protected]


Catlainn Sionean, PhD

Behavioral Scientist

Email: [email protected]


Jeffery Todd, MCC Epidemiologist

Email: [email protected]


Joseph Prejean, PhD

Chief, Behavioral and Clinical Surveillance Branch

Email: [email protected]

29


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorziy6
File Modified0000-00-00
File Created2022-08-15

© 2024 OMB.report | Privacy Policy