Behavioral Risk Factor Surveillance System (BRFSS) Asthma Call-back Survey (ACBS)
Existing Collection in Use Without an OMB Control Number
Supporting Statement
Part B: Collections of Information Employing Statistical Methods
(0920-XXXX)
September 2017
Cathy M. Bailey, PhD, MS
Centers for Disease Control and Prevention
National Center for Environmental Health
Environmental Hazards and Health Effects
Air Pollution and Respiratory Health Branch
4770 Buford Highway F- 60
Atlanta, GA 30341
awl3@cdc.gov
770-488-3716
Table of Contents
B. Collections of Information Employing Statistical Methods 4
B.1 Respondent Universe and Sampling Methods 4
B.2 Procedures for the Collection of Information 5
B.3 Methods to Maximize Response Rates, Characterize Nonresponse Bias, and Deal with No Response 8
B.4 Tests of Procedures or Methods to be Undertaken 9
B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 10
List of Attachments
|
Attachment 17 Sources for the ACBS Questions
Attachment 17a Sources for the ACBS Adult Questions
Attachment 17b Sources for the ACBS Child Questions
Attachment 18 ACBS Nonresponse Bias Analysis Plan
CDC’s NCCDPHP Division of Population Health administers the BRFSS parent survey, which provides the foundation for the ACBS administration and data collection (Attachments 5a–5f). Since the ACBS sample is a subset of BRFSS, respondents for the ACBS are BRFSS adults, 18 years and older, in participating states who report ever being diagnosed with asthma. Some states include children, below 18 years of age, who are randomly selected subjects in the BRFSS household. In participating states, parents or guardians serve as ACBS proxy respondents for their children ever diagnosed with asthma. Children do not respond directly to the ACBS questionnaire. If both the BRFSS adult respondent and the selected child in the household have asthma, then only one or the other is eligible for the ACBS. The program selects the child 50 percent of the time and the adult 50 percent of the time. The ACBS enrollment process is presented in Attachment 7.
The landline sample for each state is based on a disproportionate stratified sample (DSS) design in which telephone numbers are assigned to two separately sampled strata based on the presumed density of residential (non-business) telephone numbers [1]. The cellphone sample for each state is randomly selected from lists of all working cell phone numbers. The BRFSS sampling process is described in the BRFSS Data User Guide (http://www.cdc.gov/brfss/data_documentation/pdf/userguidejune2013.pdf ).
ACBS State-tailored Samples:
The initial sampling for each state is drawn as part of the BRFSS http://www.cdc.gov/brfss/annual_data/2013/pdf/Overview_2013.pdf sampling process where an independent sample is drawn for each state. The size of each state’s ACBS sample varies based on their BRFSS sample size and can be estimated as follows.
Using State X’s 2013 BRFSS sample size of 5,000 adults:
Assuming lifetime prevalence (weighted percentage) in State X is 13 percent for 2013 and the sample size is 5,000 adults, the number of adults eligible for the ACBS in State X will be 650 (13% x 5,000). Thirteen percent lifetime prevalence is the weighted population average based on 37 participating states.
Considering State X’s CASRO response rate of 45 percent, about 293 (45% x 650) adults will complete the ACBS.
For children, a similar calculation is used:
Approximately 1/3rd of the 5,000 adults will have children (1,650).
Using State X’s child lifetime weighted percentage rate of 11.8 percent, means 195 of the 1,650 children will have lifetime asthma.
Using a 45 percent response rate, State X would have approximately 90 child completed interviews.
To ensure an adequate number of responses for weighting purposes, the state must have at least 75 completed child interviews. If a state has less than 75 records for child data, the data are not weighted and the data from that state is excluded from the public release file. In this case, the data is combined with multiple subsequent years and weighted.
ACBS Response Rates for 2013: See Attachment 11 for ACBS 2013 response rates by State/Territory.
ACBS data collection follows all standard BRFSS data collection protocols (such as call attempts, assigning dispositions to cases, etc.). Data collection for ACBS must meet guidelines and data quality criteria established for the annual state-wide survey.
ACBS Summary of Steps, Roles, and Responsibilities:
The ACBS steps, roles, and responsibilities are described below.
CDC annually provides the ACBS questionnaire. CDC annually compiles requests regarding the questionnaire modifications from states, and sends the requests to the questionnaire work group comprised of state epidemiologists to review and vote on proposed changes or new questions. Historical questionnaire revisions are described in the Construction of the Annual Questionnaire section below. All states use the exact same ACBS questionnaire. CDC also produces data processing layouts.
Information collection is conducted by telephone interview. CDC provides Computer-Assisted Telephone Interviewing (CATI) programming to states for their use. States may opt to use their own CATI programming software.
ACBS awardees are responsible for field operations and determine how their data will be collected within the BRFSS and ACBS guidelines. States may collect data using in-house calling centers, hire vendors using RFP procedures, or contract with universities. The data collector is the same for BRFSS and the ACBS. Data collectors must develop and maintain procedures to ensure respondents’ privacy, assure and document the quality of the interviewing process, and supervise and monitor the interviewers. Files containing phone numbers must be maintained separately from any files containing responses.
States submit de-identified data files to CDC on a monthly or quarterly basis for cleaning and weighting. CDC returns clean, weighted data files to the state of origin for its use. Through the BRFSS website, CDC also makes cleaned subsets of state data files available for public use, along with information about the source of sample (landline or cell phone), weighting, and any restrictions on publication or use of the data.
ACBS Screener and Permission for Adults and Children:
The ACBS subset is selected from the BRFSS respondent pool. All BRFSS adults, 18 years and older, in participating states who report ever being diagnosed with asthma are eligible for ACBS. If a child is the selected sample member for the ACBS, the interview (Attachment 5f) will be conducted with the most knowledgeable parent or guardian in the household; persons under age 18 years are not be interviewed directly. A BRFSS respondent at the core must be the parent/guardian of the child selected. If the BRFSS respondent is not the parent/guardian of the selected child, an ACBS survey for the child with asthma is not to be conducted (e.g. a core BRFSS respondent who is a sibling of the selected child, who is over 18, but is not the guardian of the selected child could not transfer the child ACBS over to the parent/guardian of the child). The reason for this is that the core BRFSS data must also be for the parent/guardian of the selected child. However, the parent/guardian of the child can transfer the interview to the Most Knowledgeable Person (MKP) and grant this person permission to conduct the interview. All states inform the BRFSS respondent of the request to participate in the ACBS during the BRFSS interview. A template with recommended wording for the question requesting permission to call the respondent back sometime in the next two weeks is provided by the CDC to the states. Because IRB is different in each state, states may require slight changes in the wording of this question. The ACBS screeners are provided in Attachments 5a–5d.
Questionnaire:
The ACBS questionnaire has two versions, one for the adult eligible respondents (Attachment 5e), and one for child eligible respondents (Attachment 5f). The ACBS was pre-tested as the National Asthma Survey (NAS) in 2003, and administered in three states during 2005, 25 states in 2006, 35 states in 2007, 37 states in 2008, 37 states in 2009, and 40 states in 2010, and has been running consecutively for 12 years. CA and PR provide a Spanish translation of each instrument. Questionnaire changes are provided in Attachment 9.
The NAS initial questionnaires were designed by the program (NCEH) and further refined based on many questions included in other national surveys (National Health Interview Survey [NHIS], National Health and Nutrition Examination Survey [NHANES], and BRFSS) to facilitate comparison and because many of these questions had already undergone extensive testing. Some questions were written to measure progress toward Healthy People 2010 goals and to translate the National Asthma Education and Prevention Program (NAEPP) guidelines into practice. Cognitive testing and the series of four pretests (in 2003) are described in Attachment 16 (page 6), Design and Operation of the National Asthma Survey.
In 2003 and early 2004, data were collected by the NAS proved to be complex and expensive. So, in 2004, the program considered using the BRFSS as a way to identify respondents with asthma for further interviewing on a call-back basis. In 2005, three asthma grantee states (Minnesota, Michigan, and Oregon) participated in the ACBS pilot. The NAS questionnaire was modified to eliminate items already on the BRFSS and to add a few additional questions to meet state’s needs. A listing of sources for questions is provided in Attachments 17a-17b.
ACBS Call/Interview Guidelines:
All standard BRFSS data collection protocols (such as call attempts, assigning dispositions to cases, etc.) are followed. Data collection for the ACBS typically starts by February 1. The ACBS (Attachment 5e–5f) is typically conducted within two weeks of the BRFSS interview completion date. Conducting the ACBS interview earlier than two weeks is allowed. If the respondent is willing to participate immediately after the BRFSS survey, the ACBS interview can be conducted.
The BRFSS protocol suggests up to 15 calling attempts for each landline phone number and up to 8 for each cell phone number in the sample, depending on state regulations for calling and outcomes of previous calling attempts. Some states make calling attempts over the totals suggested by the BRFSS protocol. Although states may have some flexibility in distribution of calling times, in general, surveys are conducted using the following calling occasions:
Conduct 20 percent of the landline interviews on weekdays (prior to 5:00 pm)
Conduct 80 percent of the landline interviews on weeknights (after 5:00 pm) and weekends
Conduct cell phone interviews during all three calling occasions (weekday, weeknight and weekend) approximately 30 percent of cell phone calls on weekend calling occasions.
Change schedules to accommodate holidays and special events
Make weeknight calls just after the 5:00 pm
Make callbacks during hours that are not scheduled for other interviews, generally on weekdays
With the exception of verbally abusive respondents, eligible persons who initially refuse to be interviewed may be contacted at least one additional time and given the opportunity to be interviewed. Preferably, this second contact will be made by a supervisor or a different interviewer. Some states have regulations on whether refusals should be called again.
Adhere to respondents’ requests for specific callback times whenever possible
ACBS Call Disposition:
States are required to give a final disposition for ACBS eligible BRFSS respondents to indicate the result of calling the number:
States are required to give a final disposition for every number in the sample. Each telephone number in the CDC-provided sample must be assigned a final disposition code to indicate a particular result of calling the number:
A completed or partially completed interview or
A determination that:
A BRFSS respondents was eligible to be included but an interview was not completed or
A BRFSS respondents was ineligible or could not have its eligibility determined.
Procedures for call dispositions that ACBS follows are similar to the BRFSS procedures (Attachment 10). However, some additions and adaptations are required for the ACBS survey. ACBS interviews are considered complete (COIN) if the respondents finish the entire interview or if they progressed through Section 8 (medication) of the ACBS interview. ACBS interviews are refusals if the respondent refuses participation at either the BRFSS interview or at the time of the ACBS interview. Terminations are interviews that start ACBS survey but are terminated during the interview before completing Section 8.
Final disposition codes are then used to calculate response rates, cooperation rates, and refusal rates (Attachment 11). The distribution of individual disposition codes and the rates of cooperation, refusal, and response are published annually in the Summary Data Quality Reports. The ACBS uses standards set by the American Association of Public Opinion Research (AAPOR) [2] and Council of American Survey Research Organizations (CASRO) [3] to determine disposition codes and response rates.
ACBS Procedures to Promote Data Quality and Comparability:
ACBS follows the BRFSS procedures to promote data quality and comparability, with minor revisions. In order to maintain consistency across states and allow for state-to-state comparisons, the BRFSS sets standard protocols for data collection which all states are encouraged to adopt with technical assistance provided by CDC. The following items are included in the ACBS survey protocol:
All states must ask the ACBS questions without modification. Interviewers may not offer information to respondents on the meaning of questions, words or phrases beyond the interviewer instructions provided by CDC and/or the state BRFSS coordinators.
Interviewers should be trained specifically for the ACBS.
Systematic, unobtrusive electronic monitoring is a routine and integral part of monthly survey procedures for all interviewers. States may also use callback verification procedures to ensure data quality. Unless electronic monitoring of 10 percent of all interviews is being routinely conducted, a 5 percent random sample of each month’s interviews must be called back to verify selected responses for quality assurance.
General calling rules, listed below, are established by the BRFSS and states are encouraged to adhere to them whenever possible. It is understood that the calling rules are not universally applicable to each state.
All cellular telephone numbers must be hand-dialed.
If possible, calls made to non-English speaking households and assigned the interim disposition code of 5330 (household language barrier), should be attempted again with an interviewer who is fluent in the household language (e.g. Spanish).
States should maximize calling attempts as outlined in BRFSS. The maximum number of attempts (15 for landline telephone and 8 for cellular telephone) may be exceeded if formal appointments are made with potential respondents.
Calling attempts should allow for a minimum of 6 rings and up to 10 rings if not answered or diverted to answering devices.
The ACBS uses a number of techniques to deal with response rates and nonresponse. These include providing the interview in languages other than English, creating a number of call back protocols designed to convert refusals, and alternating times and days of calling attempts. States get permission from BRFSS respondents to call them back during the BRFSS survey. Experienced interviewers are used for callbacks when respondents initially refuse to take part in the survey. Hard refusals (where potential respondents state that they are not interested in completing the interview) are not called back.
States must maintain training for all interviewers involved in the ACBS. Issues related to response rates are discussed in large annual meetings of the data collectors. Data collectors also participate in monthly conference calls organized by the CDC to discuss best practices, and share experiences.
The CDC has conducted a number of pilot studies in recent years to identify methods that might improve response rates and alleviate potential nonresponse bias. These have included comparing the response patterns between landline and cell phone samples for the Behavioral Risk Factor Surveillance System (BRFSS) and Asthma Call-back Survey (ACBS) and assessing how the lag days between BRFSS and ACBS interviews affect ACBS response rates. This pilot study indicated that in BRFSS interviews, cell phone respondents are more likely to agree to participate in the ACBS compared to landline respondents. In addition, the study reported ACBS cell phone samples have a lower response rate compared to landline sample and as lag days increased, the ACBS response rate decreased. CDC staff presented these findings at professional meetings.
Comparisons of demographic response patterns indicate that nonresponse among males and respondents age 24-34 improved when cell phones were introduced in 2012 and 2013. CDC staff present findings of research on weighting methods that are appropriate to correct, to the extent possible, for demographic differences between those who participate in the survey and the BRFSS asthma population of the state at professional meetings. To date, the most appropriate adjustment to nonresponse for the ABCS has been the inclusion and increasing proportions of cell phone interviews which were introduced in 2011 and enforced for all the states in 2015.
Although response rates overall for telephone surveys are declining, the ACBS maintains a relatively high level of response rate (the median CASRO rate is 45.03% in 2013, which is a joint BRFSS and ACBS response rate) when compared to other telephone based surveys. Response rates, cooperation rates, and refusal rates for ACBS are calculated and published annually using standards set by the Council of American Survey Research Organizations (CASRO) [3]. See the 2013 Asthma Call-Back Survey (ACBS) Summary Data Quality Report for specific details (Attachment 14, http://www.cdc.gov/brfss/acbs/2013/pdf/sdqreportacbs_13.pdf). Response rate tables for the ACBS including response rates for each state/area participating in the ACBS can be found in Attachment 11. Sample disposition codes and formulas used to calculate response rates, cooperation rates, and refusal rates are provided in Attachment 12. To communicate the potential extent impact of nonresponse bias and its impact on the published dataset and accompanying web tables, a plan for analyzing ACBS nonresponse bias is described in Attachment 18.
B.4 Tests of Procedures or Methods to be Undertaken
Tests of procedures or methods are undertaken from the BRFSS. ACBS data collection procedures have been adapted over time to meet the needs of the data collection process and maximize response rates while minimizing respondent burden. Pretest of the questionnaires every year isn’t necessary as changes are not always made. As this document indicates, subject matter experts from CDC and other federal agencies, state health department representatives and survey experts are involved in the process of question development. Some of the questions which are included in the ACBS appear on the National Health Interview Survey. The use of identical or similar questions is advantageous in that it allows researchers to make comparisons across different samples, different geographic areas or over time.
The ACBS was piloted as the National Asthma Survey (NAS) in 2003. Cognitive testing and the series of four pretests (in 2003) are described in more detail in Attachment 16 (page 6), Design and Operation of the National Asthma Survey.
B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
Table A.8.1 lists individuals involved in the data collection partnership with the CDC Division of Population Health, National Center for Chronic Disease Prevention, the program responsible for implementing the BRFSS and ACBS survey. Table A.8.2 in SSA lists state BRFSS coordinators in the state health department staff responsible for the oversight of the ACBS in their respective state. States may collect data using in-house calling centers, hire vendors using RFP procedures, or contract with universities.
The ACBS History and Analysis document provides information on the background, design, data collection and processing, and analytical issues for the ACBS. See Attachment 15 (http://www.cdc.gov/brfss/acbs/2013/pdf/acbs_2013.pdf) .
CDC personnel and asthma grantees were involved in a workgroup responsible for documenting statistical, analysis, and reporting aspects of the ACBS in a detailed user’s manual covering all sections of the survey. This manual is available by request at [email protected]. See Attachment 13 for list of workgroup participants and table of contents for the ACBS data analysis manual. Workgroup members also selected a set of analytic tables for dissemination on the CDC website (adults: https://www.cdc.gov/brfss/acbs/2013_tables_LLCP.html and children: https://www.cdc.gov/asthma/acbs/acbstables.htm
Behavioral Risk Factor Surveillance System Overview, 2013. Accessed December 7, 2015. http://www.cdc.gov/brfss/annual_data/2013/pdf/overview_2013.pdf
The
American Association for Public Opinion Research. 2011. Standard
Definitions: Final dispositions of case codes and outcome rates for
surveys. 7th edition.
http://www.aapor.org/AM/Template.cfm?Section=Standard_Definitions2&Template=/CM/ContentDisplay.cfm&ContentID=3156
The Council of American Survey Research Organizations. 2013. Code of Standards and Ethics for Market, Opinion, and Social Research website. Accessed December 7, 2015. http://c.ymcdn.com/sites/www.casro.org/resource/resmgr/code/september_2013_revised_code.pdf?hhSearchTerms=%22casro+and+response+and+rate%22
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | cww6 |
File Modified | 0000-00-00 |
File Created | 2021-01-23 |