History and Analysis Report

Att15 History and analysis report ACBS.docx

Behavioral Risk Factor Surveillance System (BRFSS) Asthma Call-back Survey (ACBS)

History and Analysis Report

OMB: 0920-1204

Document [docx]
Download: docx | pdf

1



Attachment 15











2011-2013


Behavioral Risk Factor Surveillance System


Asthma Call-Back Survey


History

And

Analysis Guidance





National

Asthma

Control

Program






Version 1.0.0

3/23/2015



ACKNOWLEDGEMENT


The Asthma call-back Survey (ACBS) is funded by the National Asthma Control Program (NACP) in the Air Pollution and Respiratory Health Branch of the National Center for Environmental Health (NCEH). The ACBS is jointly administered with the Division of Population Health, National Center for Chronic Disease Prevention and Health Promotion (NCCDPHP).


NCEH and NCCDPHP greatly appreciate the efforts of the BRFSS staff in each ACBS participating state.




Paul Garbe, DVM, MPH

Chief, Air Pollution and Respiratory Health Branch

Division of Environmental Hazards and Health Effects

National Center for Environmental Health, CDC

MS F-60

4770 Buford Highway

Atlanta GA 30341


Phone: (770) 488-3700

Fax: (770) 488-1540

E-mail: [email protected]


 

Machell G. Town, Ph.D.

Acting Branch Chief, Population Health Surveillance Branch

Division of Population Health

National Center for Chronic Disease Prevention and Health Promotion

Centers for Disease Control and Prevention

1600 Clifton Road NE Mail Stop E97

Atlanta, GA 30333 USA


Phone: (404) 498 - 0503

Fax: (404) 498 - 0585

E-mail: [email protected]






















Asthma Call-Back Survey

History


What is health surveillance?


The cornerstone of CDC’s work has always been surveillance. Public health surveillance is the ongoing, systematic collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in planning and delivering public health action to reduce morbidity (disease) and mortality (death) and to improve health. Data disseminated by a public health surveillance system can be used for immediate public health action, program planning and evaluation, and formulating research hypotheses. Examples of ways surveillance data are used include:


  • guiding immediate action in a public health emergency

  • measuring the prevalence of a disease

  • identifying populations at high risk for disease

  • monitoring disease outbreaks

  • planning, implementing and evaluating programs to prevent and control disease, injury and adverse exposure

  • monitoring behavior that increases health risk

Why do we need asthma surveillance?


Asthma is one of the nation’s most common and costly chronic conditions, affecting over 37 million Americans at some time in their lives.17 An estimated 7.8 million adults and 3.5 million children had an asthma attack in the past twelve months 17 . The cost of asthma is estimated to be over $30 billion a year. Asthma can also be life threatening; over 3,600 people die from asthma each year18.


CDC’s National Asthma Control Program plays a critical role in addressing the health risk. The program funds states, cities, and school programs to help them improve surveillance of asthma, train health professionals, educate individuals with asthma and their families, and explain asthma to the public. The National Asthma Control Program is a function of the Air Pollution and Respiratory Health Branch (APRHB) of the Division of Environmental Hazards and Health Effects in the National Center for Environmental Health.


Although much has been learned in recent years about asthma management and control, the information still needs to be put into sound public health practice. Managing asthma requires a long-term, multifaceted approach, including patient education, behavior changes, asthma trigger avoidance, pharmacological therapy, and frequent medical follow-up. Asthma data need to be available at the state and local level to direct and evaluate interventions undertaken by asthma control programs located in state health departments. Improved tracking for asthma is critical for planning and evaluating efforts to reduce the health burden from this disease.


In addition, Healthy People 2020 (HP2020) goals include 8 objectives that require asthma surveillance data. (http://www.healthypeople.gov/hp2020/Objectives/TopicArea.aspx?id=43&TopicArea=Respiratory+Diseases)



What is the history of asthma surveys at CDC?


The National Center for Health Statistics surveys collect data on asthma prevalence, asthma-related deaths (mortality), and several indirect indicators of asthma-related illness (morbidity), such as hospitalizations. These data provide a good basis for analyzing national trends, but not all can be analyzed by state, and they do not become publicly available until 2 to 3 years after collection.


State health agencies have the primary role of targeting resources to reduce behavioral health risks and the diseases that may result from those behaviors. To make asthma data more useful, APRHB saw the need to expand existing data systems and develop new systems in order to make data available at a state or local level, make data available more quickly, and provide more detailed asthma data.


In 1984, CDC established the Behavioral Risk Factor Surveillance System (BRFSS), a state-based system of health surveys administered and supported by the Division of Community Health, National Center for Chronic Disease Prevention and Health Promotion. Beginning with 15 states in 1984, the BRFSS is now conducted in all states, the District of Columbia, and three territories. The BRFSS is a telephone survey that obtains information on health risk behaviors, clinical preventive health practices, and health care access, primarily related to chronic disease and injury, from a random, representative sample of adults in each state. States use the data to identify emerging health problems, establish and track health objectives and develop and evaluate public health policies and programs. Many states also use BRFSS data to support health-related legislative efforts.


In 2000, APRHB added questions about current and lifetime asthma prevalence to the core BRFSS survey. Since 2001, states have had the option of adding an adult asthma history module to their survey, and in 2005 a child asthma prevalence module was added (which requires the use of the child selection module as well). However, many states do not choose to add these modules because of cost or the more pressing need for other health-related data.


Using the BRFSS for collection of additional information on asthma met two of APRHB’s three objectives to improve asthma surveillance. First, the BRFSS is local, providing data for state and metropolitan areas in 50 states, the District of Columbia and 3 territories. Second, it is timely; data are available within 6 months from the end of the calendar year of data collection.


The third APRHB surveillance objective is to increase the content detail for asthma surveillance data. Efforts to meet this objective began in 1998 when APRHB began creating a new survey with very detailed asthma content, called the National Asthma Survey (NAS). A number of pilot tests of the survey were conducted in 2001 and 2002. The first survey used the State and Local Area Integrated Telephone Survey, an independent survey mechanism that is an off-shoot of the National Immunization Program survey at CDC. The NAS survey complemented and extended survey work from the National Health Interview Survey, National Health and Nutrition Examination Survey and the BRFSS. It added depth to the existing body of asthma data, helped to address critical questions surrounding the health and experiences of persons with asthma, and in addition, could provide data at the state and local level.


In 2003 and early 2004, data were collected by the NAS in a national sample and in 4 state samples, but this proved to be a complex and costly process. So, in 2004, APRHB considered using the BRFSS as a way to identify respondents with asthma for further interviewing on a call-back basis. The BRFSS includes a much larger sample size in each area than the NAS did. Respondents who answered yes to questions about current or lifetime asthma during the BRFSS interview would be eligible for the subsequent asthma survey.


In 2005, the original NAS questionnaire was modified to eliminate items already on the BRFSS and to add some content requested by the individual states. The BRFSS provided respondents for the call-back survey in three asthma grantee states (Minnesota, Michigan, and Oregon) for the call-back pilot. APRHB increased the size of each state’s BRFSS sample to 10,000 respondents, hoping to obtain at least 1,000 respondents with asthma to call back. However, this increase in sample size was very expensive, costing an additional $500,000 per state. Consequently, since 2006 the state BRFSS sample has not been increased for the Asthma Call-back Survey (ACBS).


States that plan to conduct the ACBS among adults with asthma identified during the BRFSS no longer need to add the Adult Asthma Module to the BRFSS, since the questions on the call-back survey provide even more detailed answers. However, if states wish to include children in the call-back survey, they must also include both BRFSS child modules: the child selection module and the child prevalence module.


The ACBS has been implemented through BRFSS every year since 2006. The number of states or territories participating in the ACBS has increased each year. Since the 2011 survey, the weighting methodology for the BRFSS was changed significantly and cell phone samples were added to the traditional landline phone samples. The new weighting methodology—iterative proportional fitting, also known as “raking”, replaced the post stratification weighting method that had been used with previous BRFSS datasets. Due to these two methodological changes, data from years 2010 and earlier are not comparable to data from year 2011 and later. Since the ACBS is methodologically linked to the BRFSS survey, data from the ACBS is also subject to the two methodological changes. Consequently, ACBS data from 2010 and earlier should not be compared or combined with ACBS data from 2011 and later.


In addition, while BRFSS initiated cell phone samples in 2011, not all ACBS participating states included the cell phone sample in the ACBS. In 2011, only 6 of the 40 states included the cell phone sample in the ACBS. Because the majority of states doing the ACBS used only the landline samples, the landline sample weight was used to produce the ACBS weight and only landline data was included in the 2011 public release file. The cell phone sample data from the 6 states that included cell phone samples in the 2011 ACBS were not released publicly.


In 2012, 22 areas used both Landline and Cell Phone (LLCP) samples for the ACBS while 15 areas used only Landline (LL) samples. Consequently, for 2012, separate public release data files were produced for LL only data and also for LLCP data. The 2012 LL only public release data file included data from the landline interviews in all the areas that met data quality standards (adult data from 37 areas and child data from 10 areas). The 2012 LLCP public release data files included landline and cell phone data that met data quality standards from the subset of areas that did both LLCP samples (adult data from 22 areas and child data from 10 areas).


In 2013, for adult data, 28 areas used both Landline and Cell Phone (LLCP) samples for the ACBS while 10 areas used only Landline (LL) samples. Consequently, for adults, separate public release data files were produced for LL only data and also for LLCP data. The adult LL only public release data file includes data from the landline interviews in all the areas that met data quality standards (adult data from 38 areas). The adult LLCP public file includes LLCP data that met data quality standards from the subset of areas that did both LLCP samples (adult data from 28 states/areas). For child data, 25 areas used LLCP samples for the ACBS while 7 areas used only LL samples. The child LLCP public release data includes LLCP that met data quality standards from the subset of areas that did both LLCP samples (child data from 10 areas). For the child LL only sample, Kansas was the only state that met the minimum 75 completes requirement. A LL only data file with only Kansas was not released publicly.


Data from the ACBS 2011 landline only file are methodologically comparable to data from the 2012 and later landline only file, but are not comparable to the ACBS LLCP data. Data from the ACBS 2012 LLCP file are methodologically comparable to ACBS 2013 and later LLCP files only.


Questionnaires, tables, data files and documentation for the ACBS can be accessed here: http://www.cdc.gov/brfss/acbs/index.htm


Funds are available for the ACBS from APRHB yearly through the BRFSS cooperative agreement. Any state or territory can apply for funds to implement the ACBS. States must include both child modules to include children in the call-back survey. The BRFSS sample size will not be increased for the ACBS. In order to produce a sufficient number of respondents for detailed analysis, it is recommended that a state conduct the ACBS for 2 consecutive years at the very least. States participating each year beginning with 2011 are shown in the following table.


Participating States by Year

Shape2 Shape1


States

2011 LL

2011 CP

2012 LL

2012 LLCP

2013 LL

2013 LLCP


Adult (A)

Child (C)

No file

Adult (A)

Child (C)

Adult (A)

Child (C)

Adult (A)

Child (C)

Adult (A)

Child (C)

Alabama

2011 AC*


2012 AC*


2013 A


Alaska







Arizona

2011 AC*


2012 AC*


2013 AC


Arkansas







California

2011 AC


2012 AC*

2012 AC

2013 AC

2013 AC

Colorado







Connecticut

2011 AC


2012 AC


2013 AC

2013 AC

Delaware







District of Columbia

2011 AC*


2012 AC*


2013 AC


Florida

2011 A


2012 A


2013 A


Georgia

2011 AC*


2012 AC*


2013 AC

2013 AC

Hawaii

2011 AC*


2012 AC*

2012 AC

2013 AC

2013 AC

Idaho







Illinois

2011 AC*


2012 AC*

2012 AC*

2013

2013

Indiana

2011 AC


2012 A

2012 A

2013 AC

2013 AC

Iowa

2011 AC*


2012 A

2012 A

2013 A

2013 A

Kansas

2011 AC


2012 AC


2013 AC


Kentucky

2011 A*


2012 C


2013 A


Louisiana

2011 AC*


2012 AC*


2013 AC


Maine

2011 AC*


2012 AC*


2013 AC


Maryland

2011 AC


2012 A*C


2013 AC

2013 AC

Massachusetts

2011 AC*


2012 AC*


2013 AC


Michigan

2011 AC

2011 AC

2012 AC

2012 AC

2013 AC

2013 AC

Minnesota







Mississippi

2011 AC*


2012 AC*

2012 AC

2013 AC

2013 AC

Missouri

2011 AC*


2012 AC*

2012 AC*

2013 AC

2013 AC

Montana

2011 AC*


2012 AC*

2012 AC*

2013 AC

2013 AC

Nebraska

2011 AC


2012 AC

2012 AC

2013 AC

2013 AC

Nevada

2011 A


2012 A

2012 A

2013 A

2013 A

New Hampshire

2011 A


2012 AC*

2012 AC*

2013 AC

2013 AC

New Jersey

2011 AC


2012 AC


2013 AC


New Mexico

2011 AC


2012 AC*

2012 AC

2013 AC

2013 AC

New York

2011 AC*


2012 AC*

2012 AC*

2013 AC

2013 AC

North Carolina

2011 A


2012 A


2013 A


North Dakota

2011 AC*


2012 AC*




Ohio

2011 AC


2012 AC

2012 AC

2013 AC

2013 AC

Oklahoma

2011 AC*


2012 AC*

2012 AC*

2013 AC

2013 AC

Oregon

2011 AC*

2011 AC

2012 AC*

2012 AC*

2013 AC

2013 AC

Pennsylvania

2011 AC

2011 AC

2012 AC

2012 AC

2013 AC

2013 AC

Rhode Island

2011 AC*


2012 AC*


2013 AC


South Carolina







South Dakota







Tennessee







Texas

2011 AC

2011 AC

2012 AC*

2012 AC

2013 AC

2013 AC

Utah

2011 AC

2011 AC

2012 A*

2012 A*

2013 A

2013 AC

Vermont

2011 AC*


2012 A

2012 A

2013 A

2013 AC

Virginia







Washington

2011 AC


2012 AC

2012 AC

2013 AC

2013 AC

West Virginia

2011 AC*


2012 AC*


2013 AC

2013 AC

Wisconsin

2011 AC*

2011 AC

2012 AC*

2012 AC*

2013 AC

2013 AC

Wyoming














Guam







Puerto Rico

2011 AC*


2012 A*C*

2012 A*C*

2013 A

2013 A

U.S. Virgin Islands








LL Landline sample; LLCP Landline Cell Phone sample.

Adult/Child: “A” indicates the state collected adult data; “C” indicates the state collected child data.

** Adult/Child data are not included in the public use file for technical reasons. See Data Anomalies.

Asthma Call-back Survey

Analysis Guidelines


The Asthma Call-back Survey is conducted approximately 2 weeks after the Behavioral Risk Factor Surveillance Survey (BRFSS). BRFSS respondents who report ever being diagnosed with asthma are eligible for the asthma call-back. If a state includes children in the BRFSS and the randomly selected child has ever been diagnosed with asthma, then the child is eligible for the asthma call-back. If both the selected child and the BRFSS adult in a household have asthma, then one or the other is eligible for the ACBS (50/50 split).


The BRFSS is a cross-sectional surveillance survey currently involving 53 reporting areas.1,2 BRFSS questionnaires, data, and reports are available at HtmlResAnchor www.cdc.gov/brfss .


From the parent survey (BRFSS), the ACBS inherits a complex sample design and multiple reporting areas. These factors complicate the analysis of the ACBS. Some states vary from both BRFSS and ACBS protocol. These variations should be considered prior to analysis of these data. Information on the BRFSS deviations can be found in the document titled Comparability of Data which can be accessed at:

http://www.cdc.gov/brfss/annual_data/annual_data.htm after selecting an individual survey year.


A. Data anomalies and deviations from sampling frame and weighting protocols


Several states did not collect ACBS data for all 12 months of the year. This may be an issue when investigating seasonal patterns in the data. States with more than 3 months with no collected interviews are noted below. States missing 6 or more months of ACBS interviews are excluded from the public use data file.


  • 2013

    • District of Columbia collected only 2 complete months of adult interviews. Because asthma outcomes have seasonal variation, the data do not represent the same time period (spread over 12 months) as does the data for the other states. Data from District of Columbia are not included in the adult response rate tables or in the adult public use files.

    • Alabama did not collect adult ACBS interviews in April, May, June and January 2014.

    • California did not collect adult and child ACBS interviews in February, March, April, May, June, July, December and January 2014.

    • Florida did not collect adult ACBS interviews in February, March, April, May, June and July.

    • Louisiana did not collect adult and child ACBS interviews in February, March, April, May and November.

    • Nebraska did not collect adult and child ACBS interviews in February, March, April, May, June and July.

    • Oklahoma did not collect adult and child ACBS interviews in February, March, April, June, December and January 2014.

    • Oregon did not collect adult ACBS interviews in April, May, June and January 2014.

    • Utah did not collect adult and child ACBS interviews in February, March, April, May, June, July and August.

    • Washington did not collect adult and child ACBS interviews in February, March, April and May.

    • West Virginia did not collect adult ACBS interviews in June, July, August, November and December.

    • Wisconsin didn’t collect adult and child ACBS interviews in February, March, April, May, June, July and August.

    • Puerto Rico didn’t collect adult ACBS interviews in February, March, April, May, and June.

  • Massachusetts didn’t collect child ACBS interviews in March, April, May, September and October.


  • 2012

    • Maryland collected only 5 complete months of adult interviews. Because asthma outcomes have seasonal variation, the data do not represent the same time period (spread over 12 months) as does the data for the other states. Data from Maryland are not included in the adult response rate tables or in the adult public use files.

    • Puerto Rico collected only 3 complete months of both adult and child interviews. Because asthma outcomes have seasonal variation, the Puerto Rico data do not represent the same time period (spread over 12 months) as does the data for the other states. Data from Puerto Rico are not included in the adult or child response rate tables or in the adult or child public use files.

    • Utah collected only 3 complete months of adult interviews. Because asthma outcomes have seasonal variation, the Utah data do not represent the same time period (spread over 12 months) as does the data for the other states. Data from Utah are not included in the adult response rate tables or in the adult public use files

    • California did not collect adult ACBS interviews in May, August, September and January 2013.

    • Louisiana did not collect child ACBS interviews in February, March, September, and January 2013.

    • Missouri did not collect child ACBS interviews in February, March, April, and October.

    • New Mexico did not collect child ACBS interviews in February, March, April, May, October, and November.

    • Oklahoma did not collect adult ACBS interviews in August, November, December, and January 2013. Oklahoma did not collect child ACBS interviews in February, March, August, November, and December

    • Washington did not collect adult ACBS interviews in February, November, December and January 2013. Washington did not collect child ACBS interviews in February, November, December, and January 2013

    • West Virginia did not collect child ACBS interviews in February, March, October, and January 2013


  • 2011

    • Kentucky only collected Adult 2011 ACBS data in November 2011, December 2011 and January 2012 for BRFSS interviews conducted throughout the 2011 calendar year. Because asthma outcomes have seasonal variation, the data for Kentucky does not represent the same time period (spread over 12 months) as does the data for the other states. Data for Kentucky were not included in the public release file or in the response rate tables.

    • Nevada did not collect Adult ACBS data from January through April.

    • Puerto Rico did not collect Adult ACBS data from January through May.

    • Puerto Rico only collected Child 2011 ACBS data in October 2011, November 2011, December 2011 and January 2012. Because asthma outcomes have seasonal variation, the child data for Puerto Rico does not represent the same time period (spread over 12 months) as does the child data for the other states. Child data for Puerto Rico were not included in the public release file or in the response rate tables.

    • Louisiana did not collect Child ACBS data in January, February, March, August and October.

    • Illinois did not collect Adult or Child ACBS data in January, February, May and June.

    • Two child records in Indiana and one child record in Pennsylvania could not be weighted or included in the public use file because the child's age increased above age 18 years between the BRFSS interview and the ACBS interview.



Several states varied from ACBS protocol in ways that affected the weighting procedures.



  • 2012

    • North Caronia (adults) and Massachusetts (adults and children) only did the call-back for their version 1 sample.

    • Maine (adults and children) only did the call-back for their version 2 sample.

    • Kentucky did not collect adult ACBS data, but did collect child data.

    • Both Landline and cellphone samples were used for both the BRFSS and the ACBS for adults in the following 24 areas: California, Hawaii, Illinois, Indiana, Iowa, Michigan, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Mexico, New York, Ohio, Oklahoma, Oregon, Pennsylvania, Texas, Utah, Vermont, Washington, Wisconsin and Puerto Rico. Data for 22 of these 24 areas are included in the Landline Cell Phone (LLCP) public use file. Data from just the Landline sample for 22 these 24 areas are also included in the Landline (LL) public use file. Data from Utah and Puerto Rico were not included in either file due to the limited number of months of data collection.

    • Combined Landline Cell Phone (LLCP) data for children from Illinois, Missouri, Montana, New Hampshire, New York, Oklahoma, Oregon, and Wisconsin  are not included in the public release file because there were too few records (<75) to produce reliable weights. Child data from Puerto Rico are not included in the public release file due to the limited number of months of data collection. Indiana, Iowa, Nevada, Utah, and Vermont did not collect child data.

    • Combined Landline Cell Phone (LLCP) data: for adults in California, Iowa, New York, Nevada, Ohio, Oregon, Texas, and Wisconsin, and for children in New Mexico, Pennsylvania, Texas, and Washington, more than 10% of the BRFSS records for respondents with asthma had no recorded information about call-back participation.  For these states, weighting was done using a Modified Adjustment Factor method.

    • While both Landline and Cell Phone samples were used for the BRFSS, only the Landline sample was used for ACBS in the following 16 areas: Alabama, Arizona, Connecticut, DC, Florida, Georgia, Kansas, Louisiana, Maine, Maryland, Massachusetts, New Jersey, North Carolina, North Dakota, Rhode Island, and West Virginia. Data for 15 of these 16 areas are included in the Landline (LL) public use file. Data from Maryland were not included due to the limited number of months of data collection.

    • Landline (LL) data for children from Alabama, Arizona, California, DC, Georgia, Hawaii, Illinois, Louisiana, Maine, Massachusetts, Mississippi, Missouri, Montana, New Hampshire, New Mexico, New York, North Dakota, Oklahoma, Oregon, Rhode Island, Texas, West Virginia, and Wisconsin  are not included in the Landline public release file because there were too few records (<75) to produce reliable weights. Child data from Puerto Rico are not included in the public release file due to the limited number of months of data collection. Florida, Indiana, Iowa, Nevada, North Carolina, Utah, and Vermont did not collect child data.

    • Landline (LL) only data: for adults in California, Georgia, Massachusetts, Ohio, Rhode Island, and Wisconsin and for children in Maryland and Pennsylvania, more than 10% of the BRFSS records for respondents with asthma had no recorded information about call-back participation.  For these states, weighting was done using a Modified Adjustment Factor method.

  • 2011

  • Georgia child ACBS data could not be weighted due to a processing error that did not allow weighting the BRFSS child data. Since no BRFSS child weight was produced the subsequent ACBS weight (which relies on the BRFSS weight) could not be calculated. Child data for Georgia were not included in the public release file or in the response rate tables.

  • Child data for Alabama, Arizona, DC, Hawaii, Illinois, Iowa, Louisiana, Maine, Massachusetts, Mississippi, Missouri, Montana, New York, North Dakota, Oklahoma, Oregon, Rhode Island, Vermont,  West Virginia, and Wisconsin  are not included in the public release file because there were too few records (<75) to produce reliable weights.

  • North Caronia (adults), Massachusetts (adults and children) and New York, (adults and children) only did the call-back for the version 1 sample.

  • Maine (adults and children) only did the call-back for the version 3 sample

  • For adults in Alabama, California, Florida, Georgia, Indiana, Nevada, New Jersey  and Wisconsin, for children in Indiana, New Mexico, Nebraska, and Texas, more than 10% of the BRFSS records for respondents with asthma had no recorded information about call-back participation.  For these states, weighting was done using a Modified Adjustment Factor method.

  • Both Landline and cellphone samples were used for both the BRFSS and the ACBS in the following six states: Michigan, Oregon, Pennsylvania, Texas, Utah, and Wisconsin.

  • While both Landline and cellphone samples were used for the BRFSS, cell phone samples were not used for ACBS in the following 33 areas: Alabama, Arizona, California, Connecticut, DC, Florida, Georgia, Hawaii, Illinois, Indiana, Iowa, Kansas, Louisiana, Maine, Maryland, Massachusetts, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Jersey, New Mexico, New York, North Dakota, Ohio, Oklahoma, Rhode Island, Vermont, Washington, West Virginia, and Puerto Rico.


For additional information on weighting the ACBS records, refer to the document “Asthma Call-back Weighting Methods” which can be requested from NCEH/EHHE/APRHB ([email protected]).


B. Other limitations of the data


  • The Institutional Review Board (IRB) in some states required that asthma be mentioned when the BRFSS respondent was asked to participate in the ACBS. Other states required that asthma not be mentioned. Some state IRBs required that BRFSS respondents be specifically asked if their BRFSS responses could be linked to their ACBS responses. Other state IRBs did not. If a state required active consent to link the responses from the two interviews, the PERMISS variable on the data file will be coded 1 for yes. If consent was denied, the ACBS was not conducted and there will be no record in the file. Wording for specific consent scripts can be obtained from each participating state.


  • Several states ask the ACBS consent questions directly after the asthma questions in the core of the BRFSS survey. Other states ask the consent questions at the end of the BRFSS interview.

  • Approximately 3% of the ACBS interviews are completed in the calendar year following the BRFSS interview. The Variable IYEAR_F identifies the year of the call-back interview.


Information about survey disposition codes, item non-response, complete and incomplete designation can be found in the ACBS Summary Data Quality Report. Similar information about the BRFSS can be found in the BRFSS Summary Data Quality Report which can be accessed at http://www.cdc.gov/brfss/technical_infodata/quality.htm



C. Data file and record issues


Data file


  • When the intent of an analysis is to compare those with asthma to those who do not have asthma, the appropriate file to use is the BRFSS file. The sample size is larger and the responses to BRFSS questions are available for all respondents with asthma and without asthma.

  • When the intent of an analysis is to compare subpopulations of those with asthma, the appropriate file to use is the call-back file.


Data record


  • The ACBS record for a respondent consists of the entire BRFSS interview record followed by the ACBS data. There is no need to merge the ACBS data with data from the BRFSS interview. The ACBS codebook, however, does not include the BRFSS portion of the data. BRFSS codebooks can be accessed at: http://www.cdc.gov/brfss/annual_data/annual_data.htm after selecting an individual survey year.

Skip patterns


  • The Asthma Call-back questionnaire has multiple and complex skip patterns. Each of the skip patterns has been coded into subsequent questions using individual value codes to identify the source response that caused the question to be skipped. These additional codes do not appear in the questionnaire, but are in the codebook. This skip coding allows the analyst to clearly determine an existing skip pattern and easily decide the denominator appropriate for any given analysis or statement without tracing skip patterns in the questionnaire. For more information on coding skip patterns see the document “Coding Skip Patterns,” which can be requested from NCEH/EHHE/APRHB ([email protected]).



Calculated variables


  • Not all of the variables that appear on the public use data set are taken directly from the ACBS questionnaire. CDC prepares a large set of calculated variables which are added to the actual questionnaire responses. The vast majority of the variables on the ACBS file are calculated variables. The calculated variables are created for the user's convenience. The procedures for the calculated variables vary in complexity; some only combine codes from one or two questions, while others require sorting and combining selected codes from multiple variables.


  • At the time of the call-back interview, the respondent is asked to confirm the responses to the two asthma questions from the BRFSS interview. Not all respondents agree with the responses that were recorded from the initial interview.


  • The calculated combined call-back asthma variables _CUR_ASTH_C and _EVER_ASTH_C are not identical to the BRFSS asthma variables ASTHNOW and ASTHMA3 (CASTHNO2 and CASTHDX2 for children) or the BRFSS adult calculated variables _CASTHM1 and _LTASTH1.

  • The combined call-back variables _CUR_ASTH_C and _EVER_ASTH_C use the BRFSS responses when the respondent agreed with them and the responses at the time of the call-back interview when the respondent did not agree with the BRFSS responses.


When using call-back data the combined variables (_CUR_ASTH_C and _EVER_ASTH_C) should be used and not the BRFSS interview variables.


  • For further details regarding these and other calculated variables, refer to the document entitled “Calculated Variables for the Asthma Call-back Survey,” which can be requested from NCEH/EHHE/APRHB ([email protected]).


Questionnaire changes


  • There were no changes to the ACBS questionnaire between 2010 and 2011.

  • 2012 ACBS questionnaire changes included:

    • The name for INH_MED 25 was changed to flexhaler

    • Three inhaler medication questions were deleted (ILP01, ILP02, and ILP07)

    • Response categories for inhaler medication question ILP03 were changed

    • The question PILLX (how long taking a specific pill) was deleted and PILL01 (on daily use) was added

    • Three questions on nebulizers were added

    • Some skip patterns and help screens were revised in the medication section

    • The content of the Work-related asthma section was completely revised

    • The time reference period for the activity limitation variable was changed from 12 months to 30 days

  • 2013 ACBS questionnaire changes included:

    • Inhaler medications Brethaire, Intal, and Tilade were deleted since all have been discontinued

    • Inhaler medications Alvesco and Dulera were added

    • Nebulizer medications Combivent Inhalation Solution and Perforomist/Formoterol were added


D. Estimation procedures


Statistical issues


  • Record weights


Unweighted data on the ACBS represent the actual responses of each respondent before any adjustment is made for variation in respondents' probability of selection, disproportionate selection of population subgroups relative to the state's population distribution, or nonresponse. To produce the ACBS final weight, the BRFSS final weight is adjusted for loss of sample between the BRFSS interview and the ACBS interview. Weighted ACBS data represent results that have been adjusted to compensate for nonresponse at the BRFSS interview and at the ACBS interview. For further details regarding the ACBS final weight, refer to the document entitled “Asthma Call Back Weighting Method,” which can be requested from NCEH/EHHE/APRHB ([email protected]).


Use of the ACBS final weight is essential when analyzing these data. If weights are not used, the estimates produced will be biased.


2013


In 2013, the majority of states implementing the ACBS included both the BRFSS landline sample (LL) and the BRFSS cell phone (CP) sample. However, a significant minority included only the BRFSS landline (LL) sample. Therefore public use files were released for adult Landline (LL) only samples and for Landline and Cell Phone (LLCP) samples. The LL only file includes landline data from all participating states that met data quality standards. The LLCP file includes landline and cell phone data from the subset of states that included both the landline and the cell phone samples and met data quality standards.


The ACBS child file must have a minimum of 75 completes to produce a reliable child weight. For states that only collected landline (LL) sample, Kansas was the only state that met the minimum requirement, A LL only data file with only Kansas was not released publicly. The public use file for children was released for the combined Landline and Cell Phone (LLCP) samples (included 10 areas that met data quality standards).

ACBS Landline (LL) file

In the 2013 ACBS LL only file, the ACBS final weight was produced using the BRFSS landline only weight (_LANDWT for adults and _CLANDWT for children). The 2013 ACBS LL only final weight variables are:

  • LANDWT_F for adults

  • CLANDWT_F for children


ACBS Landline Cell Phone (LLCP) file

In the 2013 ACBS LLCP file, the ACBS final weight was produced using the BRFSS landline cell phone weight (_LLCPWT for adults and _CLLCPWT for children). The 2013 ACBS LLCP final weight variables are:

  • LLCPWT_F for adults

  • CLLCPWT_F for children


2012


In 2012, the majority of states implementing the ACBS included both the BRFSS landline sample (LL) and the BRFSS cell phone (CP) sample. However, a significant minority included only the BRFSS landline (LL) sample. Therefore public use files were released for Landline only (LL) samples and for combined Landline Cell Phone (LLCP) samples. The LL only file includes landline data from all participating states that met data quality standards. The LLCP file includes landline and cell phone data from the subset of states that included both the landline and the cell phone samples and met data quality standards.


ACBS Landline (LL) file

In the 2012 ACBS LL only file, the ACBS final weight was produced using the BRFSS landline only weight (_LANDWT for adults and _CLANDWT for children). The 2012 ACBS LL only final weight variables are:

  • LANDWT_F for adults

  • CLANDWT_F for children


ACBS Landline Cell Phone (LLCP) file

In the 2012 ACBS LLCP file, the ACBS final weight was produced using the BRFSS landline cell phone weight (_LLCPWT for adults and _CLLCPWT for children). The 2012 ACBS LLCP final weight variables are:

  • LLCPWT_F for adults

  • CLLCPWT_F for children


2011


In 2011, the majority of states implementing the ACBS only included the BRFSS landline sample, therefore the ACBS final weight was produced using the BRFSS landline only weight (_LANDWT for adults and _CLANDWT for children).


In the 2011 ACBS file, the final weight variables are:

  • LANDWT_F for adults

  • CLANDWT_F for children.


In 2011, six states (Michigan, Oregon, Pennsylvania, Texas, Utah, and Wisconsin) implemented the ACBS using both the BRFSS landline sample and the BRFSS cellphone sample. ACBS data for these six states were also weighted using the combined landline and cellphone samples and the BRFSS final weight for the combined landline cellphone samples. These data were not released publicly. The six states can be contacted for access to the weighted data using both landline and cellphone samples. State-specific rules regarding data release may allow or prevent access to these data.


  • Variances


The procedures for estimating variances described in most statistical texts and used in most statistical software packages are based on the assumption of simple random sampling (SRS). However, the data collected in the ACBS are obtained through a complex sample design; therefore, the direct application of standard statistical analysis methods for variance estimation (including standard errors and confidence intervals) and hypothesis testing (p-values) may yield misleading results.

Computer programs that take such complex sample designs into account are available. SAS, SUDAAN, Epi Info, SPSS and STATA are among those suitable for analyzing these data.


  • SAS SURVEYMEANS, SURVEYFREQ, SURVEYLOGISTIC, and SURVEYREG can be used for tabular and regression analyses.3

  • SUDAAN can be used for tabular and regression analyses and also has additional options.4

  • Epi Info's C-sample can be used to calculate simple frequencies and two-way cross- tabulations.5

  • SPSS Complex Samples can be used to produce frequencies, descriptives, cross-tabulations, and ratios as well as estimate general linear, logistic, ordinal, and Cox regression models.6

  • STATA can produce cross-tabulations, means, logit and general linear regression models.7


When using these software products, users must specify that the sample design is “With Replacement” and also specify the stratum variable (_STSTR), the primary sampling unit (_PSU), and the record weight (LLCPWT_F, CLLCPWT_F LANDWT_F or CLANDWT_F) -- all of which are on the public use data file.


For more information on calculating variance estimations using SAS, see the SAS/STAT Users Guide, Version 13.3 For information about SUDAAN, see the SUDAAN Users Manual, Release 11.0.4 For information about Epi Info, see Epi Info, Version 7.0.5 For information about SPSS see the SPSS Complex Samples Manual.6 For information about STATA see the Survey Data Reference Manual.7


Analytic issues


  • Sample size


Although the overall number of respondents in the ACBS is more than sufficiently large for statistical inference purposes, subgroup analyses (including state level analysis) can lead to estimates that are unreliable. Consequently, users need to pay particular attention to the subgroup sample when analyzing subgroup data, especially within a single data year or geographic area. Small sample sizes may produce unstable estimates. Reliability of an estimate depends on the actual unweighted number of respondents in a category, not on the weighted number. Interpreting and reporting weighted numbers that are based on a small, unweighted number of respondents can mislead the reader into believing that a given finding is much more precise than it actually is.


ACBS follows a rule of not reporting or interpreting point estimates based on fewer than 50 unweighted respondents (e.g. percentages based upon a denominator of < 50) or for which the Relative Standard Error is greater than 30%. For this reason, and to protect confidentiality of these data, the FIPS County code is not included on the ACBS public use data record.



  • Aggregating data over Time


  • When data from one time period are insufficient, data from multiple periods can be combined as long as the prevalence of the factor of interest did not substantially change during one of the periods. One method that can be used to assess the stability of the prevalence estimates is as follows8:


  1. Compute the prevalence for the risk factor for each period.

  2. Rank the estimates from low to high.

  3. Identify a statistical test appropriate for comparing the lowest and the highest estimates at the 5% level of significance. For example, depending on the type of data, a t-test, or the sign test might be appropriate.

  4. Test the hypothesis that prevalence is not changing by using a two-sided test in which the null hypothesis is that the prevalences are equal.

  5. Determine whether the resulting difference could be expected to occur by chance alone less than 5% of the time (i.e., test at the 95% confidence level).


  • When combining multiple years of ACBS data for the purpose of subgroup analysis, the final weight will need adjusting and the file year will need to be added as an additional stratum on the complex design specification. When combining multiple years of data for the purpose of examining trends, however, reweighting is not appropriate. For more information on reweighting combined years see the document “Reweighting Combined Files,” which can be requested from NCEH/EHHE/APRHB ([email protected]).



  • Analyzing subgroups

  • Provided that the prevalence of risk factors did not change rapidly over time, data combined for two or more years may provide a sufficient number of respondents for additional estimates for population subgroups (such as age/sex/race subgroups or state populations). Before combining data years for subgroup analysis, it is necessary to determine whether the total number of respondents will yield the precision needed, which depends upon the intended use of the estimate. For example, greater precision would be required to justify implementing expensive programs than that needed for general information only.


The table below shows the sample size required for each of several levels of precision, based on a calculation in which the estimated risk factor prevalence is 50% and the design effect is 1.5.

Precision desired Sample size needed

2% 3600

4% 900

6% 400

8% 225

10% 144

15% 64

20% 36


Precision is indicated by the width of the 95% confidence interval around the prevalence estimate. For example, precision of 2% indicates that the 95% confidence interval is plus (+) or minus (-) 2% of 50%, or 48% to 52%. As shown in the table, to yield this high a level of precision, the sample size required is about 3,600 persons. When a lower level of precision is acceptable, the sample size can be considerably smaller.


  • The design effect is a measure of the complexity of the sampling design that indicates how the design differs from simple random sampling. It is defined as the variance for the actual sampling design divided by the variance for a simple random sample of the same size.8,9 For most risk factors in most states, the design effect is less than 1.5. If it is more than 1.5, however, sample sizes may need to be larger than those shown in the table above.


  • The standard error of a percentage is largest at 50% and decreases as a percentage approaches 0% or 100%. From this perspective, the required sample sizes listed in the table above are conservative estimates. They should be reasonably valid for percentages between 20% and 80%, but may significantly overstate the required sample sizes for smaller or larger percentages.




E. Advantages and disadvantages of telephone surveys


  • Compared with face-to-face interviewing techniques, telephone interviews are easy to conduct and monitor and are cost efficient. However, telephone interviews have limitations. Telephone surveys may have higher levels of non-coverage than face-to-face interviews because some U.S. households cannot be reached by telephone. While approximately 94.1% of households in the United States have telephones, a number of studies have shown that the telephone and non-telephone populations are different with respect to demographic, economic, and health characteristics.10,11,12 Although the estimates of characteristics for the total population are unlikely to be substantially affected by the omission of the households without telephones, some of the subpopulation estimates could be biased. Telephone coverage is lower for population subgroups such as blacks in the South, people with low incomes, people in rural areas, people with less than 12 years of education, people in poor health, and heads of households under 25 years of age.13 However, poststratification adjustments for age, race, and sex, and other weighting adjustments used for the BRFSS and ACBS data minimize the impact of differences in noncoverage, undercoverage, and nonresponse at the state level.


Despite the above limitations, prevalence estimates from the BRFSS correspond well with findings from surveys based on face-to-face interviews, including studies conducted by the National Institute on Alcohol Abuse and Alcoholism, CDC's National Center for Health Statistics, and the American Heart Association.8,14 A summary of methodological studies of BRFSS can be found at: http://www.cdc.gov/brfss/publications/methodology/data_qvr.htm


Surveys based on self-reported information may be less accurate than those based on physical measurements. For example, respondents are known to underreport weight. Although this type of potential bias is an element of both telephone and face-to-face interviews, the underreporting should be taken into consideration when interpreting self-reported data. However, when measuring change over time, this type of bias is likely to be constant and is therefore not a factor in trend analysis.


  • With ongoing changes in telephone technology, there are more and more households that have cellular telephones and no traditional telephone lines in their homes. These households are presently not in the sampling frame for the BRFSS, which may bias the survey results, especially as the percentage of cellular-telephone-only households continues to increase.15,16 The BRFSS is continuing to study the impact of cellular phones on survey response and the feasibility of various methods for data collection to complement present survey methods.1



REFERENCES


1. Mokdad AH, Stroup DF, Giles WH. Public health surveillance for behavioral risk factors in a changing environment. Recommendations from the Behavioral Risk Factor Surveillance Team. MMWR Recomm Rep. 2003; 52:1-12.

2. Holtzman D. The Behavioral Risk Factor Surveillance System. In: Blumenthal DS, DiClemente RJ, editors. Community-based Health Research: Issues and Methods. New York: Springer Publishing Company Inc; 2004.p. 115-131.

3. http://support.sas.com/documentation/cdl/en/statug/63962/HTML/default/viewer.htm#statug_introsamp_sect007.htm.

4. http://www.rti.org/sudaan/page.cfm/SUDAAN_Eleven_Examples

5. Dean AG, Dean JA, Coulombier D, Brendel KA, Smith DC, Burton AH, Dicker RC, Sullivan K, Fagan RF, Arner TG. Epi Info, Version 6.0: A Word processing, Database, and Statistics Program for Public Health on IBM-compatible Microcomputers. Atlanta: Centers for Disease Control and Prevention; 1995.

6. SPSS Inc. SPSS Complex Samples 15.0. Chicago, IL: SPSS Inc; 2006.

7. STATA Press Survey Data Reference Manual, 2009

8. Frazier EL, Franks AL, Sanderson LM. Behavioral Risk Factor Data. In: Using Chronic Disease Data: A Handbook for Public Health Practitioners. Atlanta: Centers for Disease Control and Prevention; 1992; 4.1-1.17..

9. Groves RM. Survey Errors and Survey Costs. New York: John Wiley and Sons; 1989.p. 265, 271-272.

10. Groves RM, Kahn RL. Surveys by Telephone: A National Comparison with Personal Interviews, New York: Academic Press; 1979.

11. Banks MJ. Comparing health and medical care estimates of the phone and nonphone populations. Proceedings of the American Statistical Association, Survey Research Methods Section.; 1983; p. 569-574.

12. Thornberry OT, Massey JT. Trends in United States Telephone Coverage Across Time and Subgroups. In: Groves, RM et al editors. Telephone Survey Methodology. New York: John Wiley & Sons; 1988: p. 25-49.

13. Massey JT, Botman SL. Weighting Adjustments for Random Digit Dialed Surveys. In: Groves, RM et al editors. Telephone Survey Methodology. New York: John Wiley & Sons; 1988; p.143-160.

14. Nelson DE, Powell-Griner E, Town M, Kovar MG. A comparison of national estimates from the National Health Interview Survey and the Behavioral Risk Factor Surveillance System. Am J Public Health. 2003; 93:1335-1341.

15. Link MW, Mokdad AH. Leaving answering machine messages: do they increase response rates for RDD surveys? Int J Public Opin Res. 2004;482.

16. Link MW, Mokdad AH, Town M, Roe D, Weiner J. Improving response rates for the Behavioral Risk Factor Surveillance system: use of lead letters and answering machine messages. Proceedings of the American Statistical Association, Survey Methodology Section [CD-ROM]. Alexandria, VA: 2004; p. 141-148.

17. http://www.cdc.gov/asthma/nhis/2013/data.htm

18. http://www.cdc.gov/nchs/fastats/asthma.htm


2011-2013 ACBS History and Analysis Guidance

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title2006-2008 Behavioral Risk Factor Surveillance System Asthma Call-Back Survey History And Analysis Guidance
Subject2006-2008 Behavioral Risk Factor Surveillance System Asthma Call-Back Survey History And Analysis Guidance
AuthorCDC
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy