BRFSS History analysis guidance

Att14 BRFSS History Analysis Guidnce.docx

Behavioral Risk Factor Surveillance System (BRFSS) Asthma Call-back Survey (ACBS)

BRFSS History analysis guidance

OMB: 0920-1204

Document [docx]
Download: docx | pdf












2016


Behavioral Risk Factor Surveillance System


Asthma Call-back Survey


History

and

Analysis Guidance





National

Asthma

Control

Program






Version 1.0.0

05/1/19



ACKNOWLEDGMENT


The Asthma Call-back Survey (ACBS) is funded by the National Asthma Control Program (NACP) in the Asthma and Community Health Branch of the National Center for Environmental Health (NCEH). The state health departments jointly administer the ACBS with the National Center for Chronic Disease Prevention and Health Promotion (NCCDPHP), Division of Population Health (DPH).


The NCEH and the NCCDPHP greatly appreciate the efforts of the BRFSS staff in each ACBS- participating state.



Josephine Malilay, PhD

Branch Chief

Asthma and Community Health Branch

Division of Environmental Health Science and Practice

National Center for Environmental Health

Centers for Disease Control and Prevention

4770 Buford Hwy, NE

Mailstop F-60

Atlanta, GA 30341


Phone: (770) 488-3465

E-mail: [email protected] 


 

Machell G. Town, PhD

Branch Chief

Population Health Surveillance Branch

Division of Population Health

National Center for Chronic Disease Prevention and Health Promotion

Centers for Disease Control and Prevention

1600 Clifton Road NE

Mail Stop E97

Atlanta, GA 30333 USA


Phone: (770) 488-4681

E-mail: [email protected]






















Asthma Call-back Survey

History


What is public health surveillance?


Public health surveillance is the ongoing, systematic collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in planning and delivering public health action to reduce morbidity (disease) and mortality (death) and to improve health. Data disseminated by a public health surveillance system can help in the formulation of research hypotheses, as well as aid the following actions:


  • guiding immediate action in a public health emergency

  • measuring the prevalence of a disease

  • identifying populations at high risk for disease

  • monitoring disease outbreaks

  • planning, implementing and evaluating prevention/control strategies for diseases, injuries, and adverse exposures

  • monitoring behavior that increases health risk

Why do we need asthma surveillance?


Asthma is one of the nation’s most-common and costly chronic conditions. It will affect more than 43 million US residents during their lifetime and generate more than (an estimated) $30 billion annually in asthma-related costs1. In the past 12 months, about 9.1 million adults and 3.3 million children had an asthma attack, which can be life threatening1. More than 3,500 people die from asthma-related complication each year2.


Managing asthma and reducing the burden of this disease requires a long-tem, multifaceted approach that includes patient education, behavior changes, asthma trigger avoidance, pharmacological therapy, frequent medical follow-up and the development of best practices that put the findings of asthma-related research into sound public health practice. In this way, disease-related data also can help state and local health departments evaluate their asthma control programs and maintain their successful interventions.

Furthermore, Healthy People 2020 (HP2020) goals include 8 objectives that require asthma surveillance data. (https://www.healthypeople.gov/2020/topics-objectives/topic/respiratory-diseases/objectives)

CDC’s National Asthma Control Program plays a critical role in addressing the health risks that US residents face from this disease. The program funds states, cities, and schools to improve asthma surveillance, train health professionals, raise public awareness, and educate individuals with asthma and their families. The National Asthma Control Program is a function of the Asthma and Community Health Branch (ACHB), Division of Environmental Health Science and Practice in the National Center for Environmental Health.




What is the history of asthma surveys at CDC?


Surveys by the National Center for Health Statistics have been collecting data on asthma prevalence, asthma-related deaths (mortality), and several indirect indicators of asthma-related illness (morbidity), such as hospitalizations. These data provide a good basis for analyzing national trends, but not all can be analyzed at the state level.


State health agencies acquire and use resources to reduce behavioral health risks and the diseases that may result from them. ACHB saw the need to expand existing data systems and develop new systems in order to make data available at a state or local level, to do so more quickly, and provide asthma data with more detail.


In 1984, CDC established the Behavioral Risk Factor Surveillance System (BRFSS), a state-based system of health surveys administered and supported by the Division of Community Health, in the National Center for Chronic Disease Prevention and Health Promotion. Beginning with 15 states in 1984, the BR FSS is now conducted in all states, the District of Columbia, and participating US territories. The BRFSS is a telephone survey that obtains information on health risk behaviors, clinical preventive health practices, and health care access, primarily related to chronic disease and injury, from a random, representative sample of noninstitutionalized adults in each state. States use the data to identify emerging health problems, establish and track health objectives and develop and evaluate public health policies and programs. Many states also use BRFSS data to inform health-related legislative efforts.


In 2000, ACHB added questions about current and lifetime asthma prevalence to the core BRFSS survey. Since 2001, states have also had the option of adding an adult asthma history module to their survey, and in 2005 a child asthma prevalence module was included in the questionnaire (which requires the use of the child selection module as well). Many states, however, do not choose to add these modules because of cost or because they have more-pressing needs for other health-related data.


Using the BRFSS to collect additional information on asthma met two of ACHB’s three objectives to improve asthma surveillance. First, the BRFSS provides data for state and metropolitan states/territories in 50 states, the District of Columbia and participating US territories. Second, it is timely; data are available as soon as possible from the end of the calendar year of data collection.


The third ACHB surveillance objective is to increase the content detail for asthma surveillance data. Efforts to meet this objective began in 1998 when ACHB began creating a new survey with very detailed asthma content, called the National Asthma Survey (NAS) 3. A number of pilot tests of the survey were conducted in 2001 and 2002. The first survey used the State and Local Area Integrated Telephone Survey, an independent survey mechanism that was an offshoot of the National Immunization Program survey at CDC. The NAS survey complemented and extended survey work from the National Health Interview Survey, National Health and Nutrition Examination Survey and the BRFSS. It added depth to the existing body of asthma data, helped to address critical questions surrounding the health and experiences of persons with asthma, and in addition, could provide data at the state and local level.


In 2003 and early 2004, data were collected by the NAS in a national sample and in four state samples, but this proved to be a complex and costly process. So, in 2004, ACHB considered using the BRFSS as a way to identify respondents with asthma for further interviewing on a call-back basis. The BRFSS includes a much larger sample size in each area than that of the NAS. Respondents who answered “Yes” to questions about current or lifetime asthma during the BRFSS interview would be eligible for the subsequent asthma survey.


In 2005, the original NAS questionnaire was modified to eliminate items already on the BRFSS and to add some content requested by the individual states. The BRFSS provided respondents for the call-back survey in three asthma grantee states (Minnesota, Michigan, and Oregon) for the call-back pilot. ACHB increased the size of each state’s BRFSS sample to 10,000 respondents, hoping to obtain at least 1,000 respondents with asthma to call back. This increase in sample size, however, was very expensive, costing an additional $500,000 per state. Consequently, since 2006, the state BRFSS sample has not been increased for the Asthma Call-Back Survey (ACBS).


States that plan to conduct the ACBS among adults with asthma identified during the BRFSS no longer need to add the Adult Asthma Module to the BRFSS, since the questions on the call-back survey provide even more detailed answers. Nevertheless, if states wish to include children in the call-back survey, they must also include both BRFSS child modules: Random Child Selection and Childhood Asthma Prevalence module.


The ACBS has been implemented through BRFSS every year since 2006. The number of states or territories participating in the ACBS has increased each year. Since the 2011 survey, the weighting methodology for the BRFSS was changed significantly and cell phone samples were added to the traditional landline phone samples4.. The new weighting methodology—iterative proportional fitting, also known as “Raking”, replaced the post stratification weighting method that had been used with previous BRFSS data sets. Due to these two methodological changes, data from years 2010 and earlier are not comparable with data from year 2011 and later. Since the ACBS is methodologically linked to the BRFSS survey, data from the ACBS is also subject to the two methodological changes. Consequently, ACBS data from 2010 and earlier should not be compared or combined with ACBS data from 2011 and later.


In addition, while BRFSS initiated cell phone samples in 2011, not all ACBS-participating states included the cell phone sample in the ACBS. In 2011, only 6 of the 40 states included the cell phone sample in the ACBS. Because the majority of states doing the ACBS used only the landline samples, the landline sample weight was used to produce the ACBS weight and only landline data were included in the 2011 public-release file. The cell phone sample data from the 6 states that included cell phone samples in the 2011 ACBS were not released publicly5.


Detailed information for ACBS data from 2011-2015 can be found in the document titled “History and Analysis Guidance” at: https://www.cdc.gov/brfss/acbs/index.htm.


Data from the ACBS 2011 landline only file are methodologically comparable to data from the 2012 and later landline only file, but are not comparable to the ACBS LLCP data. Data from the ACBS 2012 LLCP file are methodologically comparable to ACBS 2013 and later LLCP files only. From 2015 forward, ACBS public released files will include only states collecting both landline and cell phone samples for both adult and child6..


As in 2015, for 2016, ACBS protocol required that states collect both Landline and Cell Phone (LLCP) samples for both adult and child. The adult LLCP public file includes 32 states/territories that met data quality standards from the subset of states/territories that did both LLCP samples (i.e. adult data from 32 states/territories are included in public release file). For child data, 29 states/territories collected data, but only 10 states/territories met the data quality standards from the subset of states/territories that did both LLCP samples (i.e. 10 states/territories are included in public release file).



Questionnaires, tables, data files and documentation for the ACBS can be accessed on its http://www.cdc.gov/brfss/acbs/index.htm


Funds are available for the ACBS from ACHB yearly through the BRFSS cooperative agreement. Any state or territory can apply for funds to implement the ACBS. States must include both the child asthma prevalence module and random child selection module to include children in the call-back survey. The BRFSS sample size will not be increased for the ACBS. In order to produce a sufficient number of respondents for detailed analysis, it is recommended that a state conduct the ACBS for 2 consecutive years at the very least. States participating in the 2016 ACBS are shown in the following table.

4.

2016 Participating States

Shape2 Shape1

States

Adult LLCP

Child LLCP

Arizona

*

California

*

Connecticut

Florida

Georgia

*

Hawaii

*

Illinois

*

Indiana

*

Iowa

DNCCD

Kansas

*

Maine

Massachusetts

*

Michigan

Minnesota

Mississippi

DNCAD

*

Missouri

*

Montana

*

Nebraska

Nevada

DNCCD

New Hampshire

DNCCD

New Jersey

*

New Mexico

*

New York

Ohio

Oklahoma

*

Oregon

*

Pennsylvania

*

Rhode Island

*

Texas

DNCCD

Utah

Vermont

*

Wisconsin

*

Puerto Rico

Table Legend: LLCP Landline Cell Phone Combined Sample

DNCAD = Did Not Collect Adult Data

DNCCD = Did Not Collect Child Data

*Not included in the public use file due to having < 75 completes (See Data Anomalies).

Included in public use file

Asthma Call-back Survey

Analysis Guidelines


The Asthma Call-back Survey (ACBS) is conducted approximately 2 weeks after the Behavioral Risk Factor Surveillance Survey (BRFSS). BRFSS respondents who report ever being diagnosed with asthma are eligible for the asthma call-back. If a state includes children in the BRFSS and the randomly selected child has ever been diagnosed with asthma, then the child is eligible for the asthma call-back. If both the selected child and the BRFSS adult in a household have asthma, then one or the other is eligible for the ACBS (50/50 split).


BRFSS now collects data in all 50 states as well as the District of Columbia and three U.S. territories. BRFSS questionnaires, data, and reports are available at www.cdc.gov/brfss . The most recently BRFSS data user guide can be found at: https://www.cdc.gov/brfss/data_documentation/PDF/UserguideJune2013.pdf.


From the parent survey (BRFSS), the ACBS inherits a complex sample design and multiple reporting states/territories. These factors complicate the analysis of the ACBS. Some states vary from both BRFSS and ACBS protocol. These variations should be considered prior to analysis of these data. Information on the BRFSS deviations can be found in the document titled Comparability of Data which can be accessed at:

https://www.cdc.gov/brfss/annual_data/2016/pdf/compare_2016.pdf.


A. Data anomalies and deviations from sampling frame and weighting protocols


Several states did not collect ACBS data for all 12 months of the year. This may be an issue when investigating seasonal patterns in the data. States with more than 3 months of no collected interviews are noted below. States missing more than 6 months of ACBS interviews are excluded from the public use data file.


  • 2016

    • California, Oklahoma, Oregon and Mississippi missed 6 or more months of interviews. Because asthma outcomes have seasonal variation, the data does not represent the same time (spread over 12 months) as the data for other states.

    • Florida did not collect adult and child ACBS Interviews in January, February and March.

    • Illinois did not collect child ACBS interviews in January, September, October and December, it is not included in the public release file.

    • Massachusetts did not collect child ACBS interviews in January, April, June and July. Massachusetts is not included in the child public release file.

    • Missouri did not collect adult and child ACBS interviews in January, February and March. Missouri is not included in the child public release file.

    • Montana did not collect adult and child ACBS interviews in January, February, March, and April with child also not being collected in August. Montana is not included in the child public release file.

    • New Mexico did not collect child ACBS interviews in January, February and March. New Mexico is not included in the child public release file.

    • Wisconsin did not collect child ACBS interview in January, February and September.

Wisconsin is not included in the child public release data file.






Several states varied from ACBS protocol in ways that affected the weighting procedures.


  • 2016

    • Massachusetts did call-back only for their version 1 sample for child dataset. Massachusetts is not included in the child public release file.

    • New York did call-back only for their version 2 sample for adult and child dataset.

    • Iowa, New Hampshire, Nevada and Texas did not collect child ACBS data.

    • Mississippi did not collect adult ACBS data.

    • Combined Landline and Cell Phone (LLCP) child data for Arizona, California, Georgia, Hawaii, Illinois, Indiana, Kansas, Massachusetts, Missouri, Mississippi, Montana, New Jersey, New Mexico, Oklahoma, Oregon, Pennsylvania, Rhode Island, Vermont, Wisconsin were not included in the public-release data file because of having too few records (<75) to produce reliable weights.

    • Combined Landline and Cell Phone (LLCP) adult data among weighted states in Connecticut, Florida, Georgia, Hawaii, Iowa, Massachusetts, Minnesota, New Jersey, New Mexico, Ohio, Rhode Island, Texas, Utah and Wyoming had more than 10% of the BRFSS records for respondents with asthma had not recorded information about the call-back participation. For these states, weighting was done using a Modified Adjustment Factor method.

    • Combined Landline and Cell Phone (LLCP) data for children among weighted states in Connecticut, Florida, Maine, Minnesota, New York, Ohio, Puerto Rico, and Utah had more than 10% of the BRFSS records for respondents with asthma had not recorded information about the call-back participation. For these states, weighted was done using a modified Adjustment Factor method.



For additional information on weighting the ACBS records, refer to the document “Asthma Call-Back Weighting Methods” which can be requested from NCEH/EHHE/ACHB ([email protected]).


B. Other limitations of the data


  • The Institutional Review Board (IRB) in some states required that asthma be mentioned when the BRFSS respondent was asked to participate in the ACBS. Other states required that asthma not be mentioned. Some state IRBs required that BRFSS respondents be specifically asked if their BRFSS responses could be linked to their ACBS responses. Other state IRBs did not. If a state required active consent to link the responses from the two interviews, the PERMISS variable on the data file will be coded one (1) for yes. If consent was denied, the ACBS was not conducted and there will be no record in the file. Wording for specific consent scripts can be obtained from each participating state.


  • Several states ask the ACBS consent questions directly after the asthma questions in the core of the BRFSS survey. Other states ask the consent questions at the end of the BRFSS interview.

  • Approximately 3% of the ACBS interviews are completed in the calendar year following the BRFSS interview. The Variable IYEAR_F identifies the year of the call-back interview.


Information about survey disposition codes, item non-response, complete and incomplete designation can be found in the ACBS Summary Data Quality Report. Similar information about the BRFSS can be found in the BRFSS Summary Data Quality Report which can be accessed at https://www.cdc.gov/brfss/acbs/2016/pdf/sdq_report_acbs_16-508.pdf



C. Data file and record issues


Data file


  • When the intent of an analysis is to compare those with asthma with those who do not have asthma, the appropriate file to use is the BRFSS file. The sample size is larger and the responses to BRFSS questions are available for all respondents with asthma and without asthma.

  • When the intent of an analysis is to compare subpopulations of those with asthma, the appropriate file to use is the call-back file.


Data record


  • The ACBS record for a respondent consists of the entire BRFSS interview record followed by the ACBS data. There is no need to merge the ACBS data with data from the BRFSS interview. The ACBS codebook, however, does not include the BRFSS portion of the data. BRFSS codebooks can be accessed at: http://www.cdc.gov/brfss/annual_data/annual_data.htm after selecting an individual survey year.


Skip patterns


  • The Asthma Call-back questionnaire has multiple and complex skip patterns. Each of the skip patterns has been coded into subsequent questions using individual value codes to identify the source response that caused the question to be skipped. These additional codes do not appear in the questionnaire, but are in the codebook. This skip coding allows the analyst to clearly determine an existing skip pattern and easily decide the denominator appropriate for any given analysis or statement without tracing skip patterns in the questionnaire. For more information on coding skip patterns see the document “Coding Skip Patterns,” which can be requested from NCEH/DEHSP/APRHB ( [email protected] ).



Calculated variables


  • Not all of the variables that appear on the public use data set are taken directly from the ACBS questionnaire. CDC prepares a large set of calculated variables that are added to the actual questionnaire responses. The vast majority of the variables on the ACBS file are calculated variables. The calculated variables are created for the user's convenience. The procedures for the calculated variables vary in complexity; some only combine codes from one or two questions, while others require sorting and combining selected codes from multiple variables.


  • At the time of the call-back interview, the respondent is asked to confirm the responses to the two asthma questions from the BRFSS interview. Not all respondents agree with the responses that were recorded from the initial interview.


  • The calculated combined call-back asthma variables _CUR_ASTH_C and _EVER_ASTH_C are not identical to the BRFSS asthma variables ASTHNOW and ASTHMA3 (CASTHNO2 and CASTHDX2 for children) or the BRFSS adult calculated variables _CASTHM1 and _LTASTH1.

  • The combined call-back variables _CUR_ASTH_C and _EVER_ASTH_C use the BRFSS responses when the respondent agreed with them and the ACBS responses at the time of the call-back interview when the respondent did not agree with the BRFSS responses.




When using call-back data the combined variables (_CUR_ASTH_C and _EVER_ASTH_C) should be used and not the BRFSS interview variables.


  • For further details regarding these and other calculated variables, refer to the document entitled “Calculated Variables for the Asthma Call-back Survey,” which can be requested from NCEH/EHHE/ACHB ([email protected]).




Questionnaire changes


  • There were no changes to the ACBS questionnaire for 2014 through 2016.

  • 2013 ACBS questionnaire changes included:

    • Inhaler medications Brethaire, Intal, and Tilade were deleted since all have been discontinued

    • Inhaler medications Alvesco and Dulera were added

Nebulizer medications Combivent Inhalation Solution and Perforomist/Formoterol were added

  • 2012 ACBS questionnaire changes included:

    • The name for INH_MED 25 was changed to Flex Haler

    • Three inhaler medication questions were deleted (ILP01, ILP02, and ILP07)

    • Response categories for inhaler medication question ILP03 were changed

    • The question PILLX (how long taking a specific pill) was deleted and PILL01 (on daily use) was added

    • Three questions on nebulizers were added

    • Some skip patterns and help screens were revised in the medication section

    • The content of the Work-related asthma section was completely revised

    • The time reference period for the activity limitation variable was changed from 12 months to 30 days


D. Estimation procedures


Statistical issues


  • Record weights


Unweighted data on the ACBS represent the actual responses of each respondent before any adjustment is made for variation in respondents' probability of selection, disproportionate selection of population subgroups relative to the state's population distribution, or nonresponse. To produce the ACBS final weight, the BRFSS final weight is adjusted for loss of sample between the BRFSS interview and the ACBS interview. Weighted ACBS data represent results that have been adjusted to compensate for nonresponse at the BRFSS interview and at the ACBS interview. For further details regarding the ACBS final weight, refer to the document entitled “Asthma Call Back Weighting Method,” which can be requested from NCEH/DEHSP/ACHB ([email protected]).


Use of the ACBS final weight is essential when analyzing these data. If weights are not used, the estimates produced will be biased.


In 2016, all states implementing the ACBS included the BRFSS Landline and Cell Phone (LLCP) sample. Therefore, public use file was released for adult Landline and Cell Phone (LLCP) samples only. The LLCP file includes landline and cell phone data from the subset of states that included both the landline and the cell phone samples and met data quality standards.


The ACBS child files must have a minimum of 75 completes to produce a reliable child weight. The public use file for children was released for the combined landline and cell phone samples (included 10 states/territories that met data quality standards).


ACBS Landline and Cell Phone (LLCP) file

In the 2016 ACBS LLCP file, the ACBS final weight was produced using the BRFSS landline cell phone weight (_LLCPWT for adults and _CLLCPWT for children). The 2016 ACBS LLCP final weight variables are:

  • LLCPWT_F for adults

  • CLLCPWT_F for children


  • Variances


The procedures for estimating variances described in most statistical texts and used in most statistical software packages are based on the assumption of simple random sampling (SRS). The data collected in the ACBS, however, are obtained through a complex sample design; therefore, the direct application of standard statistical analysis methods for variance estimation (including standard errors and confidence intervals) and hypothesis testing (p-values) may yield misleading results.

Computer programs that take such complex sample designs into account are available. SAS, SUDAAN, Epi Info, SPSS and STATA are among those suitable for analyzing these data.


  • SAS SURVEYMEANS, SURVEYFREQ, SURVEYLOGISTIC, and SURVEYREG can be used for tabular and regression analyses.

  • SUDAAN can be used for tabular and regression analyses and also has additional options6.

  • Epi Info's C-sample can be used to calculate simple frequencies and two-way cross- tabulations.

  • SPSS Complex Samples can be used to produce frequencies, descriptive analysis, cross-tabulations, and ratios as well as estimate general linear, logistic, ordinal, and Cox regression models .

  • STATA can produce cross-tabulations, means, logit and general linear regression models9.


When using these software products, users must specify that the sample design is “With Replacement” and also specify the stratum variable (_STSTR), the primary sampling unit (_PSU), and the record weight (LLCPWT_F, CLLCPWT_F)



Analytic issues


  • Sample size


Although the overall number of respondents in the ACBS is more than sufficiently large for statistical inference purposes, subgroup analyses (including state level analysis) can lead to estimates that are unreliable. Consequently, users need to pay particular attention to the subgroup sample when analyzing subgroup data, especially within a single data year or geographic area. Small sample sizes may produce unstable estimates. Reliability of an estimate depends on the actual unweighted number of respondents in a category, not on the weighted number. Interpreting and reporting weighted numbers that are based on a small, unweighted number of respondents can mislead the reader into believing that a given finding is much more precise than it actually is.


ACBS follows a rule of not reporting or interpreting point estimates based on fewer than 50 unweighted respondents (e.g. percentages based upon a denominator of < 50) or for which the Relative Standard Error is greater than 30%. For this reason, and to protect confidentiality of these data, the FIPS County code is not included on the ACBS public use data record.



  • Aggregating data over time


  • When data from one time period are insufficient, data from multiple periods can be combined as long as the prevalence of the factor of interest did not substantially change during one of the periods. One method that can be used to assess the stability of the prevalence estimates is as follows10:


  1. Compute the prevalence for the risk factor for each period.

  2. Rank the estimates from low to high.

  3. Identify a statistical test appropriate for comparing the lowest and the highest estimates at the 5% level of significance. For example, depending on the type of data, a t-test, or the sign test might be appropriate.

  4. Test the hypothesis that prevalence is not changing by using a two-sided test in which the null hypothesis is that the prevalence are equal.

  5. Determine whether the resulting difference could be expected to occur by chance alone less than 5% of the time (i.e., test at the 95% confidence level).


  • When combining multiple years of ACBS data for the purpose of subgroup analysis, the final weight will need adjusting and the file year will need to be added as an additional stratum on the complex design specification. When combining multiple years of data for the purpose of examining trends, however, reweighting is not appropriate. For more information on reweighting combined years see the document “Reweighting Combined Files,” which can be requested from NCEH/DEHSP/ACHB ([email protected]).



  • Analyzing subgroups

  • Provided that the prevalence of risk factors did not change rapidly over time, data combined for two or more years may provide a sufficient number of respondents for additional estimates for population subgroups (such as age/sex/race subgroups or state populations). Before combining data years for subgroup analysis, it is necessary to determine whether the total number of respondents will yield the precision needed, which depends upon the intended use of the estimate. For example, greater precision would be required to justify implementing expensive programs than that needed for general information only.


The table below shows the sample size required for each of several levels of precision, based on a calculation in which the estimated risk factor prevalence is 50% and the design effect is 1.5.

Precision desired Sample size needed

2% 3600

4% 900

6% 400

8% 225

10% 144

15% 64

20% 36


Precision is indicated by the width of the 95% confidence interval around the prevalence estimate. For example, precision of 2% indicates that the 95% confidence interval is plus (+) or minus (-) 2% of 50%, or 48% to 52%. As shown in the table, to yield this high a level of precision, the sample size required is about 3,600 persons. When a lower level of precision is acceptable, the sample size can be considerably smaller.


  • The design effect is a measure of the complexity of the sampling design that indicates how the design differs from simple random sampling. It is defined as the variance for the actual sampling design divided by the variance for a simple random sample of the same size8,9. For most risk factors in most states, the design effect is less than 1.5. If it is more than 1.5, however, sample sizes may need to be larger than those shown in the table above.


  • The standard error of a percentage is largest at 50% and decreases as a percentage approaches 0% or 100%. From this perspective, the required sample sizes listed in the table above are conservative estimates. They should be reasonably valid for percentages between 20% and 80%, but may significantly overstate the required sample sizes for smaller or larger percentages.



E. Advantages and disadvantages of telephone surveys


  • Compared with face-to-face interviewing techniques, telephone interviews are easy to conduct and monitor and are cost efficient, but telephone interviews do have limitations. Telephone surveys may have higher levels of non-coverage than face-to-face interviews because some U.S. households cannot be reached by telephone. While approximately 94.1% of households in the United States have telephones, a number of studies have shown that the telephone and non-telephone populations are different with respect to demographic, economic, and health characteristics10,11. Although the estimates of characteristics for the total population are unlikely to be substantially affected by the omission of the households without telephones, some of the subpopulation estimates could be biased. Telephone coverage is lower for population subgroups such as blacks in the South, people with low incomes, people in rural areas, people with less than 12 years of education, people in poor health, and heads of households under 25 years of age12. Nevertheless, post stratification adjustments for age, race, and sex, and other weighting adjustments used for the BRFSS and ACBS data minimize the impact of differences in non coverage, under coverage, and nonresponse at the state level.


Despite the above limitations, prevalence estimates from the BRFSS correspond well with findings from surveys based on face-to-face interviews, including studies conducted by the National Institute on Alcohol Abuse and Alcoholism, CDC's National Center for Health Statistics, and the American Heart Association8. A summary of methodological studies of BRFSS can be found at: https://www.cdc.gov/brfss/publications/data_qvr.htm


Surveys based on self-reported information may be less accurate than those based on physical measurements. For example, respondents are known to underreport weight. Although this type of potential bias is an element of both telephone and face-to-face interviews, the underreporting should be taken into consideration when interpreting self-reported data. When measuring change over time, this type of bias is likely to be constant and is therefore not a factor in trend analysis.


  • With ongoing changes in telephone technology, there are more and more households that have cellular telephones and no traditional telephone lines in their homes. These household are currently not in the sampling frame for the BRFSS, which may bias the survey results, especially as the percentage of cellular-telephone-only households continues to increase11. The BRFSS is continuing to study the impact of cellular phones on survey response and the feasibility of various methods for data collection to complement present survey methods3.






REFERENCES


1. https://www.cdc.gov/asthma/nhis/2016/data.htm

2. http://www.cdc.gov/nchs/fastats/asthma.htm

3. https://www.cdc.gov/nchs/slaits/nas.htm

4. . Centers for Disease Control and Prevention (CDC). (2012). “Methodologic changes in the Behavioral Risk Factor Surveillance System in 2011 and potential effects on prevalence estimates”. Morbidity and Mortality Weekly Report (MMWR) 61(22)pp: 410–413. Available at: https://www.cdc.gov/mmwr/preview/mmwrhtml/mm6122a3.htm.

5. https://www.cdc.gov/brfss/acbs/2015_documentation.html

6. https://www.cdc.gov/brfss/acbs/2011_documentation.html

7. https://www.cdc.gov/brfss/acbs/2016/pdf/sdq_report_acbs_16-508.pdf

8. Frazier EL, Franks AL, Sanderson LM. (1992) Behavioral Risk Factor Data. In: Using Chronic Disease Data: A Handbook for Public Health Practitioners. Atlanta: Centers for Disease Control and Prevention; 4.1-1.17

9. Kish, Leslie (1965). Survey Sampling. New York: Wiley.

10. Groves RM, Kahn RL. (1979) Surveys by Telephone: A National Comparison with Personal Interviews, New York: Academic Press; 1979.

11. Blumberg, S.J., Luke, J.V. (2018). “Wireless Substitution: Early Release of Estimates From the

National Health Interview Survey, January–June 2018 .” Atlanta, GA: US Department of Health and Human Services, Centers for Disease Control and Prevention. Available at: https://www.cdc.gov/nchs/data/nhis/earlyrelease/wireless201812.pdf.


12. American Association for Public Opinion Research (AAPOR). (2017). “The Future of

U.S. General Population Telephone Research.” Available at: https://www.aapor.org/Education-Resources/Reports/The-Future-Of-U-S-General-Population-Telephone-Sur.aspx

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title2006-2008 Behavioral Risk Factor Surveillance System Asthma Call-Back Survey History And Analysis Guidance
Subject2006-2008 Behavioral Risk Factor Surveillance System Asthma Call-Back Survey History And Analysis Guidance
AuthorTatum, Sharquetta (CDC/DDNID/NCCDPHP/DPH) (CTR)
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy