Request for genIC

1291 Cover Form (1).docx

Cognitive Testing and Pilot Testing for the National Center for Chronic Disease Prevention and Health Promotion

Request for genIC

OMB: 0920-1291

Document [docx]
Download: docx | pdf

GenIC Submitted Under

Cognitive Testing and Pilot Testing for the National Center for Chronic Disease Prevention and Health Promotion

OMB Control Number: 0920-1291 Expiration 3/31/3023


  1. Date: June 1, 2020

  2. Name, CIO/Program: Carol Pierannunzi, NCCDPHP/DPH/PHSB

  3. Title of Study: Feasibility Testing for Collection of BRFSS Supplemental Data Using Web-Based Methods

  4. Study Type: Pilot Testing



  1. Purpose of Study:

The purpose of this pilot is to test the feasibility of collection of supplemental/comparison data for the BRFSS. Two methods of data collection using web-based methods will be included. These are: (1) recruiting random digit dialing (RDD) cell phone sample participants using current BRFSS sampling processes and asking respondents to complete the survey by sending an SMS text link (push-to-web), and: 2) conducting data collection using an available internet panel (a group of individuals that have agreed in advance to participate in a series of surveys). The 2021 BRFSS Core questions will be used for all data collection (see Attachment 1). Ten states will be included in the test (NY, TX, WA, ID, IL, LA, FL, IA, NM, and RI).



The information from this study will be used to:

  1. Assess the quality and validity of data collected among/between survey modes (traditional RDD/RDD push-to-web/internet panel) during concurrent data collection times.

  2. Evaluate of differences in responses (including item refusal, demographic comparisons and health outcomes) by types of questions.

  3. Produce methodological paradata reports including (but not limited to) selection bias; mode differences in sampling and response by item; differences in responses by types of questions (sensitive, long, complicated skips); costs per complete; time to completion; response rates and other data on responses including time of text invitation, day of response, stop/restart survey and device used to respond.

  4. Produce state and substate population estimates for all indicators in the final dataset adjusted by mode (if necessary) and weighted to the most recent American Community Survey population totals.

  5. Compare state-level differences in cost effectiveness, response rates, and data quality (such as item nonresponse) among respondents who complete survey via traditional RDD, RDD push-to-web, and internet panel.

  6. Determine whether data collected by web-based methods can be incorporated into the BRFSS dataset and whether statistical adjustment calculations for mode of response are necessary.

No data from study will be used for population prevalence estimation or rigorous analysis of health data.

Overall results will be used to assess methods for improved data quality and efficiency in the BRFSS data collection process. The practice of testing modes for data collection is standard for all large-scale data collection systems such as the BRFSS and other data collection systems housed within the NCCDPHP. This experiment will provide information on feasibility of collecting BRFSS data by RDD push-to-web and internet panel. It is currently unknown whether the proportion of respondents recruited via RDD push-to-web or internet panel will be sufficient to merit new procedures and development of a web-based instrument as part of the standard protocol of the BRFSS. Results from the experiment will direct further experimentation in BRFSS sampling and data collection protocols.



  1. Respondent Characteristics:

A total of 14,000 respondents will be recruited (8,000 for RDD push-to-web and 6,000 for the internet panel). We plan to screen 20,000 people to get the 14,000 respondents. Respondents must be U.S. residents, 18 years of age and older and living in a private residence, within one of the 10 states included the pilot study. States included in the project (NY, TX, WA, ID, IL, LA, FL, IA, NM, and RI) have diverse characteristics in terms of urban/rural regions, population/ race/ ethnicity heterogeneity, and population size (large and small states)



  1. Study Methods:

The data will be collected via web-based survey and internet panels. Data will then be compared to ongoing surveillance using identical questions.

  1. Recruitment and Incentives:

Modified protocols of the BRFSS RDD sample for cell phones will be used to identify and recruit potential participants for the RDD push-to-web (see Attachment 2). A commercially available sample from an internet panel will also be used. Potential participants will be recruited from 10 states with diverse characteristics in terms of urban/rural regions, population/race/ethnicity heterogeneity, and population size (large and small states). Incentives provided by the internet panel contractor will be provided. No incentives will be provided for respondents via RDD push-to-web. Attachment 3 provides an example of the recruitment text for persons who are included in a commercially available internet panel.



  1. Personally Identifiable Information (PII):

General demographic characteristics of respondents will be collected and associated with paradata on questions posed to participants and question responses. RDD sample files will include phone numbers. Respondents will not disclose their names or addresses as part of the question process. CDC will not retain any PII and will not maintain sample files of phone numbers. Respondents’ phone numbers and other sample information will be kept in files separate from response files and will not be connected to responses. After completion of the feasibility test, sample files will be destroyed. Contactor(s) and CDC will be the only entities with access to the dataset(s).


A summary report of the Feasibility Testing for Collection of BRFSS Supplemental Data Using Web-Based Methods will be provided to state health departments in the states included in the test. Information in summary form may be used for presentations on methodology, but combined datasets from participating states will not be provided. Results may also be used to prepare and present methodological research papers at professional conferences or for peer reviewed journals. No data from the feasibility test will be used to produce prevalence estimates or analyze public health status.



  1. Informed Consent/Voluntary Participation: Detail methods for ensuring informed and voluntary participation here (e.g., steps taken to obtain informed consent, how the information will be secured).

During the initial screening for RDD push-to-web, an interviewer will obtain informed consent. The potential participants will be informed that their telephone number was randomly selected and that participation in the study is completely voluntary. The interviewer will explain the nature of the study and approximately how long the survey will take. The potential participant will be told that they do not have to answer any question that they do not want to, and they can stop the survey at any time. They will be informed that their response on the survey will not be connected to any personal information, and it they have any questions they can call the survey point of contact. Since commercially available internet panels are comprised of voluntary participants, there will not be a need to obtain additional informed consent from those potential participants.



No PII will be collected. All telephone numbers in the RDD push-to-web method will be retained by a contractor for the use of the feasibility experiment and then files will be destroyed. The PII section details how information will be secured.



  1. Analysis of Data. Explain how data will be analyzed.

Analyses will include calculating cooperation rates, completion rates, overall response rates, and the distribution of demographic characteristics for persons completing the web-based survey for the RDD push-to-web and internet panel modes. Data for each mode will be evaluated on differences in responses (including item refusal, demographic comparisons and health outcomes) by types of questions. We will also compare state and mode differences in cost effectiveness, response rates, and data quality (such as item nonresponse) among respondents who complete the web-based survey via traditional RDD, RDD push-to-web, and internet panel.



  1. Collection Timeline.



The timeline below for the project (listing milestones) is in months.



Item

1

2

3

4

5

6

7

8

9

10

11

12

Project planning





Internet panel programing













CATI/ Web Programming






Interviewing via RDD push-to-web







Internet panel data collection





Data merging





Analysis













Dataset production













Report writing












  1. Burden Table.



Estimates of Annualized Hour and Cost Burden

Type of Respondents

Stage of Survey Administration

Number of respondents

Number of responses per respondent

Average burden per response (in hours)

Total burden in hours

General U.S. Adult Population

Screening for respondents via RDD push-to-web

20,000

1

0.1

2,000

Respondents via Internet panel

6,000

1

0.28

1,680

Respondents via RDD push-to-web

8,000

1

0.28

2,240

Total





5,920



Table 12B Estimated Annual Cost Burden

Stage of Survey Administration

Single Administration Burden Hours

Average Hourly Rate*

Total

Single Administration

Cost Burden

Screening for respondents via RDD push-to-web

2,000

$25.72

$51,440

Surveying respondents via Internet panel

1,680

$25.72

$43,210

Surveying respondents via RDD push-to-web

2,240

$25.72

$57,613

Total

5,920


$152,263

*Based upon the average hourly earnings from the Bureau of Labor Statistics May 2019 National Occupational Employment and Wage Estimates (available at https://www.bls.gov/oes/current/oes_nat.htm).









File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorStill-LeMelle, Terri (CDC/DDNID/NCCDPHP/OD)
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy