Supporting Statement_B_4 08 2013

Supporting Statement_B_4 08 2013.docx

Use of Smartphones to Collect Information about Health Behaviors: Feasibility Study

OMB: 0920-0972

Document [docx]
Download: docx | pdf






Use of Smartphones to Collect Information about Health Behaviors:

Feasibility Study


New


Supporting Statement: Part B











Program official/project officer: Shanta Dube

Office on Smoking and Health

Centers for Disease Control and Prevention

Tel: (770) 488-6287

Email: [email protected]




April 8, 2013

Table of Contents


Section B Collections of Information Employing Statistical Methods


B-1 Respondent Universe and Sampling Methods

B-2 Procedures for the Collection of Information

B-3 Methods to Maximize Response Rates and Deal with No Response

B-4 Tests of Procedures or Methods to Be Undertaken

B-5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data




Appendices

Appendix A Authorizing Legislation

Appendix B1 Federal Register Notice

Appendix B2 Comments in Response to the Federal Register Notice

Appendix C1 Screener/CATI Recruitment

Appendix C2 CATI Informed Consent

Appendix D Initial CATI Survey

Appendix E First Web Survey Follow-up for Smartphone Users

Appendix F Second Web Survey Follow-up for Smartphone Users

Appendix G First Text Message Survey Follow-up for non-Smartphone Users

Appendix H Second Text Message Survey Follow-up for non-Smartphone Users


COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1 Respondent Universe and Sampling Methods

The universe for the study is English-speaking U.S. residents aged 18-65. Major goals of the study are to evaluate the extent to which this universe can be covered by cell phone users in general and smartphone users in particular. The cell phone sample will be a national RDD sample of telephone numbers from a frame of known cell phone exchanges. We will purchase the cell phone RDD sample from MSG.

The primary goal of the sampling plan is to recruit, via an initial CATI survey, 900 respondents to the smartphone/text message follow-up surveys. Based on recent experience with other studies, we expect 63% of cell phone RDD respondents to be smartphone users/37% to be traditional/feature phone users, and 80% to be between 18 and 65 years old and able to complete the survey in English. Of those eligible, we expect 70% to agree to participate in the follow-up surveys. Thus, we will need to screen ~1,990 initial CATI respondents to identify 1,590 eligible respondents in order to recruit 1,115 respondents to participate in the follow-up surveys; of these, approximately 700 will be smartphone users who will complete the follow-up surveys via Web (survey invitations will be sent by text message to respondents’ smartphones) and we will select a random subset of 200 traditional/feature phone users to complete by text message.

The follow-up survey cooperation rates are unknown (they are an important study outcome), but, given that respondents will be given incentives, and the surveys are very short, we hope that 85% of the recruited sample will participate in at least one survey, and 70% will participate in both. Table 1 shows the estimated sample sizes for important study comparisons based on these assumptions.





































Table 1: Levels of Analysis and Estimated Sample Sizes

Evaluating

Comparing

To


Approx. N


Approx. N

Cell phone coverage

18-65 year old RDD cell respondents

1,590

RDD non-respondents (from Census and NHIS)

N/A

Smartphone coverage

Smartphone users

907

Non-smartphone cell users

533

Smartphone non-response

Smartphone users who agree to follow-up

700

Smartphone users who do not agree

207

All cell type (smartphone/ text message) non-response

Smartphone and text message recipients recruited for follow-up

900

Participants who refuse/are not asked to participate in follow-up

440

Smartphone and telephone interview responses

Smartphone users who answer at least one weekly survey

595

Smartphone users who answered the initial survey

1002

Smartphone + text respondents who answer at least one weekly survey

765

Smartphone + text respondents who answered the initial survey

1,590

Smartphone vs. text message survey data quality (responses and item non-response

Smartphone users who answer at least one weekly survey

595

Text message respondents who answer at least one weekly survey

170



B.2 Procedures for the Collection of Information

The pilot survey will consist of an initial CATI interview followed by a series of two follow up Web surveys with smartphone respondents, and follow-up text message surveys with traditional/feature phone respondents.

The Smartphone Study sample will consist of a national RDD sample of phone numbers from cell phone and cell/landline exchanges. The exchanges originate from the Telecordia® TPM™ Data Source. The cell phone exchanges and mixed-use exchanges are identified from exchange type.

Initial interview. Respondents will be contacted by trained interviewers. Respondents who are aged 18-65 will be asked simple demographic questions and questions about tobacco use and quit attempts (Appendix D). The questions will be drawn or adapted from the National Adult Tobacco Survey (OMB No. 0920-0828, exp. 10/31/2010). Participants who have smartphones will be invited to complete Web follow-up surveys. A sample of participants with traditional/feature phones will be invited to complete follow-up surveys via text message. This phase will allow us to measure the coverage error associated with restricting the sample to smartphone holders and will set the stage for comparing data quality of RDD and smartphone question responses.

Smartphone surveys. One goal of the study is to compare data collected from the follow-up surveys with smartphone users (Appendices E and F) to data collected from the initial CATI interview. The topics of the smartphone interview, therefore, will be similar to those of the initial interview: tobacco use and quit attempts. Another goal is to test the utility and quality of data collected via several short smartphone interviews. That is, part of the utility of the smartphone interview is in researchers’ ability to use it like a diary, sending invitations to complete frequent, short interviews.

Text message surveys. Another goal of the study is to compare data collected directly through text messages to data collected via smartphones. Since text messages are limited in length, and an opt-out message must be included, the text message questions will be shorter (Appendices G and H) than the smartphone interview questions, but their content will be similar.

Telephone interviewing procedures

Managing Call Attempts: Each call attempt will be given a minimum of five rings. Careful management of the sample allocation and scheduling of interview sessions will assure adequate penetration coverage of residential households with a maximum of 6 attempts for unresolved telephone numbers. Persistent “ring - no answers” will be attempted a minimum of four times at different times and days of the week.  

Conducting the Interview: A screener will be conducted at the beginning of each call (Appendix C1). The screener consists of: (1) verification of phone number; (2) verification of private residence; and (3) verification that respondent is between 18 and 65 years of age; and (4) not driving or engaging in another activity that could jeopardize safety.

Attempting Call-backs: The calling system optimizes queuing for definite call-backs by continuously comparing station sample activity and the index of definite call-back records. When a definite appointment time arrives, the system finds the next available station and delivers the record as the next call. The call history screen that accompanies each record informs the interviewer that the next call is a definite appointment and describes the circumstances of the original contact. Callbacks to cell phone users will be limited to one additional refusal attempt after an initial refusal because refusal conversion attempts are less successful on cell phones than landlines. Nonetheless, scheduling and honoring callbacks is critical to achieving high response rates.

Managing Interrupted Interviews: Interrupted interviews with receptive respondents will be restarted using a definite call-back strategy. A definite call-back for an exact time can be set and the interview can begin where it left off. If the interviewer who began the survey is available at the prescribed time, the system will send the call back to that station.

Recording Call Dispositions: Dispositions of each call attempt on all records in the sample will be automatically stored in the CATI system. This provides a complete call history for each record in the sample. The call history is displayed on the interviewer’s screen during each new attempt.

At the start of the initial CATI interview, the interviewer will read the informed consent (Appendix C2) to each participant. The consent form describes the interview, provides information on whom to contact with questions about any aspect of the study, and indicates that participation is completely voluntary and that participants can refuse to answer any question or discontinue the interview at any time without penalty or loss of benefits. The interviewer will enter a code via the keyboard to signify that the participant was read the informed consent script and agreed to participate.

Smartphone interviewing procedures

Invitation to participate: We will send invitations to participate by text message to smartphone respondents who agreed during the initial CATI interview to participate in the follow-up surveys. Each text message will contain introductory text, a link to the survey, and a message regarding how to opt out of the survey:

CDC/ICF Macro Smartphone Study: http://mysurvey.icfsurveys.com?id=abc123. To unsubscribe, text “STOP” to //destination//.

Survey administration. The link embedded in the text message invitation will direct respondents to the survey (Appendix E and Appendix F) and also provide the website with their randomly-generated, unique Master ID. Smartphone screens are small, and they may be portrait- or landscape-oriented. The surveys will be formatted to fit on these screens. Specifically:

  • Survey questions and response categories will be short so that the necessity for scrolling is eliminated.

  • Only one question will appear on each screen.

  • There will be no grids or questions that require zooming.

  • Focus group results may also be used to refine question presentation.


Text message interviewing procedures

Text message opt in. We will send invitations to participate by text message to traditional/feature phone respondents who agreed during the initial CATI interview to participate in the follow-up surveys. Before we send multiple messages (one message per survey item) to a cell phone number, it is appropriate to gain a second opt-in from the phone owner. Respondents will be asked to reply “yes” to a phone text message to enroll in the study.

Survey questions. Survey questions (Appendix G and Appendix H) will be sent one at a time. An automatic program will be used to check survey responses and send follow-up questions to compliant responses (e.g., “YES”, ”yes”, etc.). Responses will also be visually reviewed at the end of each day to determine whether respondents sent a compliant response that was not recognized by the automatic system (e.g., “yes”).

Opt out. The automatic system and visual review will also identify responses of STOP, the code that respondents will be instructed to use to opt out of the survey.

Incentives. Incentives for the text message surveys will be based on the same point system used in the smartphone surveys, but respondents will only have the option of receiving Amazon.com gift codes.

Quality Control

Maintaining the integrity of the data at each phase is a priority for the smartphone pilot. Table 2 shows the specific steps we will take to ensure that the data and the final analysis are accurate.

Table 2: Quality Control Plan

Survey Step

Quality Control Procedures

Testing of CATI and internet programs

  • Visually review each question and response options (100%)

  • Use “skip check” program and randomly generated data to check every possible path through the survey (100%)

  • Use skip check program to test all data collected during data collection (every 24 hours)

CATI quality assurance

  • Monitor at least 10% of all interviews (10% sample)

  • Monitor each interviewer at least once per week (100%)

  • Assign supervisors to manage a team of no more than 10 interviewers (100%)

  • Participate in daily briefing call with Command Center (100%)

  • Review call center shift reports and internal project tracking reports daily (100%)

Internet data collection quality assurance

  • Project management staff phone numbers “seeded” in text message invitations (100%)

  • Visual review of % in-survey break-offs (daily)

Text message data collection quality assurance

  • Project management staff conduct a complete pilot survey to test all automatic systems before the survey is conducted with respondents (100%)

  • Project management staff phone numbers “seeded” in text message invitations and questions (100%)

  • Daily visual review of responses from participants to ensure that all messages are correctly recognized by automatic systems

Preparation of data files

  • Assign a final disposition to each record (100%)

  • Produce frequency tabulations of every question and variable to detect missing data or errors in skip patterns (100%)


B.3 Methods to Maximize Response Rates and Deal with No Response

Two different kinds of response rates are used in CATI studies.  We will calculate response rates using industry-approved AAPOR formulas The Cooperation Rate (CR) is the proportion of all respondents interviewed of all eligible units in which a respondent was selected and actually contacted.    Non-contacts are excluded from the denominator.   This rate is based on contacts with households containing an eligible respondent.  For the initial CATI survey, we expect to attain a CR of 65% to 85%, with a mean of 70% to 75%.  A Response Rate (RR) is an outcome rate with the number of completed interviews in the numerator and an estimate of the number of eligible units in the sample in the denominator.   For the initial CATI survey, we expect to attain an RR of 25% to 40%, with a mean of 30%-35%.    

To maximize cell phone survey response rates, we have kept the initial interview as short as is feasible. Dialing attempts will be spaced across daytime, weeknight, and weekend calling periods. Cell phone numbers will be dialed up to a maximum of six times, and we will expand calling hours both earlier and later to facilitate the completion of surveys with respondents who request a callback outside of normal calling hours.

To provide respondents with easily accessible information about the study, we will provide a project menu of Interactive Voice Recognition (IVR) options, so that respondents who wish to learn more about the study or verify its legitimacy may access a study-specific IVR system via a study-dedicated toll free number.

The Web survey (for smartphone users) and text message survey (for traditional/feature phone users) response rates are an important outcome of this feasibility study. To ensure that respondents do not incur uncompensated costs for data or text messages, they will be given a payment of up to $10 for their participation.

Evaluating non-response is an important part of the present study. We will evaluate coverage and non-response error on tobacco use, sex, race, and ethnicity. See Table 1 for the specific planned comparisons.

B.4 Tests of Procedures or Methods to be Undertaken

The purpose of this study is to evaluate procedures for collecting data via Web survey and text messages with smartphone and traditional/feature phone users, respectively. On April 5, 2012, two focus groups of 4 participants each were conducted to evaluate the feasibility of using Smartphones for health and behavior data collection. During these groups, participants were asked questions to discern what kinds of and how many questions could be asked on a survey to be completed by smartphone or text message, what type of incentive would be needed, and what other barriers to participation were identified. Information gained from these focus groups included:

  • The incentive amount and how interesting a project is are strong motivators to participate;

  • Perceived burden of survey completion (including the inclusion of multiple choice vs. open-ended questions) is a barrier to participation;

  • The use of contingent incentives is an effective way to increase compliance over time; and

  • Most respondents are willing to answer sensitive questions on a Smartphone.

In addition, the focus groups revealed important, new information regarding barriers to Smartphone survey participation, including:

  • Individuals with Androids and Blackberries face usability challenges that may impact their willingness to complete a survey on a Smartphone;

  • Many individuals have Smartphones provided through their work, and would not be willing to answer survey questions on a work phone;

  • Although all participants were active Smartphone users, many perceived their Smartphone as less secure than their laptop or desktop computer;

  • Many participants do not shop online, and would not be motivated by online incentives such as ZashPay or Pay Pal; and

  • Some participants would be very reluctant to participate in surveys which would require scanning products with their Smartphones.

Feedback from the focus groups will be used to help develop the content and format of the surveys. Prior to collecting data for the initial CATI interview, a pretest of 20 interviews will be conducted to sharpen the articulation of certain survey questions and confirm the empirical estimate of the survey burden. The smartphone and text message surveys will be beta tested by ICF using staff’s smartphones and feature phones. The test will provide information on elements such as survey login procedures and displays of survey questions on screens, as this may differ by carrier and by phone type

B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Statistical aspects of the study have been reviewed by the individuals listed below.

Statistical Review

Shanta Dube, MPH, PhD

Office on Smoking and Health

National Center for Chronic Disease Prevention (NCCDPHP)

Centers for Disease Control and Prevention (CDC)

4770 Buford Highway, MS K-50

Atlanta, GA 30341

770-488-6287

[email protected]


Sean Hu, MD, MS, DrPH

Office on Smoking and Health

National Center for Chronic Disease Prevention (NCCDPHP)

Centers for Disease Control and Prevention (CDC)

4770 Buford Highway, MS K-50

Atlanta, GA 30341

770.488.5845

[email protected]


Frederica Conrey, PhD

ICF Macro

126 College Street

Burlington, VT 05401

802.264.3745

[email protected]


Data Collection

The representative of the contractor responsible for conducting the planned data collection is:

Naomi Freedner, MPH

ICF Macro

26 College Street

Burlington, VT 05401

[email protected]

802.264.3730


Data Analysis

Analysis of data will be conducted by the individuals listed below.

William Robb, MS

ICF Macro

26 College Street

Burlington, VT 05401

[email protected]

802.264.3740


Shanta Dube, MPH, PhD

Office on Smoking and Health

National Center for Chronic Disease Prevention (NCCDPHP)

Centers for Disease Control and Prevention (CDC)

4770 Buford Highway, MS K-50

Atlanta, GA 30341

770-488-6287

[email protected]


Sean Hu, MD, MS, DrPH

Office on Smoking and Health

National Center for Chronic Disease Prevention (NCCDPHP)

Centers for Disease Control and Prevention (CDC)

4770 Buford Highway, MS K-50

Atlanta, GA 30341

770.488.5845

[email protected]

12



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Author21280
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy