Use of Smartphones to Collect Information about Health Behaviors:
Feasibility Study
New
Supporting Statement: Part A
Program official/project officer: Shanta Dube
Office on Smoking and Health
Centers for Disease Control and Prevention
Tel: (770) 488-6287
Email: [email protected]
June 19, 2012
Table of Contents
Section A
A-1 Circumstances Making the Collection of Information Necessary
A-2 Purpose and Use of the Information Collection
A-3 Use of Improved Information Technology and Burden Reduction
A-4 Efforts to Identify Duplication and Use of Similar Information
A-5 Impact on Small Businesses or Other Small Entities
A-6 Consequences of Collecting the Information Less Frequently
A-7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5
A-8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency
A-9 Explanation of Any Payment or Gift to Respondents
A-10 Assurance of Confidentiality Provided to Respondents
A-11 Justification for Sensitive Questions
A-12 Estimates of Annualized Burden Hours and Costs
A-13 Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers
A-14 Annualized Cost to the Government
A-15 Explanation for Program Changes or Adjustments
A-16 Plans for Tabulation and Publication and Project Time Schedule
A-17 Reason(s) Display of OMB Expiration Date is Inappropriate
A-18 Exceptions to Certification for Paperwork Reduction Act Submissions
Appendices
Appendix A Authorizing Legislation
Appendix B1 Federal Register Notice
Appendix B2 Comments in Response to the Federal Register Notice
Appendix C1 CATI Screener
Appendix C2 CATI Informed Consent Statement
Appendix D CATI Recruitment
Appendix E First Web Survey Follow-up for Smartphone Users
Appendix F Second Web Survey Follow-up for Smartphone Users
Appendix G First Text Message Survey Follow-up for non-Smartphone Users
Appendix H Second Text Message Survey Follow-up for non-Smartphone Users
Use of Smartphones to Collect Information about Health Behaviors: Feasibility Study
Background
This is a new Information Collection Request. The Centers for Chronic Disease Prevention and Health Promotion (NCCDPHP), Centers for Disease Control and Prevention (CDC) requests a one-year approval from the Office of Management and Budget to conduct a feasibility study examining the use of smartphones to collect information about health behaviors.
Despite the high level of public knowledge about the adverse effects of smoking, tobacco use remains the leading preventable cause of disease and death in the U.S., resulting in approximately 443,000 deaths annually. During 2005-2010, the overall proportion of U.S. adults who were current smokers declined from 20.9% to 19.3%. Despite this decrease smoking rates are still well above Healthy People 2010 targets for reducing adult smoking prevalence to 12%, and the decline in prevalence was not uniform across the population.i In addition, smoking has been estimated to cost the United States $96 billion in direct medical expenses and $97 billion in lost productivity each year.ii
One of the highest priorities emanating from the American Recovery and Reinvestment Act of 2009 is tobacco control and cessation programs. In addition, the Family Smoking Prevention and Tobacco Control Act gave the Food and Drug Administration new authority to regulate tobacco products, and the Children’s Health Insurance Program Reauthorization Act of 2009 included increases in Federal excise taxes on tobacco products. These developments reinforce the importance of timely collection of data related to tobacco usage.
CDC’s Office on Smoking and Health (OSH) created the National Tobacco Control Program (NTCP) in 1999 to encourage coordinated efforts nationwide to reduce tobacco-related diseases and deaths. The program provides funding and technical support to state and territorial health departments for comprehensive tobacco control programs. The four goals of the NTCP are to: (1) prevent initiation of tobacco use among young people; (2) eliminate nonsmokers’ exposure to secondhand smoke; (3) promote quitting among adults and young people; and (4) identify and eliminate tobacco-related disparities. The four components of the NTCP are: (1) population-based community interventions; (2) counter-marketing; (3) program policy/regulation; and (4) surveillance and evaluation.
As the communications landscape in the United States evolves, survey research practice is evolving to keep up. Now that almost 30 percent of adults use wireless phones exclusively, and almost 20 percent more use their wireless phones for most or all of their calls (Blumberg & Luke, 2010). Cell phone samples are used increasingly to supplement the information collected through random digit dialing (RDD) of landline telephone numbers.
Thus far, however, most of these innovations have been reactive; in other words, they have been changes made in an effort to head off an emerging source of coverage error or a precipitous drop in response rates. Attempts to leverage new technologies to improve the quality and flexibility of survey data collection have been less common. In the past few years, however, cell phones have become more than just landline replacements; they have changed how we expect to communicate with and experience the world.
In 2007, Apple launched the iPhone™, which began a major trend toward internet-enabled cell phone use. By the end of June 2009, an estimated 42 percent of US consumers owned a smartphone. In the third quarter of 2009, almost half of new phones sold were smartphones. One estimate suggests that almost 70 percent of American adults use non-voice phone technologies including web and text messaging (Federal Communications Commission, 2010).
The evolution of completely new, completely mobile communications technologies provides a unique opportunity for innovation in public health surveillance. Text messaging and smartphone web access are immediate, accessible, and confidential, a combination of features that could make Smartphones ideal for the ongoing research, surveillance, and evaluation of risk behaviors and health conditions, as well as targeted dissemination of public health messages or other information. Mobile communications technologies have potential immediate applicability to CDC’s tobacco prevention and control activities.
Broadly, this project will focus on process outcomes, including identifying and evaluating the process of conducting surveys by smartphone; numerous outcome rates (response rate, cooperation rate, refusal rate, and contact rate), quantitative risk behavioral outcomes to evaluate the quality of the data collected, such as comparisons of mode-specific item non-response, measurement error, and coverage error; and the value of the surveys. To do this, we propose to conduct a feasibility study with adults ages 18 to 65 that consists of an initial cell phone enrollment interview, and two follow-up surveys conducted via smartphone or text message on conventional cell phones.
The legal justification for the survey may be found in Section 301 of the Public Health Service Act (42 USC 241).
This study will collect information on alcohol and tobacco use, attempts to quit tobacco use, and exposure to forces promoting and discouraging tobacco use. Data on alcohol and tobacco use are generally regarded as being no greater than minimally sensitive. Therefore, the data collection will have little or no effect on the respondent’s privacy. Nevertheless, safeguards will be put in place to ensure that all collected data remain private.
The proposed study involves a minimum amount of information in identifiable form (IIF). Respondents will be recruited through random digit dial, and neither names nor addresses will be collected. The data collection contractor, ICF Macro, will have access to respondents’ phone numbers for recruitment purposes and e-mail addresses in order to contact them for participation in the follow-up surveys and to provide incentive payments. Please refer to section A.10 for a description of how data will be de-identified prior to transmission to CDC.
Overview of the Data Collection System
The purpose of this data collection is to evaluate the utility of smartphones to collect data. One hundred percent of the recruitment and initial CATI interviews will be conducted on cell phones using Computer Assisted Telephone Interviewing (CATI). Seventy-four percent of the two follow-up surveys will be conducted online via respondents’ smartphones. Twenty-six percent of the follow-up surveys will be conducted via text message with respondents who have feature cell phones and not smartphones.
This data collection comprises the following instruments: CATI screener (Appendix C1); CATI informed consent statement (Appendix C2); CATI recruitment (Appendix D); two Web survey follow-ups for smartphone users (Appendices E and F); two text message survey follow-ups for non-smartphone users (Appendices G and H)
Items of Information to be Collected
Process evaluation as well as outcome evaluation data will be collected. Risk behavior questions will be drawn from previous OMB-approved data collections, such as the Youth Risk Behavior Survey (OMB No. 0920-0493, exp. 11/30/2011) and the National Adult Tobacco Survey (OMB No. 0920-0828, exp. 10/31/2010). The following topics will be addressed in the initial CATI and/or follow-up surveys:
Process Evaluation
Type of cell phone
Type of cell phone plan (prepaid, unlimited minutes, etc.)
Delay between text invitation and initial login;
Volume and type of Helpdesk questions;
Percent of survey break-offs and time-outs;
Percent of text-based opt-outs;
Difficulties encountered by staff with different kinds of smartphones “seeded” into the sample;
Percent Paypal email bounceback; and
Percent of respondents reporting difficulties obtaining incentive
Outcome Rate Evaluation
Response rate
Cooperation rate
Refusal rate
Contact rate
Risk Behavioral Outcome Evaluation
Alcohol consumption
Tobacco use
Attempts to quit tobacco use, and
Exposure to forces promoting and discouraging tobacco use
Identification of Websites and Website content Directed at Children under 13 Years of Age
All respondents will be over the age of 18. When a potential respondent is contacted via CATI interview, he or she first completes a screener to determine study eligibility (see Appendix C1). Specifically, the person answering the telephone is asked whether they have been reached on a cellular telephone, and are aged 18 or older. For those who respond they are not on a cellular telephone, or that they are less than 18 years old, the interview is terminated. There will be no Websites with content directed at children under 13 years of age. A Web survey will be developed for smartphone follow-up surveys. This survey will only be accessible to study participants by entering a unique ID which will be sent via email to an address provided by the participant. Once a survey has been completed using a specific ID, the survey cannot be accessed again. IDs will be generated randomly and will be non-sequential, making it highly unlikely that a respondent could input another ID and gain access to the survey.
The overall goals of the feasibility study are to determine the following:
Quality of the solution: How smartphones should be used to collect the best possible quality data.
Technological feasibility: Whether and under what circumstances smartphones can be used to collect population-based public health and behavior data.
Cost effectiveness: How much such a data collection would cost compared to more conventional data collection approaches.
The data collected by this method is different in nature from what traditional public health surveys collect. Advances in smartphone technology would make smartphones a potential tool for public health surveillance. We would constantly monitor new innovations and adjust our system to take advantage of the latest technology available. Other future improvements may include addition of voice capability and providing survey material in Spanish and other languages. A more comprehensive and thorough understanding of the above goals would help CDC better prepare for public health surveillance in the future.
Quality of the Solution
Perhaps the most important goal of the evaluation is to determine whether and how smartphones or text messages to cell phones can be leveraged to yield “good” data. Obtaining high quality data is a matter of minimizing survey error. For the purposes here, it is useful to distinguish between error associated with the sample and error associated with survey responses. From our perspective “error” can mean a systematic bias in the results or random noise in the data.
Young people and renters, for instance, are heavy cell phone users, but older Americans are much less likely to have adopted a wireless-only lifestyle (Blumberg & Luke, 2010). Statistically adequate coverage of a specific, young population might be feasible, but coverage of all American adults might not. The study will include respondents ages 18-65 in order to evaluate population coverage of both younger and older respondents.
In the pilot survey, we will evaluate quality of:
Sample. We will evaluate unit non-response by comparing people who opt in to those who do not, people who have smartphones to those who do not, and people who comply with the smartphone surveys to those who do not.
Data. We will compare item non-response in the telephone, smartphone, and text message surveys.
To determine how smartphone surveys can be implemented, we will conduct a process evaluation as part of the feasibility study to document lessons learned. Individual technologies grow through commercial innovation and are often not standard across carriers or phone or plan types. For smartphone surveys to work, we need to answer questions like:
How long a message can we send to phones?
What is the best way to send messages? Is there a standard mechanism used by several carriers?
How does the text message appear on the phone screen?
How does the internet browser on the phone operate? Does the survey need to be customized for the browser?
What is the delay between text invitation and initial login;
What volume and type of Helpdesk questions were received;
What was the percent of survey break-offs and time-outs;
What was the percent of text-based opt-outs;
What difficulties were encountered by staff with different kinds of smartphones “seeded” into the sample;
What percent of Paypal emails bounceback; and
What percent of respondents reported difficulties obtaining incentive
We will evaluate the relative cost-effectiveness of the different survey types being piloted (CATI interview, smartphone surveys, and text message surveys. For instance, we might discover that the cost of conducting a series of brief surveys via smartphone (i.e., a diary study) is $x per question and that the item response rate on sensitive questions is y%. A similar study conducted retrospectively by cell phone might cost $a per question and achieve b% response. Even if the Smartphone cost is higher ($x>$a), if the ratio of cost to quality ($x:y%) is lower, then smartphone research might be preferable to conventional methods. We will evaluate this cost:quality ratio for smartphone vs. conventional RDD cell phone surveys with 18-65 year olds and, if cell sizes are large enough, for subpopulations including smokers, 18-34 year olds, and minorities.
CDC will use the results of the study to create a Standard Operating Procedure (SOP) for implementing smartphone and text message research. This SOP will contain details on the circumstances under which smartphone research can produce high quality data, the circumstances under which it is affordable compared to telephone methodology, and the specific methods by which a smartphone survey can be carried out.
This study will collect information on alcohol and tobacco use, attempts to quit tobacco use, and exposure to forces promoting or discouraging tobacco use. Data on alcohol and tobacco use are generally regarded as being no greater than minimally sensitive. Therefore, the data collection will have little or no effect on the respondent’s privacy. However, because data will be collected from respondents less than 21 years of age, , safeguards will be put in place to ensure that all collected data remain private. First, at the onset of the initial CATI interview, interviewers will ask respondents whether they are in a place where they can answer questions privately. In addition, although the questions are sensitive (for example, “In the past year, how often did you drink any type of alcoholic beverage?”), participants’ responses will not divulge any sensitive information – responses will typically be yes, no, a numeric value, or a frequency (every day, sometimes, never, etc.). For the follow-up surveys, respondents will enter responses directly into their cell phone. Data from the initial CATI interview will be stored on secure servers, and data from the follow-up surveys will be transmitted to the Contractor, ICF, via secured data transmission.
The proposed study involves a minimum amount of information in identifiable form (IIF). Respondents will be recruited through random digit dial, and neither names nor addresses will be collected. The data collection contractor will have access to respondents’ phone numbers for recruitment purposes and e-mail addresses in order to contact them for participation in the follow-up surveys and to provide incentive payments. Please refer to section A.10 for a description of how data will be de-identified prior to transmission to CDC.
One hundred percent of the initial CATI information collection involves the use of Computer Assisted Telephone Interviewing (CATI) to reduce burden to the respondent and permit electronic collection and submission of responses. The follow-up information collection will also be electronic; 26% will be conducted via text messages and 74% will be conducted via Web survey through a link provided via text message to respondents’ smartphones.
Collecting data with smartphones has been the subject of some research. A project from the University of Washington called the Open Data Kit (http://opendatakit.org/) uses open source software to collect and compile survey data on smartphones in support of surveys in developing nations, but the focus of this project is surveys conducted by interviewers in the field for which data are recorded on a smartphone.
Some studies in the past two years have concerned smartphone surveys. In 2011, the Annual Meeting of the American Association for Public Opinion Research hosted a symposium on smartphone and iPad data collection where several researchers presented research on the implementation of research in these contexts. These studies, however, focused on the design and presentation of smartphone surveys. None specifically concerned response and coverage bias, major concerns of the present research. In evaluating data quality and cost, the present research represents an important extension of this work, which has focused on best practices.
No small businesses will be involved in this data collection.
Because the proposed study is a feasibility study, this data collection will be conducted only once. As noted above, the resulting information will use the results of the study to create a Standard Operating Procedure (SOP) for implementing smartphone and text message research. There are no legal obstacles to reduce the burden.
There are no special circumstances with this information collection request. This request fully complies with the guidelines of 5 CFR 1320.5.
A 60-day Notice was published in the Federal Register on February 27, 2012 (see Appendix B1), Vol. 77, No. 38, pp. 11545-11547. One public comment was received and acknowledged (see Appendix B2).
Because of the limited scope of this feasibility study, CDC did not consult with persons outside the agency on the design of the study.
Study participants will be paid up to $10 for their participation in the feasibility study. This amount was determined based on the time commitment and effort asked of respondents; it reimburses respondents for the data they use in completing the surveys and will help prevent panel attrition. The estimated maximum cost to participants is $4.40, which would be covered in full by the proposed incentive payment.
Specifically, smartphone pilot participants will be asked to participate in three separate surveys (an initial phone interview and 2 follow-up Smartphone surveys), so they represent a panel of participants. Effectively incentivizing a smartphone panel is difficult. Ideally, the survey will not collect respondent addresses, since preserving respondents’ sense of anonymity is an important part of obtaining accurate data. Without addresses, however, we can transmit financial incentives in only two ways: using an internet bank such as PayPal as a go between or sending an electronic gift code to the cell phone. We will offer all respondents the choice between these options. For PayPal reimbursement, respondents will need to provide email addresses and have a PayPal account. If they prefer not to provide their emails, we will send them Amazon.com gift codes via text message. The former is more like cash, the latter is easier and preserves privacy. Which of these options respondents choose is an outcome of interest.
In conventional surveys, prompt payment of incentives helps promote response, but in this case, the small amounts involved make incremental payment infeasible. A $3 or $4 Amazon.com gift code does not have much real value. Paying the entire incentive at the end of the survey cycle seems ideal. However, keeping respondents engaged with the incentive is important. We will use a “points” system like that employed by some internet panels. Each survey that respondents participate in will gain them three or four points, for a maximum of 10 points for completion of all three surveys. The survey login screen for each survey will display the respondent’s current number of points allocated. At the end of the last survey, respondents will be reimbursed for their participation at one dollar per survey point (with a minimum of $3 for respondents who completed the initial phone interview but did not participate in either of the follow-up surveys).
CDC has determined that this activity does not constitute research with human subjects and therefore does not require IRB approval. Prior to initiating data collection, the project director will train all interviewers prior to making calls to respondents.
This study will collect information on tobacco use and alcohol use in the survey component. The data are being gathered to determine whether it is feasible to collect this information (historically collected by mail, telephone, or in person surveys) through internet surveys on smartphones.
This submission has been reviewed by staff in CDC’s Information Collection Review Office, who determined that the Privacy Act does not apply. Respondent names and addresses will not be collected for the smartphone feasibility study.
Precautions will be taken in how the data are handled to prevent a breach of security. Survey data and identifying information (phone number and email address) about respondents will be handled in ways that prevent unauthorized access at any point during the study. To maintain security, the telephone number and email address of the respondent will be excluded from the final data. If reports or tabular data are submitted, the data will be reviewed to determine if the subject(s) can be identified when small cell counts occur. If there is the potential for the identification of these subject(s), (cell count fewer than 30 records), the data in these cells will be removed. Respondents will be told during the initial screener that the information they provide will be private. All interviewers will be required to sign a non-disclosure agreement on the date of hire, which will be reinforced at training.
Verbal consent will be elicited from participants. Before the telephone interview, the interviewer will read the informed consent script (see Appendix C2) to each participant. The consent script describes the study, the types of questions that will be asked on the actual surveys, the risks and benefits of participation, and participants’ rights, and it provides information on whom to contact with questions about any aspect of the study. The consent script also indicates that participation is completely voluntary and that participants can refuse to answer any question or discontinue the study at any time without penalty or loss of benefits. The interviewer will enter a code via the keyboard to signify that the participant was read the informed consent script and agreed to participate.
Participation in the survey is voluntary. Interviewers will tell respondents that “Any information you give me will be treated in a secure manner and will not be disclosed, unless otherwise compelled by law." Interviewers will also tell respondents: "I will not ask for your last name, address, or other personal information that can identify you. You do not have to answer any question you do not want to, and you can end the study at any time.”
The surveys (Appendix D: CATI Recruitment; Appendix E: First Web Survey Follow-up for Smartphone Users; Appendix F: Second Web Survey Follow-up for Smartphone Users; Appendix G: First Text Message Survey Follow-up for non-Smartphone Users; Appendix H: Second Text Message Survey Follow-up for non-Smartphone Users) ask about general tobacco and alcohol use, and include demographic questions such as respondent’s race and ethnicity.
While an individual may be sensitive about answering questions about tobacco and alcohol use, the items are, for the most part, not of a sensitive nature and are commonly found in surveys of health behavior. Data on tobacco use are generally regarded as being no greater than minimally sensitive. Therefore, the data collection will have little or no effect on the respondent’s privacy. Nevertheless, safeguards will be put in place to ensure that all collected data remain private. There are no questions concerning illegal drug use or other criminal acts. There are no questions about emotionally charged experiences such as parental or sexual abuse. Race and ethnicity questions will conform to OMB standards.
The total average time to recruit, screen, and conduct the CATI interview is eight minutes or less. Approximately 30-60 seconds are needed to introduce the survey and screen for an eligible respondent (Appendix C, CATI Screener), one minute is devoted to the informed consent process, and seven minutes are required to complete the interview (Appendix D, Initial CATI Survey). Thereafter, each respondent will be routed to two follow-up surveys. Slightly different versions of the follow-up survey will be used for Smartphone users (Appendix E and Appendix F) and non-Smartphone users (Appendix G and Appendix H). The estimated burden per response for all follow-up surveys is three minutes. We expect that 70% of the respondents completing the initial CATI interview will agree to and complete the first follow-up interview, and that 85% of those completing the first follow-up will complete the second follow-up. Prior to fielding the study, the data collection contractor will pre-test instruments and procedures. The total estimated burden for the pre-test is 20 hours.
The total estimated annualized burden for the Smartphone Study and associated support activities is 236 hours.
Estimated Annualized Burden Hours
Type of Respondents |
Form Name |
Number of Respondents |
Number of Responses per Respondent |
Avg. Burden per Response (in hours) |
Total Burden (in hours) |
Adults Aged 18 to 65, All cell phone users
|
Pre-test (CATI Screener/CATI Recruitment) |
20 |
1 |
8/60 |
3 |
CATI Screener |
1,990 |
1 |
1/60 |
33 |
|
CATI Recruitment |
995 |
1 |
7/60 |
116 |
|
Adults Aged 18 to 65, Smartphone Users
|
First Web Survey Follow-up for Smartphone Users |
697 |
1 |
3/60 |
35 |
|
Second Web Survey Follow-up for Smartphone Users |
592 |
1 |
3/60 |
30 |
Adults Aged 18 to 65, Non-smartphone Users
|
First Text Message Survey Follow-up for non-Smartphone Users |
200 |
1 |
3/60 |
10 |
Second Text Message Survey Follow-up for non-Smartphone Users |
170 |
1 |
3/60 |
9 |
|
Total |
236 |
The total estimated cost to respondents is $5,428, using $23 per hour as the average hourly wage (http://www.bls.gov/ncs/ocs/sp/nctb1475.pdf). Additional information is provided in the following table.
Estimated Annualized Burden Costs
Type of Respondents |
Form Name |
No. of Respondents |
Total Burden (in hours) |
Avg. Hourly Wage Rate |
Total Annualized Respondent Costs |
Adults Aged 18 to 65, All cell phone users |
Pre-test of CATI Screener/Initial CATI Survey |
20 |
3 |
$23 |
$69 |
CATI Screener |
1,990 |
33 |
$23 |
$759 |
|
Initial CATI Survey |
995 |
116 |
$23 |
$2,668 |
|
Adults Aged 18 to 65, Smartphone Users |
First Web Survey Follow-up for Smartphone Users |
697 |
35 |
$23 |
$805 |
Second Web Survey Follow-up for Smartphone Users |
592 |
30 |
$23 |
$690 |
|
Adults Aged 18 to 65, Non-smartphone Users |
First Text Message Survey Follow-up for non-Smartphone Users |
200 |
10 |
$23 |
$230 |
Second Text Message Survey Follow-up for non-Smartphone Users |
170 |
9 |
$23 |
$207 |
|
|
Total |
$5,428 |
There are no capital, startup, operational, or maintenance costs to respondents other than the costs of cell phone talk minutes and data.
The total contract award to the data collection contractor, Macro International, Inc., is $95,000 over a 12-month period. These costs cover the activities in Table A-14 below.
Additional costs will be incurred indirectly by the government in personnel costs of staff involved in oversight of the study and in conducting data analysis. It is estimated that 2 CDC employees will be involved for approximately 10% of their time (for federal personnel 100% time = 2,080 hours annually). The two salaries are $50.03 and $56.48 per hour. The direct annual costs in CDC staff time will be approximately $22,154 annually.
The total annualized cost for the study over a 12-month period, including the contract cost and federal government personnel cost is $117,154.
Type of Cost |
Description |
|
|
CDC Personnel |
10% of GS-14 @ $104,062/year |
$10,406 |
|
|
10% of GS-14 @ $117,478/year |
$11,748 |
|
|
Subtotal, CDC Personnel |
|
$22,154 |
Contractual Costs |
Conduct focus groups, Instrument design, survey programming, data collection and cleaning, analysis, report writing |
|
$95,000 |
|
|
Total |
$117,154 |
This is a new, one-time data collection.
Schedule
Planning I |
CDC approves work plan |
Prior to OMB Approval |
Project kickoff |
||
Draft study design plan |
||
Existing knowledge review |
||
Focus Groups |
Focus group recruiting begins |
Prior to OMB Approval |
Develop focus group moderator guide |
||
Conduct 2 focus groups* |
||
Draft focus group report |
||
Planning II |
Study design plan update |
1 week following OMB approval |
Pilot |
Program Data Collection Instruments |
1 week following OMB approval |
Data collection begins |
2 weeks following instrument programming |
|
Data collection ends |
6 weeks after data collection begins |
|
Reporting |
Draft process evaluation/survey methods report |
1 week after data collection ends |
Draft outcomes evaluation and cost analysis |
4 weeks after data collection ends |
|
Final report |
6 weeks after data collection ends |
*Combined, the 2 focus groups will have no more than 9 participants.
The evolution of mobile communications technologies provides a unique opportunity for innovation in public health surveillance. Text messaging and smartphone web access are immediate, accessible, and confidential, a combination of features that could make them ideal for ongoing research, surveillance, and evaluation of risk behaviors and health conditions. We will explore the perceived feasibility, advantages and disadvantages of conducting a population-based survey via smartphone. Deeper understanding of factors that promote and hinder participation will be useful in creating a population-based smartphone pilot survey. We will determine and describe the technological feasibility: whether and under what circumstances smartphones can be used to collect population-based public health and behavior data. In addition, response rate, contact rate, cooperation rate, and refusal rate will be calculated separately for the smartphone online survey and texting message survey. As mobile communications continue to evolve, a better understanding of how smartphones can be used to collect data on risk behaviors and health conditions is critical to public health surveillance and evaluation efforts.
The results of the study will be presented at the 2012 and 2013 American Association of Public Opinion Research annual meetings.
The OMB expiration date will be displayed on all data collection instruments.
There are no exceptions to the certification statement.
i Vital Signs: Current Cigarette Smoking Among Adults Aged ≥18 Years --- United States, 2005—2010. MMWR 2011:60(35):1207-1212.
ii CDC. Smoking-attributable mortality, years of potential life lost, and productivity losses---United States, 2000--2004. MMWR 2008;57:1226--8.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | 21280 |
File Modified | 0000-00-00 |
File Created | 2021-01-30 |