0990-PSUNC-Tracking Survey

0990-PSUNC-Tracking Survey.doc

Evaluation of the Parents Speak Up National Campaign: National Media Tracking Surveys

OMB: 0990-0345

Document [doc]
Download: doc | pdf

B. Collection of Information Employing Statistical Methods

B.1 Respondent Universe and Sampling Methods

The current Knowledge Networks panel consists of approximately 40,000 adults actively participating in research. The Web-enabled panel tracks closely to the U.S. population in terms of age, race, Hispanic ethnicity, geographical region, employment status, and other demographic elements. The Knowledge Networks panel is recruited through random-digit-dialing (RDD) and is comprised of both Internet and non-Internet households.

The PSUNC National Media Tracking Survey will be conducted as a longitudinal online survey among a large nationwide Web-enabled panel of parents of 10- to 14-year-olds in the United States. Data collection will begin in fall 2009, during the fall media flight of the national television campaign, with a baseline survey of approximately 2,000 parents selected from the Knowledge Networks panel. The sample will be chosen using probability methods that will include appropriate sample design weights, based on specific parameters of sample composition. To further reduce the effects of non-sampling error, non-response and post-stratification weighting adjustments will be applied to the sample.

Study participants who complete the baseline survey will be surveyed again in follow-up surveys, beginning in 2010. The timing of the longitudinal follow-up surveys will be contingent on the timing of the campaign’s media flights. If possible, the follow-up surveys will occur during or shortly after spring and fall media flights during each year from 2010 through 2013. Each of the surveys will be approximately 20 minutes in length and will contain the same items that have already been asked in the PSUNC Efficacy Study survey. The media tracking survey will, however, include additional questions on parents’ self-reported exposure to PSUNC ads, using measures of “confirmed awareness” (i.e., demonstrated exposure through respondents’ describing exposure to specific campaign messages). These measures will provide a nationwide estimate of awareness of the PSUNC ads. The sample for each longitudinal follow-up survey will consist of 500 parents in the spring and fall, for a total 1,000 interviews per year. Each follow-up survey will be refreshed with new participants to replace those who drop out of the study, keeping sample sizes constant for each survey over time. See Error: Reference source not found for planned sample sizes for each national survey by year.

We conducted power analyses to determine the optimal sample size for detecting statistically significant differences between treatment and control groups. Since fathers and mothers are expected to behave differently in terms of parent-child communication and possibly in terms of response rates to the surveys, they will be assessed separately. Because no published data are available about expected baseline and follow-up rates of father-child communication or survey response, power calculations are based on data to be collected from mothers. The frequency with which parents report they have spoken to their children about sex serves as the primary outcome measure, and responses will be dichotomized as “often” versus “not often” (“sometimes,” “seldom,” or “never”) for the purposes of power calculations. Power calculations were based on the comparison between parents who report exposure to the campaign and those who do not report exposure. Several assumptions were made concerning population parameters for power analyses. First, we assumed a 0.7 correlation coefficient between outcomes measured at baseline and 2-year follow-up for the same respondent. Although there is little definitive information about the true correlation over 24 months, there is some evidence from studies of parent involvement in other teen risk behaviors that such correlation is no stronger than we assume here (Wills, Sandy, Yeager, & Shinar, 2001). Second, because mothers will be sampled separately from fathers, we assume that all outcomes between different respondents will be uncorrelated. Third, it was assumed that 16% of mothers will report communicating often with their child about waiting to have sex at baseline and that 22% of mothers will report this at 2-year follow-up, as reported by Klein et al. (2005). Each of these assumptions is very conservative, resulting in increased sample sizes for our evaluation. Our assumption of a change from 16% of mothers communicating often to 22% of mothers doing so at 2-year follow up safeguards for the possibility that PSUNC public service announcement messages may produce very small effects in the short term.

All decisions about assumptions that guided our power analysis were intended to err in favor of a larger sample size to safeguard for the possibility of a worst case scenario in terms of difficulty detecting effects. These assumptions increased our confidence that smaller effects produced by the messages than those found by previous prevention programs would be reasonably detected using the sample sizes we identified.

As noted earlier, our sample design is based on conservative assumptions about survey response. Thus our estimates of longitudinal retention rates should be viewed as “worst case” scenarios that if hold true, would still ensure sufficient sample sizes to reasonably detect small message effects. We estimate that at least 75% of mothers who complete the baseline survey will be retained to complete the spring and fall 2011 surveys. The sample will be refreshed each wave to replace participants who drop out, maintaining constant representative samples in each wave.

Exhibit 8 shows longitudinal retention rates for prior Knowledge Networks studies of various lengths. While we are assuming follow-up survey response rates as low as 75%, the average follow-up completion rate across each prior Knowledge Networks study listed in Exhibit 10 is over 90% with baseline to follow-up retention averaging 81% across follow-ups ranging from 3 months to 3 years. We expect similar retention patterns for this study.

Exhibit 8. Longitudinal Completion and Retention Rates for Prior Knowledge Networks Studies

Project

Institution/Client

Sample

Survey

Time from Baseline

Percent

Follow-up Survey Completion Rate

Baseline to Follow-up Retention Rate

Stress and Trauma Survey

UC Irvine

18+ General Pop

Wave 7

3 years

94

48

Menopausal Women Survey

RTI

Female 40-65

Wave 2

2 years

89

71

National Seafood Study

NOAA

18+ Primary Grocery Shoppers

Wave 2

4 months

90

97




Wave 3

8 months

92

94

Chronic Opioid Survey

RTI

18+ General Pop

Wave 2

3 months

95

96

National Health Follow-Up

University of Pennsylvania

18+ General Pop

Wave 2

1 year

90

78

2004 Election Survey

Ohio State University

18+ General Pop

Wave 3

7 months

77

84

2004 Biotech Survey

Northwestern University

18+ General Pop

Wave 3

11 months

96

75



B.2 Procedures for the Collection of Information

In partnership with Knowledge Networks, a sample will be selected of 2,000 parents or parent surrogates (e.g., stepmother, grandfather, foster parent) of children aged 10 to 14. When the study is assigned to the sampled panel members, they will receive notice in their password-protected e-mail account that the survey is available for completion. Nonrespondents will receive two e-mail reminders from Knowledge Networks requesting their participation in the survey. The surveys will be self-administered and accessible any time of day for a designated period. Participants can complete the survey only once. Mothers and fathers will be selected separately to avoid biasing the sample, and male and female participant screeners will be used to determine study eligibility. Eligible participants include English-speaking parents or parent surrogates of children aged 10 to 14. Informed consent will be sought from parents for participation in the Web survey. Parents will consent by selecting the appropriate link on the Web screen. Members may leave the panel at any time, and receipt of the Web TV and Internet service is not contingent on completion of the study.

A 20,000 Knowledge Networks bonus point incentive (equivalent to $20 cash) will be offered to participants who complete each survey. Parents may be difficult to engage in a survey about this sensitive topic without the use of a small incentive. The incentive is intended to recognize the time burden placed on them, encourage their cooperation, and to convey appreciation for contributing to this important study over nine data collection periods. A detailed description of Knowledge Networks’ panel recruitment methodology is provided with this submission.

B.3 Methods to Maximize Response Rates and Deal with Nonresponse

The following procedures were used to maximize cooperation and to achieve the desired high response rates:

 Recruitment through Knowledge Networks for some respondents averaging 70% to 75% response rate for the Web-enabled panel.

 Knowledge Networks bonus point incentive in the amount of 20,000 (equivalent to $20 cash) will be offered to participants who complete each survey.

 An attempt will be made to locate participants who leave the Knowledge Networks panel before the end of this study. Location efforts will include mailings of refusal conversion materials designed to persuade participants to complete the study. In addition to using mailed refusal conversion materials, Knowledge Networks may also conduct telephone-based refusal conversion, contacting each attriting participant via telephone.

 Knowledge Networks will provide a toll-free telephone number to all sampled individuals and invite them to call with any questions or concerns about any aspect of the study.

 Knowledge Networks data collection staff will work with RTI project staff to address concerns that may arise.

B.4 Tests of Procedures or Methods to be Undertaken

Knowledge Networks implemented an eight-case pilot test of the survey instrument for the OMB-approved PSUNC Parent Efficacy Study. This survey is virtually identical to the instrument that will be used for the PSUNC National Media Tracking Surveys. The purpose of the pilot test was twofold: (1) to assess technical aspects and functionality of the survey instrument and (2) to identify areas of the survey that were either unclear or difficult to understand. The primary difference between the pilot test instrument and the PSUNC National Media Tracking Surveys is that the media tracking surveys include a few additional questions that ask about parent awareness of and reaction to PSUNC ads they may have seen. These additional questions have been validated and used in a number of other similar studies and thus do not need to be piloted.

Pilot test data collection was conducted during July 2006. Eligible participants came from a convenience sample of Knowledge Networks panel members who are parents or parent surrogates (e.g., stepmother, grandfather, foster parent) of children aged 10 to 14. This nationally representative panel of parents self-administered the baseline Web survey at home on personal computers. To obtain eight completed questionnaires, Knowledge Networks invited a total of 14 panelists to participate in the pilot test. Parents selected for the study received an e-mail message from Knowledge Networks alerting them that they had a survey assignment. Nonrespondents received two e-mail reminders from Knowledge Networks requesting their participation in the survey. Mothers and fathers were selected separately and participant screeners were used to determine study eligibility. Participants were administered the core efficacy study survey instrument, including questions regarding parent-child communication, attitudes and beliefs, perceptions of child sexual activity, parental involvement and monitoring, and demographics.

Nine parents were contacted by e-mail. After two e-mail reminders from Knowledge Networks, a new parent was included in the sample, resulting in a total of 14 parents contacted. Among those invited, 1 parent was found to be ineligible and 8 parent surveys were completed for a 57% response rate.

In addition to the core questionnaire items from the PSUNC public service announcement survey, the pilot test instrument included additional items to assess the participants’ comfort level in answering the questions, the level of seriousness of their answers, their honesty level, length of the survey, the overall instrument, and specific instrument questions regarding their responses to the main survey items. Responses to these questions generally suggest that pilot test participants understood the survey and were honest in providing their answers. While all eight pilot test participants indicated that there were no questions that they did not feel comfortable answering, five participants indicated that they felt very uncomfortable answering questions on this survey. Participant comfort level was elucidated further in participant debriefings conducted after the primary interviews (discussed below). A summary of findings from these questionnaire items is provided in Exhibit 9.

Exhibit 9. Pilot Test Responses to Questions about Survey Instrument

Pilot Test Item

Frequency

Response

Were there any questions that you did not feel comfortable answering?

8

No

How seriously did you answer the questions on this survey?

8

Very Seriously

How honestly did you answer the questions on this survey?

8

Very Honestly

What did you think about the length of the survey?

8

About right

Were there any questions that you didn’t understand?

1
7

Yes
No

How comfortable did you feel answering questions on this survey?

5
1
2

Very uncomfortable
Somewhat uncomfortable
Somewhat comfortable



Analyses of the pilot test data also indicated there were no significant technical problems with the survey instrument. No questions had unexplained missing data, there were no outlier values, all response options were labeled correctly, and all skip patterns appeared to function correctly. Our findings suggest that there were no logic or non-response problems with the survey, respondents were routed appropriately through the survey based on answers given to each question, and the data were accurately recorded. We also separately analyzed each question that included options for verbatim responses as a check for whether the specified list of response options in the survey adequately covered all of the potential responses that a participant could give. Analysis of verbatim response data indicated that verbatim response were generally not necessary as participants provided responses already available in the pre-coded list specified in the survey.

The pilot test also included a respondent debriefing of two participants, aimed at illuminating participants’ thought processes and further identifying areas of the survey that were either unclear or difficult to understand. Debriefings were conducted in August 2006. Two participant debriefings were conducted via telephone after completion of the pilot test. The pilot test instrument contained two questions to assess whether participants would be willing to participate in a follow-up telephone debriefing. Pilot participants who indicated willingness to participate in the telephone follow-up were also asked to indicate days and times that they preferred to conduct the telephone debriefing. Knowledge Networks then contacted these participants, via telephone, to arrange an appointment for conducting the telephone debriefing. Two eligible participants were identified and interviewed for the telephone follow-up. The telephone debriefing consisted of a brief series of questions about the participants’ impressions of the survey in terms of its ease of use, the sensitivity of the questions, its length, and any aspects of the survey that were difficult to understand.

The post-survey participant debriefings also indicated relatively few problems with the survey. Both debrief participants indicated that the survey was easily understood and did not contain any words or phrases that were unfamiliar to them. Each respondent also indicated that the survey instructions were always clear and there were never doubts about what to do in order to proceed through the survey. Neither of the debrief participants had any pre-formed thoughts about what type of organization or group was funding the survey. Each respondent also indicated that the survey was not overly long or burdensome and neither participant felt uncomfortable answering any of the questions (i.e., none of the survey questions were too sensitive for them).

Based on the findings of the pilot test, the survey appears to function as intended and is not overly burdensome, sensitive, or difficult to understand. Therefore, no substantive revisions were made to the survey instrument as a result of pilot testing.

B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The agency official responsible for receiving and approving contract deliverables is:

Allison Roper

240-453-2806

[email protected]

Office of Population Affairs/DHHS

1101 Wootton Parkway, Suite 700

Rockville, MD 20852


The persons who designed the data collection is:

W. Douglas Evans, PhD

202-416-0496

[email protected]

The George Washington University

School of Public Health and Health Services

2175 K Street, NW, Suite 700

Washington, DC 20037


Kevin C. Davis, MA

919-541-5801

[email protected]

RTI International

3040 Cornwallis Rd

Research Triangle Park, NC 27709


The person who will collect the data is:

J. Michael Dennis, PhD

650-289-2000

[email protected]

Knowledge Networks, Inc.

1350 Willow Road, Suite 102

Menlo Park, CA 94025



The person who will analyze the data is:

Jonathan Blitstein, PhD

919-541-7313

[email protected]

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

References

Abma, J. C., Martinez, G. M., Mosher, W. D., & Dawson, B. S. (2004). Teenagers in the United States: Sexual activity, contraceptive use, and childbearing, 2002. Hyattsville, MD: National Center for Health Statistics.

Abreu, D.A., & Winters, F. (1999). Using monetary incentives to reduce attrition in the survey of income and program participation. Proceedings of the Survey Research Methods Section of the American Statistical Association.

Albert, B., Lippman, L., Franzetta, K., Ikramullah, E., Keith, J. D., Shwalb, R., et al. (2005). Freeze frame: A snapshot of America’s teens. Washington, DC: National Campaign to Prevent Teen Pregnancy.

DuRant, R. H., Wolfson, M., LaFrance, B., Balkrishnan, R., & Altman, D. (2006). An evaluation of a mass media campaign to encourage parents of adolescents to talk to their children about sex. Journal of Adolescent Health, 38, 298e.1-298e.9.

Klein, J. D., Sabaratnam, P., Pazos, B., Auerbach, M. M., Havens, C. G., & Brach, M. J. (2005). Evaluation of the parents as primary sexuality educators program. Journal of Adolescent Health, 37, S94-S99.

Maynard, R., Trenholm, C., Devaney, B., Johnson, A., Clark, M., Homrighausen, J., et al. (2005). First-year impacts of four Title V, Section 510 abstinence education programs. Princeton, NJ: Mathematica Policy Research, Inc.

O’Rourke, D., Chapa-Resendez, G., Hamilton, L., Lind, K., Owens, L., & Parker, V. (1998). An inquiry into declining RDD response rates part I: Telephone survey practices. Survey Research, 29, 1-16.

Shettle, C., & Mooney, G. (1999). Monetary incentives in U.S. government surveys. Journal of Official Statistics, 15, 231-250.

The National Campaign to Prevent Teen Pregnancy. (2003). With one voice 2003: America’s adults and teens sound off about teen pregnancy. Washington, DC: The National Campaign to Prevent Teen Pregnancy.

Wills, T. A., Sandy, J. M., Yaeger, A., & Shinar, O. (2001). Family risk factors and adolescent substance use: Moderation effects for temperament decisions. Dev Psychol, 37, 283-297.


9

File Typeapplication/msword
AuthorDHHS
Last Modified ByDHHS
File Modified2009-04-07
File Created2009-04-07

© 2024 OMB.report | Privacy Policy