Women’s Preventive Health Services Survey (WPHSS)
Supporting Statement – Section B
Submitted: February 6, 2017
Program Official/Project Officer
Jacqueline Miller, MD
Medical Officer
Division of Cancer Prevention and Control
National Center for Chronic Disease Prevention and Health Promotion
Centers for Disease Control and Prevention
4770 Buford Highway, Mailstop F-76
Atlanta, GA 30341
Phone: 770-488-5061
Fax: 770-488-3230
E-mail: [email protected]
Section B – Collections of Information Employing Statistical Methods 4
1. Respondent Universe and Sampling Methods 4
2. Procedures for the Collection of Information 7
3. Methods to Maximize Response Rates and Deal with No Response 12
4. Test of Procedures or Methods to be Undertaken 16
5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 18
Attachment 1. Breast and Cervical Cancer Mortality Prevention Act of 1990
Attachment 2. 60-Day FRN
Attachment 3. Survey instrument with consent (English)
Attachment 4. Initial Invitation letter (Bilingual)
Attachment 5. Web instructions (Bilingual)
Attachment 6. Survey screen shots (English)
Attachment 7. Survey screen shots (Spanish)
Attachment 8. Phone script for non-responders
Attachment 9. Reminder postcard (Bilingual)
Attachment 10. Final invitation letter (Bilingual)
Attachment 11. Letter for states to share contact information
Attachment 12. Phone script for states to share contact information
Attachment 13. Mid-year thank you letter (Bilingual)
Attachment 14. Year 2 Invitation letter (Bilingual)
Attachment 15. Year 3 Invitation letter (Bilingual)
Attachment 16. NORC IRB approval
Attachment 17. Sampling instructions
The respondent universe for WPHSS is uninsured or underinsured women who had previously been screened through the National Breast and Cervical Cancer Early Detection Program (NBCCEDP), but now have health insurance coverage. The NBCCEDP serves women who are at or below 250% of the federal poverty level. Women age 40-64 are eligible for breast cancer screening. The program may also serve women that are under the age of 40 and have breast symptoms. Women age 21-64 are eligible for cervical cancer screening. During the past five years, the NBCCEDP has served more than 1.6 million women across the U.S., with approximately 900,000 receiving Pap testing and 1 million receiving mammograms. The demographic distribution of women served is approximately 40% white women, 15% black women, 30% Hispanic women, 4% American Indian/Alaska Native women, 7% Asian/Pacific Island women, and 3% other.
To be eligible for the study, women must meet the following eligibility criteria:
Previously received screenings through NBCCEDP
English or Spanish speaking
Age 30-62
Additionally, women must meet one of the following screening criteria:
Received a Pap test through a NBCCEDP grantee not less than 1 year but not more than 4 years from the time of study implementation OR
Received a Pap/HPV co-test through a NBCCEDP grantee not less than 3 years but not more than 5 years from the time of study implementation OR
Received a mammogram through a NBCCEDP grantee not less than 1 year but not more than 3 years from the time of study implementation.
In order to identify eligible women, NORC at the University of Chicago will work with CDC and the six states that have agreed to participate in this study: Maryland, Mississippi, North Carolina, Nebraska, West Virginia, and Washington. These states were selected based on their size and diversity of the populations that they serve and their ability to share key contact information with NORC for the study. Seven other states (California, Texas, Wisconsin, Florida, New Mexico, New York, and South Dakota) were contacted about participating, but they declined for various reasons such as state inability to share contact information, lack of available staffing in state program, already participating in another study, and concerns about survey being administered during governor election time. NORC has reached out to the participating states to provide an overview of the study, discuss logistics, and determine any potential challenges such as staffing availability once study begins. These programs will identify potentially eligible women from their database and consent the women to have their contact information shared with NORC.
States will select at random women who meet the minimum eligibility and screening criteria from their program database. The states will then reach out to these women individually by via telephone or a mailed letter to obtain their consent to share their information with NORC. Some states have requested telephone contact and others have requested mail contact based on prior experience with their patient population. The state will provide NORC a list of women who have agreed to share their contact information and be contacted by NORC for the study.
NORC will employ a multi-mode approach to data collection, using computer-assisted web interviewing (CAWI) and computer assisted telephone interviewing (CATI). First, the women will be contacted to complete an on-line survey about their preventive health services received (Attachment 3). Women will receive an invitation letter to participate in the study through an on-line survey (Attachment 4) with web instructions (Attachment 5). At the first step of the on-line survey, women will be screened to determine whether they have enrolled in an insurance program (Attachment 3). Only those who currently have insurance will be eligible to continue with the main survey instrument in English or Spanish (Attachments 6 and 7).
Approximately two weeks after the invitation letter was mailed, women that have not completed the web-based survey or refused participation will be called as a reminder and offer to complete the survey using the CATI instrument (Attachment 8). Women who have not completed the survey over the next eight weeks will be sent a reminder postcard (Attachment 9). This will be followed two weeks later with a final invitation letter that will contain the original materials (Attachment 10).
The goal of the sample design was to achieve 1,500 completed surveys would allow of an expected margin of error of +/- 2 percentage points for an estimate of 15 percentage points in Year 3. That will be 250 completed interviews per state. This goal is expected to support analysis by socio-demographic groups such as state, age, race, Hispanic ethnicity, income, and preventative screening type (cervical/breast). The targeted sample numbers at each stage are based on the Year 3 goals. To achieve the Year 3 targeted completes, NORC has estimated the response rates at the different times of contact (rate of agreement to participate in the survey, screener completion rate, eligibility rate, interview completion rate for Year 1, and year-to-year attrition rate for years 2 and 3 of the survey) based on the literature and past history with survey administration. Table 1 below shows the assumed rates and associated sample sizes for the data collection across all states combined, as well as by each state individually. We assume a year-to-year attrition rate of 67.5%, both from Year 1 to Year 2, and Year 2 to Year 3. To achieve the target of 1,500 women with completed surveys in Year 3, we will need 2,222 women to complete the survey in Year 2, and 3,292 women to complete the survey in Year 1.
To achieve the target for Year 1, we have to determine the initial number of women to sample that accounts for contacting the women, completing a screener questions to determine eligibility, the actual eligibility rate, and the interview completion rate. Based on NORC’s past experience with surveys using targeted lists of eligible people, we assume the initial contact rate to be 80% (see Table 1). Of those women we are able to reach, we assume 85% will complete the screener questions to determine if they are eligible for the survey. Of those that complete the screener, we have estimated 40% will meet our eligibility criteria. Of those that are eligible, we assume 85% will complete the full survey. Therefore, to achieve our target for Year 1, we would need a sample of 14,240 women; of whom we will be able to contact 80% (11,392 women); of whom 85% (9,683) will complete the screener; of whom 40% (3,873) will be eligible for the survey; of whom 85% (3,292) will complete the Year 1 interview. We will sample these 14,240 women across the six states equally, and expect there will be approximately the same response rates by state.
Table 1. Response Rates and Associated Sample Sizes.
|
Rate |
Sample Size Across All States |
Sample Size by State |
Initial Sample from States |
|
14,240 |
2,373 |
Contact Rate |
80% |
11,392 |
1,989 |
Screener Completion Rate |
85% |
9,683 |
1,614 |
Eligibility Rate |
40% |
3,873 |
646 |
Year 1 Interview Completion Rate |
85% |
3,292 |
549 |
Year 2 Interview Completion Rate |
67.5% |
2,222 |
370 |
Year 3 Interview Completion Rate |
67.5% |
1,500 |
250 |
Table 2 shows the expected survey completes in total for the baseline survey at the time of recruitment and for the two follow-up years for the recruited panel, as well as the associated margin of error. It also indicates the number of women that will complete the screener for the survey in the first row. Since there are a set of questions that will be asked for the screening question, we have also included what the margin of error would be for women who are screened in Year 1.
Table 2. Survey Completes by Year
|
Year 1 Baseline/ Panel Recruitment |
MOE* (Percentage Points) |
Year 2 Survey |
MOE (Percentage Points) |
Year 3 Survey |
MOE (Percentage Points) |
Total Across Years |
MOE (Percentage Points |
Total Screener Completes Across States |
9,683 |
+/- 0.7 |
NA |
NA |
NA |
NA |
9,683 |
+/- 0.7 |
Total Completes Across States |
3,292 |
+/- 1.3 |
2,222 |
+/- 1.6 |
1,500 |
+/- 2.0 |
7,014 |
NA |
*The measure of error (MOE) is calculated for an estimate of p=15% at the 0.05 significance level. A design effect of 1.2 is assumed due to differential sampling and weighting.
Again, based on our Year 3 goal of 1,500 complete surveys, the target number of completes across all participating states is approximately 3,292 for the first year. Given this sample size, an estimate of p=15% (e.g., percent of women who have no co-pay for cancer screening) for 3,292 interview completes will have a Measure of Error (MOE) of plus or minus 1.3 percentage points.
The participating states will need to make initial contact with women meeting the study criteria and gain consent for sharing their contact information with NORC for possible participation in the study. This initial outreach will be conducted by either phone or mail (Attachment 11 and 12).
NORC has developed the methodology for the sample frame that can be implemented by each state (Attachment 17). NORC will work with the states to create an accurate sampling frame. From this sampling frame, a stratified simple random sample will be selected. Depending on the state demographic distribution of the program population, a stratified sample design may or may not be needed. Some states may have higher proportions of minority populations of interest (e.g., Hispanic population), and may require high sampling rates for those populations which would result in a stratified sampling design.
Two domains of interest are race/ethnicity and rural areas. Many states have a strong mix of multiple race/ethnicities, and a simple random sample should obtain a sufficient mix across the domain of interest. In states where the percent of women served in rural areas is greater than 5.0% or 500 women, a stratified sample will be implemented with two strata: women served in rural areas and women served in other areas.
As shown in tables 1 and 2, NORC expects nonresponse at different stages of the data collection process. Weighting will be implemented to account for the nonresponse encountered. Analyses will be conducted to determine the best procedures to employ for weighting.
NORC will perform all data collection activities for this project. Employing a multi-mode approach to data collection, data will be collected using computer assisted web interviewing (CAWI) and computer assisted telephone interviewing (CATI), with the primary emphasis being on questionnaire completion via a secure web-based survey instrument. The integrated web/CATI system will allow a respondent to begin in one mode, but complete in another without sacrificing data quality.
Working with both CDC and NORC, the states will contact women who are eligible for this survey, based on the criteria noted above, and ask them for agreement for their contact information to be shared for the study (Attachments 11 and 12). Agreement from each woman is required prior to NORC being able to contact them for the survey implementation. After the state outreach, a list of women will be developed that contains the women’s names, contact information, and screening information. In Year 1 of data collection, NORC will begin by mailing an invitation letter to each woman on the final list requesting her participation (Attachment 4). The invitation letter will contain the web link for the survey, the respondent’s unique Personal Identification Number (PIN), and instructions that describes how to access the web survey. Respondents will respond to a short set of screener questions first to confirm eligibility (Attachment 3). Those who are eligible will proceed to the main survey instrument (Attachment 6 and 7).
Also included with the invitation letter is a promise of a $10 post-survey award in the form of a prepaid gift card upon completion of the survey for eligible respondents. While this initial letter will invite respondents to complete over the web, respondents will also have the option of calling the study’s toll-free line at any time to complete the survey over the phone.
Informed Consent. Prior to participating in the study, all respondents must provide their consent to participate (Attachment 6 and 7). The consent will be programmed in both modes of the survey and is critical to ensure the protection of respondents’ rights as research participants, provide answers to any questions they may have about the study and their participation in it, and inform the respondent of the voluntary nature of participation. Once consent is obtained, the survey will proceed. Throughout any interactions with respondents, NORC will communicate the purpose of the study and thoroughly explain the expectations for participation.
Telephone data collection. NORC data collection specialists will begin contacting non-respondents via telephone approximately two weeks after the web invitation mailing. This contact will serve to establish personal contact, verify receipt of the invitation letter, answer questions, and offer respondents the option to complete the survey over the phone (Attachment 8). NORC anticipates telephone data collection to span eight weeks.
Prior to the start of telephone dialing, telephone interviewers will undergo project-specific training. The training will inform the telephone interviewers about the scope of the project, any challenges they may encounter, and strategies to overcome these challenges. The training will also emphasize gaining cooperation techniques, including refusal conversion.
The case management system used for this project will deliver cases automatically to the telephone interviewers based on a pre-determined set of calling rules. Multiple call attempts will be made to women over the course of data collection, with attempts made at various times of the day. The interviewers can also set an appointment with a respondent to begin the survey at a time that is convenient for her.
Final invitation mailings. For any sampled respondents who have not completed the survey after the eight-week telephone period, a reminder postcard will be sent (Attachment 9). This will be followed two weeks later with a final invitation letter that will contain the original mail materials and a note that data collection will be ending soon (Attachment 10). Respondents will be allowed two final weeks to complete the survey via web before the round is considered closed.
Included within all mailings (invitation letter and final mailings) will be a toll-free number and e-mail address so that respondents may contact the project with questions or for assistance.
Year 2 will follow similar procedures as Year 1. Prior to the start of Year 2, NORC and CDC will meet to review the Year 1 protocol and questionnaire to decide if any modifications related to collection methodology are necessary. For example, CDC and NORC will review response rates by mode to determine if it would be beneficial to start subsequent years of data collection with CATI as opposed to CAWI. CDC will seek OMB approval through a change request for any modifications. NORC will mail an interval letter to participating women. This letter will be slightly different from the Year 1 letter, as it will thank the respondent for her previous participation (Attachment 13). In Year 2, NORC will send an invitation letter for the follow-up survey and include a promise of an additional $15 post-survey award upon completion of the survey (Attachment 14).
Year 3 Data Collection Approach
In the final year of data collection, NORC will follow the same procedures as in Years 1 and 2. Materials will reference that this is the final round of the study and encourage respondents to complete the survey to ensure that the results are as meaningful as possible by retaining as many original participants as possible. In Year 3, the invitational letter will include a promise of an additional $20 upon completion of the survey (Attachment 15).
Table 3 below outlines the data collection schedule for the project.
Table 3. Data Collection Schedule
Data Collection Activity |
Weeks |
||||||||||||||||||
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 |
10 |
11 |
12 |
13 |
14 |
15 |
16 |
17-22 |
23 |
24 |
|
Invitation letter mailed |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Telephone reminder |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Reminder postcard |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Final invitation letter |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Data collection |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
NORC will utilize the following procedures to maximize cooperation and participation in the study.
Incentives
In Year 1 of data collection, respondents will receive an invitation letter requesting their participation in the study (Attachment 4). The invitation letter will contain a promise of a $10 gift card upon completion of the survey. Years 2 and 3 will follow a similar post-award model. In Year 2, the post-award amount will increase to $15. Year 3 will utilize a $20 post-award.
Considerable research has been conducted on the effectiveness of incentives to increase response in research studies. A literature review of the varying techniques researchers used to increase response rate found that studies that utilized an incentive – whether prepaid or promised – had increased response compared with studies that did not offer incentives1.
Other studies and meta-analyses of existing research support the use of incentives to elicit response. Church (1993) noted an increased response when incentives were promised to respondents upon completion2. In a review of 292 studies, Edwards et al. (2002) found that questionnaire response doubled when monetary incentives were offered to respondents3.
Offering incentives to respondents also aid with respondent retention. NORC at the University of Chicago conducts a large, national longitudinal survey that provides monetary incentives post-interview while also experiencing high retention rates between waves of data collection. Because women in WPHSS will be surveyed once a year over a three-year period, respondent retention is important. Goritz, Wolf, and Goldstein (2008) found that multi-year surveys that offered monetary incentives retained more respondents year to year than studies that did not offer a monetary incentive4. Therefore, this study will offer increasing incentive amounts each year of data collection to maintain respondent participation.
Multimode data collection
Providing respondents the option to complete the survey online via the web instrument or over the phone allows the respondent to complete the survey in a mode that is most convenient and comfortable for her. Each mode will be offered in English and Spanish, which also maximizes participation.
The web and CATI instruments will be programmed with all requisite skip logic, which reduces overall respondent burden.
Strategic Mailings/Prompts
As mentioned previously, participants will be contacted via mail first, with a follow-up phone call two weeks later. Cases will be delivered to trained telephone interviewers for dialing according to a pre-determined set of calling rules. These calling rules will be structured so the respondents are being called at various times of the day, without being a nuisance. CATI supervisors will monitor and review contact attempts to ensure cases are not being over-dialed or under-dialed.
A reminder postcard and final invitation letter will be sent to non-respondents during the final weeks of data collection to elicit response.
Locating and Tracking Respondents
Some small percentage of respondents will require locating – based on the return of the advance material or a telephone contact attempt that terminated with a non-working number. NORC is prepared to implement the locating efforts routinely used for other similar projects. NORC uses a wide range of tools such as directory assistance, national change of address databases, national credit bureau databases, and Targus, a computerized locating service that uses a large database of information to provide matches between addresses and telephone numbers. We also use Accurint®, a widely accepted locate-and-research tool available to government, law enforcement, and commercial customers.
During each survey round, NORC will ask the respondent to confirm or update her contact information. By collecting this information at the time of the interview, this should mitigate the need for increased locating in future rounds.
Between Round Reminder Letter
Because respondents from Year 1 will be interviewed two more times over the subsequent two years, it is important to keep respondents engaged between rounds so they are committed to the project and are ready to respond when contacted again. In order to maintain this engagement, NORC will mail a letter to the Year 1 respondents thanking them for their participation and letting them know the next round will begin soon. (Attachment 13). In addition to reminding respondents of their importance to the study and nurturing a sense of belonging, these mailings will be marked “address service requested” so that the Post Office will forward any mail to new locations for respondents who have moved and will notify NORC of the new address.
Technical Assistance
NORC will provide a toll-free number to all sampled individuals that they can use to contact NORC with any questions or concerns about the study. NORC will also provide an e-mail address that respondents can use for this same reason.
NORC will work with project staff at CDC to address any concerns that may arise and to resolve any barriers to participation.
While every effort will be made to obtain a completed interview from the women in our selected sample, our response rate may be less than 80%, especially when we take into account year-to-year attrition. If our interview completion rate is less than an 80% at any stage, a nonresponse bias analysis will be conducted. Given the data available from the states, demographics from the initial sampling frame will be compared to those from the completed interviews (Year 1, Year 2, Year 3). The demographic comparisons will be done on both unweighted and weighted data, as weighting may account for some differences that could be seen with the unweighted data. This analysis will reveal any potential nonresponse bias that might be introduced to the final results.
Non-Response Bias
NORC will review data collection efforts and determine if there are any differential response rates between demographic groups (e.g., race/ethnicity). If there are noticeable differences in response rates between domains of interest, targeted outreach and survey follow-up may be implemented to increase response rates as needed. Continued follow-up is a standard protocol for our telephone interview process. At the end of each year of the survey, NORC will review the differences in demographics between the respondents and nonrespondents and determine if any adjustments in the data collection protocol should be made (e.g., increase the number of call-backs to women).
Cognitive interviewing is a process for pre-testing survey instruments such that problematic questions may be identified and adjusted before fielding the questionnaire to the full sample. Both the English language and Spanish language versions of the instrument were cognitively tested to assess the clarity of questions and to ensure respondents could provide answers in the way the survey intended.
Cognitive Test of the English Language Instrument
CDC contracted with SciMetrika, LLC5 who conducted a pre-test of the English language instrument with nine low-income women from central North Carolina counties. Participants in the pre-test received a $50 gift card incentive for their participation in the cognitive interview.
All interviews were completed in person with a trained cognitive interviewer. Respondents read and responded to each question, providing a rating of clarity (clear or unclear) and difficulty in answering. If a respondent indicated that a question was not clear or difficult to answer, the interviewer followed up with additional questions to probe what made the question unclear or difficult to answer. Questions that were identified as unclear or difficult to answer were reviewed and revised, as approved by CDC.
Cognitive Test of the Spanish Language Instrument
NORC conducted a pre-test of the Spanish instrument. The primary goal of the pre-test was to ensure that Spanish-speaking respondents understood the Spanish language items in the same manner that English-speaking respondents understood the corresponding English language survey items. NORC also assessed equivalency of terms and concepts across English and Spanish.
Nine women participated in the pre-test of the Spanish language instrument. The respondents ranged in age from 27 to 56 and represented a variety of Hispanic origins. All interviews were conducted in person. Respondents were compensated $40 for participating in the pre-test. Trained cognitive interviewers who were also native Spanish speakers conducted the interview.
The pre-test followed a two-item format to determine which items required further probing for the Spanish-language participants. Each respondent was asked to complete the survey and was probed on whether items were clear or unclear and whether items were easy or difficult to answer. Any items that were rated as unclear or not easy to answer were probed to ascertain what made the item unclear or difficult to answer. The cognitive interviewers probed not only for item content and vocabulary, but also for clarity of translation and issues related to syntax or cultural inequivalencies.
Based on the results of the pre-test, translations for several items were updated to a more common, recognizable term or phrase. In addition, the term for “Refused” was redefined as “No Contesto” on the Spanish instrument.
The following individuals inside the agency have been consulted on the design and statistical aspects of this information collection as well as plans for data analysis:
Amy DeGroff, PhD, MPH
Program Evaluator
Division of Cancer Prevention and Control
National Center for Chronic Disease Prevention and Health Promotion
Centers for Disease Control and Prevention
770-488-2415
Chunyu Li, PhD
Health Economist
Division of Cancer Prevention and Control
National Center for Chronic Disease Prevention and Health Promotion
Centers for Disease Control and Prevention
770-488-4866
Ketra Rice, PhD, MS
Prevention Effectiveness Fellow
Division of Cancer Prevention and Control
National Center for Chronic Disease Prevention and Health Promotion
Centers for Disease Control and Prevention
770-488-4241
Lindsay Gressard, MEd, MPH
Presidential Management Fellow
Division of Cancer Prevention and Control
National Center for Chronic Disease Prevention and Health Promotion
Centers for Disease Control and Prevention
770-488-3111
The following individuals outside of the agency have been consulted on the questionnaire development, statistical aspects of the design, and plans for data analysis:
Shawn Hirsch, MPH
Statistician/Project Manager
SciMetrika, LLC
919-354-5266
Additionally, the table below lists the NBCCEDP grantees who participated in discussions on study logistics around identifying and contacting previous NBCCEDP clients and to review the questionnaire.
Table 4. State NBCCEDP representatives
Name |
State |
Email address |
Cheley Grigsby |
Alaska |
|
Jennifer Roberts |
Alaska |
|
Emily Wozniak |
Arizona |
|
Beverly Sato |
California |
|
Monica Brown |
California |
|
Shannon Lawrence |
Colorado |
|
Dawn Henninger |
Maryland |
|
Leah Merchant |
Montana |
|
Lisa Troyer |
Montana |
|
Heather LeBlanc |
New York |
|
Heather Dacus |
New York |
|
Maggie Gates |
New York |
|
Paulette DeLeonardo |
North Dakota |
|
Kristin Kane |
Oregon |
|
Karen Cudmore |
South Dakota |
|
Travis Duke |
Texas |
|
Gale Johnson |
Wisconsin |
|
Carol A. Blanks |
Connecticut |
|
Hope Wood |
Florida |
|
Melody Stafford |
Kentucky |
|
E.J. Siegl |
Michigan |
|
Melissa D. Leypoldt |
Nebraska |
|
Libby Bruggeman |
New Mexico |
|
Debi Nelson |
North Carolina |
|
Brenda Di Paolo |
Rhode Island |
|
GeorgeAnn Grubb |
West Virginia |
|
Nicole Lukas |
Vermont |
|
Megan Celedonia |
Washington |
The following individuals will conduct data collection and analysis activities:
Michael Meit, MA, MPH
Project Director
NORC at the University of Chicago
301-634-9324
Stephanie Poland, MA
Survey Director
NORC at the University of Chicago
312-759-4261
Elizabeth Ormson
Statistician
NORC at the University of Chicago
301-634-9475
Megan Heffernan, MPH
Research Analyst
301-634-9412
1 Yu, J. & Cooper, H. (1983). A quantitative review of research design effects on response rates to questionnaires. Journal of Marketing Research, Vol. XX, 36-44.
2 Church, A.H. (1993). Estimating the effect of incentives on mail survey response rates: A meta-analysis. Public Opinion Quarterly, 57(1), 62-79.
3 Edwards, P., Roberts. I., Clarke, M., DiGuiseppi, C., Pratap, S., Wentz, R., Kwan, I. (2002). Increasing response rates to postal questionnaires: Systematic review, British Medical Journal, Vol. 324(7347), 1183-1185.
4 Gortiz, A.S., Wolff, H., & Goldstein, D. (2008). Individual payments as a longer-term incentive in online panels. Behavior Research Methods, 40(4), 1144-1149.
5 SciMetrika conducted this work under Contract #200-2008-27889, Task Order 0028.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Stephanie Poland |
File Modified | 0000-00-00 |
File Created | 2021-01-22 |