Supporting Statement B VRS Survey

Supporting Statement B VRS Survey.pdf

Peace Corps Conversion Loss Survey

OMB: 0420-0544

Document [pdf]
Download: pdf | pdf
B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

1. Respondent Universe and Sampling Methods
An online survey will be conducted among 1,200 Peace Corps applicants and returned Peace Corps
Volunteers, including 300 from each of the following segments: Inquire – complete an initial inquiry but
do not begin an application after 12 months; Begin application – but either do not submit it or move
forward; Submit complete application – but then elect not to proceed by stopping communication or
actively withdrawing during the review process; returned Peace Corps Volunteers – who completed
Peace Corps service in the past two years. Including returned Peace Corps Volunteers in the study will
provide information to understand what is working in the application process and will help guide the
strategies for correcting the conversion loss. In order to fulfill the 1,200 completed responses (900
Peace Corps applicants and 300 returned Peace Corps Volunteers), we plan to use a sample ratio of 25:1
for applicants and 10:1 for returned Volunteers. At least 20% of three of the four lists (we do not have
demographic info from the Inquire list) will be non-white, to ensure ethnic representation, mirroring the
current diversity within Peace Corp’s application pool. A sampling of this group will be selected to
receive invites to the survey.
While there is no recent survey of returned Peace Corps Volunteers on which to base the expected
response rate, previous surveys of the Peace Corps Volunteer population can be used as a guide. The
most recent survey of returned Peace Corps Volunteers reported a 54 percent response rate (Juanita
Graul. Survey of Returned Peace Corps Volunteers. December 1996. Peace Corps, Washington D.C.).
The 2008 and 2009 surveys of active Peace Corps Volunteers reported 60 percent and 71 percent
response rates respectively (Peace Corps 2008 Volunteer Survey: Global Report, Peace Corps Office of
Strategic Information, Research and Planning; Peace Corps 2009 Volunteer Survey: Global Report, Peace
Corps Office of Strategic Information, Research and Planning, in progress). We propose a list ratio of
10:1 for returned Volunteers. We do not have historical information on the expected response rate of
applicants and, therefore, propose a list ratio of 25:1 to ensure completion of the study.
Through the execution of the study, a dynamic segmentation will be created of those who opt out based
on their level of participation in the application process. Specifically, the conversion loss segmentation
study will explore candidate demographics, Peace Corps factors (factors about the Peace Corps that
influence drop-off), and Peace Corps recruitment factors (specific activities through the recruitment
process that influence opting out).
We will be pulling sample for the study from three different databases to meet the four audience pool
needs of this study.
The databases are:
1) Inquiry database - 110,000 names available within one-year window for those that inquire but do
not apply

2) Application database - 75,000 names available within one-year window for those that start an
application
3) Returning volunteer database - 7,000 names available within two-year window
The samples are:
1) 7,500 names for those who complete the inquiry but do not begin an application, with a one-year
window (will be drawn from inquiry database)
2) 7,500 names for those who begin an application but do not submit a completed one, with a one-year
window (will be drawn from application database)
3) 7,500 names for those who submit an application but then elect not to proceed (will be drawn from
application database)
4) 3,000 names of returning volunteers, with a two-year window (will be drawn from returned volunteer
database)
Please note since some participants move from the inquiry database to the application database, there
is some overlap. However, we will be pulling sample within a more targeted time frame to increase
relevance of the study to respondents, collect more timely and relevant data, as well as increase
response rates.
2. Procedures for the Collection of Information
A 20-minute survey is recommended to provide the landscape needed to explore the full range of
individual, Peace Corps, and recruitment process characteristics. A survey of 20 minutes will
accommodate roughly 40 questions. The survey will be completed in English. The survey questioning will
include demographics, which will always be viewed in aggregate and not on an individual basis (see
attached survey and example of a screen shot of a programmed grid question.)
Peace Corps will work with an outside vendor (Ogilvy Public Relations Worldwide) to conduct the survey.
As detailed, the sample for this study will be provided by Peace Corps, including participant email
addresses. Beyond Peace Corps recruitment status, demographics, and email address, no personally
identifiable information will be included in the lists. Based on database frameworks, Peace Corps will
provide separate lists of inquirers, applicants, and returned Volunteers. The lists will initially include
name and email addresses and after being de-duped using email addresses, names will be erased so that
the final list used for the survey invitation will not include name. The lists are being de-duped to ensure
respondents are only contacted once for inclusion in the survey. The lists will be provided in Excel
format. Following completion of the study, electronic lists will be destroyed / deleted.
Based on the conversion rate from inquire to application, we recommend increasing the list ratio to 25:1
to ensure completion of the study. A lower ratio of 10:1 can be considered for recently returned
Volunteers. We recommend that the sample framework for this study include the past one to two years

of initial applicants and returned Volunteers to provide an up-to-date assessment of the current
recruitment process.
This robust sample and design will provide the depth needed for the advanced analytics necessary to
isolate the factors that most strongly influence conversion loss, as well as to ascertain the motivations to
drive conversion. Analytics employed may include: segmentation, regression and correlations,
predictive modeling, mapping, and conversion profiling. In addition, a sample size of 1,200 will ensure a
low margin of error. The results will be tested at 95 percent significance, which is a rigorous statistical
test. At 95 percent confidence level, if the study were repeated, the study results would not fluctuate
more than 2.8 percentage points in either direction for the total sample (1,200) or 5.7 for the subgroups (300).
3. Methods to Maximize Response Rates and Deal with Non Response
In order to be able to generalize the results of the survey to the Peace Corps applicant population as a
whole, we will make sure our sample selection contains approximately the same demographic
percentages as the applicant population. Our study has taken several measures to increase the
likelihood of an adequate response rate. Since more than 95 percent of Peace Corps applications are
completed online, this methodology will allow us to collect the most representative sample of the Peace
Corps inquirers’ universe. Further, this methodology will ensure quality and accurate collection of data,
while also providing the greatest privacy to respondents and the least burden of time on respondents.
Survey deployment and submission will be automated using an online electronic tool. This will allow
many survey respondents to provide information from their own homes without the inconvenience of
having to return a paper survey through the mail. We will only be contacting Peace Corps applicants
and returned Peace Corps volunteers who have been active in the past two years, which should also
maximize the response rate. Finally, because we are planning to use a list ratio of 25:1 (applicants) and
10:1 (returned Volunteers), we anticipate that we will be successful in surveying the total of 1,200.
We will take several steps to ensure the survey results represent the demographic profile of Peace
Corps’ applicant and returning Volunteer pool. First, we are pulling a random sample of each database
sample. For those databases that have ethnicity tagged, we will ensure an over sample of these targets.
We will then re-ask the demographic profile questions in the survey and track this against the known
demographic profile matrix. We will monitor the sample while in the field and can make adjustments as
needed if the sampling is not performing as needed. This could include sending reminders to male
respondents or requestioning additional sample based on ethnicity. Finally, if needed, we will weight
the data on the backend to mirror the demographic profile matrix. So, for example, we know that 60%
of volunteers are female, with 40% being male. We also know that women respond more quickly to
surveys than men. We will monitor the sample collection to ensure it mirrors this gender profile
desired. However, if we find that at the close of the survey that the final sample is 70% female and 30%
male, we can weight the data to the appropriate percentages.

4. Tests of Procedures and Methods to be Undertaken
The content testing Peace Corps evaluated draft questionnaires with Peace Corps recruitment staff,
returned Volunteers, and Millennials to test whether the questions asked were effectively worded for
easy understanding and whether they were phrased appropriately to gather data. The results of this
testing helped shape the final questionnaire sent to OMB.
For operational testing the survey will be launched as a slow start, meaning a limited sample is sent out
and data is collected. The data is then processed to ensure that respondents are accurately moving
through the survey, as well as evaluating the time it takes the various segments to progress through the
survey. If any issues arise, the survey’s operation will be revised and prepped for full field. Once these
checks have occurred, the full field of the study will begin. If any changes are made to the survey during
the slow start, the data collected is discarded. In addition, both the Peace Corps and Ogilvy will pre-test
the survey programming with in-house staff prior to approving it for field. Further, our research partner
has reviewed the survey for length and comprehension considerations.
5.

Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Peace Corps Staff:
Linda Isaac, (202) 692-2205
Dorothy Sales, (202) 692-2206
Ogilvy Staff:
Heidi D’Agostino, (212) 880-5428
Vickie Jones, (202) 729-4179
Nancy Accetta, (202) 729-4167
Note: Ogilvy will work with OnResearch in the execution of the quantitative survey. OnResearch
follows the Casro Code of Standards and Ethics for Survey Research and ICC/ESOMAR International
Code on Market and Social Research guidelines. OnResearch will maintain the respondent lists,
including email addresses, through the course of the research and will then destroy/delete the lists.

Online Grid Example:


File Typeapplication/pdf
Authordsales
File Modified2011-04-28
File Created2011-04-28

© 2024 OMB.report | Privacy Policy