Attachment G - 2013 SDR Contacting Protocol Experiments Results

Attachment G - SDR Contacting Protocol Experiments Results.pdf

2013 Survey of Doctorate Recipients (SDR)

Attachment G - 2013 SDR Contacting Protocol Experiments Results

OMB: 3145-0020

Document [pdf]
Download: pdf | pdf
Attachment G
2013 SDR Contacting Protocol Experiments
Results

Survey of Doctorate Recipients

FINAL REPORT

2013 Survey of Doctorate Recipients
Contacting Protocol Experiments Results

OCTOBER 31, 2014

PRESENTED TO:
Steve Proudfoot, COR
National Science Foundation
4201 Wilson Blvd, Room 965S
Arlington, VA 22230

PRESENTED BY:
Ipek Bilgen
Shana M. Brown
Evan Nielsen
Karen Grigorian
NORC at the University of Chicago
55 East Monroe Street
30th Floor
Chicago, IL 60603
(312) 759-4000
(312) 759-4004

This document was prepared by NORC at the University of Chicago for the National Science
Foundation (NSF) under Task Order Number NSFDACS1247266 of the NSF Contract Number
GS10F0033M. Please contact NORC for further information regarding this document.

Table of Contents
Section 1: SDR Data Collection Protocols and Introduction to the Contacting Protocol
Experiments ............................................................................................................................. 3
1.1

Description of the Study and 2013 Data Collection ..................................................................... 3

1.2

Data Collection Protocol ............................................................................................................... 3

1.3

Contacting Protocol Experiments ................................................................................................. 5

Section 2: “Email-Only” Experiment ...................................................................................... 7
2.1

Theoretical Justification and Hypothesis ...................................................................................... 7

2.2

Experimental Design and Sample Stratification ........................................................................... 8

2.3

Contacting Protocol and Procedures ............................................................................................ 9

2.4

Analysis and Results...................................................................................................................... 9

Section 3: Green Appeal Experiment.....................................................................................13
3.1

Theoretical Justification and Hypothesis .................................................................................... 13

3.2

Experimental Design and Sample Stratification ......................................................................... 13

3.3

Contacting Protocol and Procedures .......................................................................................... 14

3.4

Analysis and Results.................................................................................................................... 15

Section 4: Start New Cohorts in Web Experiment ................................................................21
4.1

Theoretical Justification and Hypothesis .................................................................................... 21

4.2

Experimental Design and Sample Stratification ......................................................................... 22

4.3

Contacting Protocol and Procedures .......................................................................................... 22

4.4

Analysis and Results.................................................................................................................... 23

Section 5: Summary of Results..............................................................................................30
5.1

Summary of Results .................................................................................................................... 30

References ..............................................................................................................................32
Appendix A: Email Only Experiment Materials ................................................................... A-1
Appendix B: Green Appeal Experiment Materials .............................................................. B-1
Appendix C: Start New Cohort in Web Experiment Materials ............................................ C-1

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 2

Section 1: SDR Data Collection Protocols and Introduction to
the Contacting Protocol Experiments

1.1

Description of the Study and 2013 Data Collection

The Survey of Doctorate Recipients (SDR), sponsored by the National Science Foundation (NSF) and the
National Institutes of Health, gathers information from individuals who have obtained doctoral degrees
in the science, engineering, or health (SEH) fields from U.S. institutions. This panel study has been
conducted every two years since 1973 on a nationally representative cohort of SEH research doctorate
recipients, following sample members throughout their careers from year of doctorate degree award
until age 76. Each survey cycle, a sample of SEH doctoral degree earners from the two recent academic
years is added to the SDR from another NSF-sponsored survey, the Survey of Earned Doctorates (SED).
As new sample members are added to the sample, each round, a group of cases is statistically selected
to be cut from the sample. The SDR seeks surveys from sample members regardless of working or
citizenship status, or location in which they live and work. The goal of the SDR is to provide
policymakers and researchers with high-quality data and analyses for making informed decisions related
to the educational achievement and career movement of doctoral scientists and engineers trained in
U.S. educational institutions.
The 2013 SDR data collection began in mid-February 2013 and active data collection ended in August
2013. While the data collection end date was not finalized at the time of experiment development, the
2013 data collection period turned out to be considerable shorter than any data collection period since
the 1997 cycle, and was 14 weeks shorter than the prior data collection period conducted for the 2010
cycle.

1.2

Data Collection Protocol

While the SDR has been a fully tri-mode survey since 2006 - offering the survey via self-administered
online survey (Web), computer-assisted telephone interview (CATI), and self-administered mail
questionnaire (SAQ) via the U.S. Postal Service (USPS) - the Web survey has increasingly been more
popular with sample members, as the study targets a population that is technologically-savvy and for
whom Internet access is generally readily available. As many of the SDR sample members use
computers and access the Internet at home or work, they have established personal and professional
email addresses, which means electronic outreach and survey administration is a viable option for a
large portion of the sample members. In the 2010 SDR, over 62 percent of completes were obtained by
Web survey, an increase of just over 5 percent from 2008. Accordingly, the percentage of sample
members completing by mail questionnaire has decreased, falling from nearly 31 percent in 2008 to just
over 26 percent in 2010. The Web survey mode offers the convenience of self-administration, thus
2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 3

lessening administration costs, and captures data directly as respondents enter it, eliminating the time
lag of mail and data entry associated with paper surveys and increases quality by omitting opportunity
for data entry error. In the 2013 cycle, the SDR offered a critical item only (CIO) survey in the Web mode
for the first time. The CIO is a survey version comprised of a subset of questions identified as critical for
analysis and was accessible only at the end of the field period. Prior to the 2013, the SDR CIO survey
was only offered via the telephone.
The SDR uses a mixed-mode strategy in which sample members are divided into three start modes with
the following data collection protocols, upholding the principles of Dillman’s Tailored Design Method
which builds upon social exchange theory to maximize response rates (Dillman, Smyth, and Christian,
2009).
•

The SDR “mail SAQ start group” first receives an advance letter, followed by a questionnaire mailing,
thank you/reminder post card, a follow-up questionnaire mailing, a prompting letter and email, and
finally a prompt made by a telephone interviewer. The purpose of the mail start protocol is to ask
the sample members to complete the paper questionnaire; non-respondents after the postcard
mailing are offered the other completion modes.

•

The SDR “CATI start group” first receives an advance letter, then telephone calls from interviewers
to conduct the survey via CATI, a prompting letter and email, and finally the questionnaire, postcard
follow-up, and follow-up questionnaire mailing. The purpose of the CATI start group protocol is to
ask the sample members to complete the SDR by telephone; non-respondents after this request are
offered the other completion modes.

•

The SDR “Web start group” first receives an initial contact both by letter and email, which include
the URL to the Web survey and the personal identification number (PIN) and password information.
The first contact is followed by another complementary contact that includes a prompting letter and
email, telephone prompting, and finally the questionnaire mailings as described previously. The
purpose of the Web start group protocol is to ask the sample members to complete the Web survey;
non-respondents after this request are offered the other completion modes.

Based on previous research conducted by the SDR on mode preference (Hoffer et al., 2006), the
standard procedure for the SDR is to start cooperative sample members in the mode which they prefer.
At the end of the SDR questionnaire, respondents are asked to respond to the question “How would you
like to complete future rounds of this survey?” by selecting one of four options: (1) A questionnaire sent
in the mail, (2) A Web questionnaire on the Internet, (3) A telephone interview, or (4) No preference. If
the respondent selects one of the three modes, this is considered an explicit mode preference. If the
sample member selects the fourth option or does not answer the question, an implicit preference is
assumed, that is, the mode in which the sample member completed the survey. Thus, both sample
members who indicated in the 2010 SDR a mail mode preference or completed the 2010 SDR via mail
and had no future mode preference would have been considered to have a “mail mode preference” for
the 2013 SDR and subsequently started in the mail mode, following the associated contacting protocol.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 4

New cohort and nonrespondent panel cases are started in the mode indicated as the most effective in
previous SDR research (Hoffer et al., 2006).

1.3

Contacting Protocol Experiments

Each round of the SDR has the challenge of simultaneously maintaining a longitudinal panel sample,
varying in past response from cooperative to refusal, and obtaining the participation of a new cohort of
sample members. The SDR has a long history of conducting experiments in order to further its goal of a
high response rate with quality data, obtained within a reasonable data collection period. The 2013 SDR
included three contacting protocol experiments to gain efficiencies and increase Web participation; this
report describes these experiments and discusses the results.
“Email-Only” Experiment
The first 2013 SDR contacting experiment tested whether sample members could receive their first
request for Web participation via email only as opposed to the previous practice of sending both an
email and letter contact. In the SDR, many cooperative panel members provide an email address in the
prior survey round. For many cooperative sample members (i.e., individuals that completed the survey
in the previous round) that prefer the Web mode, the SDR has on file both an active and primary email
address, thus many receive the initial request for Web survey participation via an email and letter
contact. Both contact messages provide the Web survey URL as well as a PIN and password to access
the survey (see Appendix A). In the 2010 SDR, about 85 percent of sample members completing the
survey provided an email address; within Web completers, that percentage goes up to 90 percent. We
hypothesized that the list of email addresses the SDR has on record for cooperative sample members is
near-to-complete and can be used effectively to recruit the cooperative panel members that explicitly or
implicitly prefer the Web mode. As the sample members are cooperative, technology-savvy, and likely
to check their emails on a daily basis, the SDR decided to test whether the project can save resources
and decrease the perception of burden by sending two simultaneous requests for survey participation if
an email message was only sent instead of a USPS letter and email at the same time for the initial
contact. Hence, we theorized that cooperative panel cases identified to start in the Web mode would
respond equally well to a single email contact as to the email plus USPS letter combination used in prior
rounds. To test this, the 2013 SDR incorporated an experiment which sought to test whether reducing
letter mailings to this group could realize substantial savings for the project without any decrease in
response. This experiment is explained further in Section 2 and referred as the “Email-only” experiment
throughout the remainder of this report.
Green Appeal Experiment
The second 2013 SDR contacting experiment conducted attempted to determine whether sample
members with a mail mode preference could effectively be started in the Web mode the subsequent
round if provided with a “green” or environmentally responsible reason do to do so. Web surveys have
not yet taken precedence over other traditional data collection modes (such as mail, telephone, and
face-to-face) when surveying the general U.S. population. However, for surveys which have adequate
sample frames with email contact information, and target populations with higher education levels, the
Web mode has been found to be suitable and can be incorporated in the data collection design (Millar
2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 5

and Dillman, 2011). Considering that SDR panel sample members are highly educated and are more
technology-savvy and frequent users of email communication, we hypothesize that 2010 SDR
cooperative panel sample members who prefer the mail yet have an email address will complete the
survey via the Web when initially asked to use the Web mode in the 2013 SDR and are provided a
reasonable rationale to do so. This experiment is explained further in Section 3 and referred as the
“green appeal” experiment throughout the remainder of this report.
Start New Cohorts in Web Experiment
The third 2013 SDR contacting experiment tested whether sample members new to the SDR (i.e., the
SDR new cohort), that had portable emails available from the Doctorates Record File (DRF) 1 could be
successfully started in the Web, rather than in the mail mode as done in previous round. A portable
email is one in which there is generally no institutional affiliation and is obtained via one of the main
search engine sites (e.g., gmail from Google) or Internet service provider (e.g., Comcast). These email
accounts are generally accessible via any Web browser with a password - hence, they are portable, or
accessible regardless of location and carried along regardless of geographical or career-related moves.
Communicating with sample members via a portable email, as opposed to an email affiliated with their
graduate educational institution, increases the likelihood that the email address is still valid and
regularly checked after graduation. As noted, SDR possesses an adequate frame for Internet users and
targets technology-savvy and highly educated individuals. The project could save resources and increase
the data collection pace if some new cohort respondents could be successfully steered to the Web to
complete the SDR questionnaire online instead of through the mail. Moreover, especially in the U.S.
younger cohorts are more likely to use their personal email as one of their main means of
communication rather than their mailing address. This experiment tests whether sample members with
portable email addresses will complete the survey via the Web mode more quickly when they are
assigned to the Web start survey protocol rather than to the standard mail start contacting protocol.
This experiment is discussed in Section 4 and referred to as the “new cohort” experiment throughout
the remainder of this report.
We conclude the report with a summary of the experiment findings of the 2013 SDR contacting
experiments in Section 5 as well as recommendations for future survey cycles.

1

The DRF is a dataset comprised of all earned doctorates granted by regionally accredited U.S. universities, in all
fields from 1920 to the present. This data is collected via the federally-sponsored Survey of Earned Doctorates
(SED). More information about the SED and its sponsors can be found:
http://www.nsf.gov/statistics/srvydoctorates/.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 6

Section 2: “Email-Only” Experiment

2.1

Theoretical Justification and Hypothesis

One of the main advantages of Web surveys in comparison to other data collection modes (such as mail,
telephone, and face-to-face surveys) is that once the Web survey is set up, there are minimal future
costs associated with surveying additional sample members (Groves et al., 2009). The Web survey mode
is considered the least costly mode option among traditional modes especially for larger sample sizes
and technology-savvy populations; mail is now considered a relatively high cost mode due to the rapid
and consistent increase in postage, printing, and processing costs (Dillman, Smyth, and Christian, 2009).
Moreover, Web surveys allow researchers to obtain analyzable data faster than mail because of the
immediate delivery and lack of data entry. However, the response rates obtained from Web-only
surveys are traditionally lower than mail, telephone, and face-to-face surveys for the general U.S.
population (Stern, Bilgen, and Dillman, 2014; Bilgen and Stern, 2011). The suitability of Web surveys
highly relies on the target population. While Web-only contacting methodology does not provide
comparable response rates to traditional modes, offering a Web option first then following up with mail
has provided comparable response rates to offering a mail-only option (Dillman et al., 2009). Hence, in
surveys like SDR, in which the target population consists of technology-savvy individuals who are
frequent users of the Internet, the respondents are successfully encouraged to complete the survey via
Web. Moreover, in surveys which target technology-savvy individuals, empirical evidence reveals that
Web surveys achieve comparable response rates with mail surveys when the same mail follow-up
treatment is used for the Web and mail starting groups (Kaplowitz, Hadlock, and Levine, 2004).
Accordingly, the increase in Web survey use and the low costs associated with recruiting individuals
online makes Internet recruitment strategies appealing for a contacting protocol. A combination using
both email and mail recruitment strategies have also shown to increase Web response rates in
technology-savvy populations (Millar and Dillman, 2011). As noted, many cooperative SDR panel
members provide an email address in the survey or during the survey round. Thus, SDR has a near-tocomplete list of email addresses for the cooperative panel members that prefer the Web mode. Hence,
this experiment examines whether the SDR can take full advantage of the comprehensive email address
list on record for these sample members and get them to complete the survey via an initial contact
through email only, which is one of the most cost effective recruitment strategies. Therefore, in the
Email-only experiment we tested the effectiveness of an initial email-only contacting strategy to
determine if the SDR can use this method to recruiting cooperative sample members that prefer the
Web mode in order to save costs without sacrificing a timely response or response rate.
Hypothesis: Sample members in the Web start mode who receive only a single initial email contact will
respond to the survey at similar rates as those who receive the standard USPS letter-plus-email initial
contact in the 2013 SDR since all subsequent follow-up contacts for nonrespondents in the treatment and

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 7

control groups will be the same and include USPS contacts. Consequently, the initial email-only contact
group will require less effort and cost than the standard protocol to achieve comparable survey yield
rates.

2.2

Experimental Design and Sample Stratification

The eligible sample for this experiment consisted of panel members who completed the survey in 2010,
had both an active, primary email and USPS mailing address on record, and who either (1) indicated in
the 2010 SDR a Web mode preference or (2) completed the 2010 SDR via Web and had no preference.
The number of cases eligible for the experiment was 21,972.
Sample stratification is generally recommended on variables which could be correlated with the
treatment and the outcomes. Randomization should prevent any systematic imbalance between
characteristics of the treatment and control cases. Nonetheless, stratification provides additional
assurance that chance variations do not occur and confound the estimation of the treatment effect. For
this experiment, our main concern was to make sure that propensity to cooperate and respond was
balanced between the treatment and control groups. A number of proxy indicators of the underlying
propensity were used as stratifying variables: (1) whether the case was eligible for the 2010 SDR latestage incentive and was actually an incentivized late-stage nonrespondent in the 2010 SDR, with latestage responders assumed to be less cooperative than earlier responders; (2) newer versus older
respondents as measured by doctorate completion year and age; (3) sample member location
designated as in the U.S. or out of the U.S. with regional breakdown as appropriate; and, (4) recently
cooperative in the 2010 cycle versus always cooperative in the 2006-2010 cycles continuously.
Serpentine sorting was used to maximize the similarity of adjacent cases during assignment. After
sorting, we assigned every other case to the treatment and control group. This achieves an implicit
stratification so that there is no difference between the treatment and control groups on the control
variables.
Eligible cases were randomly assigned to the following treatment and control groups:
Control group: The initial contact to cooperative panel sample members in Web start mode was
the standard SDR Web start mode contacting protocol that includes the notice USPS letter and the
corresponding notice email.
Treatment group: The initial contact to cooperative panel sample members in the Web start
mode was limited to the notice email contact; the accompanying USPS notice letter was not sent.
All eligible cases had an equal probability of being assigned to each of the two experimental groups. Of
the 21,972 eligible cases, half - or 10,986 cases - were assigned to the control group and half were
assigned to the treatment group. The texts of the initial notice in both the USPS letter and email forms
are included in Appendix A. While the text of both emails is very similar, the control email includes an
additional statement that references the USPS letter.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 8

2.3

Contacting Protocol and Procedures

The contacting protocols for the treatment and control group are summarized in Figure 2.1. The only
difference in the contacting protocols is the first contact; as noted, the treatment group did not receive
the initial USPS contact letter and the email did not contain the statement referencing the letter.

Figure 2.1. Contacting Protocol by Experiment Group for the “Email-only” Experiment
Treatment Group – Initial Email Only

Week

Contact

Control Group – Initial Letter and Email

Date

Contact

Date

N/A

Initial Contact Letter

15-Feb-13

0

N/A

1

Initial Contact Email

20-Feb-13

Initial Contact Email

20-Feb-13

2

Follow-up Letter

28-Feb-13

Follow-up Letter

28-Feb-13

3

Follow-up Email

5-Mar-13

Follow-up Email

5-Mar-13

5
6

CATI prompt and follow-up

7

19-Mar-13 to
3-Apr-13

CATI prompt and follow-up

19-Mar-13 to
3-Apr-13

8

Questionnaire Mailing #1

10-Apr-13

Questionnaire Mailing #1

10-Apr-13

9

Thank you/Reminder Postcard

17-Apr-13

Thank you/Reminder Postcard

17-Apr-13

12

Questionnaire Mailing #2

8-May-13

Questionnaire Mailing #2

8-May-13

2.4

Analysis and Results

Analysis focused on the comparison of treatment and control groups for differences in survey yield rate,
days to complete, and level of effort measured by the average number of USPS mailings sent, telephone
contacts administered, and email messages sent. Survey yield rate differences were assessed using chisquare tests for the categorical response/nonresponse outcome; while t-tests were used for the
continuous days-to-respond and level of effort outcomes.
Analysis Case Set
In order to most accurately analyze the results of this experiment, we chose to exclude from analysis
those cases that were unlikely to have experienced the initial contact. The experiment sample includes
the cases which are sampled for the experiment as detailed at the sample stratification and
experimental design section above. The analysis eligible sample (outlined in Table 2.1) excludes around
12 percent of the cases that are selected. Excluded cases were flagged as having a valid email and
address at the time of experiment sample selection. However, it was later discovered that the
addresses for these cases were actually missing a vital piece of information. Since, these types of cases
went into locating after the start of data collection and/or ended as a locating problem at the end of the
data collection period, they are excluded from the analysis eligible case set for the “Email-only”

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 9

experiment. Table 2.1 illustrates the number of cases selected for the experiment and the sample
determined eligible for analysis.

Table 2.1. “Email-only” Experiment Sample and Analysis Eligibility
2013 Experiment Group

Experiment
Sample

Analysis
Eligible

Control USPS letter-plus-email contact

10,986

9,587

Treatment "Email-only" contact

10,986

9,618

Total (Experiment cases)

21,972

19,205

Survey Yield and Days to Complete
Table 2.2 outlines the survey yield rates by the end of the data collection period for the analysis sample.
According to the results, consistent with the hypothesis the exclusion of the USPS notice letter at the
initial contact did not impact the overall survey yield rates (p=0.20).

Table 2.2.

“Email-only” Experiment Survey Yield Rates by the End of the Field Period

2013 Experiment Group
Control USPS letter-plus-email contact
Treatment "Email-only" contact
Mean Difference (Control –Treatment)

Analysis
Sample
9,587

2013 Survey
(Percentage of Analysis Sample)
Completed

Not Completed
96.1%

3.9%

9,618

96.5%

3.5%

N/A

-0.4%*

0.4%

* not significant, p >0.1

However, we expect that the not significant difference between the treatment group and the control
group was due to the extensive follow-up efforts during the data collection period. Hence, we have
analyzed the weekly survey yield rates over the course of data collection. At week 2, the treatment
group’s survey yield rate lagged behind the control group by 7.5 percent. Figure 2.2 shows the inclusion
of the USPS initial contact lead to an increase in survey yield rates between weeks 0 through 9. Not
surprisingly, we observe the largest difference among the survey yield rates between weeks 0 through 2.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 10

Figure 2.2. “Email-only” Experiment Response over Time (Cumulative days to
complete by experiment group)

Cumulative completes by days to completion
12,000

WEEK 2
10,000

WEEK 9

8,000
USPS letter-plus-email
contact (Control)

6,000

"Email-only" contact
(Treatment)

4,000
2,000

0
8
16
24
32
40
48
56
64
72
80
88
96
104
112
120
128
136
144
152
160
168
176
184

0
Days to Complete

While at the end of the data period the survey yield rates do not differ, the control group (which
includes both USPS contact letter and initial contact email) leads to more timely completes at the
beginning and the middle of the data collection period in comparison to the treatment group (which
include only an initial contact email). Accordingly, overall the mean days to complete was 4 days shorter
in the control group than the treatment group (see Table 2.3.). Days to complete was calculated by
determining the number of days it took for a respondent to complete the survey by any mode after
being sent the first contact that allowed them to complete the survey. For both groups, this was the
initial notice email.

Table 2.3.

“Email-only” Experiment Mean Days to Complete

2013 Experiment Group

Overall
Completes

Mean Days to
Complete

Control USPS letter-plus-email contact

9,149

33.5

Treatment "Email-only" contact

9,278

37.8

N/A

-4.3*

Mean Difference (Control –Treatment)
* p=<.0001

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 11

Level of Effort and Associated Costs
Lastly, obtaining a higher number of complete surveys earlier in the data collection period leads to lower
levels of effort (which also translates to lower costs) for the control group in comparison to the
treatment group. Concerning follow-up effort, we can look at level of effort as defined the number of
emails sent, telephone calls made, and mailings sent to the sample members. While both the treatment
and control groups started in Web mode, both eventually received contacts via all methods (email,
telephone, and USPS mail) if they did not respond to the initial contacts. Table 2.4 indicates that cases
assigned to the control group received significantly less emails and telephone calls on the average; while
treatment cases received significantly less USPS mailings on the average.
Table 2.5 shows mean costs of contacts by experiment group. Costs of contacts included CATI
prompting, administering the full and CIO CATI surveys, questionnaire and letter mailings, and batch
emails. While email contacts costs the least; the telephone contacts are the most expensive contacting
method. Hence, the costs and level of effort depends on the number of completes achieved at the
beginning of the data collection period. While in this specific experiment the cost per complete was
slightly higher for the treatment group than the control group (not significant), for a smaller sample the
treatment group would have been less expensive as the total CATI costs would have been lower. Hence,
the design of the contacting strategy for other projects with a different sample size than the SDR not
only depends on the target population and sample characteristics but the size of the contacted sample
as well.

Table 2.4.

“Email-only” Experiment Mean Level of Effort (Emails, telephone calls,
USPS mailings)
Mean Number of Contacts

2013 Experiment Group

Analysis
Sample

USPS
Mailings

Emails

Telephone
Calls

Control USPS letter-plus-email contact

9,587

1.9

2.3

2.7

Treatment "Email-only" contact

9,618

2.0

1.9

3.0

N/A

-0.12*

0.42*

-0.29*

Mean Difference (Control –Treatment)
* significant at p<.0001

Table 2.5.

“Email-only” Experiment Mean Cost of Contacts by Experiment Group

2013 Experiment Group

Analysis
Sample

Mean Cost of
Contacts

Control USPS letter-plus-email contact

9,587

$

20.65

Treatment "Email-only" contact

9,618

$

21.47

N/A

$

-0.82*

Mean Difference (Control –Treatment)
* not significant, p >0.1

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 12

Section 3: Green Appeal Experiment

3.1

Theoretical Justification and Hypothesis

In surveys which target the general U.S. population, mail-only surveys or mail surveys followed by email
follow-ups requesting Web survey participation tend to yield higher response rates than Web surveys
followed by mail (Cantor et al., 2010; Converse et al., 2008; Shih and Fan, 2007). However, the studies
which specifically examine the response rate differences by demographic distributions found that
response rate differences among mail and Web surveys are not significant among individuals with higher
education levels (Shih and Fan, 2007). Further, the advantages of Web surveys include, but are not
limited to, speed, reduced costs, reduced data entry errors, ease of administration, and enhanced
questionnaire design capabilities (Groves et al., 2009; Dillman et al., 2009). While Web-only contacting
methodology does not provide comparable response rates to traditional modes, offering a Web option
first then following up with traditional modes has provided comparable response rates to offering a
combination of more traditional modes in technology-savvy populations (Millar and Dillman, 2011).
Due to the lack of adequate sampling frames for Internet surveys as well as the limited population which
has ready access and routinely uses the Internet, Web surveys still do not take precedence over other
traditional modes when surveying the general U.S. population. However, for surveys which have
adequate sample frames with email contact information, and target populations with higher education
levels, the Web mode is suitable and can be incorporated in the data collection design (Dillman et al.,
2009).
Moreover, the “leverage-salience theory” suggests when the respondent is faced with a survey request,
their probability of response depends on the leverage of the survey attribute, as well as the salience of
the same survey attribute (Groves, Singer, and Corning, 2000). Leverage refers to how much the sample
member values a certain attribute in the survey (such as incentive or survey topic), while salience refers
to the emphasis put into that certain attribute in the survey. Therefore, by emphasizing a socially
responsible “green appeal” in the contact materials, we are increasing the salience of this attribute,
which may potentially improve the response to the Web mode, and possibly response rates overall.
Hypothesis: 2010 SDR cooperative panel sample members who explicitly or implicitly prefer the USPS
mail mode and who provided an email address will complete the survey via Web if they receive a
first contact via email requesting they use the Web as a “green” alternative to a paper
questionnaire.

3.2

Experimental Design and Sample Stratification

The eligible sample for this experiment consisted of 2010 SDR cooperative panel sample members, who
had both an active, primary USPS mailing and email address on record, and who either (1) indicated in
2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 13

the 2010 SDR a mail mode preference or (2) completed the 2010 SDR via mail and had no preference.
The eligible sample excluded individuals who completed the 2010 SDR on the Web and indicated a mail
mode preference, and individuals who were subjected to the new “reluctant” mail start mode protocol
(see the 2013 SDR Survey Operations Plan for more information on the reluctant protocol). The number
of cases eligible for the experiment was 6,431.
The sample design incorporated stratification in order to create a balanced allocation between
treatment and control groups. The stratification design was similar to that implemented in the “Emailonly” experiment. The following stratification variables were used in the following order: (1) newer
versus older respondents as measured by doctorate completion year and age; (2) whether the case was
eligible for the 2010 SDR late-stage incentive and was actually an incentivized late-stage nonrespondent
in the 2010 SDR, with late-stage cases assumed to be less cooperative than earlier responders; (3)
sample member location designated as in the U.S. or out of the U.S. with regional breakdown as
appropriate; (4) recently cooperative in the 2010 cycle versus always cooperative in the 2006-2010
cycles continuously; and (5) pre-2010 completion mode. Serpentine sorting was utilized to maximize the
similarity of adjacent cases during assignment. After sorting, every other case was assigned to the
treatment and control groups. This achieved an implicit stratification so that there was no difference
between the treatment and control groups on the control variables.
Eligible cases were randomly assigned to the following treatment and control groups:
Control group: The control group followed the standard mail start mode contacting protocol,
starting with a questionnaire mailing.
Treatment group: Cases selected for the treatment group were assigned to a contacting protocol
that paralleled cases in the Web start mode. Treatment cases received a first contact via email,
with an accompanying USPS letter, emphasizing the “green” reasons to complete the survey
online instead of receiving a paper questionnaire.
All eligible cases had an equal probability of being assigned to each experimental group (treatment
versus control group). Of the 6,431 cases, 3,216 were assigned to the control group and 3,215 to the
treatment group.

3.3

Contacting Protocol and Procedures

The multi-contact data collection protocol for the treatment and control groups is illustrated in Figure
3.1. The four tailored “green” contacting materials for the treatment group are included in Appendix B.
All other contacting materials used for the treatment group were the same as those used for the
standard Web start mode protocol. The control group followed the standard mail start mode protocol
using the contacting materials developed for past cooperative sample members; the advance letter and
first questionnaire cover letter are also included in Appendix B.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 14

Figure 3.1. Green Appeal Contacting Schedules by Experiment Group
Treatment Group - Web Start Mode

Week

Contact

Control Group - Mail Start Mode
Date

Contact

Date

0

Initial Contact Letter with "green appeal"*

15-Feb-13

Advance Letter

11-Feb-13

1

Initial Contact Email with "green appeal"*

20-Feb-13

Questionnaire Mailing #1

19-Feb-13

2

Follow-up Letter with "green appeal"*

28-Feb-13

Thank you/Reminder Postcard

25-Feb-13

3

Follow-up Email with "green appeal"*

5-Mar-13
Questionnaire Mailing #2

21-Mar-13

5
6

CATI prompt and follow-up

7

19-Mar-13
to 3-Apr-13

Prompting Letter w/Web access

5-Apr-13
9-Apr-13
15-Apr-13
to
18-Apr-13

8

Questionnaire Mailing #1

10-Apr-13

Prompting Email w/Web access

9

Thank you/Reminder Postcard

17-Apr-13

CATI prompt

10
CATI follow-up

11
12

Questionnaire Mailing #2

8-May-13

21-Apr-13
to 11-May13

* Contact material tailored as part of the experiment treatment.

3.4

Analysis and Results

Analysis focused on the comparison of treatment and control groups for differences in survey yield, days
to respond, and level of effort measured by the number of USPS mailings sent, telephone contacts
administered, and email messages sent to each case. Since the treatment group for this experiment
started in a different mode and experienced a different data collection protocol schedule than the
control group, and tried to “convert” sample members to the Web, this section also examines the 2013
completion mode and mode preference for 2015. Survey yield rate differences were assessed using chisquare tests for the categorical response/nonresponse outcome; while t-tests are used for the
continuous days-to-respond and level of effort outcomes.
Analysis Case Set
In order to most accurately analyze the results of this experiment, we chose to exclude from analysis
those cases that were unlikely to have experienced the initial protocol. The experiment sample includes
the cases which were sampled for the experiment as detailed in Section 3.2. From the selected
experiment sample, 301 control cases and 365 treatment cases were excluded from analysis because
these were locating problems either at the time of the initial contact or at some point during the data
collection period. These cases were unlikely to have received the scheduled protocol as outlined for the
experiment groups and were not good candidates for outcome analysis. Table 3.1 details the number of
cases selected for the experiment and the sample eligible for analysis.
2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 15

Table 3.1.

Green Appeal Experiment Sample and Analysis Eligibility

2013 Experiment Group

Experiment
Sample

Analysis
Eligible

Control Mail Start Mode

3,216

2,915

Treatment Web Start Mode

3,215

2,850

Total (Experiment cases)

6,431

5,765

Survey Yield and Days to Complete
Looking at survey yield rates at the end of the 26 week data collection period, Table 3.2 shows both
control and treatment groups had high survey yield rates (95.1 and 94.8 percent, respectively). A chisquare analysis shows no significant difference between survey yield rates of the two experiment groups
at the end of the data collection period.

Table 3.2.

Green Appeal Experiment Survey Yield Rates by the End of the Data
Collection Period

2013 Experiment Group
Control Mail Start Mode
Treatment Web Start Mode
Mean Difference (Control –Treatment)

Analysis
Sample
2,915

2013 Survey
(Percentage of Analysis Sample)
Completed

Not Completed
95.1%

4.9%

2,850

94.8%

5.2%

N/A

0.3%*

-0.3%

* not significant, p >0.1

As data collection period lengths may change cycle to cycle, as it did for the 2013 SDR, it is important to
look at other measures in addition to the survey yield rate at the end of the data collection period, such
as response over the course of the data collection period, days to complete, level of effort, and survey
completion mode and future mode preference.
Figure 3.2 plots the days to complete for experiment cases that completed the 2013 SDR and Table 3.3
details the mean days to complete by experiment group. Days to complete was calculated by
determining the number of days it took for a respondent to complete the survey by any mode after
being sent the first contact that allowed them to complete the survey. For control cases, this first
contact was the first questionnaire (the advance letter only notified the sample member that the 2013
SDR was starting and did not make an explicit request to complete the survey or provide a means to do
so and for treatment cases, this first contact was the initial email.
Figure 3.2 and Table 3.3 both show that cases in the treatment group provided the quickest response.
Through week 5, the treatment group showed a higher number of completes than the control group.
The treatment group had a significantly lower mean days to complete (over 6 days) than the control

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 16

group. Because the 2013 SDR survey asks respondents to consider a specific reference date (1 February
2013) when answering certain questions about their occupation and additional educational
achievement, responses received closer to that reference date are assumed to be more accurate
because respondents will not have to think as far back into the past to recall what they were doing. In
addition, the sooner a sample member completes the survey, the less follow-up with its associated costs
is required.

Figure 3.2. Green Appeal Experiment Response over Time (Cumulative days to
complete by experiment group)

3,500

Cumulative completes by days to completion

3,000

2,500

2,000

1,500

Mail Start Mode
(Control)

1,000

Web Start Mode
(Treatment)

500

1
8
15
22
29
36
43
50
57
64
71
78
85
92
99
106
113
120
127
134
141
148

0
Days to Complete

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 17

Table 3.3.

Green Appeal Experiment Mean Days to Complete

2013 Start Mode

Overall
Completes

Mean Days to
Complete

Control Mail Start Mode

2,772

44.7

Treatment Web Start Mode

2,636

38.5

N/A

6.2*

Mean Difference (Control –Treatment)
* p=<.0001

Level of Effort and Associated Costs
Concerning follow-up effort, we can look at level of effort as defined by the number of emails sent,
telephone calls made, and mailings sent to the sample members. The treatment and control groups
started in different modes and received specific modes of contact in a different order as shown in Figure
3.1. However, sample members eventually received contacts via all methods (email, telephone, and
USPS mail) if they did not respond to the initial contacts. Table 3.4 indicates that cases assigned to the
treatment group received significantly more emails and telephone calls on the average; while control
cases received significantly more USPS mailings on the average.
Referring again to the contacting schedule in Figure 3.1, the last contact received by the control group is
prompting via telephone, and for treatment cases, it is questionnaire mailings. The findings in Table 3.4
correlate to the starting modes; on average the groups have more contacts from the mode they started
in and the least from the mode they ended in per the contacting protocol. Because the mail start
protocol for the control group begins with two questionnaire mailings, and the Web start treatment
protocol ends with the questionnaire mailings, the control group received more questionnaires than the
treatment group. As a result, mailings costs are higher for the control group. In the same vein, the Web
start treatment group had more calls and thus higher telephone costs.
Table 3.5 shows that the treatment group actually had higher mean costs when it came to the cost of
the full suite of contacts (e.g., batch emails; letter, postcard, and questionnaire USPS mailings; and
telephone prompting). This is due to the greater volume and associated higher costs for telephone
prompting for that group. However, the story changes when looking at overall mean cost of data
collection. This cost is calculated to include the full suite of contacts plus any survey administration
costs (e.g., interviewer labor costs for CATI completes and Business Reply Envelope (BRE) postage for
returned questionnaires). Because the control group has a higher percentage of respondents
completing by mail survey (Table 3.6), the control group ends up with a higher mean overall cost of data
collection (Table 3.5). Also, though the costs are not represented in Table 3.5, surveys obtained via mail
survey have additional costs associated with receipt control, pre-key editing, and data-entry.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 18

Table 3.4.

Green Appeal Experiment Mean Level of Effort (Emails, telephone calls,
USPS mailings)

2013 Experiment Group

Analysis
Sample

Emails

Mean Number of Contacts
Overall
Telephone
Questionnaire
USPS
Calls
Mailings
Mailings
3.2
2.1
1.6

Control Mail Start Mode

2,915

0.6

Treatment Web Start Mode

2,850

1.8

2.5

3.2

0.6

N/A

-1.3*

0.7*

-1.1*

1.0*

Mean Difference (Control –Treatment)
* p<.0001

Table 3.5.

Green Appeal Experiment Mean Cost of Contacts by Experiment Group

2013 Experiment Group

Analysis
Sample

Mean Cost of
Contactsa

Mean Overall
Cost of Data
Collectionb

Control Mail Start Mode

2,915

$

22.05

$

35.50

Treatment Web Start Mode

2,850

$

24.61

$

28.98

N/A

$

-2.56*

$

6.52**

Mean Difference (Control - Treatment)

* p=0.0015
** p<0.0001
a Costs include batch emails; letter, postcard, and questionnaire USPS mailings, and telephone prompting.
b Costs include outreach as well as administration (i.e., CATI interviewer labor for telephone interviews and BRE costs for completed mail
surveys); excludes incentives.

Completion Mode and Mode Preference
Since one of the goals of this experiment was to encourage previously cooperative sample members
who preferred the mail mode to complete in the Web mode, it is important to analyze both the
completion mode (or end mode) and future mode preference by experiment group. Table 3.6 shows
the percentage of completes in each mode by experiment group. It shows that the majority of cases
completed in the mode in which they started. This demonstrates that sample members who prefer the
mail mode can be convinced to complete via Web, as hoped.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 19

Table 3.6.

Green Appeal Experiment End Mode Comparison by Experiment Group

Completes
(any mode)
2,772

2013 Experiment Group
Control Mail Start Mode
Treatment Web Start Mode
Mean Difference (Control –Treatment)
a Web

2013 End Modea
(Percentage of Completes)
Web

Mail

CATI

14.9%

80.8%

4.3%

2,703

75.1%

18.0%

7.0%

N/A

-60.2%

62.8%

-2.7%

CIOs are considered Web completes and CATI CIOs are considered CATI completes.

Table 3.7 shows the mode preference expressed for 2015 by 2013 SDR respondents, by experiment
group and 2013 end mode. In both experiment groups, the majority of respondents stated a future
mode preference that was the same as the mode in which they completed, whether that was mail or
Web. This experiment was meant to test whether sample members with a mail mode preference could
be convinced to complete the Web, and taking that a step further, complete in Web in future rounds.
The mode preference of the Treatment Web completers shows that it is likely that they will do so, and a
majority of those people (71.9 percent) will prefer the Web in the future.

Table 3.7.

Green Appeal Experiment Mode Preference by Start and End Mode
Mode Preference for 2015
(Percentage of Completes)

All Experiment Cases

5,391

45.8%

38.8%

CATI / No
Preference / Not
Answered
15.5%

Control Mail Start Mode

2,721

64.8%

20.6%

14.6%

Mail End Mode

2,241

74.7%

12.2%

13.1%

Web End Mode

376

14.4%

71.3%

14.4%

CATI End Mode

104

33.7%

19.2%

47.1%

2,670

26.4%

57.3%

16.4%

Mail End Mode

486

74.7%

6.6%

18.7%

Web End Mode

2,004

14.3%

71.9%

13.8%

CATI End Mode

180

30.0%

31.1%

38.9%

2013 Start Mode/End Mode

Treatment Web Start Mode

a Web

Completesa

Mail

Web

CIO and CATI CIO completes are excluded from this mode preference analysis because they were not asked the item.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 20

Section 4: Start New Cohorts in Web Experiment

4.1

Theoretical Justification and Hypothesis

As discussed previously, one of the main disadvantages of Web surveys is the lack of adequate sampling
frame for the surveys which target the general U.S. population. However, the tailored design method
suggests that some of the design elements (such as different modes, incentives, survey topics) are more
attractive and suitable for some audiences than for others (Dillman, Smyth, and Christian. 2009).
Therefore, if these attributes/design elements are used appropriately, correct design decisions can
minimize the response burden while increasing the survey response and decreasing the survey costs
(Groves et al., 2009; De Leeuw et al., 2008). In many studies which use an address-based sampling
design, the respondents were sent an invitation letter which provided a URL address. This again
increases the complexity of the task, as the sample members: 1) need to have access to a computer and
Internet (which also decreases the coverage); 2) will need to go to their computers, enter the URL
provided in the letters, type in their access codes which is again provided in the letters, and then
complete the Web questionnaire.
In these examples, the survey invitation is provided via mail. The transition from the mail invitation to a
Web questionnaire makes the respondent task more complex and burdensome than the completing the
paper questionnaire. Moreover, it has been argued that the potential respondent may set aside the
survey invitation until they are on the computer and online next time. However, this delay may result in
sample members forgetting about the survey by the time they are next online (Anderson & Tancreto,
2012). In surveys which possess an adequate frame for Internet surveys (such as an adequate list of
email addresses) and include a technology-savvy target population with a general preference for the
Web mode (such as recent doctorate recipients), starting respondents with the Web mode can
potentially increase the response rates while decreasing the respondent burden and survey costs.
In the SDR, the default starting mode for new cohort sample members is the mail mode, with the
questionnaire cover letter also containing instructions on how to complete the survey via Web. This was
based on previous research completed in the 2003 SDR (Hoffer et al., 2006), when the Web mode was
just being introduced in the SDR, emails were obtained from fewer respondents by the SED, and perhaps
Web surveys and daily email usage were less ubiquitous. A decade later, the SDR possesses an adequate
frame for Internet users and targets technology-savvy and highly educated individuals. The project
could save resources and increase the data collection pace if some new cohort respondents could be
successfully steered to the Web to complete the SDR questionnaire online instead of through the mail.
We suspect that email contacts will be more successful and the data collection pace increased if the new
cohort sample members with portable email addresses (i.e., not tied to their graduate educational
institution) are initially asked to complete the survey via the Web mode rather than via a mail mode.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 21

Hypothesis: New cohort sample members with portable email addresses will complete the survey via
the Web mode more quickly, and with fewer costly mail and CATI contacts, when they are assigned
to the Web start survey protocol rather than to the standard mail start contacting protocol.

4.2

Experimental Design and Sample Stratification

The eligible sample consists of 2013 SDR new cohort sample members, who provided non-“.edu”
portable email addresses, as well as a mailing address. The number of cases eligible for the experiment
was 1,806, 894 sample members from the 2010 SED cohort and 912 from the 2011 SED cohort.
The sample design incorporated stratification in order to create a balanced allocation between
treatment and control groups. The stratification design was similar to that implemented in the
previously discussed experiments, but because the cases were new to the SDR, less stratification
variables were available for use. The following stratification variables were used in the following order:
(1) sample component (national or international); (2) sample member location designated as in the U.S.
or out of the U.S. with regional breakdown as appropriate, and (3) mode in which the individual
completed the 2010 or 2011 questionnaire. Serpentine sorting was utilized to maximize the similarity of
adjacent cases during assignment. After sorting, every other case was assigned to the treatment and
control groups. This achieved an implicit stratification so that there was no difference between the
treatment and control groups on the control variables.
Eligible cases were randomly assigned to the following treatment and control groups:
Control group: The control group followed the standard mail start mode contacting protocol,
starting with a questionnaire mailing.
Treatment group: Cases selected for the treatment group were assigned to a contacting protocol
that paralleled cases in the Web start mode, starting with an initial letter and email introducing
the survey and offering Web survey access.
All eligible cases had an equal probability of being assigned to either experimental group (treatment or
control group). Of the 1,806 cases, 903 were assigned to the control group (mail start contacting
protocol) and 903 to the treatment group (Web start protocol).

4.3

Contacting Protocol and Procedures

The multi-contact data collection protocol for the treatment and control groups is illustrated in Figure
4.1. The two tailored new cohort Web start contacting materials are included in Appendix C. All other
contacting materials used for the treatment group were the same as those used for the standard Web
start mode protocol. The control group followed the standard mail start mode protocol using the
contacting materials developed for new cohort sample members; the advance letter and first
questionnaire cover letter are included in Appendix C. While the initial questionnaire cover letter for
the control group in the “green appeal” experiment focused on only the paper questionnaire as the
potential completion mode, the cover letter for the control group in the new cohort experiment offered
2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 22

SDR Web survey access in addition to the mail survey as the potential completion mode. As mentioned,
the inclusion of this information in the cover letter for the new cohort was based on the 2003 mode
experiment outcomes, which showed that it was effective for the new cohort (Hoffer et al., 2006)

Figure 4.1. New Cohort Contacting Schedules by Experiment Group
Treatment Group - Web Start Mode

Week

Control Group - Mail Start Mode

Contact

Date

0

Initial Contact Letter tailored to new cohort*

15-Feb-13

Advance Letter

11-Feb-13

1

Initial Contact Email tailored to new cohort*

20-Feb-13

Questionnaire Mailing #1

19-Feb-13

2

Follow-up Letter

28-Feb-13

Thank you/Reminder Postcard

25-Feb-13

3

Follow-up Email

5-Mar-13
Questionnaire Mailing #2

21-Mar-13

5
6

CATI prompt and follow-up

7

19-Mar-13
to 3-Apr-13

Contact

Prompting Letter w/Web access

5-Apr-13
9-Apr-13
15-Apr-13
to
18-Apr-13

8

Questionnaire Mailing #1

10-Apr-13

Prompting Email w/Web access

9

Thank you/Reminder Postcard

17-Apr-13

CATI prompt

10
CATI follow-up

11
12

Questionnaire Mailing #2

8-May-13

Date

21-Apr-13
to 11-May13

* Contact material tailored as part of the experiment treatment.

4.4

Analysis and Results

Analysis focused on the comparison of treatment and control groups for differences in survey yield, days
to respond, and “level of effort” measured by the number of USPS mailings sent, telephone contacts
administered, and email messages sent to each case. Since the treatment group for this experiment
started in a different mode and experienced a different data collection schedule than the control group,
this section also examines the 2013 completion mode and mode preference for 2015, similar to the
analysis done for the green appeal experiment. Survey yield rate differences were assessed using chisquare tests for the categorical response/nonresponse outcome; while t-tests are used for the
continuous days-to-respond and level of effort outcomes.
Analysis Case Set
In order to most accurately analyze the results of this experiment, we chose to exclude from analysis
those cases that were unlikely to have experienced the initial protocol. The experiment sample includes
the cases which were sampled for the experiment as detailed in Section 4.2 above. From the selected
experiment sample, 283 control cases and 285 treatment cases were excluded from analysis because
these were locating problems either at the time of the initial contact or at some point during the data
2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 23

collection period. These cases were unlikely to have received the scheduled protocol as outlined for the
experiment groups and were not good candidates for outcome analysis. Table 4.1 details the number of
cases selected for the experiment and the sample eligible for analysis.

Table 4.1.

New Cohort Experiment Sample and Analysis Eligibility
Experiment
Sample

Analysis
Eligible

Control Mail Start Mode

903

620

Treatment Web Start Mode

903

618

1,806

1,238

2013 Experiment Group

Total (Experiment cases)

Prior to an examination of the experiment results by experiment group, it is useful to evaluate the utility
and accuracy of the starting email and USPS addresses of new cohort experiment sample members,
regardless of experiment group. For cases that had been found during the field period or were never in
locating, we compared the 2013 starting contacting information to the final 2013 contacting
information. Since these cases were part of the new cohort, the starting contacting information was
that which was reported in the SED. The final contacting information was that which was reported in
the 2013 survey (for those that provided it) or that which was determined to be the best contacting
information from the 2013 round. Our goal was to evaluate the stability of the portable emails for these
sample members compared to USPS addresses.
Table 4.2 shows that regardless of whether or not a case went into locating in the 2013 round, a higher
proportion of cases ended the round with the same email as they started with compared to ending the
round with the same USPS address. For cases never in locating in 2013, 72.9 percent of sample
members started and ended the round with the same email compared to 55.6 percent for the USPS
address. The difference is even more pronounced for cases that went into locating at some point in the
field period, 42.5 percent started and ended the round with the same email compared to 16.4 percent
for USPS addresses. This demonstrates the stability of these portable email addresses from the SED
compared to USPS addresses and enforces the value in sending both emails and mailings simultaneously
to get full coverage and maximize the starting contacting information.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 24

Table 4.2.

New Cohort Experiment Evaluation of 2013 Starting Emails and USPS
Addresses
Starting contacting data compared to Final contacting datab
(Percentage of Experiment Sample)
Email

2015 Locating Statusa

Experiment
Sample

Found, ever in locating
Never in locating

Same

USPS
Different

Same

Different

426

42.5%

57.5%

16.4%

83.6%

1,238

72.9%

27.1%

55.6%

44.4%

a

This analysis excludes cases that finalized as locating problems. Elsewhere in this report, analyzed cases only included cases never
in locating, but this analysis examines those cases as well as those that were located during the field period for illustrative purposes.
b

"Different" is defined as (1) an final address that was not the same as the starting address or (2) a no longer valid starting address
(no final address available).

Survey Yield and Days to Complete
Looking at survey yield rates at the end of the 26 week data collection period, Table 4.3 shows both
control and treatment groups had high survey yield rates (97.4 and 96.4 percent, respectively). A chisquare analysis shows no significant difference between survey yield rates of the two experiment groups
at the end of the data collection period.

Table 4.3.

New Cohort Experiment Survey Yield Rates by the End of the Data
Collection Period

2013 Experiment Group
Control Mail Start Mode

Analysis
Sample
620

2013 Survey
(Percentage of Analysis Sample)
Completed

Not Completed
97.4%

2.6%

Treatment Web Start Mode

618

96.4%

3.6%

Mean Difference (Control –Treatment)

N/A

1.0%*

-1.0%

*not significant, p >0.1

As noted in Section 3 for the green appeal, when evaluating an experiment to determine if a particular
group of sample members can be started more effectively in one mode compared to another, it is
important to look at other measures in addition to the survey yield rate at the end of the data collection
period, such as response over the course of the data collection period, days to complete, level of effort,
and survey completion mode and future mode preference.
Figure 4.2 plots the days to complete for experiment cases that completed the 2013 SDR and Table 4.4
details the mean days to complete by experiment group. Days to complete was calculated by
determining the number of days it took for a respondent to complete the survey by any mode after

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 25

being sent the first contact that allowed them to complete the survey. For control cases, this first
contact was the first questionnaire (the advance letter only notified the sample member that the 2013
SDR was starting and did not make an explicit request to complete the survey or provide a means to do
so. For treatment cases, this first contact was the initial email.
Figure 4.2 and Table 4.4 both show that cases in the new cohort experiment treatment group provided
the quickest response. Throughout data collection, the treatment group showed a higher number of
completes than the control group. However, as each group began experiencing the final scheduled
contact in their respective main protocols in weeks 8 and 9, the weekly rates of completion begin their
path to convergence. The treatment group had a significantly lower mean days to complete (nearly 17
fewer days) than the control group. As mentioned previously, because the 2013 SDR survey asks
respondents to consider a specific reference date (1 February 2013) when answering certain questions
about their occupation and additional educational achievement, responses received closer to that
reference date are assumed to be more accurate because respondents will not have to think as far back
into the past to recall what they were doing. In addition, the sooner a sample member completes the
survey, the less follow-up with its associated costs is required.

Figure 4.2. New Cohort Experiment Response over Time (Cumulative days to
complete by experiment group)
700
600
500
400

Mail Start Mode
(Control)

300

Web Start Mode
(Treatment)

200
100

0
6
12
18
24
30
36
42
48
54
60
66
72
80
86
92
102
108
115
125
132
142
152
162
177

0
Days to Complete

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 26

Table 4.4.

New Cohort Experiment Mean Days to Complete
Overall
Completes
604

Mean Days to
Complete
44.9

Treatment Web Start Mode

596

28.2

Mean Difference (Control –Treatment)

N/A

16.7*

2013 Experiment Group
Control Mail Start Mode

* p<0.0001

Level of Effort and Associated Costs
Concerning follow-up effort, we can look at level of effort as defined by the number of emails sent,
telephone calls made, and mailings sent to the sample members. The treatment and control groups
started in different modes and received specific modes of contact in a different order as shown in Figure
4.1. However, sample members eventually received contacts via all methods (email, telephone, and
USPS mail) if they did not respond to the initial contacts. Table 4.5 indicates that cases assigned to the
treatment group received significantly more emails and telephone calls on the average; while control
cases received significantly more USPS mailings on the average, mostly due to increased questionnaire
mailings.
Referring back to the contacting schedule in Figure 4.1, the last contact received by the control group is
prompting via telephone, and for treatment cases, it is questionnaire mailings. The findings in Table 4.5
correlate to the starting modes; on average the groups have more contacts from the mode they started
in and the least from the mode they ended in per the contacting protocol. Because the mail start
protocol for the control group begins with two questionnaire mailings, and the Web start treatment
protocol ends with the questionnaire mailings, the control group received more questionnaires than the
treatment group. As a result, mailings costs are higher for the control group. In the same vein, the Web
start treatment group had more calls and thus higher telephone costs.
Table 4.6 shows that the control group had higher mean costs when it came to the costs of the contacts
(e.g., batch emails; letter, postcard, and questionnaire USPS mailings; and telephone prompting). This is
due to the average number of questionnaire mailings and telephone prompting, both higher cost
contacts. The difference in cost for the contacts between the experiment groups was not significant,
however.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 27

Table 4.5.

New Cohort Experiment Mean Level of Effort (Emails, Telephone Calls,
USPS mailings)

2013 Experiment Group

Analysis
Sample

Mean Number of Contacts
Overall
Questionnaire
Telephone
USPS
Mailings
Calls
Mailings

Emails

Control Mail Start Mode

620

0.7

Treatment Web Start Mode

618

1.9

Mean Difference (Control - Treatment)

N/A

-1.2*

1.4

3.3

1.6

2.1
-0.7**

2.2
1.1*

0.4
1.2*

* p<0.0001
** p=0.0018

Table 4.6.

New Cohort Experiment Mean Cost of Contacts by Experiment Group

2013 Experiment Group

Analysis
Sample

Mean Cost of
Contacts

Control Mail Start Mode

620

$

18.17

Treatment Web Start Mode

618

$

16.42

Mean Difference (Control - Treatment)

N/A

$

1.52*

* not significant, p >0.1

Completion Mode and Mode Preference
In order to more specifically assess whether the Web start protocol and the Web survey are the best fit
for SDR new cohort with portable emails, it is important to look at the mode in which these sample
members completed the survey (end mode) and what they prefer for future rounds. Table 4.7 shows
the percentage of completes in each mode by experiment group. Regardless of the start mode, the
majority of new cohort sample members in the experiment completed the 2013 SDR in the Web;
however sample members starting in the Web overwhelmingly completed in the Web (93.0 percent)
compared to the approximately two thirds from the control group.

Table 4.7.

New Cohort Experiment End Mode Comparison by Experiment Group

2013 Experiment Group
Control Mail Start Mode

Completes
(any mode)
604

2013 End Modea
(Percentage of Completes)
Web

Mail

CATI

67.9%

29.1%

3.0%

Treatment Web Start Mode

596

93.0%

3.0%

4.0%

Mean Difference (Control –Treatment)

N/A

25.1%

26.1%

-1.0%

a Web

CIOs are considered Web completes and CATI CIOs are considered CATI completes.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 28

Table 4.8 shows the mode preference expressed for the 2015 cycle by 2013 SDR respondents by
experiment group (start mode). In both the control and treatment groups, the majority of respondents
stated a future mode preference of Web (73.7 and 79.3 percent respectively). The detail showing the
mode preference of experimental cases by completion mode is not shown here as it was for the green
appeal experiment because of the small number of cases completing in modes other than the Web.
New cohort cases, regardless of start mode, were likely to complete the Web and express a preference
for the Web for the subsequent round of the SDR.

Table 4.8.

New Cohort Experiment Mode Preference by Start and End Mode
Mode Preference for 2015

1,173

9.2%

76.5%

CATI / No
Preference / Not
Answered
14.3%

Control Mail Start Mode

589

12.4%

73.7%

13.9%

Treatment Web Start Mode

584

6.0%

79.3%

14.7%

2013 Experiment Group
All Experiment Cases

Completes

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

Mail

Web

PAGE | 29

Section 5: Summary of Results

5.1

Summary of Results

The information learned in the 2013 SDR contacting experiments will assist in determining the SDR data
collection protocol for cooperative sample members in the future.
“Email-only” Experiment
Because many cooperative panel members who prefer the Web mode have both email address and
USPS address on file, they are eligible to receive the initial Web start contact in both email and letter
form. We hypothesized that cooperative panel cases identified to start in the Web mode would respond
equally well to a single email contact as to the email plus USPS letter combination used in prior rounds.
The results indicate that the survey yield rates at the end of the study did not significantly differ
between the group which initially received a USPS letter and a notice email than the group which
initially received an email-only contact, but the control group took significantly less days to complete
and had more completes in the first 9 weeks. While by the end of the data collection period both
experiment groups responded to the survey equally well due to the follow-up contacts, the level of
effort and cost analyses revealed that the single initial email contact group did not require less effort
and costs than the standard protocol to achieve the comparable response rates as we predicted and so
the savings incurred by not sending the initial USPS letter were lost because more follow-up contact was
required later.
Green Appeal Experiment
We hypothesized that cooperative SDR panel members who prefer to complete the survey by mail
would complete the survey by Web if provided a “green” reason to do so. The preceding analysis shows
that in fact, even though all sample members selected for this experiment preferred the Mail mode
going into the 2013 SDR, the treatment led a majority within that group to complete by Web and
indicate that they would prefer Web for the next survey round. By the end of the data collection period,
there was no significant difference in rate of completion between the control and treatment groups;
however, cases in the Web treatment took fewer days to complete on average.
New Cohort Experiment
This experiment sought to test whether new cohort cases with portable emails could be successfully
started in the Web mode. Our analysis showed that new cohort sample members started in the Web
took fewer days to complete than those started in the mail group; although by the end of the field
period, there was no significant difference in the survey yield rates. The outcomes of experiment
demonstrated that new cohort sample members with portable emails can successfully be started in the
Web mode and in fact, even those that are started in the Mail group, are highly likely to complete and
prefer the Web.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 30

Recommendations
Based on the contacting experiments implemented in 2013 and discussed here, we suggest a continued
focus on discovering how more people can be encouraged to complete the SDR survey in the Web mode
and obtain quicker results, with fewer follow-up contacts required. Specifically, we suggest that the
initial contact for the Web start group continue to include the complementary email and USPS letter
contact. Further, we suggest that email prompts sent after the initial contact also be accompanied by
the complementary USPS letter where an address is available. We also suggest encouraging more
cooperative sample members to complete the Web survey, even if they prefer the mail, utilizing a green
appeal or equally salient reason for using the Web mode. Other focused “appeals” may be explored in
the future in order to maintain the positive response. Finally we suggest that new cohort sample
members with portable emails be started in the Web, as it appears to be a desirable and effective
completion mode for that group.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 31

References

Cantor, D., Han, D., Brick, P. D., Sigman, R., & Aponte, M. (2010). Findings from a Two-Phase Mail
Survey for a Study of Veterans. Section on Survey Research Methods. JSM.
Converse, P.D., Wolfe, E. W., Huang, X., & Oswald, F. L. (2008). Response Rated for Mixed-Mode
Surveys Using Mail and Emails/Web. American Journal of Evaluation, 29(1), 99-107.
Dillman, D., J. Smyth, J., & L. Christian. (2009). Internet, Mail, and Mixed-Mode Surveys: The Tailored
Design Method. Hoboken, NJ: John Wiley and Sons.
Groves, R. M., F. J. Fowler, M. P. Couper, J. M. Lepkowski, E. Singer, & R. Tourangeau. (2009). Survey
Methodology, 2nd edition. Hoboken, NJ: John Wiley and Sons.
Groves, R. M., Singer, E. & Corning, A. (2000). Leverage-Saliency Theory of Survey Participation:
Description and an Illustration. Public Opinion Quarterly. 64, 299-308.
Hoffer, T. B., K. Grigorian, K., S. Sederstrom, & L. Selfa. (2006). 2003 Survey of Doctorate Recipients:
Mode Experiment Analysis Report.
Kaplowitz, M. D., Hadlock, T. D., & Levine, R. (2004). A Comparison of Web and Mail Survey Response
Rates. Public Opinion Quarterly. 68(1), 94-101.
Shih, T-H & Fan, X. (2007). Comparing Response Rates in Email and Paper Surveys; A meta-analysis.
Educational Research Review. 4(2009), 26-40.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | 32

Appendix A: Email Only Experiment Materials

Control Group Initial Letter
February 15, 2013
Dear Dr. [LAST NAME],
We are requesting your participation in the 2013 Survey of Doctorate Recipients (SDR), sponsored by
the National Science Foundation (NSF) and the National Institutes of Health. You contributed to this
unique study of science, engineering, and health doctorate holders in the past. Thank you for your valuable
participation. Your answers provided government, business, and academic institutions with crucial
information concerning the availability of highly educated personnel in a variety of fields.
Please continue to participate by completing the survey online.
To access the 2013 SDR survey,
please go to the following secure URL address:

https://websurvey.norc.org/2013sdr

You will need a unique Personal Identification Number (PIN) and Password to access the survey.
Your PIN and Password are:

PIN:

[WEBPIN]

Password: [WEBPWD]
The information you provide will be collected by NORC at the University of Chicago, the survey contractor
conducting SDR on our behalf. These data will be kept strictly confidential and safeguarded in accordance
with the Privacy Act of 1974 and the Confidential Information Protection and Statistical Efficiency Act of
2002. Some participants find it easier to access the online survey from an email message, so we are also
sending you access to the survey to the email address we have on record for you.
If you have any questions regarding the survey, please contact NORC via the toll-free number or email
address listed below. Staff are available from 9 a.m. to 9 p.m. (Central Time) to assist you.
Thank you in advance for contributing to the SDR. We look forward to receiving your online survey.
Sincerely,

John R. Gawalt
Director
National Center for Science and Engineering Statistics
National Science Foundation

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | A-1

Control Group Initial Email
SUBJECT: A Request from the National Science Foundation
SENT: February 18/19/20, 2013
Dear Dr. [LNAME],
We are requesting your participation in the 2013 Survey of Doctorate
Recipients (SDR), sponsored by the National Science Foundation (NSF) and the
National Institutes of Health (NIH). You contributed to this unique study of
science, engineering, and health doctorate holders in the past - thank you!
The SDR provides government, business, and academic institutions with crucial
information concerning the availability of highly educated personnel in a
variety of fields. Results from earlier studies are available at the NSF
website listed below.
To access the 2013 SDR, please go to the following secure URL address:
https://websurvey.norc.org/2013sdr
Once there, enter your unique Personal Identification Number (PIN) and
Password to access the survey:
PIN: [WEBPIN]

Password: [WEBPWD]

The data you provide will be kept strictly confidential and safeguarded in
accordance with the Privacy Act of 1974 and the Confidential Information
Protection and Statistical Efficiency Act of 2002. On behalf of the NSF and
NIH, we have also sent survey access information to your mailing address, in
case this email address is no longer active.
If you have any questions regarding the survey, please contact NORC toll-free
at 1-800-685-1663 or respond to this email. Staff are available from 9 a.m.
to 9 p.m. (Central Time) to assist you. Or you can link to the SDR
Frequently Asked Questions at:
http://www.norc.org/sdr/sdr_faq.htm
Thank you in advance for contributing to the SDR. Your participation is
needed to help ensure the validity and accuracy of the survey data. We look
forward to receiving your online survey.
Sincerely,
Karen H. Grigorian
SDR Project Director
NORC at the University of Chicago
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
For assistance completing the survey:
Call NORC at 1-800-685-1663, email [email protected] or visit
www.norc.uchicago.edu/sdr.
For more information about the survey:
Visit www.nsf.gov/statistics/srvydoctoratework or write NSF at 4201 Wilson
Blvd, Suite 965, Arlington, Virginia 22230

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | A-2

Treatment Group Initial Email
SUBJECT: A Request from the National Science Foundation
SENT: February 18/19/20, 2013
Dear Dr. [LNAME],
We are requesting your participation in the 2013 Survey of Doctorate
Recipients (SDR), sponsored by the National Science Foundation (NSF) and the
National Institutes of Health. You contributed to this unique study of
science, engineering, and health doctorate holders in the past - thank you!
The SDR provides government, business, and academic institutions with crucial
information concerning the availability of highly educated personnel in a
variety of fields. Results from earlier studies are available at the NSF
website listed below.
To access the 2013 SDR, please go to the following secure URL address:
https://websurvey.norc.org/2013sdr
Once there, enter your unique Personal Identification Number (PIN) and
Password to access the survey:
PIN: [WEBPIN]

Password: [WEBPWD]

The data you provide will be kept strictly confidential and safeguarded in
accordance with the Privacy Act of 1974 and the Confidential Information
Protection and Statistical Efficiency Act of 2002.
If you have any questions regarding the survey, please contact NORC toll-free
at 1-800-685-1663 or respond to this email. Staff are available from 9 a.m.
to 9 p.m. (Central Time) to assist you. Or you can link to the SDR
Frequently Asked Questions at:
http://www.norc.org/sdr/sdr_faq.htm
Thank you in advance for contributing to the SDR. Your participation is
needed to help ensure the validity and accuracy of the survey data. We look
forward to receiving your online survey.
Sincerely,
Karen H. Grigorian
SDR Project Director
NORC at the University of Chicago
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
For assistance completing the survey:
Call NORC at 1-800-685-1663, email [email protected] or visit
www.norc.uchicago.edu/sdr.
For more information about the survey:
Visit www.nsf.gov/statistics/srvydoctoratework or write NSF at 4201 Wilson
Blvd, Suite 965, Arlington, Virginia 22230.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | A-3

Appendix B: Green Appeal Experiment Materials

Control Group Advance Notice Letter
February 11, 2013
Dear Dr. [LAST NAME]:
In a few days, we will be requesting your participation in the 2013 Survey of Doctorate Recipients
(SDR), sponsored by the National Science Foundation (NSF) and the National Institutes of Health.
You have contributed to this unique study of doctorate holders in the past. Thank you—your
answers provided government, business, and academic institutions with crucial information
concerning the availability of highly educated personnel in a variety of fields.
The SDR has been conducted biennially since 1973 and is the only source of data on the
careers of science, engineering, and health doctorate holders from U.S. academic
institutions. The value of the information obtained over the years with the help of participants like
you is immeasurable. Your involvement in this ongoing effort helps to ensure the validity and
accuracy of the survey data. Results from earlier studies are available at the NSF website listed
below.
We will be sending you the questionnaire in the mail and will ask you to complete the 2013 SDR at
that time. NORC at the University of Chicago is the survey contractor conducting this survey on
our behalf. The letter accompanying the questionnaire will explain more about this survey and our
reasons for contacting you.
If you do not receive a questionnaire within two weeks or have any questions regarding this survey,
please contact NORC via the toll-free number or email address listed below. Staff are available from
9 a.m. to 9 p.m. (Central Time) to assist you.
I would greatly appreciate your continued participation in this significant effort.
Sincerely,

John R. Gawalt
Director
National Center for Science and Engineering Statistics
National Science Foundation

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | B-1

Control Group First Questionnaire Cover Letter
February 19, 2013
Dear Dr. [LAST NAME],
Thank you for your participation in the 2010 Survey of Doctorate Recipients (SDR). Over
the past 40 years, you and other SDR respondents have contributed to an irreplaceable collection of
information about doctorate holders. Your responses helped academic and government institutions
anticipate shortages in personnel, as well as make decisions about graduate student support and
funding for research and development. The information you provided has also proven valuable for
students who want to learn about the relationship between graduate education and careers.
At this time, we are asking for your participation in the 2013 SDR. The SDR is being
conducted by NORC at the University of Chicago on behalf of the National Science Foundation
(NSF) and the National Institutes of Health. Regardless of your employment situation—whether
you are working in or out of your doctoral field, are seeking employment, are retired, or are in
another situation—your response is vital. Important measures from this study include the number
of doctorate holders working, what fields they are working in, and how their career patterns change
over time. We can only learn this from you. Results from earlier studies are available at the NSF
website listed below.
Please complete the enclosed questionnaire and return it in the postage-paid envelope to
NORC.
If you have any questions about the survey, please contact NORC via the toll-free number or email
address listed below. Staff are available from 9 a.m. to 9 p.m. (Central Time) to assist you.
Thank you for your continued participation. We look forward to receiving your completed
survey.
Sincerely,

John R. Gawalt
Director
National Center for Science and Engineering Statistics
National Science Foundation

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | B-2

Treatment Group Initial Letter
February 15, 2013
Dear Dr. [LAST NAME],
We are requesting your participation in the 2013 Survey of Doctorate Recipients (SDR), sponsored by
the National Science Foundation (NSF) and the National Institutes of Health. You contributed to this
unique study of science, engineering, and health doctorate holders in the past. Thank you for your valuable
participation.
We are asking you to complete the 2013 SDR online rather than a form sent via the mail to promote a more
efficient and eco-friendly way to participate in the 2013 SDR. We hope you will support the NSF's efforts to
conserve resources.
Please continue to participate by completing the survey online.
To access the 2013 SDR survey,
please go to the following secure URL address:

https://websurvey.norc.org/2013sdr

You will need a unique Personal Identification Number (PIN) and Password to access the survey.
Your PIN and Password are:

PIN:

[WEBPIN]

Password: [WEBPWD]
The information you provide will be collected by NORC at the University of Chicago, the survey contractor
conducting SDR on our behalf. These data will be kept strictly confidential and safeguarded in accordance
with the Privacy Act of 1974 and the Confidential Information Protection and Statistical Efficiency Act of
2002.
Some participants find it easier to access the online survey from an email message, so we are also sending
you access to the survey to the email address we have on record for you. If you have any questions
regarding the survey, please contact NORC via the toll-free number or email address listed below. Staff are
available from 9 a.m. to 9 p.m. (Central Time) to assist you.
Thank you in advance for contributing to the SDR. We look forward to receiving your online survey.
Sincerely,

John R. Gawalt
Director
National Center for Science and Engineering Statistics
National Science Foundation

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | B-3

Treatment Group Initial Email
SUBJECT: A Request from the National Science Foundation
SENT: February 18/19/20, 2013
Dear Dr. [LNAME],
We are requesting your participation in the 2013 Survey of Doctorate Recipients
(SDR), sponsored by the National Science Foundation (NSF) and the National
Institutes of Health (NIH). You contributed to this unique study of science,
engineering, and health doctorate holders in the past - thank you!
We are asking you to complete the 2013 SDR online rather than a paper form sent
via the mail in an effort to promote a more efficient and eco-friendly way to
participate in the 2013 SDR. Even when a small portion of SDR participants choose
the online survey over the questionnaire, significant resources are conserved.
To access the 2013 SDR, please go to the following secure URL address:
https://websurvey.norc.org/2013sdr
Once there, enter your unique Personal Identification Number (PIN) and Password to
access the survey:
PIN: [WEBPIN]

Password: [WEBPWD]

The data you provide will be kept strictly confidential and safeguarded in
accordance with the Privacy Act of 1974 and the Confidential Information
Protection and Statistical Efficiency Act of 2002. On behalf of the NSF and NIH,
we have also sent survey access information to your mailing address, in case this
email address is no longer active.
If you have any questions regarding the survey, please contact NORC toll-free at
1-800-685-1663 or respond to this email. Staff are available from 9 a.m. to 9
p.m. (Central Time) to assist you. Or you can link to the SDR Frequently Asked
Questions at:
http://www.norc.org/sdr/sdr_faq.htm
Thank you in advance for contributing to the SDR. Your ongoing participation is
needed to help ensure the validity and accuracy of the survey results. We look
forward to receiving your online survey.
Sincerely,
Karen H. Grigorian
SDR Project Director
NORC at the University of Chicago
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
For assistance completing the survey:
Call NORC at 1-800-685-1663, email [email protected] or visit
www.norc.uchicago.edu/sdr.
For more information about the survey:
Visit www.nsf.gov/statistics/srvydoctoratework or write NSF at 4201 Wilson Blvd,
Suite 965, Arlington, Virginia 22230.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | B-4

Treatment Group Follow-up Letter
February 28, 2013
Dear Dr. [LAST NAME],
We would like your help in completing the 2013 Survey of Doctorate Recipients (SDR). The SDR is
being conducted by NORC at the University of Chicago on behalf of the National Science Foundation (NSF)
and the National Institutes of Health.
The validity and accuracy of this study depend on your participation. Your response is vital wherever
you live, and whether you are working in or out of your doctoral field, are seeking employment, are retired, or
are in another situation. Because you were scientifically selected for the SDR, we cannot substitute any other
person for you.
Please take a few minutes to complete this important survey online using the access information
below. We are asking you to complete the survey online in an effort to make the SDR a more sustainable
and environmentally-friendly program. If you would rather complete this survey on the telephone or via a
paper questionnaire, please let us know by contacting the toll-free number or email address listed below. For
your convenience, this information has also been sent to the email address we have on record for you.
To access the 2013 SDR survey,
please go to the following secure URL address:

https://websurvey.norc.org/2013sdr

You will need a unique Personal Identification Number (PIN) and Password to access the survey.
Your PIN and Password are:

PIN:

[WEBPIN]

Password:

[WEBPWD]

The data you provide will be kept strictly confidential and safeguarded in accordance with the Privacy Act of
1974 and the Confidential Information Protection and Statistical Efficiency Act of 2002. Results from earlier
studies are available from the NSF website listed below.
If you have any questions regarding the survey, please contact NORC via the toll-free number or email
address listed below. Staff are available from 9 a.m. to 9 p.m. (Central Time) to assist you.
Thank you in advance for your cooperation in this important effort.
Sincerely,

John R. Gawalt
Director
National Center for Science and Engineering Statistics
National Science Foundation
P.S. If you have already completed the 2013 survey, thank you so much for your time!

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | B-5

Treatment Group Follow-up Email
SUBJECT: A Participation Request from the National Science Foundation
SENT: March 4/5, 2013
Dear Dr. [LAST NAME],
Please participate in the 2013 Survey of Doctorate Recipients (SDR). We are asking
you to complete the survey online in an effort to make the SDR a more sustainable
and environmentally-friendly program.
The SDR is being conducted by NORC at the University of Chicago on behalf of the
National Science Foundation (NSF) and the National Institutes of Health. Your
response will have a significant impact on the overall accuracy and ultimate
usefulness of the results.
To access the 2013 SDR survey, please go to the following secure URL address:
https://websurvey.norc.org/2013sdr
Use your unique Personal Identification Number (PIN) and Password to access the
online survey:
PIN: [WEBPIN]

Password: [WEBPWD]

For your convenience, we have also sent this information to your mailing address.
If you have any questions regarding the survey or would rather complete this
survey on the telephone or via a hardcopy questionnaire, please let us know by
contacting NORC toll-free at 1-800-685-1663 or responding to this email. Staff
are available from 9 a.m. to 9 p.m. (U.S. Central Time) to assist you. Or you can
link to the SDR Frequently Asked Questions at:
http://www.norc.org/sdr/sdr_faq.htm
Thank you in advance for your contribution to this important effort.
forward to receiving your questionnaire.

We look

Sincerely,
Karen H. Grigorian
SDR Project Director
NORC at the University of Chicago
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
For assistance completing the survey:
Call NORC at 1-800-685-1663, email [email protected] or visit
www.norc.uchicago.edu/sdr.
For more information about the survey:
Visit www.nsf.gov/statistics/srvydoctoratework or write NSF at 4201 Wilson Blvd,
Suite 965, Arlington, Virginia 22230.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | B-6

Appendix C: Start New Cohort in Web Experiment Materials

Control Group Advance Notice Letter
February 11, 2013
Dear Dr. [LAST NAME]:
In a few days, we will be requesting your participation in the 2013 Survey of Doctorate
Recipients (SDR), sponsored by the National Science Foundation (NSF) and the National
Institutes of Health. The SDR has been conducted biennially since 1973 and is the only source of
data on the careers of science, engineering, and health doctorate holders from U.S. academic
institutions. The survey, and the data it obtains, provides government, business, and academic
institutions with crucial information concerning the availability of highly educated personnel in a
variety of fields.
You were scientifically selected from a database that contains the name and degree information for
all individuals earning a science, engineering, or health doctorate in the U.S. As such, we cannot
substitute any other person for you. Your involvement in this ongoing effort will help ensure the
validity and accuracy of the survey data. Results from earlier studies are available at the NSF website
listed below.
We will be sending you the questionnaire in the mail and will ask you to participate in the 2013 SDR
at that time. NORC at the University of Chicago is the survey contractor conducting this survey on
our behalf. The letter accompanying the questionnaire will explain more about this survey and our
reasons for contacting you. All information you provide will be kept strictly confidential and
safeguarded in accordance with the Privacy Act of 1974 and the Confidential Information
Protection and Statistical Efficiency Act of 2002.
If you do not receive a questionnaire within two weeks or have any questions regarding this survey,
please contact NORC via the toll-free number or email address listed below. Staff are available from
9 a.m. to 9 p.m. (Central Time) to assist you.
I would greatly appreciate your cooperation in this significant effort.
Sincerely,

John R. Gawalt
Director
National Center for Science and Engineering Statistics
National Science Foundation

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | C-1

Control Group First Questionnaire Cover Letter
February 19, 2013
Dear Dr. [LAST NAME],
I am requesting your participation in the 2013 Survey of Doctorate Recipients (SDR), sponsored by
the National Science Foundation (NSF) and the National Institutes of Health.
You were scientifically selected from a database that contains the name and degree information for all
individuals earning a research doctorate in the U.S. As such, we cannot substitute any other person for
you. Your involvement in this effort will help ensure the validity and accuracy of the survey results.
Please complete the enclosed questionnaire and return it in the postage-paid envelope to NORC at the
University of Chicago—or complete an online survey by following the directions below.
To access the 2013 SDR survey online,
please go to the following secure URL address:

https://websurvey.norc.org/2013sdr

You will need a unique Personal Identification Number (PIN) and Password to access the survey.
Your PIN and Password are:

PIN:
[WEBPIN]
Password: [WEBPWD]

The information you provide will be collected at NORC. These data will be kept strictly confidential and
safeguarded in accordance with the Privacy Act of 1974 and the Confidential Information Protection and
Statistical Efficiency Act of 2002. If you have any questions about the survey, please contact NORC via the
toll-free number or email address listed below. Staff are available from 9 a.m. to 9 p.m. (Central Time) to
assist you.
We would greatly appreciate your cooperation in this significant effort. We look forward to receiving
your completed survey.
Sincerely,

John R. Gawalt
Director
National Center for Science and Engineering Statistics
National Science Foundation

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | C-2

Treatment Group Initial Letter
February 15, 2013
Dear Dr. [LAST NAME],
We are requesting your participation in the 2013 Survey of Doctorate Recipients (SDR), sponsored by
the National Science Foundation (NSF) and the National Institutes of Health. This survey of people who
have earned research doctorates in the United States has been conducted biennially since 1973, and provides
government, business, and academic institutions with crucial information concerning the availability of highly
educated personnel in a variety of fields. The SDR is the only source of data on this important
population.
You have been scientifically selected from a database that contains the name and degree information for all
individuals earning a science, engineering, or health doctorate in the U.S. We cannot substitute any other
person for you.
Please participate by completing the survey online.
To access the 2013 SDR survey,
please go to the following secure URL address:

https://websurvey.norc.org/2013sdr

You will need a unique Personal Identification Number (PIN) and Password to access the survey.
Your PIN and Password are:

PIN:
[WEBPIN]
Password: [WEBPWD]

The information you provide will be collected by NORC at the University of Chicago, the survey contractor
conducting SDR on our behalf. These data will be kept strictly confidential and safeguarded in accordance
with the Privacy Act of 1974 and the Confidential Information Protection and Statistical Efficiency Act of
2002.
Some participants find it easier to access the online survey from an email message, so we are also sending
you access to the survey to the email address we have on record for you. If you have any questions
regarding the survey, please contact NORC via the toll-free number or email address listed below. Staff are
available from 9 a.m. to 9 p.m. (Central Time) to assist you.
Thank you in advance for contributing to the SDR. We look forward to receiving your online survey.
Sincerely,

John R. Gawalt
Director
National Center for Science and Engineering Statistics
National Science Foundation

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | C-3

Treatment Group Initial Email
SUBJECT: A Request from the National Science Foundation
SENT: March 4/5, 2013
Dear Dr. [LNAME],
We are requesting your participation in the 2013 Survey of Doctorate Recipients
(SDR), sponsored by the National Science Foundation (NSF) and the National
Institutes of Health (NIH). This survey of people who have earned research
doctorates in the United States has been conducted biennially since 1973 and
provides government, business, and academic institutions with crucial information
concerning the availability of highly educated personnel in a variety of fields.
The SDR is the only source of data on this important population.
You have been scientifically selected from a database that contains the name and
degree information for all individuals earning a science, engineering, or health
doctorate in the U.S. We cannot substitute any other person for you, and ask you
to participate whether you are currently working in your chosen field or not.
To access the 2013 SDR, please go to the following URL address:
https://websurvey.norc.org/2013sdr
Once there, enter your unique Personal Identification Number (PIN) and Password to
access the survey:
PIN: [WEBPIN]
Password: [WEBPWD]
The data you provide will be kept strictly confidential and safeguarded in
accordance with the Privacy Act of 1974 and the Confidential Information
Protection and Statistical Efficiency Act of 2002. On behalf of the NSF and NIH,
we have also sent survey access information to your mailing address, in case this
email address is no longer active.
If you have any questions regarding the survey, please contact NORC toll-free at
1-800-685-1663 or respond to this email. Staff are available from 9 a.m. to 9
p.m. (Central Time) to assist you. Or you can link to the SDR Frequently Asked
Questions at:
http://www.norc.org/sdr/sdr_faq.htm
Results from earlier studies are available at the NSF website listed below.
Thank you in advance for contributing to the SDR.
your online survey.

We look forward to receiving

Sincerely,
Karen H. Grigorian
SDR Project Director
NORC at the University of Chicago
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
For assistance completing the survey:
Call NORC at 1-800-685-1663, email [email protected] or visit
www.norc.uchicago.edu/sdr.
For more information about the survey:
Visit www.nsf.gov/statistics/srvydoctoratework or write NSF at 4201 Wilson Blvd,
Suite 965, Arlington, Virginia 22230.

2013 SDR CONTACTING PROTOCOL EXPERIMENTS RESULTS

PAGE | C-4


File Typeapplication/pdf
AuthorProudfoot, Steven L
File Modified2015-06-19
File Created2015-06-19

© 2024 OMB.report | Privacy Policy