31450020 Part B

31450020 Part B.pdf

2008 Survey of Doctorate Recipients (SDR)

OMB: 3145-0020

Document [pdf]
Download: pdf | pdf
B. Collection Of Information Employing Statistical Methods
1. Respondent Universe and Sampling Methods
The population for the 2008 SDR will be selected from approximately 795,000 individuals from
the Doctorate Records File (DRF) which is a census of research doctorates awarded from U.S.
institutions since 1920. The DRF is compiled through the annual Survey of Earned Doctorates
(SED).
The sample design for 2008 will be consistent with the sample redesign developed and
implemented in 2003 SDR and retained in 2006 SDR. To be eligible for the 2008 SDR target
population, respondents have to:
1) receive a doctoral degree in science, engineering or health from U.S. institutions between 1958
and the academic year 2007;
2) indicate on the SED their plan to stay in U.S. after receiving their doctorate degree;
3) be under age 76;
4) be living in the U.S. as of October 1, 2008 (new survey reference date).
For 2008, a sample will be selected from the new 2006-2007 doctoral cohort groups and added to
the longitudinal sample (which covered graduates through 2005) that is conveyed from cycle to
cycle. To offset this new cohort addition and to limit the overall sample size, a maintenance cut will
be performed on the longitudinal sample.
There are two types of SEH doctorate recipients who have been excluded from the eligible SDR
sample frame: 1) non-U.S. citizens who reported plans in the SED to leave the U.S. after earning
their science, engineering or health doctorate were considered permanently ineligible for SDR
sample selection; 2) non-U.S. citizens who were selected into the SDR but who had been found to
reside outside of the U.S. for two or more survey cycles were also considered permanently
ineligible for the SDR. In both cases, it is possible that the individuals assumed to reside outside the
U.S. were actually living in the U.S. on the survey reference date and should thus have been
classified as eligible. These additional cases will be asked to participate in the 2008 SDR as a
separate subsample group, named the International Survey of Doctorate Recipients (ISDR), thus
making the overall SDR sample more inclusive and representative. The sample size for the ISDR is
approximately 2,600. These cases will be fielded with the 40,000 cases included in the 2008 SDR
sample and subject to the same sampling procedures, data collection protocol and data processing
treatment.
The targeted overall weighted response rate on the 2008 SDR is 85 percent.
maximizing the response rate is presented in Section 3.

The plan for

2. Statistical Procedures
As mentioned in the previous section, the 2008 sample design will be consistent with the 2003 and
2006 SDR sample designs. Stratification variables for the sample include: demographic group,
field of doctorate, and sex. The demographic group is a composite variable recording disability
status, race/ethnicity, and citizenship at birth (U.S. or foreign).
The 2008 SDR sample will be selected using sampling strata based on a multi-way cross of the
stratification variables. (See Attachment 5 – 2008 SDR/ISDR Sample Strata and Sample

2008 SDR OMB Supporting Statement

Page 16

Allocation Table). For 2008, a sample will be selected from the new 2006-2007 doctoral cohort
groups and added to the longitudinal sample that is conveyed from year to year. To offset this new
cohort addition and to limit the overall sample size, a maintenance cut will be performed on the
longitudinal sample. The SDR sample size and sample design ensure NSF will maintain the ability
to produce the small demographic/degree field estimates that are needed for the Congressionally
mandated report on Women, Minorities and Persons with Disabilities in Science and Engineering
(See 42. U.S.C., 1885d). The 2008 ISDR sample is, like the SDR, drawn from the DRF. However,
the stratification, with ten strata defined by race/ethnicity and gender, is simpler than that for the
SDR. Within each stratum, the frame will be sorted by degree field for implicit stratification prior
to systematic selection.
Estimates from the 2008 SDR/ISDR will be based on standard weighting procedures. As was the
case with sample selection, the weighting adjustments will occur separately for cases for the old
cohort and new cohorts. Each case will have a base weight defined as the probability of selection
into the 2008 SDR/ISDR sample. This base weight will reflect the differential sampling across
strata. For the old cohorts, the base weight will be equal to the final weight from the previous
survey cycle. The final analysis weights will be calculated in three stages:
1) First, a base weight will be calculated for every case in the sample to account for its selection
probability under the sample design.
2) Second, an adjustment for unknown eligibility will be made to the base weight by distributing
the weight of the unknown eligibility cases to the known eligibility cases proportionately to the
observed eligibility rate within each adjustment class.
3) Third, an adjustment for nonresponse will be made to the adjusted base weight to account for
the eligible sample cases for which no response was obtained.
Replicate Weights. A set of replicate weights based on the Balanced Repeated Replication (BRR)
method will also be constructed. The entire weighting process applied to the full sample will be
applied separately to each of the replicates to produce a set of replicate weights for each record.
Standard Errors. The BRR method will be used to estimate the standard errors of the 2008 SDR
estimates as in the past. The variance of a survey estimate based on any probability sample may be
estimated by the method of replication. This method requires that the sample selection, the
collection of data, and the estimation procedures be independently carried through (replicated)
several times. The dispersion of the resulting estimates then can be used to measure the variance of
the full sample.
3. Methods to Maximize Response
Maximizing Response Rates
The weighted response rate for the 2006 SDR was 79 percent. Extensive locating efforts, follow-up
survey procedures and targeted data collection protocols will be used to maximize the survey
response rate to maintain at least an approximately 80 percent response rate and to target an 85
percent response rate in 2008. Additionally, monetary incentives are also being planned, building
on experiments conducted in the 2003 and 2006 rounds. Once the details of these plans are
finalized, NSF will submit a proposal for OMB approval.

2008 SDR OMB Supporting Statement

Page 17

The contact information obtained from the 2006 SDR and from the 2006 and 2007 SED surveys for
the sample members as well as the people who are likely to know the whereabouts of the sample
members will be used to locate the sample members in 2008.
The U.S. Postal Service's (USPS) automated National Change of Address (NCOA) database will be
used to update addresses for the sample. The NCOA incorporates all change of name/address orders
submitted to the USPS nationwide, which is updated at least biweekly. Vendors also maintain up to
36-month historical records of previous address changes. It will also be used to track persons who
have moved from their previous address at the time of 2008 survey. The names and addresses of
mail nonrespondents will be matched to the most recent NCOA address updates with a vendor who
appends telephone numbers.
The locating efforts will also utilize a specially-trained locating team who has proven themselves
successful at searching for and finding nonrespondents with problem addresses or telephone
numbers. Their locating strategy will include contacting employers, educational institutions and
alumni associations, online publication searches, change of address searches, and Directory
Assistance and administrative record searches. In addition to last known address, locators have past
contacting information available as far back as 2001. Locators will also have access to contact
names and addresses given by respondents in past survey rounds, where available. An automated
commercial telephone number matching service and the national death registry will also be used.
As described above, NORC will continue to incorporate the Web mode in the data collection
protocol to improve both data completeness and sample members’ satisfaction.
A core set of contact materials (Prenotice Letter, Thank You/Reminder Postcard, and Cover Letters
accompanying the SAQ) will be used in mailing to the SDR sample members. These contact
materials will be tailored to address the particular issues or concerns of the sample groups to whom
they are targeted. Tailoring will be based on cohort (2006 Panel member versus new cohort),
response in the past round, citizenship, and expressed mode preference. NORC will also utilize
email versions of the contacting materials for sample members with email addresses on file.
NORC will conduct extensive CATI follow-ups for those sample members who do not submit a
completed questionnaire via a paper or Web form. The CATI Interviewing team will include
Refusal Avoidance and Conversion specialists who have a proven ability to work with sample
members to obtain their consent and participation.
4. Testing of Procedures
Because data from all three SESTAT surveys are combined into a unified data system, the surveys
must be closely coordinated to provide comparable data from each survey. Most questionnaire
items in the three surveys are the same.
Although there will be no new questions in the 2008 SDR questionnaire, all content items in the
SESTAT questionnaires have undergone an extensive review and testing before they were included
in the final version. The changes made in the questionnaires are a result of a variety of activities
that included extensive review of the entire content in each of the SESTAT survey questionnaires
and additional research on specific items to provide more information before a final decision was
made on placement and wording of the item in the questionnaires. Content evaluation and testing
activities for the 2003 and 2006 surveys included:

2008 SDR OMB Supporting Statement

Page 18

• External and internal consultation with questionnaire design experts on questionnaire layout and
formatting to improve user-friendliness and minimize respondent reporting errors;
• External consultation on improving the messages in the survey contact materials; and
• A two-stage pretest of the survey questionnaires consisting of mail and telephone.
All of these activities contributed to the development of the questions in the 2008 SDR
questionnaire.
Survey Questionnaire Review and Research
The SESTAT survey questionnaire items are divided into two types of questions: core and module.
Core questions are defined as those considered to be the base for all three SESTAT surveys. These
items are essential for sampling, respondent verification, basic labor force information, and/or
robust analyses of the science and engineering workforce in the SESTAT integrated data system.
They are asked of all respondents each time they are surveyed, as appropriate, to establish the
baseline data and to update the respondents’ labor force status and changes in employment and
other demographic characteristics. Module items are defined as special topics that are asked less
frequently on a rotational basis of the entire target population or some subset thereof. Module items
tend to provide the data needed to satisfy specific policy, research or data user needs.
After identifying the core and module items that would be included in the SESTAT surveys, SRS
reviewed and identified content items needing improvement, and engaged in research to craft new
questions. SRS conducted separate studies on six core items, and one study on a module for the
2003 survey questionnaires. The core item research covered the following topics on the SESTAT
questionnaires: employer’s main business, academic positions, academic institutions, work
activities, marital status, and degrees earned abroad. Based on the external consultations (See
Section A.8), a study was conducted to develop a module to capture more information on
postdoctoral employment histories in the SDR, which was included on the 2006 SDR.
The core item research resulted in some wording changes to those questions on the SESTAT
questionnaires, and a revision of how the occupation code frame is presented. The module research
led to the addition of a series of questions on postdoctoral employment for up to three postdoctoral
positions in the 2006 SDR questionnaire. The 2008 SDR questionnaire will not include new
questions not previously fielded before.
For 2008, the SDR questionnaire content will be revised from 2006 as follows:
• Survey reference date changed from April 1, 20006 to October 1, 2008.
• Removed a 2006 module on collaborative activities (it has not yet been decided if this will be
rotated back in at a future time).
• Rotated out a module on postdoctoral history, which was asked in 1995 and 2006.
• Rotated in a module on second job (status, job description, job category, relatedness of second
job to highest degree), which was asked in 1993 to 2001.
• Rotated in a module on the respondent’s and spouse’s areas of technical expertise, which was
asked in 1993 to 2003.
A complete list of questions proposed to be added, dropped, or modified in the 2008 SDR
questionnaire is included in Attachment 6.

2008 SDR OMB Supporting Statement

Page 19

The 2008 SDR questionnaire retains all content changes that were tested implemented in the 2006
SESTAT questionnaires. In 2005, SRS conducted an extensive pretest under a generic clearance
(OMB No. 3145-0174) that consisted of two phases: (1) two rounds of in-depth cognitive
interviews, and (2) a small-scale field test of the mail questionnaires.
Pretest Phase I – Cognitive interviews
Mathematica Policy Research, Inc. (MPR) and the U.S. Census Bureau (Survey Research Division)
were contracted to conduct in-depth cognitive interviews on the 2006 SDR and the other two
SESTAT survey questionnaires. Cognitive interviews were conducted in two waves, with the
waves being scheduled during the same time period at MPR and the Census Bureau. MPR tested
the full-length questionnaires for the three surveys, while the Census Bureau was asked to focus on
the employment section of the NSCG (which is the same as is used in the SDR). In addition to the
questionnaires, the cognitive interviews were also used to test improvements to the cover letters for
the 2006 survey administration.
The first round of cognitive interviews was conducted between February 2 and February 25, 2005.
During this period MPR and Census Bureau each interviewed 30 respondents. The second round of
cognitive interviews was conducted between March 25 and May 2, 2005. MPR interviewed 40
respondents (28 in-person and 12 via telephone) and the Census Bureau interviewed 30
respondents. Based on the results of the cognitive interviews, MPR and NSF worked together to
develop a series of experiments to test in the mail portion of the pretest.
Pretest Phase II – Mail Field Test
The field test consisted of two mailings of SDR and the other two SESTAT surveys with a reminder
postcard in between; no further nonresponse follow-up was conducted due to time constraints. The
NSCG mail pretest included a sample of 1,500 selected from a commercial list of 5,000 names of
bachelor’s degree holders with address, sex, age, and occupation information, and between the ages
of 21 and 75. To mimic the proportion of science and engineering cases from the 1995 NSCG,
MPR selected 15 percent of the cases from computer occupations, 20 percent from engineering
occupations, and 65 percent from other occupations for a total of 1,500 sample members. Each
sample member was randomly assigned to one of four control or experimental groups.
Pretest questionnaires were mailed on June 24, 2005 using first class mail. Although mailing a
reminder was not part of the original pretest plan, a postcard reminder was sent to all nonrespondents because of the low response (12 percent) to the first mailing. The postcard was mailed
on July 20, 2005, and provided an additional boost of about 2 percentage points to the response rate
for a 14 percent cumulative overall response rate from all three SESTAT surveys to the first
mailing. A second mailing was sent on August 3, 2005 with a cover letter urging participation with
a “respond by” date in a Priority Mail envelope. Mail returns were accepted until August 26, 2005.
Final response rate to the NSCG mail pretest was about 25%. Final response rate for respondents
from all three surveys was 27 percent.
The primary goal of the field pretest was to test the various recommended questionnaire changes
from the cognitive interviews. Specific test conditions were incorporated to obtain research data
that might further improve the questionnaires. These are described below:
1) Testing the placement of the sample person’s name and address label on the questionnaire
(front versus back cover).

2008 SDR OMB Supporting Statement

Page 20

2) Testing the Field of Study and Job Category Code Lists in a new format.
3) Testing a different approach to “anchoring” the reference date in the employment questions.
4) Testing a new wording and format of the principal employer type question.
In addition, the experimental versions of the questionnaires had small wording and formatting
changes for some questions of interest such as work activity categories, employer name and
location, supervising, etc. The control versions of the questionnaire retained the same wording for
most questions of interest and Field of Study/Job Category Code Lists used in 2003. Testing the
label placement by the presence versus absence of the content changes created a two-by-two design,
shown in the table below.
Mail Pretest Design

Address
Label

Back

Content, Anchor, and Code List
Old Content
New Content (Experimental)
(Control)
Questionnaire Version 1
Questionnaire Version 3

Front

Questionnaire Version 2

Questionnaire Version 4

The mail pretest also included testing of a new 2006 module on the method and means of
collaboration; using “Yes/No” response options in a few remaining questions with the “Mark All
That Apply” response options used in 2003; moving the part-time employment questions to a
different section and revising the work-related training reasons to fine tune the measurement of the
concepts for these two items.
Based on the mail pretest results, decisions were made to keep the sample person’s name and
address labels on the front cover of the questionnaire; use the revised wording and format of the
employer sector question; use the new Field of Study/Job Category Code Lists; no longer use the
‘Mark All That Apply’ response option; not use the reference week “anchoring” question but use
consistent question wording in all references to the principal job.
Survey Contact Materials
Survey contacting materials will be tailored to best fit sample members need for information about
the SDR and gain their cooperation. Materials requesting sample member participation via the Web
survey will include access to the survey online. As was done in 2003 and 2006 SDR, NSF and
NORC will develop 2008 SDR letterhead stationery that includes project and NSF website
information, and NORC’s project toll-free telephone line, USPS and email addresses. Additionally,
the stationery will contain a watermark that shows the survey’s logo to help brand the
communication for sample members for ease of recognition.
Questionnaire Layout
SRS has previously engaged the services of Dr. Don Dillman to further improve the visual
presentation of the 2003 and 2006 SESTAT questionnaires. An SRS staff member with expertise in
visual design theory was also involved in this process. The suggested revisions to the
questionnaires included the standardization and consistent use of formatting, placement of
instructions, and placement of privacy act notices. Also revised from previous versions were the

2008 SDR OMB Supporting Statement

Page 21

items that include a format that requires the respondent to review a long list of items before
reporting a response to make the selection process easier for the respondents.
Web-Based Survey Instrument
Because of technological improvements and the wide proliferation of Internet users, offering a Web
option to SDR respondents has become both feasible and desirable. The Web mode has the
potential to become a valuable asset to the survey with regard to decreased cost and enhanced
respondent satisfaction. In the 2003 SDR, this new mode was carefully introduced to avoid having
a negative impact on the response rate or the high data quality that the SDR project has realized
over the years.
The 2008 SDR will maintain the same functionality and software design as used in the 2003 and
2006 survey rounds. However, due to questionnaire changes, it will be necessary to recode some
portions of the instrument. This development will take place during Summer 2008, and full testing
of the reinstated questions as well as the entire instrument will be completed during August and
September 2008.
2006 SDR Survey Methodology Tests
Contacting Experiments Analysis
This report details the three contacting experiments that were implemented during the 2006 SDR.
The three experiments included in the report are 1) the Brochure Experiment, 2) the Cover Letter
Experiment, and 3) the Endorsement Letter Experiment. Each of these is briefly detailed below.
A. The Brochure Experiment
The Brochure Experiment was developed to help determine the most effective means of gaining
cooperation of new cohort sample members. In the 2006 SDR, the new cohort sample consisted of
the three most recent SED cohorts, 2003, 2004 and 2005. Because the 2005 SED cohort was not
available until after the start of the 2006 SDR field period, the new cohort sample was selected and
fielded in two stages. New cohort cases sampled from 2003 and 2004 SED were available at the
start of the data collection effort, and these cases are referred to as the first stage new cohort
sample. This contacting experiment was conducted on the first stage new cohort sample so that the
results could be used to help inform the best way to contact the second stage new cohort sample. In
the past, the SDR sent a Frequently Asked Questions (FAQ) brochure to all new cohort sample
members with an advance letter at the start of data collection. The FAQ brochure is a tri-fold
brochure that addresses sample members’ concerns about survey participation. Thus, the FAQ
brochure with an advance letter served as the control treatment for this experiment. The two
treatments were 1) including a Flyer brochure, which was a shorter, less detailed brochure with the
same advance letter instead of the FAQ, and 2) sending the advance letter without any type of
brochure.
Summary results: Excluding the FAQ from the new cohort mailing, treatment 2, had a significantly
positive effect on the response rate.
B. The Cover Letter Experiment

2008 SDR OMB Supporting Statement

Page 22

The purpose of the Cover Letter Experiment was to test whether sample members who previously
refused to participate in the survey would respond better to different versions of the questionnaire
cover letter. More specifically, we wanted to see whether this group of past refusers would respond
better to cover letters with an “authoritative” appeal or an “altruistic” appeal. The SDR
traditionally utilizes an altruistic appeal in its letters and NORC was interested to see whether a
firmer tone would be more effective in persuading past-refusers to participate in the survey.
Summary results: The response from the group receiving the “authoritative” letter was slightly
worse than the “altruistic” letter, but the difference was not significant.
C. The Endorsement Letter Experiment
The Endorsement Letter Experiment sought to increase response to a questionnaire mailing sent to
all nonrespondents to the initial 2006 contact (whether by mail, CATI or web). The endorsement
letters were included in a questionnaire mailing along with a cover letter. The endorsement letters
were from 10 different professional organizations encouraging sample members in their particular
field to participate in the 2006 SDR (e.g., an endorsement letter from the American Psychological
Association was sent to psychology doctorates). The results of similar mailings in the 2003 round,
when no endorsement letters were sent, were used as a control for this experimental treatment.
Summary results: Including an endorsement letter with the SDR mailings had a significantly
negative impact on response.
Mode Assignment Analysis
The 2003 SDR included a starting mode experiment and the questionnaire included a mode
preference question. In 2006 SDR, three different data collection modes were available at the start of
data collection. The three different starting modes were 1) a paper self-administered questionnaire
sent in the mail (SAQ), 2) a computer-assisted telephone interview (CATI), and 3) a self-administered
online survey (Web). Using mode preference information reported during in the 2003 SDR and
response information from the 2003 SDR mode experiments, the 2006 selected sample was assigned
to various starting mode data collection protocols. Old cohort sample members who responded to the
2003 SDR were stratified by explicit (their stated preference) or implicit (if no stated preference, the
mode by which they responded) mode preference, and the cases were assigned to start mode
accordingly. Explicit responses were determined by the answer to the mode preference question on
the 2003 SDR survey; for those that did not respond to the preference question or indicated no
preference, implicit preference was defined as the mode they used to complete the 2003 SDR. 2003
SDR non-respondents were assigned a starting mode based on analysis conducted on the 2003 data
which indicated that past refusals are more likely to cooperate if started in the SAQ mode and other
non-respondents were most likely to cooperate if started in the Web mode. All new cohort members
were assigned to the CATI mode; this decision was also based on analysis conducted on the 2003
SDR data. Those sample members that were living abroad and who had not completed the 2003 SDR
were started in the Web mode to decrease mailing costs for sample members most likely to be
ineligible for the 2003 SDR. Those without any physical or e-mail address were started in CATI.
The Mode Assignment Analysis report documents the results of the 2006 SDR starting mode
assignments. The 2006 SDR results were compared to the results from the 2003 SDR at the case
level for the panel, and in the aggregate for the panel nonrespondents and the new cohort. The
outcomes of interest include response rates, level-of-effort indicators, response time intervals, and

2008 SDR OMB Supporting Statement

Page 23

data quality measures. Analysis examined these outcomes by demographic variables of sex,
citizenship, doctorate field, ethnicity/race and age, and also by locating status.
Summary results: Assessment of the 2006 SDR mode assignment for particular groups revealed the
following about the 2006 data collection approach:
•

Honoring explicit and implicit mode preference was an effective strategy for the 2006 SDR.
While following this strategy did not affect the response rate or the number of contacts required
to achieve that response, it did improve both the time to respond and the quality of the data
provided.

•

Assigning panel members who refused to cooperate in the prior survey cycle to the mail starting
mode kept the response from this type of case consistent from 2003 to 2006. However, it
required a greater level of effort to maintain the response rate for this group of cases. And while
the response rate is generally low for this group, the data provided by panel members who were
converted was of a higher quality in 2006.

•

Locating problem cases were assigned to the Web starting mode in 2006 SDR. This strategy
appears to be the most effective for yielding a positive response with a lower level of contacting
effort, based on an analysis of the number of times these cases needed to be contacted.

•

Ineligibles and other nonresponse panel cases from the 2003 SDR were also assigned to the Web
starting mode in 2006 SDR. The 2006 SDR data collection results for this small group of cases
are less clear. The response rate improved for the ineligible cases in 2006, and remained static for
the other nonresponse cases. However, the level of effort required to achieve these results
increased considerably. While this is understandable for the ineligible cases that are largely
emigrants that have returned to the U.S., it is not clear why it should increase for other
nonresponse cases. Potentially, other nonresponse cases would respond more readily in another
mode.

•

Finally, new cohort cases were assigned to the CATI starting mode protocol in 2006 SDR. This
approach worked well for the new cohort cases that were missing sampling stratification variables
from SED, but did not appear to work as well for the new cohort cases with complete
stratification variables. While the time to respond was decreased for the new cohort cases overall,
unweighted response rates dropped slightly and item nonresponse increased.

Incentive Experiment Analysis
In the current environment of declining response rates, many survey researchers have begun to use
incentives to increase response rate. In the 2003 SDR, a late-stage data collection experiment showed
that offering a pre-paid incentive not only significantly increased response, but also yielded
significantly higher quality data. In the 2006 SDR, the research team implemented a follow-up
controlled experiment to determine the most efficacious time to offer a pre-paid $25 incentive to nonrespondents late in the field period after the full protocol of contacting attempts had been executed.
The incentive experiment design included four different sample groups which were selected on
September 12, 2006. At that time the main SDR sample had achieved an unweighted response rate
of 61.7%. The incentive experiment groups were selected and identified in the following way:
•

Early Control – 500 cases were sent a gaining cooperation letter and email message on September
22, 2006 and followed up with a telephone call approximately one week later.

2008 SDR OMB Supporting Statement

Page 24

•

Early Incentive – 5,000 cases sent a $25 pre-paid check and gaining cooperation and email
message on September 22, 2006 and followed up with a telephone call approximately one week
later.

•

Late Control – 500 cases were selected; of these 433 remained pending on October 17, 2006. The
remaining pending cases on October 23 were sent a gaining cooperation letter and email message
on October 23, 2006 and followed up with a telephone call approximately one week later.

•

Late Incentive – 2,600 cases were selected; of these 2,217 remained pending on October 17,
2006. The remaining pending cases were sent a $25 pre-paid check and gaining cooperation
letter and email message on October 23, 2006 and followed up with a telephone call
approximately one week later.
Summary results: Those sample members receiving incentives, regardless of Early or Late, had
higher completion, cooperation, and response rates, than those not receiving the money. While the
Late groups caught up to the Early groups, their response came later in the field period. Concerning
cost, the incentive experiment supported the 2003 finding that pre-paid incentives are a cost
effective gaining cooperation strategy. Few sample members cashed their incentive check without
completing a survey.
Web Screen Experiments Analysis
One challenge of Web questionnaires is the presentation of long lists of response options,
particularly those that cannot be fit well within the confines of a single computer screen. One such
problematic question in the SDR is the work activity question. In the 2003 SDR Web
questionnaire, this item was presented on a single page but respondents were obliged to scroll down
the screen in order to view all response options. The 2003 data indicated that the scroll-down
requirement may have affected responses, as evidenced in the relatively low frequencies for the
response options in the middle of the range (which may have been skipped over too quickly in
some Web interfaces) among the Web respondents compared to the paper and CATI respondents.
This experiment examines the effect of two different presentations of the work activities question in
the 2006 SDR Web instrument. The cases assigned to the Web starting mode data collection
protocol were scientifically assigned to a treatment or control group. The treatment group saw a
compact preview screen before the work activity question that summarized all fourteen work
activity response options on a single screen before being presented with the work activity question
in the scroll-down format. The control group did not have a preview screen, but followed the same
protocol used in 2003 and went directly to the work activity question. In addition to the
comparisons between the treatment and control group cases that completed the Web version of the
survey, we compare the results for the work activity question from respondents who completed the
paper version of the survey to the Web respondents.
Summary results: The most important lesson the 2006 SDR web preview screen experiment results
offer is that SDR respondents can effectively navigate the standard Web questionnaire without
mode effect in long list questions. And a preview screen listing all options on a single screen
before a long list question does not seem to have a positive effect on response frequency regarding
the overall number of responses.

Survey Methodology Tests to be Undertaken

2008 SDR OMB Supporting Statement

Page 25

NSF plans to conduct additional methodological tests in the current and future rounds of the survey
to reduce burden and increase utility of the survey under the burden hours in this survey clearance
for the 2010 SDR survey cycle. Proposals for these additional tests are still under consideration.
These will be submitted for OMB approval.
5. Contacts for Statistical Aspects of Data Collection
SRS Chief Statistician, Stephen Cohen, has overall responsibility of statistical aspects of the survey.
Consultation on statistical aspects of sample design was provided by Brenda Cox, (703-875-2983,
Senior Staff, Battelle) and Rachel Harter, (312-759-4025, Statistics and Methods Vice President,
NORC). At NSF the contacts for statistical aspects of data collection are Nirmala Kannankutty
(703-292-7797, SDR Project Manager) and Stephen Cohen (703-292-7769, SRS Chief Statistician).

2008 SDR OMB Supporting Statement

Page 26


File Typeapplication/pdf
File TitlePAPERWORK REDUCTION ACT SUBMISSION
AuthorDonna Williams
File Modified2008-05-22
File Created2008-05-22

© 2024 OMB.report | Privacy Policy