Download:
pdf |
pdfAPPENDIX J
Recent Changes to the NSCG Design
Recent Changes to the NSCG Design
The SESTAT data system and its component surveys have incorporated changes over the past
two decades to introduce efficiencies into the survey processing and to better align with the data
needs of policymakers and researchers. Substantial changes to the NSCG have occurred during
the planning for the 2010, 2013, and 2015 survey cycles. These changes are described below.
•
Implementation of the Rotating Panel Design
Prior to 2010, the new NSCG sample was drawn from the decennial census long form after
each decennial census. A subset of this long form based sample, namely the S&E
population, was then interviewed every two to three years throughout the decade as part of
the NSCG sample. With the long form occurring only once every decade, it was not possible
to refresh the NSCG sample during the decade. As a result, the long form based NSCG
sample suffered from increasing undercoverage of recent graduates and recent immigrants
throughout the 1990 and 2000 decades. Furthermore, by only following the S&E population
in subsequent survey cycles, the NSCG was not able to provide complete information on
individuals entering or exiting the S&E workforce.
After the 2000 decennial census, the Census Bureau discontinued the long form and
introduced the American Community Survey (ACS). In response to this change, NSF
commissioned a CNSTAT panel to examine proposed sample design options for the NSCG
based on the ACS, as opposed to the long form. The CNSTAT panel issued a 2008 report
with recommendations on the NSCG sample design for the 2010 survey cycle and beyond. 23
Using recommendations from this 2008 CNSTAT report, NCSES introduced a new rotating
panel sample design for the NSCG in the 2010 survey cycle to take advantage of the annual
nature of the ACS. In this rotating panel design, the NSCG selects a new sample every
survey cycle from the most recent ACS and follows the cases for four survey cycles. After
the fourth cycle, the cases rotate out of the NSCG and are replaced by a newly selected panel
of cases from the most recent ACS. When fully implemented, each NSCG survey cycle will
include four panels of sample cases with each panel originating from a different ACS year.
Through this rotating panel design and the selection of a new sample every NSCG survey
cycle, the NSCG is now able to address the recent graduates and recent immigrants
undercoverage that has existed in the past. Furthermore, by changing the design to provide
coverage of the entire college graduate population every survey cycle, the NSCG is now able
to examine the trends of individuals entering and exiting the S&E workforce.
The 2015 NSCG survey cycle continues the implementation of the NSCG rotating panel
design by including approximately 42,000 new sample cases from the 2013 ACS and 93,000
returning sample cases from the 2013 NSCG (originating from the 2009 ACS, 2011 ACS,
and the 2010 NSRCG). Full implementation of the NSCG four-panel rotating panel design
will occur in the 2017 survey cycle. Once the rotating panel design is fully implemented,
23
National Research Council, Committee on National Statistics. 2008. Using the American Community
Survey for the National Science Foundation’s Science and Engineering Workforce Statistics Programs.
Washington: The National Academies Press.
1
each survey cycle will see the addition of approximately 32,500 - 40,000 cases from the most
recent ACS to offset the rotating out of the oldest NSCG panel.
One feature of the rotating panel design that differs from the previous NSCG design is that
nonrespondents will be followed in subsequent survey cycles if they had responded in the
initial survey cycle. This decision to follow nonrespondents is in an effort to minimize the
potential for nonresponse bias in our NSCG survey estimates. In the 2015 NSCG,
approximately 7,000 of the 93,000 returning sample cases were nonrespondents in the 2013
NSCG survey cycle.
•
Discontinuation of the NSRCG and Expansion of the NSCG Young Graduates Sample
In the 1989 CNSTAT report that led to the establishment of the current SESTAT design, the
CNSTAT panel recommended the implementation of a biennial survey to address the
undercoverage of recent graduates that exists in the long form based design of the NSCG.
This recommendation led to the creation of the NSRCG. As a result, throughout the 1990
and 2000 decades, the NSRCG provided SESTAT with coverage of recent bachelor’s or
master’s degree recipients in SEH degree fields from U.S. educational institutions.
In the 2010 survey cycle, the NSCG began selecting sample from the ACS and, through its
rotating panel design, the NSCG was now able to provide coverage of the recent graduates
population throughout the decade. With this increased coverage available through the
NSCG, NCSES conducted an evaluation to investigate the possibility of a SESTAT design
change that would include discontinuing the NSRCG and using the NSCG, with an expanded
sample of young graduates, to provide coverage of this recent graduates population.
After reviewing our evaluation results and carefully considering the feedback received from
extensive outreach efforts with the S&E community, NCSES decided to discontinue the
NSRCG after the 2010 survey cycle. A major impetus for this decision was that the NSRCG
was no longer needed to fill the recent college graduate coverage gaps of SESTAT. Instead,
the NSCG, through the use of the ACS-based sampling frame and its rotating panel design,
provides on-going coverage of the recent college graduates population. Other factors
considered in this decision were the limited use of the NSRCG as a standalone data file and
the cost savings associated with discontinuing the NSRCG and with simplifying the SESTAT
integration processes. NCSES expanded the sample of young college graduates in the NSCG
beginning in the 2013 survey cycle to allow analysts to continue detailed investigation of the
recent college graduate population. This oversampling of young college graduates will
continue in the 2015 NSCG.
•
Web First Data Collection Strategy
The 2010 NSCG survey cycle marked the introduction of a web data collection mode to
compliment the mail questionnaire and computer-assisted telephone interviewing (CATI)
options that had existed in previous survey cycles. Through an experiment conducted during
the 2010 NSCG survey cycle, we found using a ‘web first’ approach (i.e., offering the web
data collection mode as the initial response option) produces final response rates that exceed
2
or are not statistically different from the final response rates for the mail first and CATI first
approaches. In addition, by conducting a detailed evaluation of the data collection costs in
the 2010 NSCG experiment, we determined that the web first approach achieved these
impressive response results at a much lower cost per respondent (approximately $50 per
respondent in the web first approach versus $65 in the mail first approach and $75 in the
CATI first approach). Finally, the research showed that the majority of respondents tended
to respond in the initially offered mode. This finding held across all three treatment groups –
web first, mail first, and CATI first.
Given the positive findings from the 2010 NSCG mode effects experiment, the 2013 NSCG
used the web first data collection approach as its default data collection path (i.e., the data
collection path offered to most cases) and found that the percentage of 2013 NSCG new
respondents that completed the survey by the web response mode (83%) exceeded the rate
that responded by web in the 2010 NSCG (69%). Given the continued success of the web
first approach, it will continue to be used as the default data collection path in the 2015
NSCG.
•
Survey Content
As part of the 2015 NSCG planning effort, NCSES conducted developmental work on new
questionnaire items to capture information on alternative credentials including industryrecognized certifications, occupational licenses, and educational certificates. As a starting
point for this developmental work, NCSES used the vast amount of research on this topic
conducted by the Interagency Working Group on Expanded Measures of Enrollment and
Attainment (GEMEnA). Through the interaction with the GEMEnA group, NCSES
identified an initial set of questions on certifications, licenses, and educational certificates to
consider as possible additions to the NSCG questionnaire.
The consideration of these additions to the NSCG questionnaire involved two evaluation
phases. The first phase consisted of outreach to the NCSES survey stakeholders to discuss
whether the initial set of questions provided adequate information for policy and research
needs. These discussions with survey stakeholders occurred at NCSES-sponsored workshops
held in August 2013 and January 2014. The second evaluation phase involved cognitive
testing of the proposed questionnaire items. The cognitive testing was performed by the
Census Bureau’s Center for Survey Measurement. The findings from the two evaluation
phases led to slight revisions to the question wording that addressed both conceptual
concerns (identified through the outreach discussions) and measurement concerns (identified
through the cognitive interviews). At the completion of the evaluation, NCSES decided to
add a new NSCG questionnaire section to collect information on certifications and licenses.
In addition, although questions on educational certificates were also examined as part of this
content evaluation, NCSES decided not to include these questions on the 2015 NSCG
questionnaire since the cognitive interview findings stated that respondents had difficulty
with the concept of what is an educational certificate.
Finally, in addition to the cognitive interview testing of the certification, license, and
educational certificate questions, the Census Bureau’s Center for Survey Measurement also
3
conducted an expert review and cognitive interviews for the full set of NSCG questionnaire
items. The expert review and additional cognitive interviews resulted in minor question
wording revisions to numerous items throughout the NSCG questionnaire. The specific
revisions made to the NSCG questionnaire since the 2013 survey cycle are documented in
Appendix F.
•
Discontinuing the Use of Priority Mail for the Reminder Survey Invitation
In past NSCG survey cycles, sample cases were mailed a survey invitation packet at the
beginning of data collection to introduce the survey and encourage response. At the fifth
week of data collection, a reminder survey invitation packet was mailed to nonrespondents.
Following the data collection implementation best practice of varying the look of each
contact attempt, the reminder survey invitation packet has historically been sent via priority
mail. This priority mailing allows the reminder packet to be sent in an envelope that is
different from the envelope used with the survey invitation packet. While the priority
mailing allows a more timely delivery of materials, priority mailing is expensive compared to
first class mailing postage rates. For example, mailing a standard business envelope via first
class mailing costs approximately $0.45; whereas, the same sized envelope sent via priority
mailing costs approximately $5.00. In a typical NSCG survey cycle, we mail reminder
packets via priority mail to approximately 70% of our sample cases. With a sample size
usually in the range of 140,000, this means that reminder packets are mailed via priority mail
to approximately 100,000 cases. The postage associated with this priority mail reminder
packet would be nearly $500,000 (versus $45,000 if the reminder packets are mailed first
class).
Given the extremely high cost of the priority mailing, the 2013 NSCG included an
experiment to examine whether it is possible to achieve our NSCG response, data quality,
and estimation goals while using a first class mailing rather than a priority mailing for the
reminder survey invitation. The 2013 NSCG priority mail experiment included a control
group that received the priority mail reminder packet and two treatment groups – one group
that received the reminder packet mailed first class in a different colored (brown) envelope
and one group that received the reminder packet in a first class mailing using a white
envelope with an overprint stating the importance of the survey response.
The examination of the experiment results showed that the mailing type (priority envelope,
first class brown envelope, first class white envelope with overprint) did not have any
significant impact on final response rate, representativity, or the key survey estimates. The
mailing type, however, did have a significant impact on the data collection cost. The priority
mail postage increases the overall survey cost without a full return on investment, as
measured by response. Although the use of the brown envelopes and overprint envelopes
require more nonresponse follow-up than the priority mail, the cost for this additional
nonresponse follow-up does not offset the additional cost of the priority mailing. Using first
class mail for the brown or overprint envelopes is, on average, less expensive than the
priority mailing by about $2 per case.
4
Since the use of the non-priority envelopes (brown or overprint) reduced the overall data
collection cost without any adverse impact on final response, representativity, or the survey
estimates, we will discontinue the use of priority mail for the reminder survey invitation in
the 2015 NSCG. As a result, the 2015 NSCG reminder survey invitation will be mailed
using a brown envelope via first class mail.
•
Implementing Results from the 2013 NSCG Incentive Experiments
Motivated by the success of the 2010 NSCG late stage incentive in combination with the
desire to optimize incentive usage in the NSCG data collection efforts, NCSES included two
monetary incentive experiments in the 2013 NSCG: an incentive timing experiment and an
incentive conditioning experiment. These experiments examined the impact that incentive
timing and past incentive usage has on response, data quality, and cost. The incentive in both
studies was a $30 prepaid debit card incentive. The debit card had a six month usage period
at which time the cards expired and the unused funds were returned to Census and NCSES.
For the 2013 NSCG incentive timing experiment, we tested the offering of incentives to new
sample cases that were classified as “highly influential.” Cases with large sampling weights
and a low response/locating propensity were considered highly influential cases for the
experiment. The experiment tested the effectiveness of the incentive offer occurring at four
different points during the 26-week data collection period – week 1, week 8, week 12, and
week 23. The results for the four treatment group (incentive offered at week 1; incentive
offered at week 8; incentive offered at week 12; incentive offered at week 23) were
compared against the results for the control group that did not receive an incentive. When
response, cost, and data quality factors were examined, it was determined using an incentive
at week 1 for the highly influential cases provides the best results among these criteria. As a
result, we plan to offer a $30 prepaid debit card incentive to highly influential new sample
cases at week 1 of the 2015 NSCG data collection effort.
The 2013 NSCG incentive conditioning experiment examined the impact that receiving an
incentive in a previous survey cycle has on future survey response. In the 2010 survey cycle,
hard-to-enumerate cases and recent college graduates received monetary incentives to
encourage response. In the 2013 NSCG survey cycle, these past incentive recipients were
assigned to one of three treatments groups – no incentive, $30 debit card incentive at week 1,
$30 debit card incentive at week 23. When response, cost, and data quality factors were
examined, it was determined using an incentive at week 1 for past incentive recipients
provides the best results among these criteria. As a result, we plan to offer a $30 prepaid
debit card incentive to past incentive recipients at week 1 of the 2015 NSCG data collection
effort.
•
Adaptive Design Experimentation
Adaptive design provides a framework for tailoring data collection strategies during the data
collection process, in response to conditions on the ground. Shrinking budgets for survey
operations as well as an increase in reluctance in the general population to participate in
surveys have resulted in declining response rates. This is happening simultaneously with the
5
desire on the part of data users to be able to make reliable estimates for smaller
subpopulations. Adaptive design seeks to improve survey representation even in the face of
falling response rates as a way to maintain data quality while controlling costs.
Given the potential benefits of adaptive design techniques, we included an adaptive design
experiment into the 2013 NSCG to explore if it was possible to implement adaptive design
interventions and whether there were there any limitations to implementation. The 2013
NSCG adaptive design experiment included 4,000 new sample cases where flexibility was
given to the NSCG operations staff to determine the most appropriate data collection
approach for these cases.
The main take-away from the 2013 NSCG adaptive design experiment is that it is
operationally possible to monitor data as it is collected and use the monitoring results to
incorporate adaptive design techniques into the NSCG data collection effort. Data
monitoring and data processing staff in the 2013 NSCG were able to make decisions
regarding data collection interventions and create files that provided necessary information to
data collection staff in order to implement those interventions. In addition, when examining
data quality and cost, the effectiveness of adaptive design in general, and mode switching in
particular, look promising. With only a small significant reduction in response rate (4.4% in
this case), there was no significant difference between the adaptive design treatment group
and the control group either in representativity or final key estimates. Finally, the data
collection costs for the adaptive design treatment group was approximately $5 less per case
than for the control group.
However, these results should be taken with caution given the small sample size of the 2013
NSCG adaptive design experiment. Rather than considering adaptive design in the NSCG to
be an unmitigated success, more research should be completed with a larger sample size for
both statistical and operational reasons. Statistically, a larger sample size will provide
R-indicators or other data monitoring tools stability (particularly given the differential
weights in the NSCG), thereby reducing confidence intervals. Operationally, a follow-up
experiment would allow the NSCG operations staff to more formally codify possible data
collection interventions and when their use is appropriate. Therefore, the 2015 NSCG plans
to include a follow-up adaptive design experiment with a larger sample size than the
experiment in the 2013 NSCG.
6
File Type | application/pdf |
File Title | 1999 OMB Supporting Statement Draft |
Author | Demographic LAN Branch |
File Modified | 2015-03-18 |
File Created | 2015-03-03 |