Justification

Volume I NATES 2013 Pilot Test.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf


Volume I:


National Adult Training and Education Survey (NATES) 2013 Pilot Test




Request for Clearance


OMB# 1850-0803 v.72


















September 18, 2012





Contents




Justification

Since the advent of the Great Recession, the relationship between the attainment of a postsecondary credential and job prospects has come sharply into focus in the national conversation (BLS, 2012, Career One Stop, n.d.; Carnevale et al, 2010; Carnevale et al, 2012, Julian and Kominski, 2011). The stock of human capital is seen as a resource for matching employer needs to workforce skills. The development of new human capital is seen as a driver of new economic growth. Traditional postsecondary degrees comprise only a portion of the education and training for American adults looking to find jobs. Policy makers and researchers need data on non-degree credentials that have value in the labor market. These include industry-recognized certifications and professional licenses. They also include subbaccalaureate educational certificates that prepare adults for work. To understand how adults acquire work-related credentials, policy makers and researchers also need data on the enrollment and participation of out of school youth and adults in education and training to acquire the skills they need to be successful at work. Such training may include non-credit instruction in community colleges, formal on-the-job training, and adult basic skills education.


At the request of the Council of Economic Advisors (CEA), the Office of Management and Budget (OMB), and the Under Secretary of Education (OUS), the National Center for Education Statistics (NCES) is leading a federal effort to improve data collection on the education and training that youth and adults need to prepare for jobs and to contribute to the growth of the economy. The Interagency Working Group on Expanded Measures of Enrollment and Attainment (GEMEnA) consists of senior representatives from the Bureau of the Census, the Bureau of Labor Statistics (BLS), the National Science Foundation (NSF), as well as CEA, NCES, OMB, and OUS. With GEMEnA’s expert guidance and support, NCES has embarked upon a multi-faceted effort to apply best practice survey development principles towards the goal of establishing government-wide consensus on the best measures of the participation in and credentialing of education and training for work. To provide guidance and support to GEMEnA’s work, NCES is convening a panel of experts on education, the workforce, and the economy. The expert panel will meet for the first time in fall of 2012 and a summary will be posted on the NCES website (web address to be determined).


GEMEnA's portfolio includes four main strands of work:


  1. Establish and deploy a core set of survey items related to the prevalence and key characteristics of industry-recognized certifications and licenses. Based on item development work that included focus groups, cognitive interviews, and a nationally-representative pilot study with a seeded convenience sample of known credential holders, GEMEnA has recommended a set of survey items believed valid for the measurement of industry-recognized certifications and professional licenses. According to GEMEnA’s first pilot study, the 2010 Adult Education and Training Study (ATES), 29.7% of the US adult population 18 and older had at least one certification or license, yet the federal statistical system does not currently collect data on this type of credential.1 This work is documented briefly in a January 2012 FCSM paper (http://www.fcsm.gov/12papers/Boivin_2012FCSM_VII-B.pdf) and is the subject of a forthcoming technical report from NCES. The expert panel will provide guidance to GEMEnA on the types of research and policy questions that may be answered by deploying these questions in federal household surveys of out of school youth and adults. In the meantime, the Census Bureau is fielding a module containing tested credentialing items in the fall of 2012 for release in fall of 2013. These data will provide the first official statistics from the federal government on the prevalence of industry-recognized certifications and professional licenses. The content of the module can be seen at http://www.reginfo.gov/public/do/PRAViewIC?ref_nbr=201202-0607-002&icID=182115 . In addition, NCES is deploying certification/license items in its longitudinal surveys of out of school youth and adults and BLS is considering placement of certification/license items in its household studies. GEMEnA will continue to seek opportunities to recommend validated survey items in appropriate federal household data collections and the guidance of the expert panel will help to focus those efforts.

  2. Continue development of a core set of survey items related to the prevalence and key characteristics of subbaccalaureate educational certificates. Because of sampling frame issues in the seeded sample of certificate holders, the ATES pilot study provided inconclusive evidence about the validity of survey items on educational certificates—a type of credential reported by 13.6% in the 2010 ATES pilot study. Subbaccalaureate educational certificates represent the completion of a program of study at an educational institution and are a key source of education and training for many occupations. The attainment of an educational certificate of at least one year in duration may be counted towards the administration’s goal of having all Americans complete at least one year of education beyond high school. Because the labor market value of educational certificates has become part of the national conversation (Carnevale et al., 2012), NCES has funded and GEMEnA is guiding a multi-year effort to focus best practice survey development on educational certificates. This will include focus groups of potential respondents with credentials in predominately male, predominantly female, and mixed gender professions. Details on the focus groups may be found in the OMB package (http://0-edicsweb.ed.gov.opac.acc.msmc.edu/browse/browsecoll.cfm?pkg_serial_num=4904). After analyzing the focus group findings, GEMEnA will develop a plan for conducting cognitive interviews with newly-developed or adapted items. The cognitive interviews will be followed by a new nationally-representative pilot study that will include an expanded and cleaned sample frame of certificate holders. Based on the results of the pilot study, GEMEnA will make recommendations for next steps. GEMEnA will keep the expert panel apprised of its progress on this strand of work and will seek guidance as needed on particular aspects of the work.

  3. Consider new measures of participation in education and training designed to prepare out of school youth and adults for work. Federal household surveys have collected information on enrollment in regular and non-regular schooling among adults for decades. NCES’s National Household Education Study (NHES) collected nationally-representative data on adult participation in various kinds of educational experiences, including enrollment in formal postsecondary education, vocational training, and adult basic skills education. Because of low response rates to the CATI methodology, the NHES adult education module was halted after 2005. The October Supplement to the CPS has provided school enrollment information for children and adults for decades. Longitudinal studies of out of school youth and adults (from NCES and BLS) often have enrollment and participation questions. A renewed policy interest in the development of human capital among US adults and a proliferation in work training programs in the public and private sectors warrant a fresh examination of how and why we collect data on enrollment and participation. Extant data sources such as the October CPS Supplement and the NCES longitudinal studies will benefit from a freshening of standard survey items to reflect new education realities and policy interests. This strand of work will also support the development of a new federal household survey focusing on education, training, and credentials for work, of which this clearance is a pilot. GEMEnA is beginning the effort to improve federal statistical data on enrollment and participation in education and training for work by defining the scope of the work and developing a plan for addressing data needs. The expert panel will guide this effort by identifying key research and policy questions and helping to focus GEMEnA’s efforts on high priority data needs.


  1. Support NCES in the development of a new household study focused on education, training, and credentials for work. The purpose of the new household study is to go beyond a measure of the prevalence of credentials and a few of their characteristics. The goal is to produce a high quality data collection that will serve both research and policy audiences by addressing questions concerning the relationships among credential attainment, participation in education and training for work, and employment characteristics. The pilot test covered in this clearance is a step towards that goal. GEMEnA plans to seek guidance from the expert panel as the survey development process continues.


In planning for a new household study on education, training, and credentials for work and in support of GEMEnA’s fourth strand of work described above, NCES seeks clearance for a National Adult Training and Education Survey (NATES) Pilot Test. The purpose of the NATES Pilot Test is to evaluate the feasibility of using a mail topical survey to conduct the new household study. The NATES Pilot Test will provide an opportunity to examine response rates at both the unit and item level and includes a nonresponse bias study based on in-person interviews with a sample of nonrespondents. The NATES Pilot Test will be conducted in the winter and spring of 2013 and will involve approximately 10,000 households. The data will be collected by the U.S. Census Bureau.


Clearance is requested by October 3, 2012 in order to complete the final formatting of questionnaires, the frame and sample preparation, and other pre-collection preparation.


Need for Information

Attaining a postsecondary credential has become increasingly important for securing opportunities to get high-return jobs in the United States in the 21st century. However, NCES has traditionally only collected data on postsecondary certificates and degrees awarded through credit-bearing instruction in traditional institutions of higher education that participate in Title IV federal student aid programs. These comprise only a portion of subbaccalaureate education and training American adults seek and complete to learn the skills they need to find and keep good-paying jobs.


The NATES will provide a means to investigate issues related to education, training, and credentials for work that cannot be adequately studied through the Center’s institution-based data collection efforts. While the NATES Pilot Test is not being conducted to make survey estimates, the data gathered will allow for examination of the empirical properties of potential survey measures.


Purposes and Uses of the Data

The data collected in the NATES Pilot Test will be used to evaluate the feasibility of conducting a household mail survey to capture information about education, training, and credentialing for work, with a particular focus on response rates and response bias. It will not be used to generate official national estimates of the population with certifications and certificates. Information gathered from this pilot will be used to make recommendations for methodological approaches and survey measures to be fielded in future adult education, training, and credentials data collections, including possible administration with NHES.


Use of Improved Information Technology

The NATES Pilot Test will be conducted using similar methods and techniques that were recently fielded in NHES. The self-administered questionnaires will be implemented in the Census Bureau’s form reading software, which offers a wide range of capabilities to projects. The following features are important for the NATES Pilot Test:


Forms Design. Form templates can classify each data field as a text entry, choice, signature, or image zone. Completed hardcopy forms can be processed without manual data entry or in conjunction with manual entry.

Data Capture. The system extracts data according to rules established for each questionnaire template. Completed NATES forms will be processed with a combination of electronic data extraction using optical mark recognition (OMR) for check box responses and manual data entry in a system called key from image (KFI).

Verification. Extracted data are subject to validation according to project specifications. The KFI data are independently verified during a second keying (KFI) phase with differences adjudicated by a third party.

Receipt Control. The system will provide for automatic receipt control in a flexible manner that will be used to produce status reports that allow ongoing monitoring of the survey’s progress.

Efforts to Identify Duplication

Senior policy officials in the Departments of Education and Labor, foundations including the Gates Foundation and Lumina, and research organizations such as the Georgetown Center for Education and the Workforce have recognized that there is a lack of valid statistical information on prevalence of industry-recognized certifications and education certificates and called for the development of new data sources. A series of meetings during the fall of 2009 launched a broad effort to begin to define and enumerate these credentials. NCES conducted a review of research literature and data collections since the work of a previous Interagency Committee in 2000, from which NCES developed a bank of existing survey items on certifications (completed 11/2009) and education certificates (completed 1/2010). This research found no surveys that adequately capture comprehensive data on adult training and education.


Consultations Outside the Agency

As noted above, this Pilot Study is part of four strands of work guided by GEMEnA, which has met monthly since October 2009 and consists of senior staff from the Bureau of the Census, the Bureau of Labor Statistics, the Council of Economic Advisors, the National Center for Education Statistics, the National Science Foundation, the Office of Management and Budget, and the Office of the Under Secretary of Education. Current members as of 9/2012 are:


Council of Economic Advisors

Chinhui Juhn


Census Bureau

Bob Kominski

Stephanie Ewert


Bureau of Labor Statistics

Dori Allard

Harley Frazis


National Science Foundation

Dan Foley

John Finamore

Office of Management and Budget

Shelly Martinez


Department of Education – Office of the Under Secretary

Jon O’Bergh


National Center for Education Statistics

Sharon Boivin

Lisa Hudson

Kashka Kubzdela

Matthew Soldner

Sarah Crissey


Many members of GEMEnA provided valuable feedback in the development of NATES Pilot Study survey content. Bob Kominski and Harley Frazis provided particular recommendations and reviews for the employment section. Rose Kreider from Census and Frank Gallo from the Employment and Training Administration (ETA) of the Department of Labor provided suggestions and comments on the demographic questions in the survey. Ronald Johnson of ETA provided expert advice on formal apprenticeship programs. Steve Rose from the Georgetown University Center on Education and the Workforce commented on an early version of the certificate section.


Payments to Respondents

As was successfully used in the 2012 NHES, to encourage response and thank respondents for their time and information, an advance cash incentive of $5 will be sent to the experimental Screener cases (discussed below) and $15 will be sent to each household with the main survey.


Assurance of Confidentiality

The following text will be included on the NATES questionnaires:


We are authorized to collect this information by Section 9543, 20 U.S. Code. You do not have to provide the information requested. However, the information you provide will help the Department of Education’s ongoing efforts to learn more about the educational experiences of adults. There are no penalties should you choose not to participate in this study. Your answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (Section 9573, 20 U.S. Code). Your responses will be combined with those from other participants to produce summary statistics and reports.


According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. The valid OMB control number for this voluntary information collection is OMB 1850-0803. Approval expires 9/30/2013. The time required to complete this information collection is estimated to average 152 minutes per response, including the time to review instructions, gather the information needed, and complete and review the information collection. If you have any comments concerning the accuracy of the time estimate(s), suggestions for improving the form or reducing burden, or have comments or concerns regarding the status of your individual submission of this form, write to: Sharon Boivin, National Center for Education Statistics, U.S. Department of Education, 1990 K Street NW, Washington, DC 20006. Do not return the completed form to this address.


Sensitive Questions

The NATES Pilot Test is a voluntary survey. No persons are required to respond to it and respondents may decline to answer any question in the survey.


Questions that ask about personal and household income may be considered sensitive by some respondents. However, measures of income are important because education attainment is statistically associated with income and the empirical properties of the survey measures may differ for people with different income levels.


Estimated Response Burden

Table 1.  Estimated response burden for the NATES Pilot

Instrument

Unit1 sample size

Expected response rate (%)

Expected number of completed units2

Avg. unit completion time (mins)

Total burden hours

Mail Screener

500

60

270

3

14

Topical Survey3

10,000

65

5,850 (8,775 individual responses)

22.5

2,194

Nonresponse Survey

1,670

60

1,000

5

83

Total

NA

NA

7,120 (10,045 individual responses)

NA

2,291

1 Unit = Household

2 Approximately 10 percent of the address sample is expected to be undeliverable.

3 Based on NCES testing, it takes an average of 15 minutes for a person to complete the questionnaire. Each sampled unit will receive either 3 individual questionnaires or a booklet that includes the full set of questionnaire items for up to three eligible household members. The average unit completion time is estimated to be 22.5 minutes based on the expectation that on average responding units will complete one and a half questionnaires.


The administration times for the Screener and topical survey are based on practice administrations. It is expected that it will take the respondent about 15 minutes to answer the individual items contained in the questionnaire designed for a single adult. Since each sampled unit will receive questionnaires for three adults, the average unit response burden is estimated to be 22.5 minutes based on the expectation that on average responding units will complete one and a half questionnaires. It is expected that respondents will take about 3 minutes, on average, to complete the Screener. Additionally, the in-person nonresponse follow-up (discussed below) is expected to take approximately 5 minutes.


The cost to respondents for the total hour burden is estimated to be $48,775, that is, $21.29 per hour for 2,291 burden hours. The hourly rate is based on the average for all civilian workers from the 2010 National Compensation Survey (http://www.bls.gov/ncs/ocs/sp/nctb1475.pdf ). There are no other costs to respondents and no recordkeeping requirements associated with the NATES Pilot Test.


Cost to the Federal Government

The total cost of the NATES Pilot Test data collection to the government is approximately $1,200,000. This includes all direct and indirect costs of the data collection.


Project Schedule

January, 2013   Initial mailing                            

February–April, 2013   Nonresponse follow-ups by mail

May–June, 2013 Nonresponse bias study field collection

August, 2013 Complete processing activities


Statistical Methodology

The NATES Pilot Test is an address-based sample covering the 50 states and the District of Columbia. It will be conducted from January through July 2013. Households will be randomly sampled as described below, and a questionnaire will be administered by mail to each adult household respondent.


1. Sampling Households

A nationally representative sample of 10,000 addresses will be used. A nationally representative sample of addresses was drawn in a single stage from a file of residential addresses maintained by a vendor, based on the United States Postal Service (USPS) Computerized Delivery Sequence File (CDSF) for the 2012 NHES and the 10,000 NATES addresses will be drawn from 48,000 unused addresses in the NHES sample draw. The NATES sample will be stratified by ethnicity, with an oversample in areas with relatively larger Black and Hispanic populations. In addition, the sample will be stratified by the percentage of the population in poverty. Addresses in areas defined as high poverty areas (where 20 percent or more of the area is at or below the poverty level) will be sampled at a slightly higher rate than those in other areas.


The NATES plan includes an initial mailing with a reminder postcard and up to three nonresponse follow-up mailings. The first mailing will contain a letter, three topical questionnaires (separately or in booklet form), a postage paid envelope for each questionnaire (three envelopes for the cases receiving three topical questionnaires and one envelope for the cases receiving a booklet), and a $15 cash incentive (three $5 bills). The reminder postcard will be mailed to all sample households one week after the first mailing. The three subsequent mailings will each contain a letter, three topical questionnaires (separately or in booklet form), and either one or three postage-paid return envelopes. The third mailing will be sent via FEDEX while the other mailings will be sent via United States Postal Service. The follow-up mailings do not include a cash incentive.


An additional 500 households will receive the experimental NHES screener. This screener experiment will test the feasibility of enumerating all household members (adults and children) on a short household screening instrument. It uses the same form as the 2012 NHES household Screener, which enumerated only children in the household (see section 6 below).


2. Within-Household Sampling

The sample will consist of all eligible adults in each sampled household. To be eligible, an adult must be age 16 to 65 and not enrolled in high school. If no one in the household meets the eligibility criteria, the household will be instructed to mark one box indicating that information on one questionnaire and to return that one questionnaire. If a household has more than three adults, respondents can call the toll-free number on the questionnaire to request more instruments.


Sampling all adults per household has the advantage of allowing a single stage survey administration, which is expected to increase response. It also minimizes the need for post-survey statistical adjustments to account for within-household sampling.


3. Expected Yield

As described above, questionnaires will be sent to each of the 10,000 sampled addresses (units). It is expected, based on experience from the NHES, that about 10 percent of addresses (or a total of 1,000 addresses) will be undeliverable. An expected unit response rate of 65 percent is assumed. This unit response rate would yield 5,850 completed units and approximately 8,775 respondents. For the screener experiment, an expected response rate of 60 percent is assumed, yielding 270 completed screeners. Table 2 summarizes the expected numbers of completed interviews for the NATES Pilot Test.


Table 2.Expected numbers sampled and expected numbers of completed Screeners and extended topical surveys in the NATES Pilot Test

Survey

Expected number of sampled units (households)

Expected number of completed questionnaires

Household Screeners

500

270

Topical Surveys

10,000

8,775


4. Sample Size Requirements

The key objective of the NATES Pilot Test is to assess response rates and response bias. However, in order to make sure key subgroups are represented, Table 3 shows the expected numbers of completed extended topical surveys for subgroups defined by credential status, education attainment, and age.



Table 3. Estimated population distribution and expected numbers of completed extended topical surveys for key subgroups

Characteristic

Estimated percent of adult population

Expected number of completed interviews





Overall

100

8,775

Credential status




Has certificate, certification, or license1

24

2,106

Education attainment




High school diploma or less

66

5,792


Associates degree, some college,

other less than BA


6


527

Age




18-30

23

2,018


31-45

29

2,545


46 and older

48

4,212

By education attainment



Less than Bachelor’s degree




Has certificate, certification, or license1

14

1,229


Hispanic

11

965


Nonwhite, not Hispanic

14

1,229





1 The percentage of adults who reported their occupation has legal or professional requirements for continuing education or training.

NOTE: Expected numbers of completed interviews were calculated by applying the estimated percent of the adult population (accurate to the hundredths) to the total expected sample size.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Adult Education for Work-Related Reasons Survey of the National Household Education Surveys Program (NHES), 2005.



5. Estimation Procedures

There are no plans to release survey estimates from the NATES Pilot Test; the aim of the Pilot Test is to provide a large scale methodological evaluation of survey methods. However, to facilitate computation of preliminary estimates so as to aid in the evaluation of the accuracy of the survey measures for future data collections, weights will be computed to allow generation of estimates of population characteristics.


6. Additional Design Features

The NATES Pilot Test will contain special methodology and design features to address household research concerns for the future use of the NATES items. These features include:


Household Screener Test. A sample of 500 households will receive a modified NHES screener questionnaire, using the same mailing procedures that were used in the 2012 NHES administration. Once a questionnaire is returned, the case is considered complete. These households do not get a topical questionnaire. The purpose of this test is to compare response rates achieved for this sample to response rates achieved in the 2012 NHES to determine if there is any negative effect on overall response of requesting information about adults as well as children in the household. Following NHES procedures, the screener test will consist of an advance letter mailing, followed by a screener mailing, reminder postcard mailing, and three follow-up letters to nonrespondents. Each follow-up letter will include a screener questionnaire. The first screener mailing will include a $5 incentive.


Nonresponse Bias Study. About 1,670 household addresses will be selected from non-responding addresses after the mail operation for further nonresponse follow-up. The purpose of the additional follow-up is to attempt to measure the extent, if any, of nonresponse bias in the survey data collected by the main mail operation. A short questionnaire (including a question on the number of household members and a limited set of questions about one eligible adult’s educational experience) will be administered in-person. Census will conduct an analysis of the characteristics of sampled people who responded to the original NATES questionnaire and analysis of those who responded to the in-person contact. Census will also perform response propensity modeling and other bias analyses to assess the extent of nonresponse bias in the survey estimates.


The nonresponse bias analysis sample will be based on nonresponse in clusters of counties so Census can recruit field representatives within clusters. This will be more cost effective for interviewing than if the sample were spread out over all counties in the survey. The number of counties in the nonresponse bias follow up will be approximately 200. The nonresponse bias study target is 1,000 completed interviews. The study assumes a nonresponse rate of 60 percent to account for interviews that will not be eligible for the follow-up. Hence, the approximate sample size is expected to be 1,670. Census will select about 50 clusters of several counties each, especially in rural areas where there is less sample, and clusters of 1 county in urban areas that have more sample. Within each cluster, Census will preselect approximately 9 cases.


Booklet Design Test. The study will include a test that will allow for evaluation of response rates when the main survey is administered as a single booklet format versus 3 separate questionnaires with 3 separate postage-paid envelopes. Approximately half of the households will receive each type of mailing. There is sufficient power within the planned sample design to evaluate the main effect on response rates of booklet versus separate questionnaires, and we may be able to add some high level interaction terms as well. Because there is no methodological literature on the effect of separate questionnaires versus a booklet, we felt it was important to evaluate this effect and learn more about its size and impact.

7. Methods for Maximizing Response Rates

The NATES design incorporates a number of features to maximize response rates; in particular, it will use the same follow-up methods and incentive amounts that were successfully used in the 2012 NHES.


Total Design Method/Respondent Friendly Design. This approach combines the attributes of the least expensive and best methods available beginning with the least labor-intensive mode to a mode requiring increasingly greater amounts of cost. While this places an emphasis on use of resources, these procedures create a respondent friendly approach that uses design attributes, a scheduled sequence of contacts, and survey mode to motivate and encourage survey participation. Surveys that take advantage of respondent friendly design have demonstrated increases in survey response (Dillman, Smyth, and Christian 2008; Dillman, Sinclair, and Clark, 1993).


Engaging Respondent Interest and Cooperation. The content of respondent letters and frequently asked questions (FAQs) will be focused on communicating the legitimacy and importance of the study. Interviewer training for the Nonresponse Bias Analysis Study will focus on strategies for communicating the importance and legitimacy of the survey and gaining cooperation.


Nonresponse Follow-up. The data collection protocol includes several stages of nonresponse follow-up prior to releasing cases to the Field Representatives to administer the nonresponse bias study. In addition to the numbers of contacts, changes in follow-up method (mail, FedEx) are designed to capture the attention of potential respondents and are based on the recent successful 2012 NHES administration.


Incentives. Incentives will be used for the main survey and the household screener experiment. A prepaid incentive of $15 will be used in the main survey and $5 will be used for the household Screener (as was done in the 2012 NHES).


Telephone Questionnaire Assistance. A toll free telephone number will be listed on the questionnaire for respondents to call to request more mail materials and to receive answers to questions regarding the questionnaire.


Cognitive Laboratory Research for the NATES Pilot

The NATES Pilot was preceded by 14 cognitive interviews on the main survey materials. Interviews were conducted with adults from the Washington, DC Metro area. The cognitive interviews were conducted in-person. Participants were selected based on their self-reported credential status, education attainment, and age. Changes to the survey letter and questions were made based on these interviews, including:


  • Revising language that asks for all household members to complete the survey

  • Revising the skip patterns and directions in the household screener

  • Shortening the letter that will accompany the questionnaires

  • Revising response categories and terminology in several questions; for example, using the word “issued” instead of “awarded” in the certification section, adding “vocational” to descriptions of types of schools, using arrows to indicated “other (specify)” boxes, and creating emphasis by underlining or capitalizing words.


Individuals Responsible for Study Design and Performance

The people listed below participated in the study design and are responsible for the collection and analysis of the data.

Sharon Boivin, NCES 202/502-7627

Sharon Stern, Census Bureau 301/763-5638


References

Bureau of Labor Statistics. (2012, March 23). Education pays. Retrieved from http://www.bls.gov/emp/ep_chart_001.htm.


Career One Stop. (n.d.). More education means more money. Retrieved August 7, 2012 from http://www.careerinfonet.org/finaidadvisor/earnings.aspx?nodeid=21.


Carnevale, A., Rose, S., and Hansen, A. (2012). Certificates: Gateway to Gainful Employment and College Degrees. Washington DC: The Georgetown University Center on Education and the Workforce. Available online at http://cew.georgetown.edu/certificates/.

Carnevale, A., Smith, N., and Strohl, J. (2010). Help Wanted: Projections of Jobs and Education Requirements Through 2018. Washington DC: The Georgetown University Center on Education and the Workforce. Available online at http://cew.georgetown.edu/jobs2018/.

Dillman, D.A., Sinclair, M.D., and Clark, J.R. (1993). Effects of questionnaire length, respondent-friendly design, and difficult questions on response rates for occupant-addressed Census mail surveys. Public Opinion Quarterly, 57, 289-304.

Dillman, D.A., Smyth, J.D., and Christian, L.M. (2008). Internet, mail, and mixed mode surveys: The Tailored Design Method. New York: Wiley.

Julian, T. and Kominski, R. (2011). Education and Synthetic Work-Life Earnings Estimates. Suitland, MD: U.S. Bureau of the Census.

U.S. Department of Labor. (2011). Occupational Earnings in the United States, 2010. Washington, D.C.: Bureau of Labor Statistics. Available online at http://www.bls.gov/ncs/ocs/sp/nctb1475.pdf.



1 Estimates from the ATES pilot study are not considered official statistics.

2 For the booklet which is essentially three individual questionnaires, this sentence will say 20 minutes instead of 15 minutes.

14


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWAITS_T
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy