Supporting Statement

0693-0033-RecentFormerPostdoc-Less5Years-SS[1].docx

Generic Clearance for Program Evaluation Data Collections

Supporting Statement

OMB: 0693-0033

Document [docx]
Download: docx | pdf

OMB Control No. 0693-0033

NIST Generic Clearance for Program Evaluation Data Collections

Expiration Date: 10/31/2012


REQUEST FOR CLEARANCE TO CONDUCT A SURVEY FOR THE NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY (NIST) OF RECENT FORMER NRC NIST POSTDOCS (WITHIN THE PAST FIVE YEARS)


FOUR STANDARD SURVEY QUESTIONS


1. Explain who will be surveyed and why the group is appropriate to survey.


The National Institute of Standards and Technology (NIST) proposes to conduct a survey to assess the NIST National Research Council (NRC) Postdoctoral Research Associateship Program (RAP) through the company Westat, who has been contracted to carry out the assessment of the Program. This two-year, multi-method study seeks to examine the postdoctoral experiences and career trajectories of NRC NIST postdocs. 


Recent former NRC NIST postdocs who completed their NRC NIST postdoc appointment within the past five years will be surveyed using the Survey of Recent Former NRC NIST Postdocs (Within the Past Five Years).


This group was selected because input from these respondents will help us answer questions about the postdoctoral experiences and career trajectories of NRC NIST postdocs.


The assessment of the NRC NIST Postdoctoral Research Associateship Program has two major objectives:


The first objective is to conduct a program evaluation that assesses the organizational benefits, if any, that accrue to NIST as a result of RAP. This objective has three major goals to determine: (1a) if RAP is an effective recruitment tool that converts talented postdocs into permanent employees; (1b) if, and how, RAP advances NIST programs and its mission; and (1c) if the outreach efforts of NIST are effective in attracting high quality candidates to apply for and accept RAP awards. As shown in Chart 1 below, the survey will explore goals 1a-c.


The proposed survey will use a quasi-experimental design to construct a meaningful comparison group using qualified but denied NIST applicants. This design approach has the clear advantage of being rigorous in design as well as practical in data collection. Also, this approach eliminates selection bias by using individuals who, like the awardees, demonstrated an interest in the program, motivation to apply, and strong enough qualifications to be recommended for inclusion. Goal 1c will be explored by comparing the survey results of NRC NIST applicants to that of recent former NRC NIST Postdocs (see Chart 1).


The second objective is to examine the perceived and measurable program benefits for the NRC NIST postdocs. The two major goals of this objective are to determine: (2a) the usefulness of participation for advancing the careers and productivity of the NRC NIST postdocs; and (2b) participants’ perceptions of the effects of program participation. As shown in Chart 1, the survey will explore goals 2a and b.


The primary purpose of this survey data collection is to gain a better understanding of how the program is being implemented and what its impact is on the sponsoring institution and participating individuals. NIST will use the data to assess the extent and effects of programmatic activities and associated outcomes of these activities. All information collected will be used to provide analytical and policy support to NIST’s International and Academic Affairs Office (IAAO), helping NIST make decisions about future program initiatives to improve postdoctoral training.


Chart 1.—Goals of Study and Links to Survey of Recent Former NRC NIST Postdocs

(Within the Past Five Years)


Goal/research question

Indicator

1a. Is RAP beneficial to NIST as a recruitment tool?

Experiences of recent former NRC NIST postdocs

1b. Beneficial to NIST programs and mission?

Experiences of recent former NRC NIST postdocs

1c. Effectiveness of NIST outreach efforts to attract high quality candidates?

Quality of recent former NRC NIST postdocs versus NRC NIST applicants

2a. What are the career trajectories of NRC NIST postdocs and how do they compare to non-RAP awardees?

Reports of career activities of recent former NRC NIST postdocs versus- NRC NIST applicants

2b. What is the attractiveness of the NIST RAP program?

Subjective list of benefits and concerns (e.g., relationship with advisor, career opportunities)



2. Explain how the survey was developed including consultation with interested parties, pre-testing, and responses to suggestions for improvement.


Development work was done to identify appropriate and relevant items for the survey. This work included conducting a literature review and a search of existing survey instruments. Several expert panel sessions were conducted with current NRC NIST postdocs and advisors in November 2010 to identify key issues to address and terminology to use in all of the surveys. Multiple drafts of the recent former NRC NIST postdoc survey were reviewed internally by project team members for item relevance and clarity. Pretest calls were conducted with nine former NRC NIST postdocs (including one recent former postdoc) in January and February of 2011 to improve the surveys. Respondents reviewed and completed the survey and were asked about the clarity and relevance of each survey item (e.g.,whether they could answer each question


without too much burden, and how long it took to complete the survey) during follow-up phone interviews. After the follow-up, the survey was reviewed and revised based on the respondents’ feedback.


The information collected in this study represents the minimum effort required to assess the benefits of the NRC NIST Postdoctoral Research Associateship Program. It is expected that the survey will take 30 minutes to complete. Expected time to complete the surveys was based on feedback received from the pretests of the survey instrument (see Chart 2 for survey response burden reported by respondents). Please note that the full version of the survey was pretested with all of the former postdocs (1 recent former postdoc who finished within the past 5 years and 8 former postdocs who finished more than 5 years ago). Recent former postdoc who finished within the past 5 years will complete the revised, full version of the pretested survey. As a result, the below response burden estimates for that pretest are accurate for this sample.



Chart 2.—Survey Response Burden from Pretests of All Former Postdocs- 1 Recent Former Postdoc (Within the Past 5 years) and 8 Former Postdocs (More than 5 years ago)


Survey response burden

Number of respondents

Mean burden

(in minutes)

Range

(in minutes)

9

28

10-45

Note: Several respondents gave ranges for completion times (e.g., 20-25 minutes).  The average

and range provided here are based on the high end of the range (e.g., 25 minutes from the

example above).



3. Explain how the survey will be conducted, how customers will be sampled if fewer than all customers will be surveyed, expected response rate, and actions your agency plans to take to improve the response rate.


Stratified simple random samples will be used with the goal of achieving the final sample of 87 recent former NRC NIST postdocs. The sample will be explicitly stratified by program termination date1 and implicitly stratified by specialty research field2. The samples will be proportionally allocated over the four to five strata for each of these stratification variables.





Updating Respondent Contact Information


We will need contact information for respondents in order to administer the surveys and telephone interviews. This information will include e-mail addresses, telephone numbers (work or home), and work or home addresses.


We will use lists that were compiled as the postdoctoral program progressed from 1980 to 2011 to create the sample of recent former NRC NIST postdocs who completed the program in the past five years. The lists contain candidates’ names, addresses, and other contact information, such as phone numbers and/or email addresses, but some of those pieces of information are not available or are outdated. Updating the entirety of both lists is costly and time-consuming. Therefore, the lists will be updated on a sample basis, where a large enough random sample will be selected to get a sufficient number of candidates for an survey (see Sample Selection and Expected Response Rate below for a description of how we determined select sample sizes). This approach helps to control the cost and maintain the ability to generalize the survey results.


Chart 3 below shows the number of respondents for whom there is contact information. As seen in the chart, there is contact information for 130 respondents; 77 respondents will need updated contact information.


Chart 3.Selected Sample With Contact Information, by Type of Respondent


Population size

Select sample size

Select sample with contact information

Select sample need contact information

1,082

207

130

77


Westat will employ their own hourly staff to trace contact information. This staff will include individuals with tracing experience and who will be trained in a multitude of tracing techniques by one of the Westat team members working on the NIST project. Based on the experiment with small samples, it is expected that accurate contact information will be obtained for approximately 60 percent of the recent former postdocs in the list.


Sample Selection and Expected Response Rate


Chart 4 shows information on selected, field, and final sample sizes for all recent former postdocs. Westat assumes that the response rate for those for whom good contact information can be obtained is approximately 70 percent. The final target sample size is 87 respondents. Given this final target sample size and applying the expected above response rate of 70 percent, the selected sample size and the field sample size are calculated as shown in Chart 4 along with expected precision for an estimate of a population proportion of 50 percent (this is the proportion that is customarily used to calculate the sample size for an expected precision)3. Based on these calculations, it was determined that the selected sample size needed is 207 respondents.


Chart 4.—Various Sample Sizes and Expected Response Rate

Population size (N)

Selected sample sizeA

Correct contact information rate (%)

Field sample sizeB

Response rate (%)

Final sample size (n)

1,082

207

60

124

70

87

A The selected sample size is the total number of respondents selected for inclusion in the study.

B The field sample consists of those respondents we have contact information for or expected to find contact information for.


Chart 5 shows the estimated burden hours to complete the survey. It is estimated that the total annual burden will be 44 hours.


Chart 5.—Estimated Annual Burden Hours for Survey of Recent Former NRC NIST Postdocs (Within the Past Five Years)


Number of respondents

Burden per respondent (in minutes)

Total annual burden (in hours)

87

30

44


Survey Administration


Data will be gathered using a self-administered online survey of approximately 87 recent former NRC NIST postdocs. Respondents will have varying types of contact information. Respondents with a valid e-mail address will be sent an e-mail message one week prior to survey administration for the purpose of (1) notifying them about the survey study, and (2) verifying their e-mail address. During the survey administration, another e-mail message will be sent to invite them to participate in the study. The e-mail message will include an attached cover letter. The cover letter requests the participation of each respondent and introduces the purpose and content of the survey. The cover letter includes instructions on how to complete the web version of the survey that will be accessed through the internet as well as contact information in case of queries. Information about the option to complete a traditional paper version of the survey will also be included in the letter. Respondents who have a telephone number, a call will be made to obtain their e-mail address and an e-mail invitation will be sent to them. Respondents who have a work or home address, a letter will be mailed to request their e-mail address and an e-mail invitation will be sent to them. The information needed to complete the survey will be made readily available to respondents.


To counter an unexpectedly low response rate, a reserve sample with a sample size of 20 percent of the main sample, or 42 respondents, will be selected.


To improve response rates, follow-up efforts will be used to encourage survey completion. Reminder e-mails will be sent to all non-respondents beginning two weeks after initiation of the survey. A second reminder will be sent to all non-respondents three weeks after initiation. Telephone follow-up for nonresponse will begin about four weeks after initiation of the survey. Experienced telephone interviewers trained in nonresponse conversion will conduct up to four follow-up calls.



4. Describe how the results of the survey will be analyzed and used to generalize the results to the entire customer population.


The present study will use a mixed methods approach to analyze the survey data collected. Close-ended items on the survey will be analyzed quantitatively. Data will first be analyzed descriptively and then used in multivariate statistical models to examine study questions.

Qualitative techniques will be employed to analyze open-ended survey items. Content analysis will be used to code data thematically, and pattern codes will be generated for the final descriptive analysis. Evaluators will first examine the data individually and then convene with each other to compare and contrast observed patterns. The frequencies of agreed-upon patterns will be included in the final report.


The final report will be delivered to the IAAO at NIST upon the conclusion of the project (approximately 24 months from launch). This report will include a summary of all the data collected and analyzed during the assessment, as well as conclusions that address each of the research questions and project components.

1 Program termination date has four strata: (1) 5 years or less after participation in NRC/NIST; (2) 6-9 years after NRC/NIST; (3) 10-19 years after NRC/NIST; and (4) 20-30 years after NRC/NIST.

2 Specialty research field has five strata: (1) physical sciences; (2) engineering; (3) mathematical and computer sciences; (4) biological, biomedical, and health sciences; and (5) all others that include other fields, unclear fields, and missing.

3 To calculate the expected precision, an assumption was made on the design effect. Even though equal probability samples will be selected, the final respondent sample will have unequal weights after nonresponse adjustment. Since the starting weights are equal before nonresponse adjustment, the design effect is expected to be moderate (the larger the variability of the weights, the larger the design effect), and so, a value of 1.2 for the design effect is assumed. Because the population sizes are not large, a hyper-geometric distribution was used. Under the hyper-geometric distribution, the standard error (se) of a proportion estimate for a population proportion (P) is given by the following formula:


(1)

where N is the population size, n is the final sample size, and Q = 1 - P. However, a sample with unequal weights will not be as efficient as a simple random sample (i.e., with equal weights), and thus, the design effect > 1. Therefore, to use formula (1), the sample size was adjusted using the design effect by replacing n by n' = n / d, where d is an assumed value of the design effect. The design effect adjusted sample size n' is referred to as the effective sample size. For the preparation of Chart 4, formula (1) with n' and P = 0.5 was used.


6


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleOMB Control No
AuthorDarla Yonder
File Modified0000-00-00
File Created2021-02-03

© 2024 OMB.report | Privacy Policy