Supporting Statement

0693-0033-NRC-NIST-Postdoc-Non-Awardees-SS.docx

Generic Clearance for Program Evaluation Data Collections

Supporting Statement

OMB: 0693-0033

Document [docx]
Download: docx | pdf

NIST Generic Clearance for Program Evaluation Data Collections

OMB Control No. # 0693-0033

Expiration Date 10/31/2012


Request for Clearance to Conduct a Survey For The National Institute of Standards and Technology (NIST) OF

National Research Council (NRC) NIST Postdoc Non-Awardees



FOUR STANDARD SURVEY QUESTIONS



1. Explain who will be surveyed and why the group is appropriate to survey.


The National Institute of Standards and Technology (NIST) proposes to conduct a survey to assess the NIST National Research Council (NRC) Postdoctoral Research Associateship Program (RAP) through the company Westat, who has been contracted to carry out the assessment of the Program. This two-year, multi-method study seeks to examine the postdoctoral experiences and career trajectories of NRC NIST postdocs. 


The survey will be conducted with NRC NIST Non-Awardees using the Survey of NRC NIST Non- Awardees.

This group was selected because the input from these respondents will help us answer questions about the postdoctoral experiences and career trajectories of NRC NIST postdocs.


The assessment of the NRC NIST Postdoctoral Research Associateship Program has two major objectives:


The first objective is to conduct a program evaluation that assesses the organizational benefits, if any, that accrue to NIST as a result of RAP. This objective has three major goals to determine: (1a) if RAP is an effective recruitment tool that converts talented postdocs into permanent employees; (1b) if, and how, RAP advances NIST programs and its mission; and (1c) if the outreach efforts of NIST are effective in attracting high quality candidates to apply for and accept RAP awards. As shown in Chart 1 below, the survey will explore goals 1a-c.


The proposed survey will use a quasi-experimental design to construct a meaningful comparison group using qualified but denied NIST applicants. This design approach has the clear advantage of being rigorous in design as well as practical in data collection. Also, this approach eliminates selection bias by using individuals who, like the awardees, demonstrated an interest in the program, motivation to apply, and strong enough qualifications to be recommended for inclusion. Goal 1c will be explored by comparing the survey results of NRC NIST applicants to that of NRC NIST Non- Awardees (see Chart 1).

The second objective is to examine the perceived and measurable program benefits for the NRC NIST postdocs. The two major goals of this objective are to determine: (2a) the usefulness of participation for advancing the careers and productivity of the NRC NIST postdocs; and (2b) participants’ perceptions of the effects of program participation. NIST is also interested in obtaining information about why successful applicants decline the RAP award. As shown in Chart 1, the survey will explore goals 2a and b.


The primary purpose of this survey data collection is to gain a better understanding of how the program is being implemented and what its impact is on the sponsoring institution and participating individuals. NIST will use the data to assess the extent and effects of programmatic activities and associated outcomes of these activities. All information collected will be used to provide analytical and policy support to NIST’s International and Academic Affairs Office (IAAO), helping NIST make decisions about future program initiatives to improve postdoctoral training.


Chart 1.—Goals of Study and Links to Survey of Non-Awardees

Goal/research question

Indicator

1a. Is RAP beneficial to NIST as a recruitment tool?

Experiences of NRC NIST Non-Awardees

1b. Beneficial to NIST programs and mission?

Experiences of NRC NIST Non-Awardees

1c. Effectiveness of NIST outreach efforts to attract high quality candidates?

Quality of NRC NIST Non-Awardees versus NRC NIST applicants

2a. What are the career trajectories of NRC NIST postdocs and how do they compare to non-RAP awardees?

Reports of career activities of NRC NIST Non-Awardees versus- NRC NIST applicants

2b. What is the attractiveness of the NIST RAP program?

Subjective list of benefits and concerns (e.g., relationship with advisor, career opportunities)


2. Explain how the survey was developed including consultation with interested parties, pre-testing, and responses to suggestions for improvement.


Development work was done to identify appropriate and relevant items for the survey. This work included conducting a literature review and a search of existing survey instruments. Multiple drafts of the NRC NIST Non-Awardees survey were reviewed internally by project team members for item relevance and clarity.


The information collected in this study represents the minimum effort required to assess the benefits of the NRC NIST Postdoctoral Research Associateship Program. It is expected that the survey will take 30 minutes to complete. Expected time to complete the surveys was based on feedback received from the pretests of the current, recent former and former survey instruments.




3. Explain how the survey will be conducted, how customers will be sampled if fewer than all customers will be surveyed, expected response rate, and actions your agency plans to take to improve the response rate.


Stratified simple random samples will be used with the goal of achieving the final sample of 333 respondents. The sample will be explicitly stratified by program termination date1 and implicitly stratified by specialty research field2. The samples will be proportionally allocated over the four to five strata for each of these stratification variables.


Updating Respondent Contact Information


We will need contact information for respondents in order to administer the surveys and telephone interviews. This information will include e-mail addresses, telephone numbers (work or home), and work or home addresses.


We will use lists that were compiled as the postdoctoral program progressed from 1980 to 2011 to create the sample of NRC NIST Non-Awardees. The lists contain candidates’ names, addresses, and other contact information, such as phone numbers and/or email addresses, but some of those pieces of information are not available or are outdated. Updating the entirety of both lists is costly and time-consuming. Therefore, the lists will be updated on a sample basis, where a large enough random sample will be selected to get a sufficient number of candidates for an interview (see Sample Selection and Expected Response Rate below for a description of how we determined select sample sizes). This approach helps to control the cost and maintain the ability to generalize the survey results.


Chart 2 below shows the number of respondents for whom there is contact information. As seen in the chart, all 1,110 respondents will need updated contact information.


Chart 2.Selected Sample with Contact Information

Population size

Select sample size

Select sample with contact information

Select sample need contact information

2,076

1,110

0

1,110


Westat will employ their own hourly staff to trace contact information. This staff will include individuals with tracing experience and who will be trained in a multitude of tracing techniques by one of the Westat team members working on the NIST project. Based on the experiment with small samples, it is expected that accurate contact information will be obtained for approximately 60 percent of the people in the list.

Sample Selection and Expected Response Rate


Chart 3 shows information on selected, field, and final sample sizes for all non-awardees. Westat assumes that the response rate for those for whom good contact information can be obtained is approximately 60 percent. The final target sample size is 333 respondents. Given this final target sample size and applying the expected above response rate of 60 percent, the selected sample size and the field sample size are calculated as shown in Chart 4 along with expected precision for an estimate of a population proportion of 50 percent (this is the proportion that is customarily used to calculate the sample size for an expected precision)3. Based on these calculations, it was determined that the selected sample size needed is 1,110 respondents.


Chart 3.—Various Sample Sizes and Expected Response Rate

Population size (N)

Selected sample sizeA

Correct contact information rate (%)

Field sample sizeB

Response rate (%)

Final sample size (n)

2,076

1,110

50

555

60

333

A The selected sample size is the total number of respondents selected for inclusion in the study.

B The field sample consists of those respondents we have contact information for or expected to find contact information for.


Chart 4 shows the estimated burden hours to complete the survey. It is estimated that the total annual burden will be 167 hours.


Chart 4.—Estimated Annual Burden Hours for Interview with NRC NIST Non-Awardees

Number of respondents

Burden per respondent (in minutes)

Total annual burden1

(in hours)

333

30

167


Survey Administration


Data will be gathered using a self-administered online or paper survey of approximately 333 respondents. Respondents will have varying types of contact information. Respondents with a valid e-mail address will be sent an e-mail message one week prior to survey administration for the purpose of (1) notifying them about the survey study, and (2) verifying their e-mail address. During the survey administration, another e-mail message will be sent to invite them to participate in the study. The e-mail message will include an attached cover letter. The cover letter requests the participation of each respondent and introduces the purpose and content of the survey. The cover letter includes instructions on how to complete the web version of the survey that will be accessed through the internet as well as contact information in case of queries. Information about the option to complete a traditional paper version of the survey will also be included in the letter.

Respondents who have a telephone number, a call will be made to obtain their e-mail address and an e-mail invitation will be sent to them. Respondents who have a work or home address, a letter will be mailed to request their e-mail address and an e-mail invitation will be sent to them. The information needed to complete the survey will be made readily available to respondents.


The survey is scheduled to be administered in spring 2011.


To counter unexpectedly low response, a reserve sample with a sample size of 20 percent of the main sample, or 160 respondents, will be selected.


To improve response rates, reminder e-mails will be sent to all non-respondents beginning two weeks after initiation of the survey. A second reminder will be sent to all non-respondents three weeks after initiation. Telephone follow-up for nonresponse will begin about four weeks after initiation of the survey. Experienced telephone interviewers trained in nonresponse conversion will conduct up to four follow-up calls.


4. Describe how the results of the survey will be analyzed and used to generalize the results to the entire customer population.


The present study will use a mixed methods approach to analyze the survey data collected. Close-ended items on the survey will be analyzed quantitatively. Data will first be analyzed descriptively and then used in multivariate statistical models to examine study questions.

Qualitative techniques will be employed to analyze open-ended survey items. Content analysis will be used to code data thematically, and pattern codes will be generated for the final descriptive analysis. Evaluators will first examine the data individually and then convene with each other to compare and contrast observed patterns. The frequencies of agreed-upon patterns will be included in the final report.


The final report will be delivered to the IAAO at NIST upon the conclusion of the project (approximately 24 months from launch). The final report will include a summary of all the data collected and analyzed during the assessment, as well as conclusions that address each of the research questions and project components.







1 Program termination date has four strata: (1) 5 years or less after participation in NRC/NIST; (2) 5-9 years after NRC/NIST; (3) 10-19 years after NRC/NIST; and (4) 20-30 years after NRC/NIST.

2 Specialty research field has five strata: (1) physical sciences; (2) engineering; (3) mathematical and computer sciences; (4) biological, biomedical, and health sciences; and (5) all others that include other fields, unclear fields, and missing.

3 To calculate the expected precision, an assumption was made on the design effect. Even though equal probability samples will be selected, the final respondent sample will have unequal weights after nonresponse adjustment. Since the starting weights are equal before nonresponse adjustment, the design effect is expected to be moderate (the larger the variability of the weights, the larger the design effect), and so, a value of 1.2 for the design effect is assumed .Because the population sizes are not large, a hyper-geometric distribution was used. Under the hyper-geometric distribution, the standard error (se) of a proportion estimate for a population proportion (P) is given by the following formula:


(1)

where N is the population size, n is the final sample size, and Q = 1 - P. However, a sample with unequal weights will not be as efficient as a simple random sample (i.e., with equal weights), and thus, the design effect > 1. Therefore, to use formula (1), the sample size was adjusted using the design effect by replacing n by n' = n / d, where d is an assumed value of the design effect. The design effect adjusted sample size n' is referred to as the effective sample size. For the preparation of Chart 4, formula (1) with n' and P = 0.5 was used.


4


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleOMB Control No
AuthorDarla Yonder
File Modified0000-00-00
File Created2021-02-03

© 2024 OMB.report | Privacy Policy