ECDS Survey SS Part B

ECDS Survey SS Part B.docx

2014 Pilot Early Career Doctorates Survey

OMB: 3145-0235

Document [docx]
Download: docx | pdf

Early career doctorates survey

B. Collection of Information Employing Statistical Methods

B.1 Universe and Sampling Procedure

The Pilot ECD Survey is a survey of individuals who completed their doctorate degree within the past 10 years, regardless of whether the degree was earned in the U.S. or abroad, and currently work at a U.S. academic institution, federally funded research and development center (FFRDC), or the National Institutes of Health Intramural Research Program (NIH IRP). While the primary goal of this pilot survey is to estimate the total count of ECDs in these settings, good estimates of the number of postdocs and of the number ECDs holding doctorate degrees earned outside the U.S. are also needed as these are subpopulations on which data are extremely limited. Consequently, the sampling procedures for the pilot survey have been designed to maximize the precision of these estimates.

For the Pilot ECD Survey, a two-stage sampling plan will be implemented. The first stage will be a stratified sample of institutions—U.S. academic institutions, FFRDCs, and NIH IRPs. For the Pilot ECD Survey, the frame for the U.S. academic institutions will be the 572 institutions participating in the NSF-NIH Survey of Graduate Student and Postdoctorates in Science and Engineering (GSS). While membership in the GSS, FFRDC, and NIH IRP will represent the primary stratification in the institutional sample, the GSS institutions will be further stratified by type of institution defined using Carnegie classification and existence of a medical school within the institution. The institutional level sample will include 160 GSS institutions selected with probability proportionate to size (pps), where size will be estimated by a composite measure based on the number of ECDs, postdoctoral appointees (postdocs), and non-U.S. doctorate degree holding ECDs at each institution. All 40 FFRDCs and all 25 NIH IRP will be sampled with certainty.

In the second stage, a sample of potential ECDs will be selected from each of the responding institutions and asked to complete a 30-minute questionnaire. An average of 40 prospective respondents will be selected from each of the GSS institutions and FFRDCs. Some large GSS institutions will be sampled more than once given the highly skewed distribution of ECDs across academic institutions, and thus will have a within-institution sample size larger than 40. For the Pilot ECD Survey, we plan to partition those large institutions sampled more than once into component schools (e.g., medical school and graduate school) such that 40 prospective respondents will be selected from each school. Proportional (rather than equal) allocation will be used within the FFRDC stratum to limit the burden on small FFRDCs and to keep the selection probabilities of the potential ECDs within the FFRDCs approximately equal. Once the allocation has been computed, we may wish to set a maximum sample size in each institution to limit survey burden in the largest institutions.

Because many of the NIH IRP have fewer than 40 ECDs and to reduce the burden on the NIH coordinator1, we will sample 250 ECDs across all NIH IRP proportionate to the number of ECDs listed within each of the intramural research programs. Table 3 shows the expected allocation of the Pilot ECD Survey sample by institutional strata.

Table 3. Sampling Strata and Estimated Sample Sizes for the 2013 ECD Survey



Estimated number of


Sample size

Institutional strata

Institutions on frame

ECDs1

Postdocs2

ECDs w/ foreign doctorates3


Stage 1 sample institutions

Stage 2 sample members

GSS institutions

572

223,142

62,726

39,761


160

6,400

Institutions with medical schools

135

133,699

49,042

28,5823


95

3,800

Highly active research institutions

113

47,932

12,910

8,046


35

1,400

Doctorate-granting institutions

130

19,966

688

1,744


15

600

Master’s-granting institutions

194

21,845

87

1,390


15

600

FFRDCs

40

8,600

2,294

998


40

1,600

NIH Intramural Research Programs

25

3,840

3,650

1,920


14

250

Total

637

235,582

68,670

42,679


201

8,250

1 The number of ECDs is not known. The estimated number is derived from the number of postdocs and NFRs reported in the GSS, the number of postdocs reported in the Survey of Postdocs at Federally Funded Research and Development Centers (FFRDC Postdoc Survey), the number of assistant and associate professors reported in IPEDS, and the total number of employees reported by FFRDCs on their websites. The NIH IRP estimate is based on the counts of postdocs reported in the FFRDC and discussion with NIH staff regarding the number of non-postdoc ECDs within the NIH IRP. This count was highly correlated (r=0.965) with the count of known or likely ECDs reported by the 56 institutions that participated in the ECD Project methodological study.

2 Average counts of postdocs reported in the 2010-12 GSS and FFRDC Postdoc surveys.

3 The number of ECDs with foreign doctorates is not known. Institution level estimates are derived as a function of the proportion of postdocs with non-U.S. doctorate degrees reported in the GSS and the proportions of respondents from the ECD Project methodological study reporting non-U.S. doctorates.

4 The NIH Office of Intramural Training & Education will provide a list of ECDs for all 25 intramural research programs.


Postdocs will be oversampled among GSS and FFRDC institutions such that approximately 40% of the sample is postdocs (compared to the estimated 28% in the population). Since postdocs are more likely to have earned their doctorate outside the United States than other ECDs, oversampling postdocs will also increase the number of non-US degreed ECDs in the sample. We estimate that this oversampling will increase the unequal weighting design effect in the GSS stratum by about 7%, but will improve coverage of these key subpopulations and enhance comparability across postdocs and non-postdocs and U.S. degreed and non-U.S. degreed ECDs.

Expected eligibility rates, design effect, and expected precision of final count estimates

In the 2012 ECD Project methodological study (hereafter called the methodological study), 39 of the 56 participating institutions were able to positively identify ECDs working at their institution using doctorate degree and year of doctorate indicators in their administrative databases. The remaining 17 institutions were unable to positively identify all ECDs working in their institution as they lacked one or both of these indicators in their administrative databases. Instead, these institutions worked with study staff to review the job titles used in their institution, identify those that might include ECDs, and provide listings of all individuals within the selected job titles. Study staff then reviewed these lists and assigned all individuals to one of three sampling strata: (1) likely ECDs, (2) possible ECDs, and (3) unlikely ECDs based on job title, age, and other information provided by the institution. For the methodological study, we sampled as evenly as possible across these strata within each institution. For example, in an institution with many potential ECDs, 14 would have been selected from the ‘likely’ stratum and 13 each from the ‘possible’ and ‘unlikely’ strata. As shown in Table 4, variation within the lists resulted in more ‘likely ECDs’ being sampled than ‘possible ECDs’ and more ‘possible ECDs’ being sample than ‘unlikely ECDs.’


Table 4. Institutions providing ECD Listings, Sample Members, and Response and Eligibility Rates Achieved in the ECD Methodological Study by ECD Sampling Strata


Can ID ECDs?




Yes



No



Total

Institutions

39



17



56

GSS

23



17



40

FFRDC

15



0



15

NIH IRP

1






1


Sample members


1,437




678




2,115

GSS

869



678



1,547

FFRDC

488



0



488

NIH IRP

80






80


Individuals

by ECD sampling strata

Known


Likely

Possible

Unlikely


Total

% of sample by ECD status

100%


40.1%

32.0%

27.9%



Sampled

1,437


272

217

189


2,115

Respondents

1,066


194

137

96


1,493

Response rate*

74.2%


71.3%

63.1%

50.8%


70.6%

Confirmed ECDs

1,043


179

99

18


1,339

Stratum eligibility rate








Respondent

97.8%


92.3%

72.3%

18.8%


89.7%

Overall sample

72.6%


65.8%

45.6%

9.5%


63.3%

* AAPOR RR2, based on the number of respondents that provided sufficient information to determine whether they were an ECD.

Table 4 also shows that the response rate and eligibility rates varied substantially by ECD sampling stratum. Of the sample members from institutions that could identify their ECDs (i.e., the “Known” stratum), 74.2% responded to our invitation and 97.8% of the respondents met the ECD criteria. In comparison, only 50.8% of the individuals sampled in the ‘unlikely ECD’ stratum responded to our invitation and only 18.8% of these respondents met the ECD criteria. The ‘likely ECD’ stratum was similar to the ‘known ECD’ stratum and the ‘possible ECD’ stratum fell in between the ‘likely ECD’ and ‘unlikely ECD’ strata. Overall, 70.6% of the methodological study sample responded enough to verify their ECD status, yielding an overall eligibility rate of a 63.3%.


Table 5 shows the sample design for the second stage data collection for the Pilot ECD Survey. In order to maximize the projected precision of the estimated total number of ECDs for the budgeted sample size, we will sample individuals from institutions that cannot positively identify all ECDs from their administrative data as follows: 70% from the ‘likely ECD’ stratum, 25% from the ‘possible ECD’ stratum, and 5% from the ‘unlikely ECD’ stratum. Assuming one third (33%) of GSS institutions and 5% of FFRDCs will not be able to positively identify their ECDs, and further assuming that the achieved eligibility rates will be similar to but slightly lower than the methodological study this sampling design yields an expected eligibility rate of 92.0%.


Table 5. Sample Sizes and Eligibility Rates Projected for the Pilot ECD Survey

 

Can ID ECDs?

 

 

 

 

 

Yes

 

 

No

 

 

Total

Institutions

145

 

 

56

 

 

201

GSS

106

 

 

54

 

 

160

FFRDC

38

 

 

2

 

 

40

NIH IRP

1

 

 

 

 

 

1

 

 

 

 

 

 

 

 

Sample members

6,010

 

 

2,240

 

 

8,250

GSS

4,240

 

 

2,160

 

 

6,400

FFRDC

1,520

 

 

80

 

 

1,600

NIH IRP

250

 

 

 

 

 

250

 

 

 

 

 

 

 

 

Individuals

by ECD sampling strata

Known

 

Likely

Possible

Unlikely

 

Total

% of sample by ECD Status

100%

 

70.0%

25.0%

5.0%

 

 

Sample Size

6,010

 

1,568

560

112

 

8,250

Expected Eligibility Rate

96%

 

92%

65%

10%

 

 

Expected ECDs

5,770

 

1,443

364

11

 

7,588

Expected Overall Eligibility Rate

 

 

 

 

 

92.0%


Data collection will continue through the end of the data collection period (January 31, 2105). Response rates will be calculated using the American Association for Public Opinion Research standard definition and the OMB recommended guidelines. Assuming a within institution correlation of 0.01 and overall design effect of 1.94, the 95% confidence interval for the estimated total number of ECDs is expected to be within plus or minus 0.80% of the estimated total (i.e., CI = +/-0.80% at alpha=0.05).

B.2 Description of Survey Methodology and Statistical Procedures

As noted above, the Pilot ECD Survey is a two-stage sample design: institutional and individual ECD. Correspondingly, it is a two-stage data collection effort. The first stage of data collection involves contacting institutions to elicit their cooperation to compile a list of all ECDs working at the institution. This process begins with the contractor identifying potential List Coordinators (LCs) based on GSS experience. With this information, a package will be sent to HAs —university presidents at GSS institutions, the center director at FFRDCs, and the NIH IRP Director of the Office of Postdoctoral Services (OPS) in the Office of Intramural Training and Education (OITE)—seeking institutional participation. A follow-up call will be made to HAs seeking to clarify why participation is important and to further explain what it will mean for the institution.

Once cooperation has been established with the HA, the LC is contacted to begin the process of compiling the list of ECDs. Communications between the LC and an institutional liaison (IL) are on-going while the list is being compiled. Initial communications are made to establish a relationship and elicit cooperation. Follow-up communications are made to clarify the task requirements and prompt late responding LCs. Once the requested list of ECDs and accompanying data are provided to the contractor, the lists are reviewed for completeness and compared to the number of ECDs expected based on institutional responses to the GSS, FFRDC Postdoc Survey, and IPEDS data collections, as well as institutional websites. Project staff will follow-up with LCs until the list data are deemed final, and then the lists are prepared for sampling and the second stage of data collection begins.

The second stage of data collection begins with a pre-notification sent by either the institutional HA or NSF. For all institutions where HAs agree, pre-notification will be sent by the HA. However, it is known from the methodological study that not all HAs will agree to this request. In these cases HAs will be asked to provide a letter of support to be sent with a pre-notification letter from NSF. An invitation e-mail will be sent within three days of pre-notification. Subsequent contacts are contingent upon completion status—i.e., follow-up contacts will cease once a survey is completed, it is determined from either the survey questions or communications from the prospective respondent that they are ineligible, or the ECD provides a firm refusal to participate.

Following the Total Design Method (TDM) approach, nonrespondents will receive a number of follow-up communications. In the order of their use, these include: two reminder e-mails, a reminder call, a third reminder e-mail, a mailed reminder letter with an accompanying brochure and letter of support if one was provided by the institutional HA, a call by a computer-assisted telephone interviewer (CATI), and a final e-mail reminder. A final communication in the form of a thank you e-mail will be sent to all survey respondents.

A more detailed description of the contact strategies for both stages of data collection are provided in Attachment A. Copies of the contact materials can be found in Attachment B (institutional contacts) and Attachment C (ECD contacts).

B.2.1 Imputation for Item Nonresponse in the Pilot ECD Survey

Imputation is planned for the following 5 key variables: gender, citizenship, origin (country) of degree, field of study, and salary. Most of these are needed to establish national estimates of the total ECD population and important subpopulations. Items will be imputed in the order from least to greatest percentage of missing data, using a weighted hot deck procedure (Cox, 1980) using the SUDAAN IMPUTE procedure. An imputation flag or indicator variable will be created and placed on the data file for each variable that is imputed.

B.3 Methods Used To Maximize Response Rate

A 71.8% response rate (AAPOR RR22) was achieved at the second stage of data collection (e.g., among sample members) for the methodological study. For this response rate calculation, a partial interview was defined as any person who logged into the survey or contacted NSF outside of the survey as a result of the survey invitation. This definition was used because one of the objectives of the methodological pilot was to determine the feasibility of reaching ECDs. Logging in to the survey or responding directly to the survey invitation provided evidence that contacting the sample member was possible. However, since the primary objective for the Pilot ECD Survey is to produce national estimates of the number of ECDs in academic institutions, FFRDCs, and the NIH IRP, a partial interview will need to have responses for all questions required to determine ECD status. Using this stricter definition of a partial interview, a 70.6% response rate was achieved in the methodological study (as shown in Table 4 above).

Given the objective to provide national estimates and the more constrained definition of a partial interview, a number of new steps will be used in the Pilot ECD Survey design to maximize response rates. These steps include those used during the methodological study as well as others aimed at further maximizing response.

Institutional stage

  • A request sent to institution HAs for participation in a FedEx package and includes a letter and brochure that detail what participation will require and a survey participation form.

  • Introductory call and information packet sent to LC to confirm appropriateness of their nomination and elicit cooperation.

  • Follow-up e-mails and phone calls to LCs to offer help and seek updates on list compilation progress.

ECD stage

  • Pre-notification from the HA. For GSS institutions and FFRDCs, the type of pre-notification depends on the level of participation to which the HA agrees (i.e., the protocol group an institution is in). Pre-notification can include an e-mail directly from the HA or a letter of support the HA provides the Pilot ECD Survey contractor that will be enclosed with a pre-notification letter from NSF.

  • Reminder e-mails to nonrespondents describing the importance of survey participation. All e-mails will include a hyperlink with embedded and secure login credentials making it easy for the prospective respondent to access the web survey.

  • A reminder phone call to nonrespondents aimed at ensuring that the ECD has received previous e-mail communications and to determine if the prospective respondent has encountered any problems preventing them from completing the web survey.

  • A reminder mailing to nonrespondents. Two versions of the reminder letter will be used. The version sent to each ECD will be contingent upon whether pre-notification was sent by the HA or NSF. When HAs have provided them, a second letter of support will be included in the reminder mailing.

  • Telephone interview calls. A second phone call will be made to nonrespondents during which the telephone interviewer will seek to elicit immediate participation via a telephone interview.

With a few minor modifications, the methods to maximize response at the institutional level are the same or similar to those used in the methodological study. An overview/next steps e-mail was added to the institutional contact protocol (see Attachment B) as this summary helped LCs in the methodological study understand what was needed of them and sped up response from institutions. An important change to the contacts in the ECD level data collection is the addition of mail and phone follow-ups (see Attachment C). The inclusion of these contacts follows the TDM approach, which empirical research indicates significantly improves response rates.

B.4 Testing of Procedures

The methodological study had two primary objectives: to create a methodology for building an ECD sampling frame from the databases within the sampled institutions and to test different strategies for contacting and recruiting sample members to take a 30 minute questionnaire. A secondary objective of the methodological study was to field test the survey questionnaire with ECDs. In this section a review of those methodological tests will be provided.

B.4.1 Test of Building Sampling Frame

The first step in the methodological study was to determine the feasibility of building a sampling frame of ECDs. The methodology for this step was modeled on O*Net, an establishment study for which the contractor has achieved high response rates over a period of ten years (e.g., establishment response rate of 76.1% and employee response rate of 65.0%). Protocols were designed to maximize institutional participation while minimizing the burden on institutional respondents. To achieve these two goals, the methodology leveraged contacts developed through the NSF-NIH Survey of Graduate Students and Postdoctorates in Science and Engineering (GSS). When possible, the contractor identified a contact at institutions based on experience in the GSS and recommended that person to the HA as the person to perform the tasks necessary to compile a list of ECDs at the institution (in the methodological study this person was referred to as the point of contact [POC]). Follow-ups with the HA were made when necessary to seek institutional participation. Once a relationship was established with the POC, follow-up contacts were made to both assist the POC with any questions about the task and, when needed, prompt the POC about task completion.

Of the 81 institutions in the methodological study, 74 (or 91.4%) agreed to participate in the study. Due to time limitations within the methodological study (institutions were asked to provide their listings within 2 months), only 56 of the 81 (69.1%) sampled institutions were able to compile and provide NSF with a list of ECDs at the institution. It is expected that in the Pilot ECD Survey a higher institutional participation rate will be achieved due to the extended data collection period.

As shown in table 4 on page 12, 39 of the 56 institutions that provided ECD listings were able to positively identify the ECDs working in their institutions based on type and year of highest degree information maintained within their administrative databases, and 1,437 of the 2,115 sample members were sampled from these known ECD lists. For the 17 institutions that could not identify ECDs based on their administrative data, NSF requested listings for job titles that might include ECDs and the contractor estimated the likelihood that the person was an ECD based on available data. Of the 678 individuals sampled from these lists, 272 (40.1%) were sampled as likely ECDs, 217 (32.0%) were sampled as possible ECDs, and 189 (27.9%) were sampled as unlikely ECDs. The eligibility rate of respondents by ECD sampling strata (known, likely, possible, and unlikely) shows that institutions that had type and year of highest degree on their data files were able to accurately identify the ECDs within their institutions – 97.8% of respondents within these institutions were ECDs, with ineligibility typically due to incorrect year of degree information on the institution’s files.

B.4.2 Test of Contacting ECDs

The second step in the methodological study was to contact ECDs seeking survey participation. The methodology for contacting ECDs was, in part, modeled on TDM. An experiment was included in the methodological study to determine the effectiveness of pre-notification and final appeal communications coming from an institutional HA vs. NSF. These were the two original experimental groups in the study. However, during data collection it was also determined that some institutions had policies or preferences that were inconsistent with the study design. One policy problematic for the design was that three institutions would not share employee information without notifying the employee first. In response to this requirement, an Opt out/in treatment group was created for the study. For two institutions, an institutional authority sent an e-mail to sample members notifying them of their selection and asking to them to reply by a specified date if they did not want to participate; if the sample member did not opt-out the institution would provide the contact information. One institution sent an opt-in version of the HA pre-notification e-mail that asked those interested in participating to contact the survey staff directly or let the HA (or POC) know that the institution could provide NSF with their contact information. Design adjustment was also required for the HA experimental group when institutions had policies forbidding HAs from sending e-mails soliciting survey participation or when HAs did not respond to NSF’s request to send the pre-notification e-mail.

After permission and lists were provided to the contractor, prospective respondents were sent a pre-notification either from an HA, other institutional figure, or NSF. Subsequently, they were sent an invitation, and three reminder e-mails, and a final appeal communication (e-mail from the HA or NSF). Following the TDM approach, each follow-up contact used a slightly different plea for participation and increased since of urgency. As noted above, given the objective of determining the ability to reach ECDs a partial complete was defined as any access to the web survey and the response rate reported is AAPOR RR2. This definition allowed for an assessment of the success of contacting with ECDs.

In the methodological study, the HA treatment resulted in a significantly higher response rate than the NSF treatment: 74.8% of sample members notified by an HA responded as compared to 68.5% of those who were notified by NSF. The opt-out/opt-in treatments elicited the lowest response rate at 57.5%. As a result, all institutions in the Pilot ECD Survey will start in the HA treatment group and be moved to other treatment groups only when dictated by the institutions response in stage 1.

Experimental Designs for the Pilot ECD Survey

Two experiments are proposed for the Pilot ECD Survey. The first experiment will assess the effectiveness of different subject lines for the email contacts. Some empirical evidence showed that the subject lines of emails can impact response rates (see Couper, 2008 for a summary). We experimentally test varying subject lines by increasing sense of urgency in the subject line and the prominence of the survey sponsor. As shown in table 6, a control and two treatments are specified for this experiment which varies the detail of the subject line as well as the prominence of the sponsor. Experimental groups #1 and #2 provide greater detail and an increasing sense of urgency as compared to the control group. Experimental group #2 emphasizes the National Science Foundation as the sponsor and deemphasizes the survey title. We hypothesize that sample members in both experimental groups will respond at a higher rate than those in the control group, and that those in experimental group #2 will respond at the highest rate.


Table 6. E-mail Subject Line Experimental and Control Conditions


Control

Experimental Group #1

Experimental Group #2

Login credentials

E-mail

NSF Early Career Doctorates Survey

NSF Early Career Doctorates Survey – Your Login Credentials

National Science Foundation ECD Survey – Your Login Credentials

Reminder E-mail #1

NSF Early Career Doctorates Survey

NSF Early Career Doctorates Survey – Reminder

National Science Foundation ECD Survey – Reminder

Reminder E-mail #2

NSF Early Career Doctorates Survey

NSF Early Career Doctorates Survey – Your Help Needed

National Science Foundation ECD Survey – Your Help Needed

Reminder E-mail #3

NSF Early Career Doctorates Survey

NSF Early Career Doctorates Survey – Please Respond

National Science Foundation ECD Survey – Please Respond


To minimize institutional effects, individuals will be sampled within institutions and randomly assigned to one of the three treatment groups. Sample members will be assigned as evenly as possible across treatment groups yielding 2,750 sample members per group. Assuming a within institution correlation of 0.01, a difference in the ECD response rate of 5% can be detected with 80% power at alpha = 0.05 when the response rate is 60% in one of the three experiment groups. Response rates for individuals in this experiment will be assessed one week after the third email was sent.


The second experiment will start after the initial experiment ends and will test the hypothesis that a high priority mailing will improve response rates more than a regular-first class mailing. In the proposed ECD contact protocol, the seventh contact (and fifth reminder) is a mailing to all non-responding ECDs. This mailing will include an appeal from NSF, a survey brochure, and a letter of support from an authority figure in the sample member’s institution (e.g., the university president or chancellor). Based on Dillman’s Total Design Method (2007), letters sent via high priority mail differ in the packaging, mode of delivery, and speed with which they are delivered. This makes the letter more noticeable to the recipient before they open the package and is expected to have the effect of lending greater importance and legitimacy of the survey request. However, in the context of this survey, the change in mode of delivery may be sufficient to gain the sample member’s attention. If regular mail results in comparable response rates to the special delivery method, substantial savings could be gained in future waves of the survey.


Based on the completion rates within the methodology study, we assume 60% of sample members will have completed their survey by the 3rd email reminder, leaving 40% or 3,300 sample members in a non- or incomplete response status. With two equal groups of 1,650 and assuming a within institution correlation of 0.01, a difference in the ECD response rate of 4% can be detected with 80% power at alpha = 0.05 when the response rate is 10% in one of the two experiment groups. The response rates for individuals in the second experiment will be assessed at two points in time: one week following the receipt of the mailing and at the end of data collection.



B.4.3 Field Test of the Questionnaire

A secondary objective of the methodological study was to field test the questionnaire. Overall the field test met with burden expectations. Prior to data collection it was estimated that survey completion time would be approximately 30 minutes. The average completion time for the methodological study was 36 minutes -- 33 minutes for those who completed the survey in one session and 40 minutes for those who completed in multiple sessions.

B.4.4 Changes to the ECD Questionnaire

No changes are anticipated for the ECD questionnaire. The ECD questionnaire (Attachment E) was revised following the methodological study to omit or improve problematic items and include a few items from the Organisation for Economic Co-operation and Development (OECD) Careers of Doctorate Holders (CDH) survey.

B.5 Names and Telephone Numbers of Individuals Consulted

The individuals consulted on Pilot ECD Survey technical and statistical issues are listed in table 7.


Table 7. Individuals Consulted on Pilot ECD Survey Technical and Statistical Issues

Name

Affiliation

Telephone number

Ms. Kelly Phou
ECD Survey Manager

National Science Foundation, NCSES
Arlington, VA

703-292-7422

Ms. Emilda Rivers
HRS Program Director

National Science Foundation, NCSES
Arlington, VA

703-292-7773

Mr. John R. Gawalt

Director NCSES

National Science Foundation, NCSES

Arlington, VA

703-292-7776

Ms. Jeri Mulrow

Deputy Director NCSES

National Science Foundation, NCSES

Arlington, VA

703-292-4784

Dr. Stephen Cohen
Chief Statistician

National Science Foundation, NCSES
Arlington, VA

703-292-7769

Dr. Wan-Ying Chang

Mathematical Statistician

National Science Foundation, NCSES

Arlington, VA

703-292-2310

Mr. Darius Singpurwalla

Mathematical Statistician

National Science Foundation, NCSES

Arlington, VA

703-292-7793

Ms. Rebecca Morrison

Survey Methodologist

National Science Foundation, NCSES

Arlington, VA

703-292-7794

Ms. Jennifer Sutton
Research Training Coordinator

National Institutes of Health
Bethesda, MD

301-435-2686

Dr. Lori Conlan
Director, Career Services Center

National Institutes of Health
Bethesda, MD

301-435-7231

Ms. Cathee Johnson Phillips
Executive Director

National Postdoctoral Association Washington, DC

202-326-6427

Dr. Jodi Yellin
Director, Science Policy

American Association of Medical Colleges, Washington, DC

202-828-0485

Dr. Irena Tartatkovsky
Senior Science Policy Analyst

American Association of Medical Colleges, Washington, DC

202-862-6134


Ms. Roxanne Murray
Director, HR & Administration

Association of American Universities Washington, DC

202-408-7500

Mr. Peter Einaudi
Project Director

RTI International
Research Triangle Park, NC

919-541-8765

Dr. Paul Biemer
Senior Survey Methodologist

RTI International
Research Triangle Park, NC

919-541-6056

Ms. Laura Burns Fritch
Survey Methodologist

RTI International
Research Triangle Park, NC

919-990-8318

Mr. Brian Head
Survey Methodologist

RTI International
Research Triangle Park, NC

919-485-5511

Dr. Sara Wheeless
Mathematical Statistical Task Leader

RTI International
Research Triangle Park, NC

919-541-5891

Dr. Patricia Green
Senior Advisor

RTI International
Chicago, IL

312-456-5260

Mr. Bob Steele
Systems Development Task Leader

RTI International
Research Triangle Park, NC

919-316-3836




1 The Director of the Office of Postdoctoral Services (OPS) at the NIH Office of Intramural Training and Education (OITE) will provide NSF with a listing of all ECDs in the NIH IRP and send initial notifications to the ECDs selected to participate in the survey.

2014 Pilot Early Career Doctorates Survey 8



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPlimpton, Suzanne H.
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy