Report on 2008 Generic Clearance Activities

Report on 2008 generic clearance activities 05 18 09.doc

NCSES-Generic Clearance of Survey Improvement Projects for the National Center for Science and Engineering Statistics

Report on 2008 Generic Clearance Activities

OMB: 3145-0174

Document [doc]
Download: doc | pdf

8


May 18, 2009



Ms. Shelly Wilkie Martinez

Office of Management and Budget

New Executive Office Building

Room 10201

Washington, DC 20503



Dear Shelly:


In December 2006, the National Science Foundation’s Division of Science Resources Statistics (SRS) generic clearance for survey improvement projects (OMB Number 3145-0174) was extended for three years, with the stipulation that reports be submitted annually, containing indicators of the work that was conducted under the clearance. Between January 1, 2008 and December 31, 2008, SRS conducted 12 studies under projects approved by OMB for the SRS generic clearance for this three-year period. Projects covered four current surveys and exploratory work for the Postdoc Data Project and Microbusiness R&D. The report that follows describes the work conducted for studies during calendar year 2008.


The methods employed in these generic clearance activities included face-to-face interviews, focus groups, web surveys, email survey, and some interviews over the phone. The interviews for the projects utilized cognitive testing methods to elicit unbiased information about how respondents perceive items such as definitions, instructions, questions, and response categories.


The number of respondent burden hours used during 2008 in the 12 studies described below totaled 1,009 hours. A breakdown by method shows the following: 455 hours for face-to-face or phone interviews, 245 hours for web surveys, 242 hours for two email surveys, and 63 hours for focus group participation. These hours cover the 2,052 individuals who participated in these studies. For some studies, the amount of time was as little as four minutes and for others as much as five hours. Typical participation for these individuals ranged from 15 minutes for the web surveys to one hour for face-to-face interviews during cognitive testing.


Brief descriptions of the generic clearance activities by survey or initiative along with their findings follows.



Academic R&D Survey


The Academic R&D Survey is an annual survey of U.S. colleges and universities that spent at least $150,000 on R&D in science and engineering. A redesign of the survey is currently in progress with Westat as the SRS contractor for the project. SRS conducted four projects during 2008 under the SRS generic clearance in support of the redesign project.


  1. Site visits to explore issues for survey redesign


SRS and Westat staff conducted site visits at 16 institutions to discuss topics for the survey redesign. The site visits used two sets of questions geared to specifically invited officials. Officials from the Office of the Vice President for Research, sponsored programs office, and institutional research were asked about topics including the structure of the institution’s research enterprise and their uses of NSF’s and other data for tracking academic R&D. Officials from accounting and budget offices (typically including the current respondent for the Academic R&D survey) were asked about recordkeeping issues and the ability of the institution to collect the types of data suggested by participants in the data users workshop and expert panel for the redesign effort. The findings from these visits were used to draft a survey instrument.


The 16 site visits averaged five hours in duration. A total of 48 university employees attended the interviews, for 240 burden hours.


  1. Email survey exploring the need to add new fields


In July 2008, SRS and Westat conducted a brief email survey of a subset of Academic R&D survey respondents. The purpose of the study was to collect details on emerging and interdisciplinary research fields that might be considered for addition to the current survey taxonomy. Respondents with the highest frequencies for reporting “other” categories in response to the current survey were asked to provide a breakdown of the disciplines they included in these categories. The survey was tested by sending the questionnaire via email to eight respondents and contacting them by telephone to discuss their impressions of the task and the wording of the questions. The final email survey of 185 respondents obtained a 77 percent response rate, resulting in responses from 143 respondents. SRS considered the results of this study in deciding whether to redesign categories now or wait for the results of ongoing SRS taxonomy work. The study findings will also be incorporated into SRS-wide taxonomy revisions.


SRS estimates that the 143 respondents spent no more than 60 minutes each to provide a response. Hence, the total burden on respondents was 148 hours for the email survey (5 hours for pretesting with 8 respondents and 143 hours for the survey).


  1. Community college R&D survey


SRS and Westat staff conducted pretesting of a brief screening survey of U.S. community colleges to measure R&D performed at these institutions. The original plans included a survey of 349 community colleges. During pretesting activities, however, multiple strategies failed to locate any institutions with activities that fit the survey definition of R&D. Subsequently, SRS dropped the full implementation of the screening survey. As a result of this work, SRS decided not to expand the Academic R&D survey population to include community colleges.


SRS visited two institutions and conducted interviews lasting about two hours at each location. During this interview, the survey concept and definitions were discussed. Seven community college officials attended the two interviews. In addition, four pretesting interviews were conducted by phone to test the survey instrument. Each pretest interview included one college official and averaged 1 hour in duration.


The study used 14 burden hours for the two interviews (2 hours times seven people), and four burden hours for pretesting four respondents by phone (one hour each), a total of 18 burden hours.


  1. Cognitive testing of redesigned questionnaire


In December 2008, SRS and Westat staff conducted the first round of cognitive testing for the revised questionnaire during visits to four institutions. Those interviewed included the current survey respondent as well as representatives from applicable offices that help provide survey data such as the accounting or budget office, sponsored programs office, and institutional research office. An additional 12 site visits were conducted in January and February of 2009. The findings from all 16 interviews will be reported in next year’s generic clearance report.


An estimated 15 individuals attended the four site visit interviews during calendar year 2008. The cognitive interviews averaged 4 hours in duration, for an estimated 60 burden hours.



National Survey of Recent College Graduates (NSRCG)


The NSRCG, conducted every two to three years, is one of three surveys used by SRS to measure the U.S. science and engineering workforce.


In 2008, SRS completed the third and the final round of 14 cognitive interviews on issues surrounding survey questions on the community college experience. The interviews were conducted by the staff of Washington State University’s Social and Economic Sciences Research Center. The goal of the study was to determine whether respondents understood the questions and whether the questions measured concepts as intended. Three types of questions were investigated. First, the interviews asked about questions previously included on all three surveys. such as attending community college and obtaining an associate’s degree. Second, the interviews looked at past questions on reasons for attending community college. Third, the interviews covered new questions on community college attendance. The three rounds of cognitive interviews helped identify respondents’ issues with these questions.  Using results from all three rounds of interviews (20 interviews were conducted in 2007), SRS fine-tuned question stems and response category options for the 2008 NSRCG.


Fourteen interviews lasting one hour each were conducted during the 2008 round of cognitive interviews, for a total of 14 burden hours. 



Survey of Earned Doctorates (SED)


The SED collects data annually on all individuals receiving research doctoral degrees (e.g., Ph.D. rather than M.D.) from accredited U.S. institutions. SRS contracts with NORC to manage the survey operations. SRS conducted two projects in 2008 to improve the SED under the generic clearance.


  1. Cognitive interviews for questionnaire revisions


SRS and NORC conducted 20 cognitive interviews with recent respondents to explore several issues for the 2010 SED questionnaire. The interview protocol covered the entire SED survey questionnaire, with some items targeted for additional probing. These included the consent wording on the front cover of the survey, the terms “interdisciplinary” and “tuition remission,” postgraduation salary questions, time-to-degree questions, and the presence of bar codes. Results of the interviews led to several adjustments for the 2010 survey form.


Twenty cognitive interviews were conducted in calendar year 2008. The interviews lasted approximately 90 minutes each, for 30 burden hours.


  1. Web survey


SRS collaborated with NORC to develop a brief web survey of user data needs. Three types of data users comprised the sample for the web survey. First, SRS included 543 deans of institutional units that administer the SED. Second, SRS included 31 data users who requested race/ethnicity/gender (REG) tables annually. Third, SRS included 297 data users who downloaded the 2006 REG tables via the SRS website between June and September 2008. The survey asked about aspects of their use of the SED data. The topics included frequency of use; the federal, state, or institutional reporting requirements for such information; the types of data aggregation they perform; their preferences for aggregations and their use of SED reports. Of the 871 respondents that were invited to participate in the web survey, 373 (43 percent) responded. SRS used these results to redesign selected data tables in the interagency Summary Report and the REG tables.


It took approximately 15 minutes for each respondent to complete the web survey. A total of 373 data users responded, for a burden of 94 hours.



Survey of Graduate Students and Postdoctorates in Science and Engineering (GSS)


The GSS is a survey of U.S. colleges and universities that have at least one graduate student or one postdoc in science, engineering, and health fields. The survey requests aggregated information by school and department for graduate students, postdocs, and nonfaculty researchers. SRS currently contracts with RTI for both survey operations and redesign activities. In 2008 , SRS conducted two projects under the generic clearance to redesign the survey instruments for the 2008 GSS survey cycle.


1. Cognitive interviews


RTI conducted three rounds of cognitive interviews with 24 GSS respondents to test the 2007 web survey. RTI methodologists asked participants to complete the survey in the same manner that they did for the 2007 GSS data collection. The methodologists observed their actions, noting which features were used and when errors occurred. The methodologists also asked scripted and impromptu probes to elicit more information from the participants about their experience with the survey. Findings from the cognitive interviews were used to design the 2008 web survey.


The cognitive interviews used 1.5 burden hours for the 24 respondents in Rounds 1 and 3 (36 hours), and one burden hour for the eight people in Round 2 (8 hours) for a total ofp44 burden hours.


2. Usability testing


After changes to the 2007 version of the questionnaire and web survey, three rounds of cognitive interviews were conducted in preparation for the 2008 GSS. These interviews tested the 2008 improvements to the layout and wording of the GSS questionnaire and the navigation of the web survey. RTI conducted three rounds of testing in Washington, DC; Research Triangle Park, NC; and Chicago, IL with 32 participants. The first two rounds of testing evaluated the PDF questionnaire as a paper prototype for the web survey. Respondents were shown both a landscape and portrait version of the paper questionnaire to determine their preferences. Respondents liked the new portrait layout, which was incorporated into the 2008 GSS. Findings from the first two rounds of testing were used to revise the web survey that was tested in the final round. Findings from the third round were incorporated into the final 2008 survey instrument.


The GSS usability tests required one hour of burden on average for the 16 respondents in Rounds 1 and 2 (16 hours) and 1.5 burden hours for eight respondents in Round 3 (12 hours). This totals 28 burden hours for the three rounds of usability tests.


Microbusiness R&D Exploratory Work


SRS wants to explore whether very small companies are conducting R&D or otherwise engaged in innovation seeking activities. This initial work looked at whether such firms can be surveyed as part of NSF’s annual Business R&D and Innovation Survey (BRDIS), conducted jointly with the U.S. Census Bureau, or whether they should be surveyed in a separate, tailored data collection. Currently, companies with fewer than five employees are excluded from the BRDIS sample. In spring/summer 2008, SRS worked with survey methodologists from the Energy Information Administration to identify, recruit, and interview independent companies with less than five employees. Twenty-five cognitive interviews were conducted with companies in the vicinity of Washington, DC; Philadelphia, Pennsylvania; and Boulder Colorado. The interviews explored the ability of very small companies to answer R&D questions from the larger survey as well as new questions on innovation. The results of these interviews were used in constructing a separate questionnaire for microbusinesses. SRS plans to conduct preliminary testing of the questionnaire in 2009.


The interviews took between 45 to 75 minutes. A total of 25 burden hours is estimated for this project assuming an average of one hour for each of the 25 visits.



Postdoc Data Project (PDP)


The PDP is a multi-year project to examine the feasibility of collecting data on the activities of postdoctoral researchers (postdocs) across all employment sectors in the United States. In 2008, SRS conducted two projects under the generic clearance to continue to pursue the measurement of postdocs.


  1. Focus Groups with Postdocs


For the first project, SRS conducted four focus groups with postdocs, two groups with non-U.S. PhD degreed postdocs and two with U.S. PhD degreed postdocs. Participants received an incentive payment of $50. The purpose of the focus groups was to explore measures of postdoc activities and prioritize topics for a possible survey of postdocs. Several topics were found to be important to postdocs including opportunities within the postdoc position, funding support, balancing the advisor’s needs with the postdoc’s career development, and transitioning from a postdoc to a permanent position. For the most part, foreign and U.S.-degreed postdocs gave similar ratings to these topics. In addition, postdocs with foreign degrees gave high ratings to topics including whether a postdoc leads to increased opportunities in their home country for academic positions, how postdocs can improve English language skills, and whether an academic postdoc appointment is preferable considering the visa delays associated with industry postdoc appointments. Focus group participants selected publishing papers and obtaining a permanent job as the best measures of success for postdoc appointments. Findings from the focus groups are being used in designing future data collection instruments.



The four focus groups averaged 110 minutes in length, resulting in a total burden of 63 hours (110 minutes times 34 participants = 3,740 minutes).

  1. List Assessment Interviews (LAIs)

For the second project, SRS conducted three LAIs, designed as quick turnaround purposive-based sample surveys to assess specific lists as sources for a survey frame of postdocs. One LAI investigated a list of establishments. Two LAIs investigated lists of individuals. The three LAIs are described below.


a. LAI 1: Nonprofit Organization Lists: IRS Form 990

SRS conducted a short survey to assess IRS Form 990 as a potential source for a frame of nonprofit establishments that employ postdocs. Of the 1,007 nonprofit organizations selected, 26 percent (263 organizations) completed the survey by web or by phone interview. However, 263 of these organizations were ineligible for the LAI. Reasons for ineligibility included lack of R&D for most organizations (89 percent), lack of nonprofit status for a handful of respondents, and lack of employment of postdocs. Given the very high ineligibility rate for R&D activities of those completing the first stage interview, SRS discontinued this project and is pursuing other approaches to locate nonprofit organizations that employ postdocs.


The 263 organizations who responded to the survey took 4 minutes, on average, to complete it. This resulted in approximately 18 burden hours (4 minutes times 263 respondents = 1,052 minutes).


b. LAI 2: Individual Postdoc Lists from the NSF Grant Database


SRS conducted a short web survey to assess the NSF Grants Database as a potential source for a frame of individual postdocs. SRS surveyed 207 academic and non-academic postdocs identified from the database. Of the 131 respondents (63 percent), three-fifths (60 percent) reported themselves as postdocs in 2006. Findings from the survey showed that “temporary status” was the single best criterion to distinguish postdocs from non-postdocs and “intention to provide training in research skills” was the second best criteria.



The survey took, on average, 20 minutes to complete. We used approximately 44 burden hours (20 minutes times 131 respondents = 2,620 minutes).

c. LAI 3: Individual Postdoc List from Organizations


SRS conducted a short web survey of persons appearing on lists of postdocs identified in previous PDP efforts to assess the lists as a potential source for a frame of individual postdocs. Sources of these lists included academic organizations, a national professional association, and other establishments. The survey asked the individual about job characteristics and other attributes in order to establish the respondent’s match to PDP’s definition of a postdoc. Of the 3,724 web invitations sent, 911 individuals (24 percent) responded. The study findings showed that just over half of those respondents considered themselves to be postdocs.


The survey took, on average, 12 minutes to complete. SRS used approximately 182 burden hours (12 minutes times 911 respondents = 10,932 minutes) on this project.



The 2008 calendar year was a high point for the number of projects under the SRS generic clearance. Through a variety of methodological tasks, SRS staff sought to improve the quality of current and potential SRS data collections. SRS expects that future activities under the generic clearance will continue to contribute to increasing data quality and improving the survey experience for respondents.


Sincerely yours,

Fran Featherston

Division of Science Resources Statistics

National Science Foundation


cc:

S. Plimpton (IRM/DAS)

L. Carlson (SBE/SRS)

M. Frase ( ‘’ )

J. Jankowski ( ‘’ )

N. Leach ( ‘’ )

S. Cohen ( ‘’ )



File Typeapplication/msword
File TitleJune 25, 2006
Last Modified ByFran Featherston
File Modified2009-05-18
File Created2009-05-18

© 2024 OMB.report | Privacy Policy