Supporting Statement - SEES Part A OMB 09302015

Supporting Statement - SEES Part A OMB 09302015.docx

Survey of Grantees of Science, Engineering,and Education for Sustainability (SEES) and Comparable Non-SEES Programs

OMB: 3145-0242

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT

Evaluation of the National Science Foundation’s Science, Engineering, and Education for Sustainability (SEES) Portfolio of Programs


OMB Control Number 3145-NEW



A. Justification


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


This ICR is for a new data collection associated with the Evaluation of the National Science Foundation’s (NSF) Science, Engineering, and Education for Sustainability (SEES) Portfolio. This supporting statement seeks approval to conduct a new data collection using an Internet survey of principal investigators (PIs) who received NSF funding from the SEES portfolio of programs or from similar non-SEES NSF programs. The SEES portfolio encompasses 17 cross-directorate programs with varied grantee-target audiences.


NSF has contracted with the Manhattan Strategy Group to conduct an evaluation of the SEES portfolio. The evaluation is designed to determine NSF’s success in achieving program- and portfolio-level goals. The evaluation seeks specifically to measure the success of SEES in terms of:

  1. the development of new knowledge and concepts that advance the overarching goal of a sustainable human future,

  2. new and productive connections made between researchers in a range of disciplines, and

  3. the development of a workforce capable of meeting sustainability challenges.


The overall SEES evaluation includes three tasks: a historical review (Task 2), comparative analyses (Task 3), and a network analysis (Task 4). The Internet survey under this clearance is necessary for Tasks 3 and 4, and will address the following research questions:


  1. Research Question guiding the Comparative Analyses (Task 3): Do activities conducted and programs developed under the SEES Portfolio achieve different outcomes than similar non-SEES NSF programs?


  1. Comparative Analysis of SEES and non-SEES Programs

    • How do SEES and select non-SEES programs compare when examining project-level characteristics such as award size, number of applicants, and project duration?

    • How do SEES and select non-SEES programs compare when examining research objectives, outcomes, and impacts of projects awarded?

    • How do SEES and select non-SEES programs compare when examining the composition of project teams and project collaborators?

  1. Comparative Analysis: Understanding Added Value of the SEES Portfolio

  • In what ways, if at all, has the SEES portfolio increased public understanding of sustainability issues?

  • To what extent has the SEES portfolio increased outreach and interest in environmental sustainability issues from the research and education community?

  • To what extent (if at all) has SEES been covered in the media?

  1. Comparative Analysis: Comparison of Workforce Development and Training Activities

  • What are the career trajectories and pathways of PIs, co-PIs, postdocs, or students in SEES and comparable non-SEES projects?

  • What post-award education and training do PIs, co-PIs, postdocs, or students undertake in SEES and comparable non-SEES projects?

  • To what extent do PIs, co-PIs, postdocs, or students in SEES and comparable non-SEES projects remain or plan to remain in science and engineering employment?


  1. Research Question Guiding the Network Analysis: “Do SEES programs foster connections and collaborations among researchers in various fields of sustainability science and engineering?”

  1. Collaboration Indicators

    • Do the collaboration networks of investigators change pre/post participation in SEES?

    • How do the collaborations of SEES investigators vary by the characteristics of projects and PIs?

  1. Influence of Participation in SEES Program on Individual’s Network

    • To what extent, if at all, have SEES PIs and co-PIs developed an interdisciplinary network?

    • Do the collaborative ties of investigators increase and/or persist after receiving a SEES award?

  1. Comparison of Networking Activities of SEES and Non-SEES Individuals

    • Do SEES investigators collaborate more and have more interdisciplinary collaborations than similar investigators funded by other NSF programs?

    • How do the collaborations of SEES investigators differ in character from those of similar investigators funded by other NSF programs?




2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


The survey results, along with the totality of the SEES Evaluation, will be used by NSF to assess the success of the SEES portfolio in (1) achieving different outcomes in sustainability science, research, and engineering as compared to other investments made across the Foundation, and (2) fostering the development of an interdisciplinary network of researchers in sustainability science when compared with similar non-SEES NSF programs funding of research and education initiatives.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


The survey of SEES and comparable non-SEES PIs will be conducted using Internet survey software provided by SurveyMonkey. The Internet delivery system will allow respondents to take the survey at any time within the window of the data collection period. The online survey allows respondents to save responses and return to the survey later to finish, giving them convenience to choose the best time to complete the instrument. It also allows the automated flow of the instrument based on skip patterns or questions dependent on responses to previous questions. Respondents will receive invitations and reminders to complete the survey via e-mail. Completion rates will be tracked in real time. The Evaluation Team will follow up via telephone with respondents who do not complete the instrument after two email reminders in an attempt to achieve the expected response rates of 80%.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


Grant application packages—including grant team profiles, proposals and annual reports by grantees, and internal directorate and program documentation—will be used to gather a significant portion of the data used for the overall evaluation of the SEES portfolio of programs.


Still, NSF does not systematically collect information on grantees beyond the end of the award. Central issues to the evaluation related to SEES outcomes cannot be collected using extant data the Foundation collects. Oftentimes, outcome measures in research projects yield maturity after the grant ends. For instance, the length of time between getting research funded and publishing the results commonly exceeds the duration of the NSF project. In addition, important and unique SEES’ goals target the development of interdisciplinary networks of collaboration among scholars to tackle science and engineering questions indispensable to sustainability. These flourishing networks cannot be measured exclusively through extant documentation. Similarly, post-award workforce development goals need to be collected to determine the effect of SEES on career pathways taken by grantees.


The questions included in the survey instrument were kept to a minimum of topics and indicators that could not be addressed via extant documentation. We conducted an extensive mapping of the data needed and alternative sources of information to ensure that the survey included only questions that cannot be answered with data already collected or recorded.


5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


No small entities will be involved in this study.


6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


This data collection will occur one time only. Without the survey of PIs, NSF would not be able to assess whether the SEES investment achieved its goals related to workforce development, career pathways, and interdisciplinary collaboration to address the pressing research issues in science and engineering germane to sustainability. The Foundation would be unable to get a complete understanding of the accomplishments and shortfalls of the SEES portfolio approach to sustainability research. Not collecting this information means that the effectiveness of the SEES portfolio cannot be determined.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* requiring respondents to report information to the agency more often than quarterly;

* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* requiring respondents to submit more than an original and two copies of any document;

* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


The request fully complies with the regulations. None of the above special circumstances apply.


8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported. Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years - even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


The 60-day notice to solicit public comments was published on the Federal Register on May 29, 2015 (FR Doc. 2015-13041). No comments were received in response to the notice.


An evaluation advisory group was convened to provide consultation on all aspects of the Evaluation of the SEES Portfolio of Programs. The advisory group reviewed evaluation documents, including the evaluation plan and drafts of each component of the evaluation completed to date to provide comments and recommendations as well as insight into available data sources to address the research questions and sub-questions posed by the SEES evaluation.

Additionally, a pretest of the survey instrument is being conducted with eight PIs receiving SEES funding to verify the response time and ensure clarity and relevance of the survey questions.



9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


No incentives will be used in this data collection.



10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


Reports prepared for this study will summarize findings across the sample and will not associate responses with specific individuals. No personally identifiable information will be shared with anyone outside the study team.


Responses to this data collection are voluntary. Respondents will be fully informed about the purpose of this study and neither the names of respondents nor their affiliations will be included in any reports. Completed surveys will be maintained in a password-protected database. Comments made through the survey will not be attributed to specific individuals in the report or any other publications resulting from this project.


All researchers working with the survey data will take the following precautions to ensure the privacy of all data collected:


  • All staff on the project will be instructed in the privacy requirements of the study and will sign statements affirming their obligation to maintain privacy;

  • Data files for analysis will contain no personal identifiers for program participants;

  • Analysis and publication of study findings for the participant survey will be in terms of aggregated statistics only; and

  • Any quotations from responses to open-ended questions used in public reporting will be reviewed to ensure that the identity of the respondent cannot be ascertained.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


NSF has long worked to promote the inclusion of women and underrepresented minorities in the science and engineering research community. NSF’s Committee on Equal Opportunities in Science and Engineering (CEOSE) was established in 1980 by Congress to “address the problems of growth and diversity in America’s STEM workforce.”1 CEOSE reports biannually to Congress2 on NSF’s efforts to broaden participation of women, underrepresented minorities, and people with disabilities. The Foundation’s efforts extend to sponsoring education efforts to encourage an interest in science among youth as early as their K-12 education.


For this reason, the Foundation is interested in the effects of SEES on the inclusion of women and underrepresented minorities in its sustainability grants. As a result, the survey instrument (Appendix B) includes questions regarding the race and gender of researchers in the respondents’ network of most important collaborators.


12. Provide estimates of the hour burden of the collection of information. The statement should:

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices. * If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under “Annual Cost to Federal Government.”


The Paperwork Reduction Act requires the agency to account for the amount of burden that it is placing on the public when seeking information on behalf of the Federal government. This burden is measured in terms of hours (see Table A-1) and includes the following activities:


  • Reviewing instructions;

  • Using technology to collect, process, and disclose information;

  • Adjusting existing practices to comply with requirements;

  • Searching data sources;

  • Completing and reviewing the response; and

  • Transmitting or disclosing information.


Respondents to this collection of information are PIs who received grants from SEES and comparable non-SEES sustainability programs. Assuming 400 individuals will receive the survey via email. The survey will occur one time only. In pretesting the survey instrument with eight PIs, the average time taken to complete the instrument was 45 minutes. The burden computations presented on the table below take into account an 80 percent response rate.3


Table A-1: Estimated Hour and Annual Cost Response Burden

Data Collection Activities

Number of respondents

Time to complete questionnaire

(in minutes)

Annual Hourly Burden
(in hours)

Cost

Survey of PIs

 

 

 

 

Respondents

344

45

258

$15,676.08

Non-respondents

86

2

2.87

$174.18

Total

430

-

242.67

$15,850.26

Note: Hourly wages calculated using the annual wage of $126,390 (BLS estimate for annual wage of the 90 percentile of engineering professors in the US), for an hourly wage of $60.76. Retrieved from http://www.bls.gov/oes/current/oes251032.htm


13. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected on the burden worksheet).

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collections services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


No other costs to respondents or record keepers are anticipated.


14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.


Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.


The overall cost of this research to the Federal Government is presented in Exhibit A-2. Three-year overall costs total $281,304.29.

Exhibit A-2. Overall Cost to the Federal Government

Category

Costs

Base Year

Costs

Option Year 1

Costs

Option Year 2

Total

Instruments

$26,575.35

 

 

$26,575.35

OMB Clearance

$7,958.30

 

 

$7,958.30

Data Collection/Analysis/Reporting

0

$114,453.13

$132,317.51

$246,770.64

TOTAL

$34,533.65

$114,453.13

$132,317.51

$281,304.29


The overall costs to the Federal Government presented above cover costs for survey design, including sampling plan development and instrument development, data collection activities, data analysis and reporting. The contract includes a base year of designing, planning, and piloting the survey, with option years 1 and 2 for survey implementation and analysis.


15. Explain the reasons for any program changes or adjustments reported on the burden worksheet.


This is a new data collection.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


The exhibit below details all deliverables associated with the Survey of PIs discussed in this ICR. The timetable details the due dates for the following reports and their respective drafts: Task 3.A: Comparative Analysis of SEES and non-SEES Programs, Task 3.C, Comparison of Workforce Development and Training in SEES and non-SEES Programs, Task 4.A, Collaboration Indicators Report, Task 4.B, Influence of Participation in SEES, Task 4.C, Comparison of Networking Activities between SEES and non-SEES Participants and the Final Evaluation Report of the SEES Evaluation.


Exhibit A-3. Timetable for entire project

Task

Due Date

Task 3.A: Draft Report on Comparative Analysis of SEES and Non-SEES Programs

02/05/ 2016

Task 3.A: Final Report on Comparative Analysis of SEES and Non-SEES Programs

10/03/2016

Task 3.C: Draft Report on Value of SEES as a Portfolio of Programs Indicators Report

03/03/2016

Task 3.C: Final Report on Value of SEES as a Portfolio of Programs Indicators Report

11/04/2016

Task 3.D: Draft Report on Comparison of Workforce Development and Training in SEES and Non-SEES Programs

08/04/2016

Task 3.D: Final Report on Comparison of Workforce Development and Training in SEES and Non-SEES Programs

02/03/2017

Task 4.A: Draft Collaboration Indicators Report

02/05/2016

Task 4.A: Final Collaboration Indicators Report

06/03/2016

Task 4.B: Draft Report on Influence of Participation in SEES

11/04/2016

Task 4.B: Final Report on Influence of Participation in SEES

03/06/2017

Task 4.C: Draft Report on Comparison of Networking Activities between SEES and Non-SEES Participants

12/02/2016

Task 4.C: Final Report on Comparison of Networking Activities between SEES and Non-SEES Participants

04/03/2017

Draft Evaluation Report

08/04/2017

Final Evaluation Report (accompanying database)

12/01/2017


The evaluation analysis will also use supplemental information from extant data collected through NSF’s administrative system related to SEES and comparable non-SEES project characteristics. These data will be combined with primary survey data (from the proposed data collection) to examine the outcomes of SEES.


The survey data will allow the research team to supplement the administrative data gathered on SEES and comparable non-SEES projects by asking respondents to elaborate on their project characteristics, team composition, project outputs, and contributions, as well as their current and future employment. The survey will consist of open- and close-ended questions. Open-ended questions will be coded and analyzed using qualitative data analysis software. A structured coding procedure will be used to identify initial themes or trends within responses as well as relationships among the themes. Responses to close-ended questions will be tallied using survey software and exported to statistical software for analysis. We will first generate descriptive statistics to summarize the characteristics of SEES and comparable non-SEES projects and provide baseline testing of the group differences. We will then model the program effects using regression models to control for project- and program-level characteristics to test program effects with more statistical precision.


The survey will also allow for a network analysis to address the research questions regarding collaboration activities of PIs in SEES and non-SEES programs. Network analysis is a set of descriptive quantitative methodologies designed to evaluate the interconnections among a given set of social actors.


There are two types of network analysis: neighborhood and egocentric analysis. At the "neighborhood" level, a set of NSF awardees are conceptualized as actors, and any two are considered "connected" if they are senior investigators on one NSF-funded grant. The interconnectedness of the network is then considered. For the SEES evaluation, researchers will examine network "components" (i.e., a cluster of nodes that are strung together to form a single structurelike a molecule). Analyses for the neighborhood evaluation of SEES will consist of statistical comparisons of rates of network density and interdisciplinarity in order to test for significant differences between the SEES and non-SEES sample of PIs. The statistical test will be based on the ratio of the two rates (e.g., network density and interdisciplinarity rates for SEES and non-SEES PIs). This ratio has a known variance that can be used to calculate a standard error.4 This standard error will help us evaluate whether the rate ratio is different than one, i.e. whether the interconnectivity of the network is significantly reduced when SEES investigators are removed. The analysis will then examine the SEES sample to identify characteristics of projects that are significantly related to the rate of interdisciplinary collaboration. When conducting "egocentric" analysis, the ego is at the center of the network. In a survey context, the ego is the survey respondent. This "ego" is then connected to several "alters" about whom we want information. Involvement in SEES will then be an “ego level” variable, which the evaluation team will use to test differences between SEES and non-SEES PIs that are plausibly due to network involvement.



17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


OMB approval information on the data collection, including expiration date, will be displayed at the beginning of the survey question. The following statement will be attached to the data collection instrument:


“The OMB control number for this project is 3145-XXXX. Public reporting burden for this collection of information is estimated to average 45 minutes per respondent, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspects of this collection of information, including suggestions for reducing this burden to Suzanne H. Plimpton, Reports Clearance Officer, National Science Foundation, 4201 Wilson Boulevard, Suite 1265, Arlington, Virginia 22230 or send email to [email protected].”


18. Explain each exception to the topics of the certification statement identified in “Certification for Paperwork Reduction Act Submissions.”


There are no exceptions to the certification statement.


1 Committee on Equal Opportunities in Science and Engineering. 2011-2012 Biannual Report to Congress: Broadening Participation in America’s STEM Workforce. Retrieved from: http://www.nsf.gov/od/iia/activities/ceose/reports/Full_2011-2012_CEOSE_Report_to_Congress_Final_03-04-2014.pdf

2 National Science Foundation, Office of Integrative Activities (OIA) Committee on Equal Opportunities on Science and Engineering (CEOSE). Retrieved from: http://www.nsf.gov/od/iia/activities/ceose/

3 Welch and Barlau (2012) conducted a mail survey of PIs in NSF completed education grants and achieved an 81 percent response rate. Retrieved from: http://www.colorado.edu/ibs/decaproject/pubs/Survey%20nonresponse%20issues%20Implications%20for%20ATE%20PIs%20researchers%20%20evaluators.pdf


4 Flanders, W.D. (1984). Approximate variance formulas for standardized rate ratios. Journal of Chronic Diseases, 37(6), p. 449-53.

- 10 -


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSmyth, Michel - OASAM OCIO
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy