Responses

Response to survey comments with additional notes docx docx docx.docx

Experimental Program to Stimulate Competitive Research Jurisdictional Survey

Responses

OMB: 3145-0225

Document [docx]
Download: docx | pdf

Summary Recommendation: At the most basic, the promised attachments were not found in ROCIS. This submission needs an overview of both the program and the information collection to provide a more coherent understanding of the effort to be undertaken. It is not clear from the current version how each part fits with the other or what even the purpose of the collection is. The survey and protocol appear to be in draft form without the necessary PRA information.

Attachments have been modified in response to comments.

NOTE: Responses to raised points are in italics, and red font.

Introduction:

Q: Please provide more information, particularly context and purpose, of this program.

A: The entire introduction of Supporting Statement A has been changed to provide more contextual information. Changes appear in highlighted text in sections A1 and A2.

Q: Is this a university-based program? A program administered by research institutes? Please explain.

A: The funding mechanism for the EPSCoR program is found in the revised Supporting Statement A.1, specifically in the section beginning at the bottom of page 6.

Q: What are the research and/or policy questions driving this study? What are the tangible benefits of funding? What is the quality of research that is produced from funded projects?

A: Specific policy questions driving the study are found in the revised Supporting Statement A.2, specifically at the bottom of page 8 and top of page 9.

Q: Does the third evaluation (historical) overlap with the first evaluation (active EPSCoR programs)?

A: The two approaches are partially overlapping with respect to the subject material, but are wholly separate in terms of process and timing. Both cover the current NSF EPSCoR program – the Academies panel as part of a current assessment of the active EPSCoR and EPSCoR-like programs (NSF, NIH, DOE, NASA, USDA) and this study as part of a historical assessment of NSF’s effort. At the same time, the two studies adopt different approaches. The Academies panel uses the NAS process, based heavily in consensus statement of the expert panelists. This study will have greater opportunity for primary data collection and secondary data analysis. This study is a two-year effort, with the report due in December 2013. The Academies panel likely will be concluded earlier.

Q: Please explain why some jurisdictions are exempted from standard reporting practices.

A: No jurisdictions are exempted from standard reporting practices. The statement, “A specific rationale is to collect data needed from jurisdictions which are not a part of standard reporting practices, and which will provide a more nuanced understanding of activities carried out under the auspices of EPSCoR-supported research and related activities“ contained a misplaced clause, “which are not a part of standard reporting practices” modified “data” not “jurisdictions. What is meant is that the rationale is to collect supplemental data from the jurisdiction, not to collect data from supplemental jurisdictions.

Q: Please present a visual depiction of how the various aspects of the evaluation fit together for the purpose of greater clarity.

A: See Figure 1 on page 14 of Supporting Statement A.

Q: Please distinguish between Project Directors and Program Administrators.

A: Project Directors are the principal investigators/EPSCoR overall leaders. They are either senior academics or leaders of state EPSCoR offices. Program Administrators are administrative staff. They typically do not hold faculty positions. (See footnote #2, page 11 of Supporting Statement A).

A.1.Circumstances Making the Collection of Information Necessary:

Site selection criteria:

Q: Please confirm or correct our understanding that all grantees form the universe for this collection, and a census approach will be taken, so that all will be invited to participate. Please clarify if eligible, but not funded, programs will be included as well as Supporting Statement A suggests.

A: All grantees form the universe for this collection, except for Missouri and Guam. Missouri has submitted an EPSCoR RII Track-1 proposal (under review). Guam has recently become EPSCoR-eligible but has not yet submitted an RII Track-1 EPSCoR proposal. See Supporting Statement B, page 5.

Q: Why are there so many awardees excluded from possibly participating (SSB.1)

A: The Research Infrastructure Improvement Awards are the largest single funding category within the EPSCoR program. Moreover, the Research Infrastructure Improvement awards are intended to build sustainable research capacity at a jurisdiction level (unlike either co-funding of standard NSF grants or the workshops), and so only these awards fall under the evaluation objective and primary study questions. Within the RII awards, the focus of the study is on the Track 1 Research Improvement Awards. The Track 2 and C2 awards are recent – they were made using ARRA funding – and so the results of RII EPSCoR support will not be known before the evaluation is complete. See Supporting Statement B, pages 5-6.

Site selection process:

Data collection process:

A2. Purpose and Use of the Information Collection:

Q: Please explain the overarching purpose of this evaluation, to what ends the results will be applied, and how it will be used by NSF and the field at large (if at all).

A: Response to this question is provided in the revised Supporting Statement A.2.

A.3 Use of Improved Information Technology and Burden Reduction:

Q: Please define what appropriate information technology is.

A: Information technology to be used is described in the revised Supporting Statement A.3. The survey will be fielded online, using the Qualtrics survey software platform.

Q: Please describe how the qualitative data will be analyzed, presumably with specialized software?

A: Content analysis of qualitative data will be performed using NVivo qualitative data analysis software.

A.4. Efforts to Identify Duplication and Use of Similar Information:

Q: Please explain what statistical data from other public and private sources include.

A: Statistical data to be used are described in the revised Section A.4, on pages 12-14.

A.5. Impact on Small Businesses or Other Small Entities:

A.6. Consequences of Collecting the Information Less Frequently:

A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5:

A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency:

A.9. Explanation of Any Payment or Gift to Respondents:

A.10 Assurance of Confidentiality to Participants:

Q: Why is there only a promise of “open-ended survey questions should be reported in aggregate form”? Is there a possibility the data will not be reported in aggregate form?

A: Section A.10 has been revised. Data will be reported in aggregate form.

Q: Does the NSF statute allow for the granting of confidentiality as stated?

A: Yes. See language at the end of Section A.10.

Q: Please clarify or revise the following sentence: “Responding to the questions and the resulting division of labor therein will be divided and aggregated under the direction of the Project Director”. Please explain what this means.

A: The survey will be sent to each Project Director. Each jurisdiction has its own individual processes for storing historical data regarding EPSCoR awards, and so a variety of paths may be followed for completing the survey. In some jurisdictions, the Project Director may him/herself complete the survey; in others the Program Administrator or other EPSCoR staff (e.g., Education specialists, evaluators) may assist the Project Director; in other jurisdictions, members of the state committee, former EPSCoR participants, or other faculty members may become involved. As each jurisdiction is unique, discretion regarding how to best complete the survey is being left to the Project Director. Explanation is incorporated into revision to Section A.10.

A.11 Justification for Sensitive Questions:

A.12. Estimates of Annualized Burden Hours and Costs:

Q: The submission implies that jurisdictions that were eligible but were ultimately denied funding will participate in this effort. Please confirm or correct our understanding.

A: There are no jurisdictions that are eligible but ultimately denied funding. All jurisdictions with EPSCoR RII funding will participate. Explanatory text added to Supporting Statement A.12.

Q: Is 100% response realistic? Is this effort mandatory for grantees

A: The effort is not mandatory for grantees, but it is nevertheless expected that all jurisdictions will participate. NSF has publicized that the survey will be coming for nearly a year, and all Project Directors have been consulted regarding the survey’s design. RII Track-1 project teams have, as part of the Programmatic Terms and Conditions of their award, the expectation to cooperate with NSF EPSCoR program evaluation activities. Based upon past experience, we therefore believe that the response rate will be near unity. Explanatory text added to Supporting Statement A.12.

A.13. Estimates of Other Total Annual Costs to Respondents and Record Keepers:

A.14. Annualized Cost to the Federal Government:

Q: The cost seems severely underestimated for 29 hours of interviews.

A: Interviews require one hour per discussion. Hourly rate was estimated at $38.94; multiplying the two yields the estimated cost of $1129.26. This is the figure in the updated Table A.12.

A.15 Explanation for Program Changes or Adjustments:

A.16 Plans for Tabulation and Publication and Project Time Schedule:

Q: Please update the project schedule.

A: Table A.16 updates the schedule.

Q: Please explain what is meant by the sentence that begins with “Given the high degree of qualitative…”

A: As the sentence was not clear, it was split into two and slightly re-worded. The section (highlighted in Supporting Statement A.16) states, “Given the high degree of qualitative information to be collected, only basic descriptive statistical analyses will be conducted on a subset of close-ended questions.” “The primary emphasis of the survey is to enable in depth analysis of qualitative data obtained through open-ended questions.”

Q: Who is the expert panel? What input should they provide?

A: The Supporting Statement erroneously referred to the contractor’s internal peer review processes; there is no “expert panel.” The statement was re-written as, “a draft report on the survey findings will be developed and reviewed through the contractor’s internal peer-review process by May 2013.”

Q: How will the final report be incorporated into NSF’s grant and portfolio management processes?

A: A statement as to how the final report will be incorporated into NSF’s grant and portfolio management processes was included in Supporting Statement A.2, at the top of page 9. As described, changes may range from redefinition of program goals and objectives to changes to reporting and administrative processes.

B.1. Collection of Information Employing Statistical Methods:

Q: Please justify the multiple exclusions.

A: As explained in the revised Supporting Statement A.1, the Research Infrastructure Improvement Awards (RII) are the largest single funding category within the EPSCoR program and are intended to build sustainable research capacity at a jurisdiction level (unlike either co-funding of standard NSF grants or the workshops). As a result, only RII awards fall under the evaluation objective and primary study questions. Within the RII awards, the focus of the study is on the Track 1 Research Improvement Awards. The Track 2 and C2 awards are recent – they were made using ARRA funding – and so the results of EPSCoR support will not be known before the evaluation is complete.

B.2 Describe the Procedures for the Collection of Information:

Q: No Attachments are located in ROCIS.

A: Updated attachments are provided.

Q: How will the data be analyzed?

A: Data collected through the survey will be analyzed using content analytic methods in NVivo, a qualitative data management and analysis software package for the social sciences.

B.3. Describe methods to maximize response rates and to deal with issues of non-response:

Q: Please describe alternate modes or approaches to combat non-response. Using the same mode three times consecutively but expecting a different response seems like a less successful strategy than optimal.

A: In response to this comment, the strategy for combating non-response has been modified. First, a pre-notification letter will be sent via US mail on NSF letterhead announcing the survey. A week after this first mailing, the contractor will follow up with an email with the weblink embedded. Subsequent email reminders will be sent at a different time and on a different day of the week in order to accommodate different work schedules. For non-respondents, and incompletes, an email reminder will also be sent on different days of the week as well. After three email reminders, a phone call reminder will be made sending respondents back to the weblink. See revised text in Supporting Statement B.3.

Q: Perhaps the team should take lessons from other online survey efforts and include successful components of these prior works, such as a bar across the top show progress.

A: A percentage completion bar has been added to the survey. See Attachment 1.

B.4 Describe any tests of procedures or methods to be undertaken:

Q: How many PDs were contacted for comment?

A: The universe of PDs (all 29) was introduced to the fact that the evaluation effort would include a survey at the Spring PA/PD meeting held at NSF. Each PD was given the opportunity to comment during an open session, and opportunities for individual feedback were also provided. Should the survey be approved in advance of the next PA/PD meeting, to be held in January 2013, the contractor will be invited to present the survey and to discuss any issues related to its completion.

Instruments and consent materials:

The Paperwork Reduction Act of 1995 (35 USC 44 §3506) and Implementing Regulations (5 CFR 1320.5) requires that the authority of collection (who and under what statute), purpose, use, voluntary nature (or if a mandatory collection), and privacy offered (and under what statute; if no statute, private to the extent permitted by law), if any, be conveyed to the participant of a study, in addition to explaining the length of the study. Please ensure that the PRA burden statement, OMB control number and expiration date are also provided on all study materials viewed by the participant. This is currently not the case. Please make sure that interview participants hear/read/receive the same information. Please send revised materials that incorporate this information.

A: See updated Attachments. Statements now included at start of survey and appear in screenshots along with the OMB control number which appears at the top of each survey page. Additionally, the expiration date will be added to the survey header. See Attachment 1.

Survey

Please include an introduction, a thanks, an expected time, etc.

A: Incorporated onto survey face page. See Attachment 1.

Please include sample screen shots with the next submission.

A: See Attachment 1.

Please provide the protocols for the interviews.

A: Attached.

Question 7 does not seem to flow logically from Question 6, though the intention may be.

A: Question 6a has been reworded to, “How are decisions made about choosing research themes…” This makes the link from Question 6 to Question 7.

There is no reason for Questions 10 through 13 to differ. Simply use the word ‘institution’ and there is no need to create four different versions of the question.

A: There are two reasons for creating different versions of the question. First, an EPSCoR RII award may involve any (or all) of these institution types; asking the question separately is intended to ensure completeness of response, so that all institutions participating in EPSCoR where academic infrastructure has been strengthened will be addressed. Second, it is expected that the nature of capacity development may differ from institution type to institution type; creating separate boxes for response will facilitate analysis.

Please justify why there are no follow-up questions to Question 15, such as number or why/why not.

A: There is not an expectation that RII funding will necessarily lead to the establishment of independent research entities. There is, however, a follow-up question (15a) which seeks to understand how such new centers or institutes will be sustained financially post-award as a measure

Protocol

Question 4 seems better suited for the survey; please justify why this is on the interview protocol.

A: The question will instead be posed to each state committee chair “Please describe the structure of your state committee” and the various structures will be used as prompts should they be needed.

Question 5’s formatting hides the 12 parts of the question. Please revise.

A: The following are types of activities that state committees can play: (a) writing, updating, and maintaining a state S&T plan, (b) serving as a managing agent for RII awards, (c) developing RII proposals, (d) selecting or replacing RII PDs, (e) coordinating with other federal EPSCoR programs in the jurisdiction, (f) facilitating coordination across institutions of higher learning in the jurisdiction, (g) facilitating coordination with other stakeholders, (h) managing other jurisdiction-level R&D funding programs, or (i) managing jurisdiction-level training programs intended to broaden participation. ?

During the interviews, each State Committee will be asked about the role that the State Committee plays in each of these activities.



Question 7b implies meaningful follow-up questions, such as on what basis are state committee members appointed, etc.

A: An additional follow-up question has been added to address the basis on which state committee members are appointed.

Additional Comments received Thursday 11/15/2012:

Further Comments on 201208-3145-002 Experimental Program to Stimulate Competitive Research (EPSCoR) Survey

 

  • The “EPSCoR Background and Description” text at the beginning of the response memo does a better job of explaining the program and the evaluation than the existing SS text, and should be inserted into the revised SS before Part A.1.



Text included in Supporting Statement A.1.

  • Part A.1, on the necessity of the collection, is not very strong---in large part because its first paragraph was having to explain (albeit not clearly) the program. It would be better revised to discuss the need for a comprehensive evaluation of the program and, building on the existing second paragraph of A.1 and the response to the passback question on the potential overlap between this study and the NAS study. That said:

    • It is surprising that the SS only references the 1988 authorization of EPSCoR (42 USC 1862g; P.L. 100-570) in passing, yet never mentions or describes the effective reauthorization of the program in 2011 (under the America Competes Act of 2010, P.L. 111-358; specifically, 42 USC 1862p-9). Surely, a more effective need/justification statement could be crafted based on that legislative language’s emphasis  on assessment and documenting efforts, accomplishments, and improvements. That language also directly authorizes the NAS study---and explicitly makes that a broader-review of all EPSCoR-like programs, which the memo text hints at (but is important to clarify, in justifying the need for an NSF-specific evaluation)

A: Language added to Supporting Statement A.1 making reference to the reauthorization as the justification for the survey.

    • The “National Academies of Science” reference in the existing A.1 is invalid; it should either be “National Academy of Sciences” or the “National Academies”. The memo’s statement that “the Academies process uses the NAS process, based heavily in consensus statement of the expert panelists” reads disparagingly, and the revised text would better distinguish between the broader, high-level review of the NAS study and the more program-specific evaluation NSF is conducting (that the NSF review will “have greater opportunity for primary data collection and secondary data analysis” is a good and valid point).


A: Reference to National Academies has been removed from Supporting Statement A.1.

  • The response memo references (by combining it with a few others) but doesn’t directly answer a basic question---what exactly is an EPSCoR “jurisdiction”? (Is it strictly a state-awardee function, as suggested by the mentions of some non-participants? Is it universities or groups of universities? Is it some combination?)

A: As mentioned above, definition of “jurisdiction” has been incorporated into Supporting Statement A and B. A “jurisdiction” is a U.S. state, commonwealth (Puerto Rico), or territory (U.S. Virgin Islands)

  • And, with that, the added text in the memo on the data analysis may muddle things a bit. The memo proposes to do regression analyses after constructing a dataset from a variety of public and private sources, including an EPSCoR-participant dummy variable. It’s unclear what the rows of this dataset are (states?) and, if so, how some of these will be constructed. For instance, for Web of Science publications, would you be trying to count relevant articles by any author from a particular state? Other variables on the list are clearly state-level attributes (e.g., state population) but others are university-level attributes (e.g., Carnegie university quality rankings), hence the confusion.



A: Regression analyses will occur at two levels, state level and university level. In the state level analyses, each row would be a state/year combination. Dependent variables would include percentage of NSF funding to jurisdiction, percentage of publications with one author in jurisdiction, etc. In the state-level regressions, however, independent variables may include number of Research One/Research Two institutions in the jurisdiction (based on Carnegie rankings), which is a state-level rollup of university-level data. In the university-level analyses, each row would be an institution-year combination. Dependent and independent variables would be constructed using university-level information (e.g., NCSES data).

Specifically with respect to using publications as a dependent variable, searching on the author fields (addresses of each individual author; address of corresponding author) provides both the number of publications with any author in the U.S. as well as the number of publications with one or more authors in each state.

  • That section on proposed data analysis also asserts two roles for the proposed survey: first, that it would be an opportunity for “jurisdictions” to “validate and correct data” that would enter these models and second, that it would “collect information regarding other outcome variables of interest.” The SS asserts that, and the survey instrument certainly corroborates, that the survey isn’t collecting any quantitative variable (in the sense that respondents aren’t providing any count/expense/etc. data). Accordingly, (1) it’s not clear that the survey can play the “validate and correct” role and (2) it would seem that it might lead to the construction of some other dummy variables, but that is about all. Is that correct?



A: As described in Supporting Statement A.1, Section A.4 (page 14) and A.11, a template containing each state’s material collected from progress reports and proposals will be provided to the awardees. The template will include:

  1. The list of individuals identified as faculty members having been hired using EPSCoR funds

  2. The list of graduate students and postdoctoral researchers having participated in EPSCoR

  3. The list of publications attributed to RII awards

  4. The list of patents, patent applications, and licenses associated with EPSCoR funding.

  5. The list of new degree programs created as a result of EPSCoR



The purpose of providing the information in this fashion is to allow each EPSCoR state to edit or otherwise update a pre-assembled list rather than to ask each state to provide data de novo (although should the states have their own databases, they could simply export them into the template should they desire). So the template allows the states in fact to validate and correct data.

  • While the templates do not themselves ask for counts/year or counts/iteration, they do provide the basis for collecting quantifiable information that could be incorporated into the regression analyses mentioned above. Examples include:

    • Using number of faculty hired during an EPSCoR award rather than a 0-1 EPSCoR participation variable

    • Using number of graduate students participating in EPSCoR rather than a 0-1 EPSCoR participation variable

    • Using number of EPSCoR-supported publications rather than a 0-1 EPSCoR participant variable.



  • The memo’s stated rationale for excluding several EPSCoR awardee types should be incorporated in to the SS, and the statement should be made strongly that this evaluation is primarily intended to describe/analyze the experiences of RII Track-1 grantees.



A: Changes to Supporting Statement A (especially in A.1 and A.2) clarify the point that the RII awards, specifically the Track-1 awards, are the focus on the evaluation.

  • The memo restates, but does not appear to answer, the point about including the OMB control number and PRA burden statement on materials viewed by the respondents. That information does not appear on the questionnaire and interview protocol in the original package, and certainly does not appear in the contact emails and such included in this batch of new documents.



A: See updated Attachments

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy