3145-0256 Supporting Statement Part B 2024

3145-0256 Supporting Statement Part B 2024.docx

NSF INCLUDES Network Member Survey

OMB: 3145-0256

Document [docx]
Download: docx | pdf

Supporting Statement Part B. STATISTICAL METHODS

B.1. Potential Respondent Universe

There are 4,750 National Network members (i.e., all those who have registered for the Online Community, includesnetwork.org, at least six months prior to the survey administration.) Of those 4,750 National Network members, 797 have adjusted their subscription settings to “no email” and will not be contacted to participate in the survey. Therefore, the potential respondent universe is 3,953.


The National Network is comprised of individuals who are interested in or working directly to broaden participation in STEM. Some of these individuals are NSF INCLUDES grantees; others have received other NSF awards or pursue broadening participation in STEM with support from other sources, including grants from federal, state, philanthropic, or other entities. Some are themselves representatives of these various types of funders, such as program officers at NSF, other Federal agencies, and private foundations. There are variations in the ways that these groups interact in and with the National Network, and the types of supports that the Coordination Hub provides. As such, the National Network survey has skip logic for three groups— one set of questions for all National Network members, regardless of role (Group C), additional questions for members who are currently implementing NSF INCLUDES-funded projects (Group B), and a few additional questions for those implementing NSF INCLUDES Alliance or Collaborative Change Consortia projects (Group A).


Participation in the National Network is required of NSF INCLUDES-funded projects and strongly encouraged for other related projects/organizations that work toward similar goals. Members are individuals; there are no organizational memberships in the National Network (see Part A.1 for description of different groups of National Network members).


There is variation across National Network members in terms of level and forms of participation: some members are more engaged in the Online Community than others and attend in-person events, while less engaged individuals may only consume listserv emails from the Coordination Hub. Providing different entry points and opportunities to access the experience that each member wants from the National Network, regardless of funding or level of interest in active participation, has been an intentional choice by the Coordination Hub. The National Network was designed with a low barrier to entry (e.g., asking minimal information from members when registering for the Online Community) and meant to provide different kinds of supports depending on members’ needs and interests. As a result of this design decision, the Coordination Hub has some information on who makes up the National Network (in terms of whether the member is funded or not, what type of work they engage in, and how active they are) but the information is not exhaustive of everyone who is considered a part of the National Network. Moving forward, the Coordination Hub will undertake efforts to gather additional membership information to create a better picture of the full National Network.


Prior administration of the survey in 2021 resulted in an overall response rate of between 21 and 22% each year. However, considering the wide variation in levels of engagement across the full National Network membership, analyses conducted over the last several years provide evidence that the response rate is higher among the more active members. The Coordination Hub estimates that among members who are more active on the Online Community, response rates are around 40-50%.


B.2. Procedures for the Collection of Information

B.2.1 Overview of National Network Survey

The National Network Survey is intended to assess the health, development, expansion, and impact of the National Network to promote responsive services and supports by the Coordination Hub. The Coordination Hub has included a number of program implementation and demographic questions to help understand whether or not respondents are similar to the National Network population with respect to grantee/non-grantee status, organization type, type of grant, geography, etc. This is a cross-sectional, not a longitudinal, survey; respondents’ information will not be linked over time.


The electronic survey instrument consists of 29 questions to National Network members about (a) what engagement and value they have with the National Network, (b) how they understand and operationalize collaborative infrastructure and systems change in their work. Based on level and type of funding from NSF INCLUDES, not all questions will be asked of each respondent. There is skip logic for three groups— one set of questions for all National Network members, regardless of role (Group C), additional questions for members who are currently implementing NSF INCLUDES-funded projects (Group B), and a few additional questions for those implementing NSF INCLUDES Alliance or Collaborative Change Consortia projects (Group A). Group A will receive all 29 questions. Group B will receive 28 questions. Group C will receive 22 questions.


B.2.2 Survey Administration Procedures

The Coordination Hub will administer the survey annually to all National Network members (i.e., all those who have registered for the Online Community, includesnetwork.org, at least six months prior to the survey administration and have not adjusted their subscription settings to “no email”). The survey will be administered electronically with skip logic to direct the respondents to the questions most relevant for them, such as questions specifically for funded members (see Sections A.12, B.1, and B.2.1).


Approximately two weeks prior to launching the survey, the Coordination Hub’s evaluation team will post an announcement on the Online Community that the survey is forthcoming, and National Network members should expect an email with their individual survey link, as well as a description of the purpose of the survey, and potential benefits. In collaboration with the Coordination Hub, the evaluation team will email each potential respondent and launch the survey with a four-week response window.


B.3. Methods to Maximize Response Rate and Minimize Non-Response Rate

Based on prior survey administrations, the Coordination Hub anticipates an overall response rate of approximately 22% from all National Network members (i.e., all those who have registered for the Online Community, includesnetwork.org, at least six months prior to the survey administration and have not adjusted their subscription settings to “no email”). While response rates between 30-40% are typical for online surveys, especially for large and diffuse communities such as the National Network (see, Response rates of online surveys in published research: A meta-analysis - ScienceDirect), the Coordination Hub and its evaluation team are implementing strategies, detailed below, to increase this rate and understand representativeness and potential non-response bias.


  1. In addition to the benefits of using Qualtrics outlined in Section A.3, and efforts to streamline and shorten the survey outlined in section A.1, respondents will have four weeks to complete the survey. After the initial invitation, the evaluation team will follow-up weekly for three weeks (for a total of three reminders) to encourage non-respondents to complete the survey. This follow-up will occur through direct email to National Network members, as well as general reminders in the Online Community. Copies of pre-survey, initial survey, and follow-up survey email scripts are included in this submission.


  1. As noted in Section B.1, engagement characteristics across members vary, and prior analyses provide evidence that response rates vary accordingly. It is expected that this variation would continue in future administrations. Due to the way that membership has historically been structured (i.e., with a low barrier for entry), the Coordination Hub has limited membership information to establish accurate response rates across sub-groups, compared with the full National Network. However, based on prior analyses the Coordination Hub has reason to believe that response rates are higher for those members who are more actively engaged in the National Network. One analysis points to a response rate of closer to 40-50% for those who are active in the Online Community.


  1. The Coordination Hub’s evaluation team will compare response rate by different groups (i.e., funded vs. un-funded, across different member engagement groups, members of MSI institutions, etc.) to known information about representation of these groups in the National Network to understand potential gaps in data or the limitations of the survey findings. If non-response from certain sub-groups of members is significant, the Coordination Hub’s evaluators may weight responses from those sub-groups to create more representative responses.


  1. The Coordination Hub and its evaluation team will continue to pursue procedural improvements such as (1) collecting and maintaining standard demographic information for each National Network member, (2) defining and assessing active National Network membership to understand response rate of active members as well as potentially create a more targeted list for future survey administrations, and (3) developing and implementing a comprehensive plan to enhance communications with National Network members.


B.4. Test of Procedures or Methods to Be Undertaken

B.4.1 Data Preparation

Survey responses will be collected using the online tool, Qualtrics. After survey administration is complete and the Qualtrics online tool is closed, the Coordination Hub’s evaluation team will begin data preparation by cleaning responses and creating sub-groups of respondents for comparisons, by grouping members by type of INCLUDES funding (i.e., into Groups A and B), by focus populations of BP efforts, level of the system that members are working at, etc., to facilitate clearer analysis and larger comparison groups for means comparisons.

B.4.2. Quantitative Descriptive and Correlational Analyses of Likert and Checkbox Survey Items

The Coordination Hub’s evaluation team will conduct initial descriptive analysis of survey data to understand response patterns on the items/scales for the whole sample and by specific subgroups. These descriptive analyses will examine means, standard deviations and frequencies for each item and scale for the overall sample. The evaluators will also conduct a series of cross-tabulations and statistical tests to examine survey responses for different categories of respondents, such as by type of grantee, participation level, and organizational type. The evaluators will present these results using a series of data tables and graphical representations.


B.4.3. Qualitative Coding of Open-Ended Survey Items

To prepare open-ended survey data for analysis, the Coordination Hub’s evaluation team will create a codebook guided by the survey questions. The codebook will contain codes that are identified a priori to align to the question content, as well as emergent codes based on analysis of the data to capture important themes. Each code will include a brief definition of the code as well as examples. The evaluators will refine the codebook based on its application to the first 20 responses coded – a process that typically provides enough evidence as to whether the codebook will apply well to the range of responses provided. To ensure the coding is rigorous and reliable, the evaluation team will conduct inter-rater reliability training with all members of the team. During the training, the team will engage in iterative rounds of coding the same set of responses and using discussion to resolve differences. Coders will be deemed reliable when they reach 80% agreement with the other team members. Themes will be summarized and presented both narratively and graphically.


B.5. Names and Telephone Numbers of Consultants

Alice Opalka, Coordination Hub Co-Lead Evaluator, (206) 239-8919, [email protected]

Cecilia Borges-Farfan, Coordination Hub Co-Lead Evaluator, (425) 773-3502, [email protected]

The NSF Eddie Bernice Johnson INCLUDES Coordination Hub: [email protected]


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPlimpton, Suzanne H.
File Modified0000-00-00
File Created2024-09-06

© 2024 OMB.report | Privacy Policy