OMB Statement B_ Noyce Evaluation_SRI

OMB Statement B_ Noyce Evaluation_SRI.docx

Evaluation of the Robert Noyce Teacher Scholarship Program

OMB:

Document [docx]
Download: docx | pdf


SRI Noyce Evaluation- 30-day OMB notice

Supporting Statement Part B


Supporting statement PART b: Collection of Information Employing Statistical Methods

B.1. Respondent Universe and Selection Methods

The study plan includes interviews and focus groups to capture the diverse perspectives of stakeholders across all four Noyce funding tracks: Scholarships and Stipends, Teaching Fellowships, Master Teaching Fellowships and Noyce Funded Research. As such, the respondent universe for this study includes all Noyce grantees funded between 2013 to 2023. Data collection activities may include participants across all four tracks (see table).


Qualitative data collection. The qualitative data collection includes interviews and focus groups that will support the development of case studies and network analysis. We plan to conduct case studies with up to 10 Noyce hub grantees. The case studies will include the awardee institution, its partner organizations, Noyce project staff, project participants, and partner schools. Our primary method of data collection will be through interviews with key stakeholders. These stakeholders will include the Noyce project Principal Investigator (PI), co-PI(s), program staff, and teachers. This data will also be used to support Noyce network analysis to understand how Noyce-funded individuals and organizations are connected and the quality of their interactions. In addition, we plan to conduct an additional 15 interviews with Noyce principal investigators and involve an additional 90 teachers and scholars in focus groups.

The proposed strata for this project are funding year and Noyce funding strand, as shown in the table below (recipient-level information is provided in the table below).   

 

Track 1

Track 2

Track 3

Track 4

Capacity Building

Total 

2013 Grants

32

8

0

0

6

46

2014 Grants

41

9

0

0

13

63

2015 Grants

35

5

0

15

8

58

2016 Grants

38

5

3

23

2

71

2017 Grants 

69

2

3

9

11

94

2018 Grants 

43

0

8

16

11

78

2019 Grants

46

3

4

4

10

64

2020 Grants

34

1

7

29

17

88

2021 Grants

30

2

3

35

17

87

2022 Grants

40

4

5

30

24

103

2023 Grants

38

1

10

28

12

89



B.2. Procedures for the Collection of Information

Qualitative methods

The role of the qualitative methods in the evaluation is to gain insights into individuals’ experiences in Noyce-funded projects, perceptions of benefits and outcomes that may not be captured in other kinds of data, and relations and interactions with other individuals or institutions that inform sensemaking and decision-making within projects, organizations, and networks. These insights will help us articulate explanations for mechanisms and trends in the quantitative findings. They will also provide illustrative examples in the form of summaries of individual’s experiences, descriptions of projects, and quotes. Furthermore, qualitative analysis is critical to foreground people’s experiences, perspectives, and voices so that broad findings statements do not overshadow participants’ lived realities.

Below, we describe sampling, data collection, analysis, and outcomes for the case studies, qualitative network analysis (QNA), and interviews and focus groups. The table below shows the outcomes, data, and related evaluation questions for these methods.

Anticipated outcomes and related evaluation questions for qualitative methods

Outcomes

Data

  • Explanations for impacts on recruitment

  • Examples of strategies and activities that have been effective

  • Explanations for impacts on teacher knowledge of instruction informed by understanding and being responsive to individual students’ experiences and needs

  • Explanations for impacts on the creation of opportunities and leveraging the skills and experiences different kinds of individuals bring.

  • Explanations for impacts on retention

  • Examples of strategies and activities that have been effective for retention

  • Explanations for and descriptions of structures and/or systems that establish and support social networks that effectively enhance STEM teacher preparation and professional development regionally

  • Explanations for quantitative findings on impacts of capacity building grants on PIs and institutions

  • Descriptions of how capacity building grants benefit PIs and institutions

  • Explanations for quantitative findings on impacts of capacity building grants on the creation of opportunities and leveraging the skills and experiences different kinds of individuals bring.

  • Descriptions of how capacity building grants enhance the creation of opportunities and leveraging the skills and experiences different kinds of individuals bring. Explanations for and illustrations of more common challenges encountered by awardees, with particular attention to challenges to the creation of opportunities and leveraging skills and experiences different kinds of individuals bring.

  • Explanations for and illustrations of both common and unusual or innovative solutions to challenges developed by awardees, with particular attention to the creation of opportunities and leveraging skills and experiences different kinds of individuals bring.

  • Case study descriptions of institutionalization strategies and practices

  • Descriptions of sustainability indicators

  • Descriptions of sustainability outcomes

  • Explanations for sustainability outcomes

  • Descriptions of effective institutionalization strategies

  • Annual & final reports

  • Case study

  • Interviews

  • Focus Groups

Case Studies

SRI will conduct up to 10 case studies, each of which will be a Noyce hub (an awardee institution, its partner organizations, Noyce project staff, project participants, and partner schools). We will conduct semi-structured interviews with the Noyce project PI and co-PIs, program staff, participants, and others identified by the initial set of interviews as important to the project or the hub or people with important information about the project or the hub.

Data and Sampling

We will gather data from 10 Noyce hubs. Data collection will primarily involve interviews with key stakeholders including the Noyce project Principal Investigator (PI), co-PI(s), program staff, and teachers. Additionally, we will collect relevant award artifacts to supplement our understanding of each project's context and implementation. The interviews will be conducted using semi-structured interview protocols with questions and follow-up prompts designed to address the relevant evaluation questions.

To select the 10 cases, we will use a maximum variation sampling approach (Flyvbjerg, 2006) to capture a wide range of perspectives and experiences. The approach involves identifying Noyce hubs representing a range of key characteristics, including:

  • Project Tracks

  • Project goals and strategies focused on the creation of opportunities and leveraging individuals’ skills and experiences

  • Impactful strategies, regarding recruitment; teacher knowledge of student-centered instruction and pedagogical knowledge; the creation of opportunities and leveraging individuals’ skills and experiences; and retention, including the roles of role of scholarships, scholarship repayments, and salary supplements

  • Prior capacity building project grants

  • Project and PI challenges and solutions

  • Geographical locations

The data informing the selection process will come from the analysis of annual and final reports. Data pertinent to the defining characteristics of potential cases will be systematically organized using a structured database. This database will include the key characteristics listed above. Each potential case will be profiled based on these variables, creating a detailed matrix that allows for the visual comparison and contrast of cases across the spectrum of identified characteristics. The use of this structured approach ensures a transparent and replicable selection process, enabling the research team to identify and select the most varied cases efficiently. This data will be continuously reviewed and updated throughout the selection phase to reflect any new information or insights that may arise, ensuring that the final selection of cases encompasses the maximum variation desired for the study. The organized data will serve not only as the foundation for case selection but also as a valuable resource for understanding the context and nuances of each selected case, facilitating deeper analysis and richer insights into the educational phenomena being investigated.

We will interview a total of up to 7 adult individuals per hub, for a maximum total of 70 individuals. Interviewees will include the PI and co-PI(s), Noyce project staff, teachers, and other adults suggested by the PI. All participants will complete an informed consent form. We will interview each individual once or twice (twice only if additional information or clarification is needed), for up to 60 minutes, using a semi-structured interview protocol. Interviews will be conducted via Zoom or similar video conference tool, or phone. We will audio record interviews of participants who consent to be audio recorded. Audio recordings of interviews will be transcribed using a professional service. For participants who decline to be audio recorded, we will take notes.

Analysis & Outcomes

Qualitative data analysis will be conducted using MAXQDA software (VERBI Software, 2020), facilitating systematic coding and organization of interview transcripts and other textual data (Saldaña, 2016). We will conduct both within-case and across-case qualitative analysis to identify patterns, themes, and variations across the different Noyce projects (Miles et al., 2020). Through this analysis, we aim to provide detailed explanations about the mechanisms underlying project implementation and outcomes, drawing connections with quantitative and other qualitative findings (Yin, 2017). We will also use data from the case studies (interviews and artifacts) to analyze the social networks, network dynamics, and implications for effective preparation of STEM teachers in the Noyce hubs. Case study interview protocols will include questions and probes related to individuals and organizations that interviewees are connected to and the types and qualities of interactions those connections involve (e.g., receiving information, asking for help, social support).

The case study methodology will culminate in the development of comprehensive case profiles and a summary of cross-case findings, elucidating the unique characteristics and commonalities across the studied projects regarding the anticipated outcomes listed in Table 5 above. To enrich our findings and provide context, we will incorporate quotes and stories from the interviews to illustrate key insights and experiences. These outcomes will serve to triangulate with and complement quantitative data, offering a nuanced understanding of the impact and effectiveness of Noyce-funded projects.

Interviews & Focus Groups

To capture the rich insights and perspectives of stakeholders involved in Noyce-funded projects, we will conduct a series of interviews and focus groups. Our methodology prioritizes meaningful engagement with awardees, scholars, and teachers, ensuring a comprehensive understanding of project dynamics and outcomes.

Data and Sampling

We plan to conduct a total of up to 15 interviews with PIs and 15 focus groups with scholars and teachers. To select participants, we will, as in the case studies, use a maximum variation sampling approach (Flyvbjerg, 2006) to capture diverse perspectives and experiences across multiple project contexts. Our goal is to conduct interviews with PIs and co-PIs from approximately 15 Noyce awardees, and one focus group interview with scholars and teachers from each of those awardees. We will recruit a maximum of 6 scholars and teachers for each focus group, for a maximum total of 105 interview and focus group participants. We will identify awardees with projects representing a range of characteristics, including:

  • Project Tracks

  • Project goals and strategies focused on the creation of opportunities and leveraging individuals’ skills and experiences

  • Impactful strategies, regarding recruitment; teacher knowledge of student-centered instruction and pedagogical knowledge; the creation of opportunities and leveraging individuals’ skills and experiences; and retention, including the roles of role of scholarships, scholarship repayments, and salary supplements

  • Prior capacity building project grants

  • Project and PI challenges and solutions

  • Geographical locations

The data informing the selection process will come from the analysis of annual and final reports. Data pertinent to the defining characteristics of potential awardees will be systematically organized using a structured database. This database will include the key characteristics listed above. Each potential awardee will be profiled based on these characteristics, creating a detailed matrix that allows for the visual comparison and contrast of awardees across the spectrum of identified characteristics. The use of this structured approach ensures a transparent and replicable selection process, enabling the research team to identify and select the most varied awardees efficiently. This data will be continuously reviewed and updated throughout the selection phase to reflect any new information or insights that may arise, ensuring that the final selection of awardees encompasses the maximum variation desired for the study. The organized data will serve not only as the foundation for awardee selection but also as a valuable resource for understanding the context and nuances of each selected awardee, facilitating deeper analysis and richer insights into the educational phenomena being investigated.

We will identify 15 Noyce awardees as primary candidates, as well as 15 awardees with similar characteristics as secondary candidates. We will first attempt to recruit the PIs of the 15 primary candidate awardees. If a PI declines to participate, we will reach out to the PI of a secondary awardee with similar characteristics, until we have reached our total of 15 awardees.

All participants will complete an informed consent form. We will interview each PI once, for up to 60 minutes, using a semi-structured interview protocol. We will also use a semi-structured protocol for the focus groups, which will take up to 60 minutes. Interviews and focus groups will be conducted via Zoom or similar video conference tool. We will audio record interviews of PIs who consent to be audio recorded. All focus groups will be audio recorded. Focus group participants who decline to be audio recorded will not participate in the focus group. Audio recordings of interviews and focus groups will be transcribed using a professional subscription service. For PIs who decline to be audio recorded, we will take notes.

Analysis & Outcomes

Qualitative coding will be employed to systematically analyze interview and focus group transcripts using MAXQDA software (Saldaña, 2016; VERBI Software, 2020). This coding process will enable the identification and organization of key themes and insights emerging from the data (Miles et al., 2020). Thematic analysis will complement the coding process, allowing for a nuanced exploration of recurring patterns, connections, and variations within the qualitative data.

The outcomes of the interview and focus group analysis will include detailed descriptions of identified themes, providing valuable context and depth to triangulate with and illustrate case study and quantitative findings (see Table 5 for list of anticipated outcomes). Furthermore, quotes and stories extracted from the interviews and focus groups will be incorporated into reports to vividly illustrate key findings and amplify the voices of stakeholders. These qualitative narratives will enhance the overall richness and depth of our study findings.


B.3. Methods to Maximize Response Rates and the Issue of Nonresponse

We will identify 10 Noyce hubs as primary candidates for the case studies, as well as 10 hubs with similar characteristics as secondary candidates. We will first attempt to recruit the 10 primary candidates for the study. For any primary candidate hub that declines to participate, we will reach out to a secondary candidate hub with similar characteristics, until we have reached our total of 10 hubs.

To recruit participants, we will email hub PIs, describing the study and what participation will entail, using an NSF-approved email template. If a PI agrees to participate, we will conduct an initial, brief interview to gather some contextual information as well as information about key hub participants they recommend we interview. We will ask PIs to communicate with potential participants, providing them with an NSF-approved email template, asking them to participate.


To recruit interview and focus group participants, we will email PIs, describing the study and what participation will entail, using an NSF-approved email template. If a PI agrees to participate, we will conduct an initial, brief interview (up 15 minutes), to gather some contextual information as well as information about scholars and teachers they recommend for participation in the focus group. We will ask PIs to communicate with potential participants, providing them with an NSF-approved email template, asking them to participate.


B.4. Tests of Procedures

Interview Protocol Development 

To develop the interview protocols, we first generated an Interview Question Bank, a spreadsheet listing interview questions that were aligned with evaluation questions and key constructs (as described in the Evaluation Conceptual Framework) by different subgroups of participants (PIs and co-PIs, teachers, scholars, and project staff). The purpose of generating this question bank was to facilitate protocol customization, i.e., the process of determining which questions to include in each interview protocol. This evaluation covers a wide range of topics, yet we want to keep data burden to a minimum. Therefore, to optimize our time with each interviewee, we included only essential questions in each protocol. The interview questions address eight outcome categories based on the evaluation conceptual framework, study plan, and data collection and analysis plan (see deliverables 2.1, 2.2, and 2.3, respectively). For each outcome category, we developed one or more research constructs. For each construct, we developed one or more interview questions and determined which interviewee type or types each question was relevant to. In some cases, we also created additional versions of a question, varying the focus or phrasing of the question to make it relevant for different interviewee types. We developed three interview protocols derived from the Interview Question Bank. We created one protocol for Noyce PIs and co-PIs, one for teachers and scholars, and one for program staff.

Interview Pilot 

We piloted the protocols to collect information about whether questions were relevant and meaningful to interviewees, if questions elicited the kinds of responses intended, and what sequence of questions made most sense for different types of interviewees. Two SRI evaluators piloted the protocols at the 2024 Noyce Summit on July 15-17, 2024. To recruit participants, the evaluators invited conference attendees during the Town Hall meeting, Scholar resource fair and SRI’s poster presentations to participate in one-on-one interviews with the evaluation team. Attendees could report their interest via a Google Form reachable by a QR code and participate either during the conference or via Zoom following the conference. Five of the conference attendees participated in interviews during the conference, including four PIs and one scholar. We emailed three additional conference attendees who had indicated interest, inviting them to participate in the pilot interviews via Zoom after the conference. We have, to date, interviewed one of them, bringing the total number of pilot interviews to six (five PIs and one teacher/scholar). We acquired participants’ consent to participate using a form approved by SRI’s Institutional Review Board, which also approved the Interview Question Bank. 

We asked pilot interviewees to speak with us for 30 minutes, knowing that the full protocols would take longer but being considerate of participants’ time during a busy conference. We also figured that asking different questions of different participants would fully cover all protocol questions. However, participants were generous with their time and feedback, so the pilot interviews were approximately 60 minutes long.  

Following the Noyce Summit, the evaluation team met to discuss interviewers’ experiences and what modifications to make to the protocol. Overall, the evaluators found that the protocols worked well and the questions generated meaningful information across the key topics. Data collected through this pilot exercise at the Noyce Summit confirmed our assumption that protocols need to be customized based on information about the awardee, project, and other contextual factors, as there was significant variation across projects regarding focus and strategies. Specific updates to the protocols included

  • Change expected time requirements for some questions. For example, the first questions asking about the context of the interviewee’s campus and the goals of their Noyce award required more time to answer than originally expected. 

  • Change some questions or prompts to be more specific. For example, changing the prompt “Tell me about your role and background” to “Tell me a little bit about where you are in your education and how you got the Noyce scholarship” 

  • Update words and phrases to be more relevant to participants (e.g., using the term “learning activities” instead of “professional development”) 

  • Change the topic sequence to create a better flow of conversation and improve how topics build on one another. For example, moving up the Network Connections & Interactions topic since those questions provide additional background or context about the Noyce project, which helps interviewers refine questions and prompts to be particularly relevant in subsequent sections 



B.5. Consultants

Not applicable.



  

4


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMarietta Bowman
File Modified0000-00-00
File Created2025-05-19

© 2025 OMB.report | Privacy Policy