Responses to OMB Informal Review of NSF's Evaluation of the Noyce Program

Responses_to_OMB_Questions_informalcommentsrevision 4_20_2011 (3).docx

Evaluation of the Robert Noyce Teacher Scholarship Program

Responses to OMB Informal Review of NSF's Evaluation of the Noyce Program

OMB: 3145-0217

Document [docx]
Download: docx | pdf

Responses to OMB Informal Review of NSF’s Evaluation of the Robert Noyce Teacher Scholarship Program

April 2011


Response to OMB Questions: NSF Robert Noyce Teacher Scholarship Program


  1. Research Question #2:  Deals primarily with perceptions of Noyce program and people by people inside and outside the program.  We do have a slight concern with this given that in the past, NSF has proposed using these subjective questions to ostensibly measure impact and rarely will test these questions for reliability or validity.  We would encourage pretesting and encourage appropriate interpretations. 


Perceptions and Impact. Data gathered from Research Question 2 about the perceptions that Noyce stakeholders have about the Noyce Program will solely be used for descriptive purposes. This information will provide additional context to help explain the impacts identified from Research Question 5, which looks at the impacts of the receipt of a Noyce award on the IHEs’ production of certified STEM teachers who take teaching jobs in high-need districts and on their persistence in teaching.


Activities to Ensure Reliability and Validity. This study has taken several steps to ensure the reliability and validity of its surveys.


  • Instrument design was informed by the survey development and analysis conducted under a prior research study. The current study team reviewed the surveys developed and validated in a prior study completed in spring 2010 by Dr. Frances Lawrenz, University of Minnesota. For the recipient survey, the Lawrenz study conducted both exploratory and confirmatory factor analysis of key items on the recipient survey.  The current survey builds upon the findings that emerged from the prior study (and includes questions related to the specific factors that were found to characterize the Noyce program experiences).1 The prior research also suggests strong support for the internal consistency of the factors, as indicated by Cronbach’s alpha (Liou & Lawrenz, 2009).


  • Wherever possible, the current survey incorporates items from national surveys. The survey draws from such surveys as the National Center for Education Statistics Schools and Staffing Survey 2007-08 (SASS) and the National Science Foundation’s (NSF) 2006 National Survey of Recent College Graduate (NSRCG). Some items have been adapted from national protocols to ask about more specific information necessary to understand the scholars’ educational and employment background and teachers’ perceptions of school climate.


  • Study design documents and survey instruments were reviewed by the study’s Evaluation Advisory Committee (EAC), to ensure the design and instruments are appropriately tailored to study’s questions and audiences. The EAC consists of university-based researchers whose expertise focuses on teacher preparation (particularly in STEM areas).


  • Surveys for each of the respondent groups were pilot tested. The study team conducted pilot tests to ensure that items were clear, that language was unambiguous, and that items were understandable to respondents. Based on responses to the pilot tests, the study team revised each of the surveys. The study team solicited pilot feedback from the full range of potential respondents (e.g., Principal Investigators, STEM faculty, K-12 principals, and Noyce recipients), and as a result, clarified ambiguous language and item formats, eliminated items that were not meaningful, and revised response scales.


  • The theory of change informed the development of additional topics that were important to measure—topics that were not already assessed using prior survey items. Survey items were developed based on the theory of change that the program has articulated. To ensure that the descriptive data produced from this study will have face validity, items were developed to map to a theory of change model that articulates the linkages between Noyce program components (including program activities for preservice and inservice teachers, STEM faculty involvement, and the service obligations in high-need districts) and the expected outcomes for Noyce recipients, teachers, and students.


  • The study will compare surveys results of relevant items to annual monitoring data entered by PIs. Finally, to ensure that items have been properly interpreted, we will compare survey results to results from data entered into annual monitoring system about related activities (note that the data entered into the monitoring system are more generic and less individualized than data that will be obtained from surveys that allow respondents to report individual experiences ).



  1. Research Question #5:  NSF must make sure that this goal is done well, since the data that will tell whether or not the basic goal of the program is being achieved.  Without good data from #5, the results from the other goals 1-4 are pretty meaningless. We look forward to the details of this in order to assess whether or not the goals of #5 will be met.


The planned impact analyses will focus on both teacher and student outcomes. Quasi-experimental research design approaches will be used to examine the causal impacts of each of the outcomes described below. The study design and analysis plan were reviewed with the study’s Evaluation Advisory Committee, including expert methodologists. The teacher impact analysis uses a short interrupted time series with comparison units design, which was described in Appendix B of the original submission (August 2010) and reiterated and revised in Appendix B appended to this response . In this package, we are including an updated version of Appendix B, which includes an added description of the student impact analysis.


The student impact analyses were not described in this package, since they were added more recently through a contract modification. In this package, we are including an updated version of Appendix B, which includes an added description of the student impact analysis. Additionally, the teacher and student impact analyses are briefly described below.


One of the primary questions that will be addressed by the analysis of teacher outcomes is, how does an institution of higher education’s (IHE’s) receipt of a Noyce grant affect its production of certified or licensed STEM teachers? Our approach to this question seeks to determine whether receipt of a Noyce grant causes IHEs to produce greater numbers of certified STEM teachers than the numbers the IHEs would have produced if they had not received Noyce grants. Our proposed quasi-experimental approach to addressing this question utilizes a difference-of-differences approach. This approach is also known as a “pre-post with comparison group design” with multiple measurements at pre and post, and is computationally similar to a “short-interrupted time series” design.  For additional details see Appendix B in the attachments.



For student outcomes, the study will conduct a pilot study that examines the impact on students of having being taught by a teacher who had received Noyce support on students’ math (or science) achievement scores.2 This approach essentially tests the effect of a classroom intervention, specifically having the class taught by a Noyce teacher versus being taught by a non-Noyce teacher. The study’s quasi-experimental matched-comparison group design will identify those schools with one or more Noyce teachers, identify one or more matched comparison classes within each school taught by non-Noyce teachers, and will ultimately compare (within-school) spring student achievement scores of Noyce and non-Noyce teachers’ students. The differences between the scores of the students of Noyce and non-Noyce teachers would be aggregated over schools to produce an overall impact estimate. The analytic models would control for student-level pre-test scores (scores from the prior year), and any other student-level demographic data that are available (e.g., free-reduced price lunch eligibility, limited English proficiency status, special education status). While the pilot study will not be powered to detect small effects, the study will demonstrate the feasibility of using the proposed design in a larger scaled study which could be powered to detect small, but educationally meaningful effects.


Surveys and interviews are designed to collect information to address Questions 1-4 of the program evaluation.  Types of information collected by these methods provide a broad understanding of the program that can inform both program improvement and assist in the development of nuanced and contextualized interpretations of the impact data.  For example, the previous evaluation found that the scholarship was instrumental in the decision to pursue a career in teaching for those STEM majors who had not thought of teaching as a career.  This finding will be further examined in the current study.  Another area of interest is the relationship between participation in Noyce-supported activities and teacher outcomes (e.g., recruitment and retention).  Developing a broad understanding of how participation in Noyce-supported activities influences teachers’ commitment to teaching in  high-need districts can inform program level decisions on the appropriate balance of funds that go to (1) financial support (e.g., scholarships, stipends, and fellowships) and (2) programmatic activities aimed to support the professional development of teachers (e.g., field experiences in high need settings).   An important aspect of the collection of information to answer Questions 1-4 is the examination of (1) personal and professional reasons for teaching or not teaching in high-need districts and (2) reasons for staying or leaving teaching after the completion of service requirements.  Moreover, this type of information will inform the design of future studies on retention in teaching.  As with the quasi-experimental component of the evaluation, findings and results from this component of the evaluation will be qualified in terms of the strengths and limitations of the methods.



References

Liou, P.Y. & Lawrenz, F., et al., (2009). University of Minnesota Evaluation of the Robert Noyce Teacher Scholarship Program, Final Report Section Two: Factor Analysis of the Evaluation Questionnaire. Minneapolis, MN, University of Minnesota, 2009.


1 Dr. Lawrenz is also a consultant on the current study, has reviewed our surveys, and her comments have been incorporated into the current survey instruments.

2 Only students of teachers who teach subjects/grades that are assessed in their respective state/districts will be included in the study.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEllen Bobronnikov
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy