FSRELMA_PLC_OMB_PartB[1]rev

FSRELMA_PLC_OMB_PartB[1]rev.docx

Evaluation of a District Wide Implementation of a Professional Learning Community Initiative

OMB: 1850-0906

Document [docx]
Download: docx | pdf

OMB Package Part B: Evaluation of a District-Wide Implementation of a Professional Learning Community Initiative


September 2013

Table of Contents




Part b. collection of information employing statistical methods

This submission is a request for approval of data collection activities that will be used to support the Mid-Atlantic Regional Educational Laboratory (REL MA) Evaluation of a District-Wide Implementation of a Professional Learning Community Initiative. The study is being funded by the Institute of Education Sciences (IES), U.S. Department of Education (ED), and is being implemented by ICF International and its subcontractor, Rutgers University’s Center for Effective School Practices.


This study aims to address the need for systematic information about district-wide implementation of professional learning communities as a critical element in improving teacher quality and instruction, thereby contributing to increased student achievement. ED seeks OMB clearance to survey online a population of teacher participants in school-based professional learning communities and interview principals face to face about the context and their perceptions, pre- and post-implementation. Data collection from teachers will focus on what the professional learning communities do, how they operate, and to what extent they produce the outcomes expected of them as framed by six conceptual attributes of professional learning communities and five specific tasks. Data collection from principals will focus on contextual information about school culture and conditions such as resources that support implementation. Teachers and principals will also provide their reflections on the challenges of implementing professional learning communities and their suggestions for improvement. The analysis will enable comparisons among professional learning communities within and across schools, and between teachers’ pre-implementation expectations and post-implementation experiences. Study findings are expected to inform both theory and practice related to implementation of professional learning communities.


The primary research questions to be addressed in the study are listed below. The questions were formulated with direct input from the leadership of WCASD to ensure that the study meets their specific needs.


  • Research Question 1: How do PLC teams implement the study district’s PLC tasks? That is, what collaborative teamwork routines do they develop to achieve the key attributes of PLCs and the key tasks listed in the study district’s PLC materials?

  • Research Question 2: How do the study district’s teachers and principals evaluate their experience with the program? Is their actual PLC experience the same or different from what they expected it to be before the implementation of the program? How do teachers’ evaluation of the program and of the implementation compare with those of principals?

  • Research Question 3: What specific artifacts (e.g., essential learning targets, standardized common assessments, and systematic interventions) are produced by each PLC team in the district?



  1. Respondent Universe and Sampling Methods

A nationally representative sample of districts is not feasible or necessary for this study. Due to the specific request for this study from the superintendent of the study district and the special circumstances of that district’s carefully planned district-wide implementation, the universe for this study is the West Chester Area School District in Chester County, PA. Within the study district, all teachers in all grades (N=930) are expected to participate in professional learning communities starting in 2013-14. The district superintendent will request that all teachers in all grades (N=930, or 100%) complete a post-implementation online survey in May-June, 2014. The District will strive for a better than 85% response rate from teachers.



Data for the study will be collected through a post-implementation online survey of all teachers (N=930) in all of WCASD’s 16 schools, and through semi-structured interviews with the population of principals post-implementation (N=16). REL MA will design and administer the post-implementation survey of teachers. REL MA will also design and collect the post-implementation interview data from principals. All teachers and principals will be involved in data collection activities because PLCs reside in individual schools and depend to a large extent on school-level supporting conditions. Teachers are the members of PLCs; it is their expectations and perceptions we wish especially to discern. This study will enable a fine-grained assessment of their experiences pre- and post-implementation as well as how their experience compares to that of other team members (to this end we will use a unique identifier to link individual teachers to PLC teams). Principals are included because they are on-the-scene observers and gatekeepers who control access to critical resources for implementation, and because they are key stakeholders who are deeply invested in the outcomes of teacher quality and student achievement. Importantly, because PLCs in WCASD will be given complete freedom to organize their work routines, it is very likely that teams’ composition and work routines (and therefore the PLC experience of individual teachers and principals) will vary within and among schools in the district. Therefore, it is not feasible to limit data collection to a sample of teachers and administrators in light of the goal of providing adequate representation of their experiences with implementation.


Although this study collects data in a single district in response to a specific request from that district’s superintendent, we believe that the knowledge and insights generated through this study will inform a much larger community of educational stakeholders both regionally and nationally. This research project provides a rare opportunity to examine comprehensive, thoughtful district-wide implementation of PLCs – for teachers in all grade levels from kindergarten through 12th grade and in all subject-matter areas (not just reading and mathematics) – in a single intact system. The district-wide implementation is being undertaken following a careful planning process of the type utilized or contemplated by other districts and schools nationwide which do not have the ability or resources to evaluate the implementation of their programs and which can benefit indirectly from this study. Moreover, the project will be able to test the utility of a theoretically-based model and a dedicated measurement instrument for evaluating the implementation of PLCs that other researchers can use as they engage in similar tasks. Given the scarcity of methodologically rigorous research on PLCs and their effects, this project will be breaking new ground by employing parsimonious data collection activities while making efficient use of the data collected.



  1. Procedures for the Collection of Information

  1. Recruitment of Research Participants

Our study proposes to conduct face-to-face interviews with all WCASD’s principals post-implementation and to survey the entire population of teachers online post-implementation. Time will be allotted during the regular school day for teachers to respond to the surveys and for principals to participate in interviews.

Principals from all 16 schools in WCASD will be invited to participate in the post-implementation face-to-face interviews. As noted above, they have expressed their support for the initiative and for the study. Expected to last up to two hours, each interview will be scheduled at a time that is mutually convenient for the principals and study team. If needed, interviews will be rescheduled to accommodate scheduling emergencies or conflicts.

All teachers in all WCASD schools will be recruited by the superintendent to participate in the post-implementation survey. There is a population of 930 possible teacher respondents in a total of 16 schools. The district has made all teachers aware of the development of the PLC initiative as planning and pilot activities have taken place over the course of the school year just ended (2012-13). District leadership highly values the initiative and the proposed study, and has enlisted principals in encouraging and facilitating teachers’ participation. One method principals have agreed to use to boost teachers’ participation is allocating time during scheduled faculty meetings for teachers to respond to the surveys.

The recruitment package prepared and disseminated by the superintendent will include the following three documents:

  • Post-Implementation Teacher/Principal Recruitment. The email from the district superintendent expresses WCASD’s priority on the PLC initiative and the study. The email states the commitment to obtaining input from teachers and principals, and of using data from the study to guide improvements in teacher learning and student achievement. The email finally asks staff to cooperate fully in providing the information requested (see Appendix A and B).

  • Notification email. The email notification from the superintendent follows up in providing instructions for teachers to access the online survey and for principals to expect a telephone call from the study team. This email briefly reiterates the importance of studying the implementation of the PLC initiative, provides an overview of the study design, and summarizes the benefits of participating (Appendix C).

  • Survey reminder. A short email notice from the superintendent will remind non-responsive teachers to complete and submit the survey. It will be automatically generated weekly during the identified time period.



  1. Data Collection Plan and Instrumentation

The study data collection consists of one round of interviews with principals and one online surveys of teachers post-implementation. Overall the combination of quantitative and qualitative data sources is intended to provide a rich and balanced description of the process of PLC implementation in WCASD as reported by teachers and administrators.



Principal Interviews. REL MA researchers will conduct post-implementation interviews with all of WCASD’s school principals (May–June, 2014) at the end of the first full school year of district-wide PLC implementation. Each interview will last two hours and will be conducted in a private room during regular work hours to minimize any inconvenience to participants.



    1. Sampling strategy: All 16 principals in the district are expected to participate.



    1. Purpose of data: The purpose of the interviews is to gauge principals’ degree of involvement with implementation of the program; learn about specific actions (including allocation of resources) they initiated and/or related initiatives they supported; obtain their reflections on challenges from their administrative and/or leadership perspective they encountered; and represent their overall assessment of the first year of implementation. The information they provide will augment and contextualize teachers’ perspective and reports on the PLC initiative and its implementation. It will also allow a comparison to administrators’ perspective prior to the district-wide implementation of the PLC initiative as captured in the pre-implementation interviews with the sample of principals. A copy of the interview protocol is included in Appendix D. The interview protocol and instrument have been approved by the Rutgers University’s IRB (Protocol #: E13-752) and comply with Rutgers University’s policy on the protection of human subjects.



Teacher Survey. REL MA researchers will collect data from a post-implementation online survey of teachers (May–June, 2014) at the end of the first full year of implementation. The data collected will be used to describe how teachers implemented the program (Research Question 1), what artifacts they produced working in PLCs (Research Question 3), and the progress each team made on implementing the core components of the tasks identified in WCASD’s PLC protocol (Research Question 2).



    1. Sampling strategy: All teachers in WCASD will receive the post-implementation survey. There are a total of 930 possible respondents.



    1. Purpose of data: The survey will include measures that assess different aspects of teachers’ PLC experiences as defined by the logic model including: (a) the extent to which PLC team members developed, individually and collectively, a sense of a shared mission; (b) reports on group and communication dynamics and the PLC’s’ leadership/facilitation model (as a way of measuring the extent of the each PLC’s collaborative culture); (c) the degree of identification with the PLC team; (d) the perceived personal and team commitment to collective inquiry and continuous improvement; (e) the degree to which PLC team activities were action- and results-oriented; (f) the degree to which the PLC team was able to incorporate PLC activities into teachers’ work flow and work routines (e.g., regular meeting time and space); (g) personal satisfaction with PLC work and motivation to continue this work; (h) description of the tasks accomplished and products developed by the PLC team; (i) personal assessment of the level and quality of support that the school and district leadership provided to the PLC teams; and (j) personal assessment of the impact of PLCs on one’s own and others’ professional development and instructional practices, and on school culture. Appendix E includes a copy of the survey with an index of the specific constructs and scales that will be included in this online survey.



    1. Instrument description: Most survey items (75 out of 91 items, or 82%) are drawn from five sources: (a) survey questions and assessment rubrics of PLC implementation developed and published by Solution Tree (see DuFour et al., 2010); (b) a set of surveys published by the National Staff Development Council (see Killion, 2006); (c) the Professional Learning Communities Assessment—Revised (PLCA-R) survey instrument (Olivier & Hipp, 2010); (d) the Professional Learning Community Annual Survey from the Data Coach Project of the Delaware Department of Education, 2010; and (e) the Professional Learning Community Survey Instrument II from the Wake County Public School System (Baenen & Jackl, 2010). Thirteen additional items (14%) are based on the district’s expectations for completion of tasks as delineated in the PLC protocol. Sources for the three remaining items (3%) were drawn from the Group Development Questionnaire (Wheelan & Hochberger, 1996) and the Index of Group Dimensions (Hemphill, 1956).



    1. Open-ended survey items: In addition, the online survey (see Appendix E) will include open-ended questions that ask teachers to reflect on their best and worst PLC-related experience and invite their suggestions or recommendations for improvement. These questions are intended to augment and contextualize teachers’ reports on their PLC experience in the online survey. It will also be used for comparing teachers’ reflections with those that emerge from the post-implementation interviews with principals. The survey instrument and data collection protocol has been approved by the Rutgers University’s IRB.

  1. Data Cleaning and Analysis

REL MA approach to answering the research questions is based on triangulation of different types of data, quantitative and qualitative. This is done not only to obtain deeper insights into the dynamic nature of the implementation but also to guard against potential bias in our representation of the process and the interpretation of findings. Accordingly, once the teacher survey dataset is ready for analysis, we will explore the distribution of individual responses on all variables of interest for irregular patterns (outliers, missing values, and deviation from normality for continuous variables) and make adjustments as needed. If we detect a random pattern of missing values (i.e., missing completely at random or missing at random), we will exclude missing cases from the analysis (listwise deletion). If missingness is not at random, then the missing data mechanism must be modeled as part of the estimation process in order to produce unbiased parameter estimates. We will employ a maximum likelihood approach to missing values, i.e., choosing as parameter estimates those values which, if true, would maximize the probability of observing the values of variables observed in the data (see Allison, 2001). Next, we will compare responses across subgroups of teachers who share a distinct individual characteristic (such as grade level and content area), as well as teachers from different departments and schools, to identify potential sources of bias in our data (i.e., violation of the homogeneity of variance assumption).


Audio recordings of principal interviews will be transcribed and the analyzed using Dedoose. Special attention will be given to extracting insights into barriers and facilitators to implementation. The content of teachers’ responses to the open-ended questions in the post-implementation online survey will be downloaded and then content analyzed using the same procedure to extract the most positive and most negative PLC experiences reported by teachers and to synthesize their recommendations for improving the implementation of PLCs in the District.


Research Question 1 (how do PLC teams implement the study district’s charge?) will be answered primarily through analyzing the post-implementation survey data collected from teachers. The survey will include several items (see Appendix E) that ask teachers to report retrospectively on aspects of the work routine of the PLC in which they are a member (e.g., degree to which the team meets regularly, level of active participation in the group, group dynamics, and leadership style within the group).

  • Responses (on a Likert scale) provided by teachers who are members of a specific PLC team (as ascertained by the team’s unique study identifier) will be summed and averaged to produce team-level measures of these variables.

  • These variables will be analyzed next using cluster analysis to develop taxonomies of work routines as they emerge from this exploratory procedure. The study team will determine the specific cluster analysis algorithm to be used once there is an opportunity to assess the specific properties of the data collected from teachers.

  • Next, the study team will use findings from content analyzing teachers’ responses to the open-ended questions on the post-implementation online survey (where they describe the most positive and the most negative PLC experience they had and recommend possible improvements) to delineate common challenges/barriers that PLC teams encounter as members collaborate to establish work routines.


The data analysis approach to answering Research Question 2 (how do WCASD teachers and principals evaluate their experience with the program compared to their initial expectations) involves variables included in the online survey that measure expectations, attitudes, evaluations, and self-reported behaviors.

  • The study team will utilize paired-sample t-tests (for continuous variables) and chi-square tests of differences (for categorical variables) to test and assess the magnitude of discrepancies between teachers’ expectations and their impressions of the program following the first full year of implementation.

  • Findings from teachers’ responses to the open-ended reflection questions on the post-implementation survey will also be used to answer this question, either by corroborating and/or clarifying themes that emerge from the teacher survey or adding new insights that were not captured through the survey. To answer this research question regarding the principals, the study team will analyze qualitative findings from the post-implementation interviews with principals.

  • Audio recordings of the interviews will be transcribed verbatim and checked against the original audio recording for accuracy.

  • Dedoose, a web-based application for managing, integrating, and analyzing qualitative and mixed methods data, will be used for data analysis. A grounded theory approach will be employed to explore a priori and emergent themes regarding principals’ evaluation of the program. The a priori themes will be defined according to the explicit themes explored in the interview (e.g., role of principal in implementation and allocation and use of school resources to support PLC work). It is anticipated that emergent themes will be focused on facilitators and barriers to implementation as well as on the impact, if any, PLCs had on teachers, students, and/or the school in the course of the first year of the program.

  • An extensive list of codes and their definitions that relate to each theme will be derived from the interview data. Two independent coders will use the coding scheme to analyze transcripts of four randomly selected interviews and agreement among coders (intercoder reliability) will be assessed using Cohen’s Kappa. The coding scheme will be adjusted to eliminate disagreements among coders and improve overall reliability (Cohen’s k > .85).

  • All codes will be entered into Dedoose and tagged to their associated segments of text for all interviews. Text segments will then be sorted by codes and reviewed by the coding team to identify emergent themes and to identify recurring patterns of responses to assess prevalence of themes.

  • In the final step of the analysis, the study team will examine relationships among themes by looking at associations between differentially coded content.


Lastly, the study team will compare key findings about teachers’ perspective on the PLC initiative and its implementation to key findings that emerge from the analysis of the interview data obtained from principals. The study team will note similarities and differences in perspectives as a way of assessing possible changes to school culture before and after implementation.


To answer Research Question 3 (specific products produced by PLC teams), the study team will analyze the survey data for teachers’ reports about artifacts produced by their team.

  • Specifically, teachers will be asked to indicate the number of instructional units for which the team designed (1) essential learning targets, (2) core formative assessments, and (3) assessment rubrics. They will also be asked to indicate if they shared these artifacts with stakeholders (students, parents, and principal) and whether these artifacts were actually used with students in the course of the first year of implementation. The study team will also reference principals’ assessment of artifacts produced by PLC teams in their school based on analyzing the interview data collected from all principals in the district at the end of the first year of full implementation of the PLC initiative.

  1. Data Storage and Management

Data collected from teacher surveys and principal interviews will be linked using the unique study identifiers, such that each teacher is also identified as a member of a particular PLC team and each teacher and principal are also linked to a particular school in the District. All data files will be fully documented, and at the end of the study, study identification numbers will be stripped.


Given the need to link study participants across multiple datasets, participants who consent to participate in the study will be asked to provide personally identifiable information (first and last name, role, and name of school). REL MA will keep this information confidential and use it for the sole purpose of issuing each participant a unique study identifier to link individuals across datasets used in the analysis. All products from this study will report data in aggregate form only. Schools will be given the option of remaining anonymous in all public reports. REL MA staff responsible for all data analyses will follow National Center for Education Statistics protocol for restricted datasets; only a limited number of analysts will have access to the data; and data will be stored on a non-networked computer in a locked room accessible only by the analysts.


REL MA (specifically, Rutgers University’s Center for Effective School Practices, a subcontractor to ICF International) will protect the confidentiality of information for the study and will use it for research purposes only. The project director, Dr. Blitz, will ensure that individually identifiable information about study members remains confidential. When reporting the results, data will be presented in aggregate form only so that individuals and schools will not be identified. The following safeguards, which are routinely employed by Rutgers University’s Center for Effective School Practices to carry out confidentiality assurances, will be applied consistently during the study:


  • All employees sign a confidentiality pledge, which describes both the importance of and the employee’s obligation to discretion. All study team members who will have access to study participants or data will sign the pledge and will obtain appropriate clearance. Procedures will be followed to revoke clearance in a timely fashion from members who leave the study team.

  • Respondents’ personally identifiable information is maintained on separate forms and files, which are linked by sample identification number only.

  • Access to hard copy documents is strictly limited. Documents are stored in locked files and cabinets, and discarded materials are shredded.

  • Access to computer data files is protected by secure user names and passwords, which are available to specific users only.

3. Methods to Maximize Response Rates and Deal with Nonresponse

REL MA has developed multiple strategies to maximize response rates while minimizing burden on respondents. It has found that the following techniques contribute significantly to a high response rate: establishing positive relationships with district staff and respondents; sending advance emails; establishing efficient and flexible scheduling; and sending repeated reminders about participation in data collection activities. In this study, we expect that response rates will benefit from the unequivocal support and enthusiasm of the district’s superintendent and principals for the PLC initiative and for the study.


REL MA expects a 100% response rate from principals for the post-implementation interviews. If work-related emergencies at school preclude one or more principals from scheduling or keeping their appointments for the interview, then the study team will seek first to reschedule for a face-to-face interview during regular school hours. The study team will use the backup strategy of scheduling telephone interviews or computer-assisted telephone interviews at a mutually convenient time that might fall outside of the regular school hours. The target of a 100% response rate is reasonable in light of principals’ involvement in and endorsement of planning and pilot activities for implementation of professional learning communities in the study district.


REL MA is striving for 85% or greater response rate from teachers for the post-implementation survey. Teachers will have a window of 30 days for accessing the online survey. We believe that better than 85% response rate is attainable because principals have agreed to structure time for teachers to complete the survey during faculty meetings or other periods already dedicated to school-wide or district-wide issues. REL has had success in partnering with schools to achieve response rates exceeding 80% (Drummond et al., 2011; Heppen et al., 2012; Martin, Brasiel, Turner, & Wise, 2012; Wijekumar, Hitchcock, Turner, Lei, & Peck, 2009).


To help alleviate respondents’ concerns about data privacy and subsequently boost participation, all information request documents will include a statement on the study’s adherence to confidentiality and data collection requirements (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Appropriate certifications will appear on all study materials for teachers and principals. Recruiters who schedule interviews with principals will also inform them that the study meets all federal research guidelines and was reviewed by both the Office of Management and Budget (OMB) and an independent institutional review board (IRB). Senior study team members will monitor recruiting issues daily to quickly resolve any obstacles to participation that might arise.


In addition to the sending of advance emails (the invitation from the superintendent, and the notification email) and reminders to participants by the superintendent, the study team will follow up with principals by telephone or email within two days to answer any additional questions and begin scheduling post-implementation interviews. If needed, the study team will schedule a conference call with principals to address any common issues that are identified.


If survey response rates are lower than expected, the study team will conduct a sensitivity analysis to investigate the pattern of missing values in order to determine whether nonresponse is associated with potential selection bias. If this is the case, the study team will take care to analyze the data and interpret findings accordingly, for example by means of propensity score analysis.


4. Tests of Procedures or Methods to Be Undertaken

The study team piloted the post-implementation interview protocols with three principals outside of the study district whose experience with PLCs is appropriate to each instrument. The study team also pre-tested the post-implementation surveys with nine teachers outside the study district whose experience with professional learning communities is appropriate to each of these instruments. These pilots assessed the content, clarity, and specific wording of individual interview questions and survey items, and respondent burden time for the interviews and surveys. The pilots also assessed the use of probes for the interviews. All changes resulting from the pilot and pretest have been made and have been incorporated in the documents submitted for OMB approval. No further changes are planned.


5. Individuals Consulted on Statistical Aspects of the Design and Collecting and/or Analyzing Data

The following individuals were consulted on the statistical aspects of the study design and on the data collection and analysis:

Name

Title

Telephone Number

Robert Boruch

Professor, University of Pennsylvania

215-898 0409

Laura Hamilton

Senior Behavioral Scientist, RAND

412-683-2300, x4403

Chris Hulleman

Research Associate Professor, James Madison University

540-568-2516

Andrew Porter

Professor, University of Pennsylvania

215-898-7014

Christopher Rhoads

Assistant Professor, University of Connecticut

860-486-3321



These experts on research methods, data analysis, implementation, and teaching and learning have provided input on the study’s design. Based on their input, the original emphasis on conducting process and outcomes evaluation of WCASD’s PLC initiative has been replaced with a focus on the process of implementation and the degree to which core elements of the program are implemented. This approach has informed the logic model of the evaluation as well as the overall research methodology which combines quantitative and qualitative data collection.





References

Allison, P. D. (2001). Missing Data. Sage University Papers Series on Quantitative Applications in the Social Sciences. Thousand Oaks, CA: Sage.

Baenen, N., & Jackl, A. (2010, June). Evaluation of central services professional learning teams as of spring 2010. Eye on Evaluation. E&R Research Report No. 10.06. Raleigh, NC: Wake County Public School System.

Delaware Department of Education. (2012). DDOE/Data coach project. Retrieved from www.doc.k12.de.us/tleu_files/PLCSurveyReport_2012.pdf.

Drummond, K., Chinen, M., Duncan, T.G., Miller, H.R., Fryer, L., Zmach, C., & Culp, K. (2011). Impact of the Thinking Reader® software program on grade 6 reading vocabulary, comprehension, strategies, and motivation (NCEE 2010-4035). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

DuFour, R., Eaker, R., & DuFour, R. (2005). Recurring themes of professional learning communities and the assumptions they challenge. In R. DuFour, R. Eaker, & R. DuFour (Eds.), On common ground: The power of professional learning communities (pp. 7–30). Bloomington, IN: National Educational Services.

DuFour, R., DuFour, R., Eaker, R., & Many, T. (2010). Learning by doing: A handbook for professional learning communities at work (2d ed.). Bloomington, IN: Solution Tree.

Grossman, P., Wineburg, S., & Woolworth, S. (2000). What makes teacher community different from a gathering of teachers? Seattle, WA: Center for the Study of Teaching and Policy.

Hall, G., & Loucks, S. (1979). Implementing innovations in schools: A concerns-based approach. Austin, TX: Research and Development Center for Teacher Education.

Hemphill, J. (1956). Group dimensions: A manual for their measurement. Research Monograph No. 87. Columbus, OH: Ohio State University, Bureau of Business Research.

Heppen, J.B., Walters, K., Clements, M., Faria, A., Tobey, C., Sorensen, N., and Culp, K. (2012). Access to Algebra I: The Effects of Online Mathematics for Grade 8 Students. (NCEE 2012–4021). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

Hord, S. M. (1997). Professional Learning Communities: Communities of Continuous Inquiry and Improvement. Austin, TX: Southwest Educational Development Lab.

Killion, J. (2006). Collaborative professional learning in school and beyond: A tool kit for New Jersey educators. New Jersey Department of Education Office of Academic and Professional Standards and the New Jersey Professional Teaching Standards Board in cooperation with the National Staff Development Council.

Lomos, C., Hofman, R. H., & Bosker, R. J. (2011). Professional communities and student achievement: A meta-analysis. School Effectiveness and School Improvement, 22(2), 121–148.

Martin, T., Brasiel, S. J., Turner, H., & Wise, J.C. (2012). Effects of the Connected Mathematics Project 2 (CMP2) on the Mathematics Achievement of Grade 6 Students in the Mid-Atlantic Region (NCEE 2012-4017). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

National Staff Development Council. (2001). Standards for staff development. Oxford, OH: Author. Retrieved from http://www.nsdc.org/standards/learningcommunities.cfm

Nelson, M. C., Cordray, D. S., Hulleman, C. S., Darrow, C. L., & Sommer, E. C. (2012). A procedure for assessing intervention fidelity in experiments testing educational and behavioral interventions. Journal of Behavioral Health Services Research, 39(4): 374–96.

Olivier, D.F. & Hipp, K.K. (2010). Demystifying professional learning communities: School leadership at its best. Lanham, MD: Rowman & Littlefield Education.

Stoll, L., Bolam, R., McMahon, A., Wallace, M., & Thomas, S. (2006). Professional learning communities: A review of the literature. Journal of Educational Change, 7, 221–258.

Vescio, V., Ross, D., & Adams, A. (2008). A review of research on the impact of professional learning communities on teaching practice and student learning. Teaching and Teacher Education, 24(1), 80–91.

Wijekumar, K., Hitchcock, J., Turner, H., Lei, PW., and Peck, K. (2009). A Multisite Cluster Randomized Trial of the Effects of CompassLearning Odyssey® Math on the Math Achievement of Selected Grade 4 Students in the Mid-Atlantic Region (NCEE 2009-4068). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

Wood, D. R. (2007). Professional learning communities: Teachers, knowledge, and knowing. Theory Into Practice, 46(4), 281–290.

OMB Package: Evaluation of a District-Wide Implementation of a Professional Learning Community Initiative 1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleTable of Contents
AuthorCindy Blitz
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy