PPSS_T018_OMBClearance_1.PartA_063015

PPSS_T018_OMBClearance_1.PartA_063015.docx

Study on Sustaining the Positive Effects of Preschool

OMB: 1875-0276

Document [docx]
Download: docx | pdf






Shape1

JANUARY 2015

Task Order 18: Study on Sustaining the Positive Effects of Preschool

Draft 2: OMB Clearance Request


Prepared for

U.S. Department of Education

Office of Planning, Evaluation and Policy Development

Policy and Program Studies Service

June 2015









Contents

Page

Introduction 1

Supporting Statement for Paperwork Reduction Act Submission 2

Justification (Part A) 2

A1. Circumstances Making Collection of Information Necessary 2

A2. Use of Information 5

A3. Use of Improved Technology to Reduce Burden 5

A4. Efforts to Avoid Duplication of Effort 5

A5. Efforts to Minimize Burden on Small Businesses and Other Small Entities 6

A6. Consequences of Not Collecting the Data 6

A7. Special Circumstances Causing Particular Anomalies in Data Collection 6

A8. Federal Register Announcement and Consultation 7

A9. Payment or Gift to Respondents 7

A10. Assurance of Confidentiality 8

A11. Sensitive Questions 9

A12. Estimated Response Burden 9

A13. Estimate of Annualized Cost for Data Collection Activities 10

A14. Estimate of Annualized Cost to Federal Government 10

A15. Reasons for Changes in Estimated Burden 11

A16. Plans for Tabulation and Publication 11

Case Study Analytic Approach 11

A17. Display of Expiration Date for OMB Approval 13

A18. Exceptions to Certification for Paperwork Reduction Act Submissions 13

References 21



Introduction

The Policy and Program Studies Service (PPSS), within the U.S. Department of Education’s Office of Planning, Evaluation and Policy Development, requests Office of Management and Budget (OMB) clearance for the case study component of the Study on Sustaining the Positive Effects of Preschool.

This study aims to accomplish two main goals. The first is to produce a literature review summarizing what is known about policies, programs, and practices that can help students in kindergarten through Grade 3 (K–3) build on the positive effects of preschool or make cognitive, social-emotional, and academic gains. The second goal is to provide detailed case study descriptions of five programs that help disadvantaged students in K–3 build on the positive effects of preschool or lead to positive cognitive, social-emotional, and academic outcomes by using policies, programs, and practices related to two key topic areas: (1) preschool and K–3 alignment and (2) differentiated instruction.

Supporting Statement for Paperwork Reduction Act Submission

Justification (Part A)

A1. Circumstances Making Collection of Information Necessary

Study Overview

Preschool can improve academic, behavioral, social-emotional, and cognitive outcomes for students of varying backgrounds, including disadvantaged students, at least in the short term. Research shows that participation in high-quality preschool can improve young children’s readiness skills for elementary school (e.g., Andrews, Jargowsky, & Kuhne, 2012). However, without additional and continuous supports, as these children continue through the elementary grades, participation in preschool cannot overcome the multiple challenges faced by disadvantaged children. It is important to identify ways to sustain early cognitive, social-emotional, and academic outcomes in the years immediately following preschool in order to give all students opportunities to thrive academically and sustain gains that might have been made with the benefit of preschool.

The study involves two key components: (1) a literature review and (2) case studies. The goals of each component are as follows:

  1. From the extant literature, summarize what is known about policies, programs, and practices that have the potential to aid practitioners and policy-makers in sustaining the positive effects of preschool. The review will focus on two specific areas:

  • Preschool and K–3 alignment

  • Differentiated instruction

  1. Provide detailed case study descriptions of five programs that help disadvantaged students in K–3 have positive cognitive, social-emotional, and/or academic outcomes and may build on the positive effects of preschool by using policies, programs, and practices from the two topic areas above.

Building on promising practices and potential sites uncovered during the literature review, the case study component of this study will examine the design and implementation of policies, programs, or strategies related to the study’s two topics of interest at five sites that implement programs aimed at cognitive, social-emotional, or academic outcomes for disadvantaged students in Grades K–3. Of particular interest are economically disadvantaged children, children who are learning English as their second language, and children who come from homeless, neglected, or migrant populations. Data collection activities for the case studies will include interviews with key staff, observations of program activities or classrooms, and a review of key program documents. Analyses of the data collected within and across the sites will provide practitioners and policymakers with much-needed information about the implementation of preschool to Grade 3 alignment and differentiated instruction.

Conceptual Framework

For preschool-age children at risk of falling behind in school, attending a high-quality early learning and care program, including Head Start, has been found to help improve their readiness for school and school success through higher test scores, better attendance, reduced placement in special education, and reduced grade-level retention (Andrews, Jargowsky, & Kuhne, 2012; Barnett, 2008; Karoly & Bigelow, 2005; Reynolds, 1994; Reynolds et al., 2007). Other lasting benefits can include higher rates of high school completion, a greater likelihood of attending college, and increased lifetime earnings (Karoly, Kilburn, & Cannon, 2005; Gormley & Phillips, 2005; Reynolds & Ou, 2011).

Accordingly, the federal government has supported preschool education through programs such as the U.S. Department of Health and Human Services’ (HHS’s) Head Start program and the U.S. Department of Education’s (ED’s) preschool services for children with disabilities (Individuals with Disabilities Education Act, Part B) as strategies for improving the education and life trajectories of at-risk children. Because of the promise of preschool to improve outcomes for disadvantaged students, states are increasingly implementing universal prekindergarten programs. Furthermore, in his 2014 State of the Union address, President Barack Obama called for high-quality preschool for all children and requested additional federal investments in the 2015 fiscal year budget for Child Care and Development Block Grants, Head Start and Early Head Start, and Preschool Development Grants to states. The president also has proposed expanded Early Head Start-child care partnerships, expanded home visiting programs, and new partnerships with states to provide high-quality preschool to four-year-olds in families earning less than 200 percent of the federal poverty line.

Research shows, however, that not all students who experience preschool achieve positive, long-term outcomes (Barnett, 2008; Lee & Loeb, 1995). Some preschool program evaluations document that strong initial benefits fade, the “ preschool fade-out effect,” sometimes as soon as early elementary school (Manship, Madsen, Mezzanote, & Fain, 2013; Ramey et al., 2000; U.S. Department of Health and Human Services, 2010). Preschool effects may fade for many reasons, including poor elementary school quality, lack of parental supports, lack of continuous follow-up with participating students, or insufficient intensity or duration of the program (e.g., Brooks-Gunn, 2003; Lee & Loeb, 1995). Therefore, the early elementary context can help or hinder children’s continued academic and social progress. As the country prepares to expand preschool access for all four-year-old children and to align preschool programs with kindergarten-through-12th-grade elementary and secondary education systems, the need for quality information about how to sustain the benefits of preschool has heightened.

The study is guided by a conceptual framework that considers factors associated with preschool (e.g., dosage, quality) and postpreschool education (e.g., preschool and K–3 alignment, differentiated instruction) and context (e.g., school environment) as contributors to students’ later outcomes.

Achieving desired outcomes from preschool programs is not a mere matter of children’s participation. We know that the quality of the preschool program matters (e.g., Peisner-Feinberg et al., 2001), and that the context in which children live influences their outcomes. As Lee and Loeb (1995) have noted, the effects of preschool often fade in early elementary school if the quality of schools children attend after preschool is poor. Two postpreschool program approaches, described further below, may be ways to sustain preschool’s positive effects.

Preschool and K–3 Alignment. Preschool and K–3 alignment or PK–31 alignment reflects coordination among standards, curricula, teacher instructional practices, student assessment, and teacher professional development between the preschool years and the early elementary school years. The effects of preschool may be more sustainable if curricula and instructional strategies from preschool through Grade 3 are well-aligned (Brooks-Gunn, 2003).When implemented as intended, PK–3 alignment policy or practices should provide a coherent educational experience for a student starting in preschool. It may be a way to capitalize and sustain the investment of early education (Bogard & Takanishi, 2005).

Differentiated Instruction. The premise of differentiated instruction is that teaching practices and curricula should vary to meet the diverse needs and skills of the individual student and to optimize students’ learning experiences (Tomlinson, 2000; 2001). One explanation for the preschool fade-out effect, described previously, is that children who make early gains in preschool may not have the opportunity to maintain their growth rate or learning trajectory because early elementary instruction is not differentiated to meet their current skill level.

Case Study Research Questions

The case studies that are the focus of this Office of Management and Budget (OMB) clearance request will provide practitioners and policymakers with in-depth information about the design and implementation of programs that may sustain and build on the effects of preschool as children progress through the early elementary grades, addressing the very real problem of preschool fade-out effects. Specifically, the case studies will highlight characteristics (e.g., resources, personnel, staff, training, setting, population served) of PK–3 or differentiated instruction programs that aim to increase cognitive, social-emotional, or academic student outcomes. They also will explore implementation challenges and supports within urban and rural settings, which will, in turn, provide information on implementation and sustainability issues that may be unique to each setting.

The case studies aim to answer the following research questions:

  1. What are the characteristics (e.g., resources, personnel, staff characteristics, training, setting, population served) of PK–3 or differentiated instruction programs that aim to increase cognitive, social-emotional, or academic outcomes of students?

  2. On what research, theory, and/or experiences did the designers of these programs base the program structure and content?

  3. What are the challenges of implementing these programs and how have staff and leaders tried to overcome these challenges?

  4. How does the organization implementing the program ensure its sustainability?

A2. Use of Information

The case studies will be of immediate interest and significance for practitioners and policymakers because they will offer detailed information about supports and interventions that school systems can use to sustain the benefits of preschool. Building on the study’s literature review, the case studies will provide an in-depth look at how practices related to preschool and K–3 alignment and differentiated instruction can be successfully developed and implemented in real-world contexts.

The study team will use the interview, observation, and document review data collected during program site visits to develop individual case summaries for each of the five sites in the case study sample. In addition, the study team will use the data to produce a final publicly available report of cross-case findings, which will include detailed descriptions of each site and the site’s program or policy. In designing this report, the study team will focus on developing a useful product addressing multiple audiences, including program directors, principals, teachers, federal and state policymakers planning for early childhood investments and supports, and researchers conducting studies on this topic. The information will be useful for policymakers as they look at adopting and implementing potential new initiatives. District administrators and principals will be able to take the specific information included in the case studies to implement similar interventions in their own districts and schools.

A3. Use of Improved Technology to Reduce Burden

The recruitment and data collection plans for the case study component of this study reflect sensitivity to issues of efficiency and respondent burden. Beginning with site selection, the study team will use Internet searches related to each topic and will enter relevant information into a database that will track each program/site, its characteristics, and its progression through the site- selection process. Once potential sites are identified, the study team will use online materials available to determine the extent to which each program has undergone internal or external evaluation and corresponding evidence for effectiveness. It will conduct screening interviews by email and telephone to reduce respondent burden and facilitate convenience for participants. During the data collection process, the study team will continue to use technology to reduce burden whenever possible. For example:

  1. Interviews and focus groups will be audiotaped and then transcribed at a later date to reduce the amount of time that participants will engage in interview activities.

  1. A phone number and e-mail address will be provided to study participants, allowing them to contact research staff directly with any questions they may have.

A4. Efforts to Avoid Duplication of Effort

The study team will avoid duplication of effort by using preexisting data (e.g., program information available on school or district websites, published program evaluations) whenever possible to guide site selection and data collection. For example, the study team will determine whether any of the proposed data collection elements for the case studies can be addressed through preexisting policy or evaluation documents. This step will reduce the number of questions asked in the case study interviews and focus groups, thus limiting respondent burden and minimizing duplication of previous data collection efforts and information. In addition, the study team will avoid selecting sites for which case studies have already been conducted, except in cases where the extant case study does not go into the detail needed to address the study’s research questions or was completed so long ago that it would be useful to learn updated information about the site’s work.

A5. Efforts to Minimize Burden on Small Businesses and Other Small Entities

Some school districts likely to be involved in this study have fewer than 50,000 students, and are thus considered small entities. Because we have minimized the burden on these (and all) districts and are offering a small stipend for participating, we do not believe study activities will have a significant economic impact on these small entities.

A6. Consequences of Not Collecting the Data

The data to be collected through the case studies are needed to inform state and local efforts to develop and implement programs that can successfully sustain the benefits of preschool as children advance through the elementary grades. Failure to collect the data proposed through these case studies will limit the information available to the Department to guide federal policy development and technical assistance related to programs to sustain the effects of preschool. In addition, it would prevent the distribution of in-depth information to policymakers and practitioners across the nation about the use of such programs. The absence of this case study report could therefore hinder state, district, and school stakeholders’ ability to make careful and informed decisions about promising policies, programs, and practices to sustain the benefits of preschool.

A7. Special Circumstances Causing Particular Anomalies in Data Collection

None of the special circumstances listed applies to this data collection.

A8. Federal Register Announcement and Consultation

  1. Federal Register Announcement. A 60-day notice to solicit public comments was published in the Federal Register on 3/30/2015 (Volume 80, Number 16648). No public comments were received during the 60-day comment period.

  1. Consultations Outside the Agency. A technical working group (TWG) was consulted as part of this study and will continue to provide conceptual and methodological considerations for the collection, analysis, and reporting of the case study data. The TWG members provided comments on the study design and they nominated sites for inclusion in the case study sample in 2014. The TWG will review the protocols and proposed case study sites in late spring 2015. The study’s TWG members, listed in Exhibit 4, bring together expertise in research on the effects of preschool among disadvantaged children and the sustainability of those effects, the needs of special student populations, approaches for aligning prekindergarten through third grade, and methods for conducting rigorous research reviews and case studies.

Exhibit 4. Technical Working Group Members

Name

Affiliation

Lindy Buch

Retired director of Early Childhood Education and Family Services in the Office of Great Start at the Michigan Department of Education

Margaret Burchinal

Senior scientist at the Frank Porter Graham Child Development Institute at the University of North Carolina, Chapel Hill, and an adjunct professor in the Department of Education at the University of California, Irvine

Linda Espinosa

Retired professor, Early Childhood Education, University of Missouri; served as codirector of the National Institute for Early Education Research

Kristie Kauerz

Research assistant professor of P–3 Policy and Leadership at the University of Washington

Ellen Kisker

Principal investigator, What Works Clearinghouse Early Childhood Education for Children With Disabilities Review Team

A9. Payment or Gift to Respondents

Each participating school will receive an honorarium ($100 gift card for purchasing classroom materials or school supplies) as recognition of their time and effort to participate in the study.  At most case study sites, two schools will be included in the study, for a total of $200 in honoraria per site. The interviews and focus groups are the sole source of data for the study, which increases the importance of achieving a high participation rate. The honoraria are valuable in securing staff participation and serve to acknowledge the value of their time. The $100 amount is intended to recognize schools for the time it takes the principal, teachers, and staff to provide us with information, schedule and coordinate visits, and participate in interviews and focus groups. Refreshments will be provided to participants the day of the case study visits. No other payments or gifts are planned for this study.

Other recent federal data collections have paid participants to ensure successful recruitment and data collection efforts. Our review of several of these data collections suggests that our honorarium amount is relatively minimal. For example, the Longitudinal Assessment of Comprehensive School Reform Implementation and Outcomes (LACIO) paid the schools participating in case studies $200 (ED-01-CO-0129). The Identifying Potentially Successful Approaches to Turning Around Chronically Low Performing Schools study paid case study schools $250 (ED-04-CO-0025/0020).

A10. Assurance of Confidentiality

As researchers, the study team is vitally concerned with maintaining the anonymity and security of its records. The contractors’ project staff has extensive experience collecting information and maintaining the confidentiality, security, and integrity of interview, focus group, and observation data. All members of the study team have obtained their certification on the use of human subjects in research as well as federal security clearances. This training addresses the importance of the confidentiality assurances given to respondents and the sensitive nature handling data. The team also has worked with the Institutional Review Board (IRB) at American Institutes for Research (AIR) to seek and receive approval of this study, thereby ensuring that the data collection complies with professional standards and government regulations designed to safeguard research participants.

The following data protection procedures will be in place:

The study team will protect the identity of individuals from whom we collect data for the study to the extent possible (given the small number and size of some districts included in the study) and will use it for research purposes only. Respondents’ names will be used for data collection purposes only and will be disassociated from the data prior to analysis. As information is gathered from respondents or sites, each respondent will be assigned a unique identification number, which will be used in analysis files as well as printout listings on which data are displayed. Respondents’ unique identification number also will be used for data linkage across sources (e.g., interview and observation data). Any identifiable information will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required. In addition, the study team will shred all interview protocols, observation rubrics, forms, and other hard-copy documents containing identifiable data as soon as the need for the hard copies no longer exists.

Prior to beginning interviews, focus groups, or observations, a member of the research team will explain to participants what will be discussed, how the data will be used and stored, and how anonymity will be maintained. Participants will be instructed that they can stop participating at any time. The study’s goals, data collection activities, participation risks and benefits, and uses for the data will be detailed in a consent form that all participants will read and sign prior to beginning any data collection activities. Participants will be informed that sites will be named in case study reports but that individuals will not be named specifically. Signed consent forms will be collected from site visitors and stored in secure file cabinets at the contractors’ offices.

All electronic data will be protected using several methods. The contractors’ internal networks are protected from unauthorized access by defense-in-depth best practices, which incorporate firewalls and intrusion detection and prevention systems. Access to computer systems is password protected, and network passwords must be changed on regular basis and conform to the contractors’ strong password policies. The networks also are configured so that each user has a tailored set of rights, granted by the network administrator, to files approved for access and stored on the local area network (LAN). Access to all electronic data files and workbooks associated with this study will be limited to researchers on the case study data collection and analysis team. Any files that are saved outside these secure folders (e.g., to transmit data files between study team members at AIR and case study sites) will be encrypted and require a strong password to access.

A11. Sensitive Questions

This study will not include the collection of sensitive information. The only data to be collected directly from case study participants will focus on district and school policies and practices rather than on individual people. District and school policies and practices are within the public domain (e.g., schools communicate their policies and programs to students and parents in a variety of ways). In this sense, the data are not sensitive in nature.

A12. Estimated Response Burden

It is estimated that the total hour burden for the case study data collection is 85.5 hours. This translates to an estimated cost of $3,058.37 based on the average hourly wage of participants. Exhibit 5 summarizes the estimates of respondent burden for the various study activities across all five case study sites.

The estimated burden associated with the data collection at each individual case study site is 12 hours. This estimate assumes the following data collection activities at each of the five sites:

  • One-and-one-half-hour interviews with one or two district program leaders

  • One-hour interviews with principals at one or two schools within each district

  • One-hour interviews or focus groups with five program teachers, preschool teachers, and/or other program staff at each of two schools within each district

  • One-hour interview with the program funder (e.g., from a foundation or private funding agency), if applicable

  • One-hour interview with the program evaluator, if applicable

  • Thirty minutes each of the principal and district contact’s time to account for conversations to recruit the site, obtain background information, and schedule site visits and interviews.

The data collection for each site also will include two observations of program activities, but these observations have been excluded from estimates of response burden because the observed program activities will be part of respondents’ customary and usual business practices.

Exhibit 5. Estimated Total Hour and Monetary Cost Burden of Case Study Data Collection

Task

Total Sample Size

Estimated Response Rate

Number of Respondents

Time Estimate (in Hours)

Total Hour Burden

Hourly Rate2

Estimated Monetary Cost of Burden

District Superintendent/

Program Staff Interview

10

100%

10

1.5

15

$43.88

$658.20

District Superintendent/

Program Staff Pre-Survey

10

100%

10

0.5

5

$43.88

$219.40

District Superintendent/ Program Staff Recruitment/Scheduling

5

100%

5

0.5

2.5

$43.88

$109.70

Principal Interview

10

100%

10

1

10

$42.19

$421.90

Principal Pre-Survey

10

100%

10

0.5

5

$42.19

$210.95

Principal Recruitment/Scheduling

10

100%

10

0.5

5

$42.19

$210.95

Preschool Teacher Interview/Focus Group

10

100%

10

1

10

$28.21

$282.10

Elementary Teacher or other Program Staff Interview/Focus Group

25

100%

25

1

25

$28.21

$705.25

Program Funder

4

100%

4

1

4

$29.99

$119.96

Evaluator Interview

4

100%

4

1

4

$29.99

$119.96

Total for Case Study Data Collection

98

98

85.5

$3,058.37

Annualized Burden

32 hours, 40 minutes

32 hours,

40 minutes

28.5

$1,019.46

A13. Estimate of Annualized Cost for Data Collection Activities

There are no additional annualized costs for data collection activities associated with this data collection beyond the total hour burden estimated in item A12.

A14. Estimate of Annualized Cost to Federal Government

The estimated cost to the federal government for the Task Order 18 case studies, including development of the case study research plan and data collection instruments as well as data collection, data analysis, and report preparation, is $321,888 for the two years of the study, or approximately $160,944 per year.

A15. Reasons for Changes in Estimated Burden

This is a new data collection.

A16. Plans for Tabulation and Publication

Case Study Analytic Approach

The study team will establish and adhere to a set of qualitative analytic procedures and standards to limit bias and ensure reliable findings. For the case studies to be of value, they must be grounded in rigorous methods of qualitative research. Experts in qualitative research methods make clear that well-planned, systematic, and transparent qualitative data collection and analysis techniques yield reliable, transferable findings (Anfara, Brown, & Mangione, 2002; Creswell, 1998). Qualitative site visit data will be analyzed through a carefully structured five-step analytic process guided by the study’s research questions and conceptual framework. The process is designed to build reliability and validity into the case study process by both creating a chain of evidence and using triangulation to identify themes (Yin, 2003). The analysis process will incorporate standards of evidence, triangulation of data, and procedures for measuring and ensuring interrater agreement.

Exhibit 6. Overview of Qualitative Analysis Process

Shape2

To analyze site visit data, the research team will rely on a set of codes based on constructs underlying the interview and focus group protocols, classroom observation protocols, and documents collected from sites. Coding interview and observation data is central to qualitative analysis. Through a systematic coding process, analysts will identify information associated with specific constructs that will anchor within-case and cross-case analyses.

  • First, a preliminary code list will be drafted and codes will be piloted with a subset of data (for example, selected interview transcript portions from several sites) to determine whether the set of codes covers the topics reflected in the interview, whether they are appropriately specific, and whether the definitions in the codebook are clear.

  • Once the codebook is finalized, AIR will conduct training to ensure that analysts agree on the application of each code.

  • Analysts will use a qualitative software program (e.g., NVivo or Dedoose), to facilitate the coding process. Such a program allows analysts to assign relevant codes to data and then compare coded data across sources.

  • Three analysts will code case study data; this team will meet weekly to discuss any questions about how to apply the codes and discuss any disputable data source. Any disagreements in codes will be resolved by discussion and consensus among coders.

  • To measure interrater reliability, more than one analyst will code 20 percent of data to ensure at least 80 percent agreement. If 80 percent agreement is not reached after coding the first interview or document together, coding staff will be retrained and another interview will be coded together. This process will be repeated until 80 percent agreement is reached consistently on three interviews coded together. For any text on which 80 percent agreement is not reached on codes, the senior analyst’s codes will serve as the master data.

Codes for analyzing case study data will be structured so that analysts can apply more than one code to the same interview passage or observation note, as applicable.

Case Summaries

After all raw data have been coded, analysts will identify the patterns, themes, and categories that are most relevant to the research questions of the study. This process of data reduction involves noting the prevalence of each response, group differences, and associations among data sources. Specifically, analysts will use the software program to query coded data in order to summarize findings for each case, producing both individual site and cross-site case summaries. Creating case summaries will be a systematic process that relies on coded data and not any one researcher’s or respondent’s perspective on the program or policy. To this end, analysts will review the frequency of coded responses to characterize which themes are common and which are outliers, identifying the most common themes. Analysts also will examine differences in responses among respondent types, such as by role (e.g., principal, elementary school teacher, preschool teacher, funder).

AIR will establish clear standards of evidence in order to draw conclusions. For most topics, analysts will seek convergence of perspectives to draw conclusions (i.e., at least two people agree, with no contradictory evidence). However, given the limited number of respondents and cases in this study, there might be topics for which there will be only one knowledgeable respondent (e.g., a district leader may be the only person knowledgeable about the development of the program). AIR will adjust standards of evidence to account for this. Before the question set is completed, project leadership and PPSS will establish final decision rules regarding the specific number of respondents required to count as evidence.

Cross-Case Analysis

In the final stage of analysis, the study team will use case summaries for each site to compare themes across sites. Through this review process, themes, policies, or practices common to more than one site can be identified and included in a final case study report. AIR will conduct cross-case analysis to the degree possible because it allows for more general conclusions to be drawn about how effective policies, programs, or practices can be sustained. If the programs or policies examined through case studies are very different, cross-case analysis may be less appropriate, and AIR will include more emphasis on examining themes within each case

Reporting

In summary, data collected for each case/site will be analyzed and included in five internal case summary reports as well as a publicly released case study report. The case study report will begin with an introductory section that (1) features an audience-appropriate overview of the study and (2) outlines common themes that emerged from the data analysis. The report will feature cross-case findings as well as detailed descriptions of each site and the site’s program or policy. In designing this report, AIR will focus on developing a useful product addressing multiple audiences, including program directors, principals, teachers, federal and state policymakers planning for early childhood investments, and researchers conducting studies on this topic.

The proposed timeline for data collection and reporting activities is described in detail below and shown in Exhibit 7.

Exhibit 7. Timeline for Data Collection Activities and Reporting

Activity

Time Frame

Work with PPSS/OEL/HHS to identify potential case study sites

December 2014–March 2015

Draft interview, focus group, observation, and document review protocols

December 2014–January 2015

Revise site visit protocols

March 2015

TWG meeting to review potential case study sites and protocols

April 2015

Submit OMB package and revisions

March–August 2015

Conduct site visitor training

September 2015

Conduct site visits

September-October 2015

Code site visit data

November 2015–January 2016

Submit draft site-level reports (case summaries)

March 2016

Conduct cross-case analysis

March–April 2016

Submit final site-level reports

May 2016

Submit draft case study report

April 2016

Submit final case study report

October 2016

Note: PPSS=Policy and Program Studies Service; OEL=Office of Early Learning; HHS=U.S. Department of Health and Human Services; TWG=technical working group; OMB=Office of Management and Budget

A17. Display of Expiration Date for OMB Approval

All data collection instruments will display the OMB approval expiration date.

A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions to the certification statement identified in Item 19, “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I are requested.

References

Andrews, R. J., Jargowsky, P., & Kuhne, K. (2012). The effects of Texas’s pre-kindergarten program on academic performance (CALDER Working Paper No. 84). Washington, DC: National Center for Analysis of Longitudinal Data on Educational Research.

Anfara, V. A., Brown, K. M., & Mangione, T. L. (2002). Qualitative analysis on stage: Making the research process more public. Educational Researcher, 31(7), 28–38.

Barnett, W. S. (2008). Preschool education and its lasting effects: Research and policy implications. Boulder, CO, & Tempe, AZ: Education and the Public Interest Center & Education Policy Research Unit. Retrieved from http://nepc.colorado.edu/files/PB-Barnett-EARLY-ED_FINAL.pdf

Bogard, K., & Takanishi, R. (2005). PK–3: An aligned and coordinated approach to education for children 3 to 8 years old. Social Policy Report, 19(3), 3–22.

Brooks-Gunn, J. (2003). Do you believe in magic? What we can expect from early childhood intervention programs. Social Policy Report, 17(1), 1; 3–14.

Creswell, J. W. (1998). Qualitative inquiry and research design: Choosing among five designs. Thousand Oaks, CA: Sage.

Gormley, W. T., & Phillips, D. (2005). The effects of universal pre-K in Oklahoma: Research highlights and policy implications. Washington, DC: Georgetown University. Retrieved from http://explore.georgetown.edu/publications/index.cfm?Action=View&DocumentID=14387

Karoly, L. A., & Bigelow, J. H. (2005). The economics of investing in universal preschool education in California. Santa Monica, CA: RAND Corporation.

Karoly, L. A., Kilburn, M. R., & Cannon, J. S. (2005). Early childhood interventions: Proven results, future promise. Santa Monica, CA: RAND Corporation. Retrieved from http://www.rand.org/pubs/monographs/MG341

Lee, V. E., & Loeb, S. (1995). Where do Head Start attendees end up? One reason why preschool effects fade out. Educational Evaluation and Policy Analysis, 17(1), 62–82.

Manship, K., Madsen, S., Mezzanotte, J., & Fain, G. (2013). Evaluation of the Stretch to Kindergarten program: 2012 findings. San Mateo, CA: American Institutes for Research.

Peisner-Feinberg, E. S., Burchinal, M. R., Clifford, R. M., Culkin, M. L., Howes, C., Kagan, S. L., & Yazejian, N. (2001). The relation of preschool child-care quality to children's cognitive and social developmental trajectories through second grade. Child Development, 72(5), 1534–1553.

Ramey, S. L., Ramey, C. T., Phillips, M. M., Lanzi, R. G., Brezausek, C., Katholi, C. R., & Snyder, S. (2000). Head Start children’s entry into public school: A report on the National Head Start/Public School Early Childhood Transition Demonstration Study. Birmingham, AL: Civitan International Research Center, University of Alabama at Birmingham.

Reynolds, A. J. (1994). Effects of a preschool plus follow-on intervention for children at risk. Developmental Psychology, 30(6), 787–804.

Reynolds, A. J., & Ou, S. (2011). Paths of effects from preschool to adult well-being: A confirmatory analysis of the Child-Parent Center Program. Child Development, 82(2), 555–582.

Reynolds, A. J., Temple, J. A., Ou, S., Robertson, D. L., Mersky, J. P., Topitzes, J. W., & Niles, M. D. (2007). Effects of a school-based early childhood intervention on adult health and well-being: A 19-year follow-up of low-income families. Archives of Pediatric and Adolescent Medicine, 161(8), 730–739.

Tomlinson, C. A. (2000). Differentiation of instruction in the elementary grades. ERIC Digest. Syracuse, NY: ERIC Clearinghouse on Elementary and Early Childhood Education.

Tomlinson, C. A. (2001). How to differentiate instruction in mixed-ability classrooms (2nd ed.) Alexandria, VA: ASCD.

U.S. Department of Health and Human Services. (2010). Head Start Impact Study: Final report. Washington, DC: U.S. Department of Health and Human Services, Administration for Children and Families, Office of Planning, Research, and Evaluation.

Yin, R. K. (2003). Case study research, design and methods (3rd ed.). Newbury Park, CA: Sage.

1 PK refers to prekindergarten.

2 Wage information was collected from the Bureau of Labor Statistics (BLS). Funder/evaluator wages were based on the BLS “Community Services Manager” salary. Salaries were converted to hourly wages using http://www.convertunits.com/salary/.


American Institutes for Research
1000 Thomas Jefferson Street NW
Washington, DC 20007-3835
202.403.5000 | TTY 877.334.3499

www.air.org


Copyright © 2015 American Institutes for Research. All rights reserved.

1695_03/15

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authordsorensen
File Modified0000-00-00
File Created2021-01-25

© 2024 OMB.report | Privacy Policy