Supporting Justification for Revising OMB Clearance for “The Evaluation of Ramp-Up to Readiness Under the Regional Educational Laboratory Program” (OMB 1850-0907)
Section A
May 2014
Submitted to
Joelle Lastica, Ed.D.
Contracting Officer’s Representative
Institute of Education Sciences
U.S. Department of Education
Submitted by
Dean Gerdeman, Ph.D., Director
1120 East Diehl Road, Suite 200
Naperville, IL 60563-1486
866-730-6735
www.relmidwest.org
This publication was prepared for the Institute of Education Sciences (IES) under contract ED-IES-12-C-0004 by Regional Educational Laboratory Midwest, administered by American Institutes for Research. The content of the publication does not necessarily reflect the views or policies of IES or the U.S. Department of Education, nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S. government. The publication is in the public domain. Authorization to reproduce in whole or in part for educational purposes is granted.
_01/14
Contents
Page
1. Circumstances Necessitating Collection of Information 6
2. How, by Whom, and for What Purpose Information Is to Be Used 9
3. Use of Automated, Electronic, Mechanical, or Other Technological Collection Techniques 17
4. Efforts to Avoid Duplication of Effort 18
5. Sensitivity to Burden on Small Entities 18
8. Federal Register Announcement and Consultation 19
9. Payment or Gift to Respondents 20
11. Additional Justification for Sensitive Questions 21
12. Estimates of Hour Burden 22
13. Estimate of Total Annual Cost Burden to Respondents or Record Keepers 22
14. Estimates of Annualized Cost to the Federal Government 22
15. Reasons for Program Changes or Adjustments 22
16. Plan for Tabulation and Publication and Schedule for Project 26
17. Approval Not to Display the Expiration Date for OMB Approval 31
18. Exception to the Certification Statement 31
Attachment A-1. Administrative Data Request 36
Attachment A-2. Student Fall Survey 38
Attachment A-3. Student Spring Survey 46
Attachment A-4. Consent Documents 54
Parent Information Letter and Consent Withholding Form 54
Attachment A-5. Instructional Log for Ramp-Up Workshop 58
Attachment A-6. Fall Staff Survey 64
Attachment A-7. Spring Staff Survey 73
Attachment A-8. Educational Sciences Reform Act (ESRA) 92
Attachment A-9. Federal Register Notices 93
Attachment A-10. Confidentiality Form and Affidavits 94
Attachment A-11. Assumptions and Results for the Power Analyses 97
Attachment A-12. Components of Implementation Fidelity and Their Indicators 99
Attachment A-13. Rubric for Assessing Students’ Exposure to Ramp-Up 111
Figure
Tables
Table 1. Revised Data Collection Timeline 11
Table 3. Revised Estimates of Respondent Burdent Included Phase 1 and 2 23
of the Ramp-Up Evaluation Project. 23
Table 4. Revised Estimates of Annualized Costs for Respondents Involved 25
Table 5. Description of Statistical Models for Confirmatory Analyses 27
Table 6. Description of Models for Exploratory Analyses. 28
Table 7. Schedule of Activities for Phase 1 and Phase 2 of Ramp-Up Evaluation 32
Table A-11.1 Minimum Detectable Effect Sizes Based on Numbers of Schools, Teachers, and Students 98
Table A-12.1 Components and Indicators Associated With Ramp-Up Implementation 99
Table A-13.1 Rubric for Assessing Students’ Exposure to Ramp-Up 111
Table A-14.1 Rubric for Comparing College-Readiness Supports in Treatment and Control Schools 114
The U.S. Department of Education (ED) requests clearance for a revision of a currently approved data collection under the Office of Management and Budget (OMB) clearance agreement (OMB 1850-0907) for activities related to the Regional Educational Laboratory (REL) program. For the previous clearance, ED, in consultation with American Institutes for Research (AIR), obtained approval to study the implementation of the Ramp-Up to Readiness Program (“Ramp-Up”) in Minnesota public schools. This revised application for clearance includes a second phase of the existing project involving examination of Ramp-Up’s impact. For this second phase, data will be collected from 54 additional schools in Minnesota and Wisconsin, resulting in a collection in 76 schools total.
Ramp-Up, developed by the College Readiness Consortium at the University of Minnesota, is a schoolwide guidance program that aims to increase students’ likelihood of college enrollment and completion by promoting multiple dimensions of college readiness (academic, admissions, financial, career, and personal and social). It is an intensive and comprehensive approach to college preparation (compared with many other college-access programs) in which all students within a school meet repeatedly with an advisor in groups over multiple years and receive detailed instruction and assistance related to dimensions of college readiness. Phase 1 on the project—the phase for which clearance has already been obtained—involves the gathering of data for an in-depth examination of the degree to which schools are able to implement Ramp-Up with fidelity and the contrast between Ramp-Up and other schools’ approaches to college readiness. The research questions being addressed by Phase 1 are:
RQ1. What are the characteristics of the student populations, geographical settings, and historical performance for the schools implementing Ramp-Up to Readiness?
RQ2. Among students enrolled in schools implementing Ramp-Up, how do students’ academic achievement, college enrollment actions, and college enrollment differ for students eligible versus not eligible to receive free or reduced price lunch and for students enrolled in rural versus nonrural high schools?
RQ3. To what extent do (a) schools implement the core components of Ramp-Up (i.e., structural supports, curriculum and tools, and professional development) as intended by the program developer, and (b) students in Ramp-Up schools receive the program exposure that the College Readiness Consortium believes is necessary to produce impacts?
RQ4. How does Ramp-Up differ from college-related supports (i.e., programs, services, activities, and resources) in schools not implementing Ramp-Up?
RQ5. What do school staff members (e.g., teachers, counselors, administrators) who are involved in implementing the Ramp-Up program perceive as the strengths and weaknesses of its curriculum, tools, and professional development? According to school staff, which aspects of Ramp-Up were more difficult to implement and why?
RQ6. To what extent are measures of personal readiness on ACT’s Engage survey (i.e., the Commitment to College and Goal Striving scales) valid? That is, to what extent do the Engage scales indicate concurrent and predictive validity within a high school sample?
Phase 2 of the project will involve gathering the same types of data, only from an expanded set of schools during another academic year (2014-2015). Three data collection activities will not be continued in Phase 2, so as to lessen the burden on respondents and reduce the cost for the project.
Phase 2 of the study will contribute strong experimental evidence about the efficacy of Ramp-Up as a college-readiness intervention. Schools randomly assigned to implement Ramp-Up will be compared to high schools that offer other college-readiness activities, services, and supports.
The expanded project has the potential to inform policymakers who focus on K-12 education, policymakers who focus on postsecondary education, researchers, and practitioners more broadly. Despite significant state and federal interest in increasing students’ college readiness (e.g., Council of Chief State School Officers, 2010; U.S. Department of Education, 2010), there is little rigorous evidence on the effectiveness of college-readiness interventions (Tierney, Bailey, Constantine, Finkelstein, & Hurd, 2009). And although empirical support exists on the individual dimensions of Ramp-Up, the program as a whole has not been evaluated. Phase 2 of this project will contribute strong experimental evidence about the efficacy of the Ramp-Up program on key college-readiness outcomes. Moreover, the program involves a specific curriculum and tools that could be adopted more widely if the program is found to be effective. Ramp-Up’s program design and practices also may be of particular interest to practitioners and researchers because Ramp-Up aims to serve all students in a school rather than a select subgroup; it is a data-driven approach to assess and track students’ college preparation; and its group advisory approach may be more cost effective than a similarly intense one-on-one counseling approach. Findings from Phase 2 will be helpful for informing educators and policymakers about the impact of Ramp-Up on college readiness outcomes and informing the continued development and implementation of Ramp-Up and other college-readiness programs. This expanded project will build evidence on early outcomes of the intervention in the domains of college enrollment actions, personal readiness, and advanced coursework after one year of program exposure. It will not examine the program’s complete theory of action (specified in the Justification section, Figure 1, following). The confirmatory research questions (CRQs) are as follows:
CRQ1. What is the effect of Ramp-Up on the likelihood of Grade 12 students completing FAFSA?
CRQ2. What is the effect of Ramp-Up on students’ personal college readiness for students in Grades 10, 11, and 12?
Phase 2 also will address three exploratory research questions (ERQs) to better understand the relationship between attending a Ramp-Up school and college readiness outcomes.1 First, according to the developer, the short duration of the Ramp-Up intervention in this study should impact more immediate outcomes (those listed in CRQ1-2) but also may impact longer-term outcomes for which data will be available. The first exploratory question examines whether there is evidence that Ramp-Up produces those longer term impacts.
ERQ1: What is the effect of Ramp-Up on three additional (longer-term) outcomes of interest: (a) enrollment in advanced coursework2, after accounting for the number of advanced courses offered; (b) the likelihood of a student in grade 11 taking the ACT or SAT exam3, and (c) the likelihood of a student in grade 12 submitting at least one college application? 4
ERQ2: What is the effect of Ramp-Up on the two confirmatory and three exploratory outcomes for two subgroups of interest: (1) students who scored in the middle or upper third of 8th grade standardized test scores, and (2) students who are eligible for free or reduced price lunch?5
These impact questions address the short-term effects of the program (after one year). CRQ1 and CRQ2 examine outcomes considered key to the success of Ramp-Up and that the program developers believe can be brought about within one year of implementation. The developers believe that these outcomes can be impacted after one year because prior research indicates that other interventions have impacted these outcomes in one year or less and Ramp-Up is a relatively intense approach to improving college readiness. ERQ1 and ERQ2 will provide additional exploratory information about the relationship between Ramp-Up and college readiness.
As is standard for ED-sponsored research projects, data from Phase 2 also will examine the implementation of Ramp-Up in the expanded set of schools. The implementation-related research questions (IRQs) are:
IRQ1. To what extent do Phase 2 schools implement the core components of Ramp-Up (i.e., structural supports, curriculum and tools, and professional development) as intended by the program developer?
IRQ2. Do students in Ramp-Up schools receive the amount of program exposure that the College Readiness Consortium believes is necessary to produce impacts?
IRQ3. How does Ramp-Up differ from college-related supports (i.e., programs, services, activities, and resources) in other schools?
IRQ4. What do school staff members (e.g., teachers, counselors, administrators) who are involved in implementing the Ramp-Up program perceive as the strengths and weaknesses of its curriculum, tools, and professional development? According to school staff, which aspects of Ramp-Up were more difficult to implement and why?
IRQ5. Is the degree of fidelity of implementation among schools in the early Ramp-Up group similar to that of schools that implemented Ramp-Up to Readiness during Phase 1 of this project?
Fifty-four high schools have submitted applications with the Consortium to implement the program during the next two academic years, with half of the high schools implementing Ramp-Up in the 2014–15 school year (early implementing schools) and the remaining schools delaying implementation by one year (later implementing schools). The Consortium has agreed to partner with REL Midwest and will allow REL Midwest to use a systematic random assignment process to determine which schools implement early and which schools implement later. The random assignment that occurs as part of Phase 2 will allow for estimation of program impacts on the early outcomes of the intervention.
Phase 2 will require the following data collections:
Existing student-level and school-level data gathered through requests for extant data from participating schools, school districts, the Minnesota Department of Education (MDE), and the Wisconsin Department of Public Instruction (WDPI)
An assessment of students’ personal college readiness (i.e., ACT’s ENGAGE® assessment for Grades 10–12) to be administered in fall 2014 and spring 2015
A student survey to collect information about students’ experiences with school college-readiness supports, which will be administered once in the fall of 2014 and again in spring 2015
Extant documents from the program developers, including schools’ participation in professional development and use of resources, to be collected in the fall of 2014 and spring 2015
Informed consent documents from (1) parents or guardians who want to withhold permission for their child(ren)’s data to be gathered for the project, and (2) from school staff from whom survey data and instructional log data are to be collected
Instructional logs on which teachers implementing Ramp-Up during the 2014–15 school year record the activities conducted during advisory sessions and five workshops provided to students
A survey of school staff to collect information about college readiness activities in fall 2014
A survey of school staff in early implementing Ramp-Up schools to be administered in spring 2015
Nearly all of the data to be collected during Phase 2 of the project also are being collected during Phase 1. The exceptions are the baseline data collections that will occur in the fall of 2014 (i.e., administration of ACT’s ENGAGE to students, administration of the fall survey to students, administration of the fall survey with school staff, and the collection of administrative extant data from schools). The research team also will be collecting extant documents from the Consortium which will be working with schools to implement the program.
Through this revision, ED is requesting expansion of the present clearance to include collection of these data from the 54 high schools that will implement Ramp-Up during 2014-15 academic year or during the 2015-16 academic year. Phase 2 will not examine implementation during 2015–16, when the later implementing schools begin to implement Ramp-Up. ED believes that the data collections for which clearance is being requested represent the bare minimum necessary to assess the efficacy of Ramp-Up on short-term student outcomes.
Education stakeholders, including state and federal policymakers, have made college and career readiness one of the major goals of education reform (Council of Chief State School Officers, 2010; ED, 2010). Although many interventions have been developed to help students continue their education to the postsecondary level (e.g., Career Beginnings, Talent Search, Upward Bound), rigorous evidence on these programs’ effectiveness is limited and shows mixed impacts.6
Ramp-Up to Readiness attempts to amalgamate strategies for improving college readiness recommended in previous research (nonexperimental evidence) into 28 half-hour activities, five workshops, and professional development. Currently, 56 high schools in Minnesota are implementing Ramp-Up, and the developers intend to make the intervention available to a much larger set of Minnesota schools. Few impact studies have rigorously examined factors or interventions that influence improved college enrollment, persistence, or completion. This evaluation will contribute strong experimental evidence about the efficacy of a college-readiness intervention. This project will attempt to gather such evidence through the least burdensome means.
Phase 2 will examine 54 schools for one year. REL Midwest will randomly assign 27 of the schools to implement the program beginning in 2014–15. The other 27 schools will continue their present college-readiness activities and supports and will delay implementation of Ramp-Up until 2015–16.
ED seeks to use impact findings from Phase 2 to inform the educators and policymakers about the impact of Ramp-Up on students’ college readiness and to inform the continued development and implementation of Ramp-Up and other college-readiness programs. more generally as well as the impact of similar programs on early college success outcomes. This proposed evaluation will: (1) produce rigorous estimates of the program’s impact on completing the FAFSA and personal college readiness after one year of program implementation; (2) explore the relationship between attending a Ramp-Up high school and other short-term college-readiness outcomes, as well as personal college-readiness outcomes for a subgroup of students who qualify for free or reduced-price lunch; and (3) understand schools’ experiences with program implementation, including the degree to which schools implement the intervention with fidelity, and how the Ramp-Up intervention compares with college-readiness supports in other high schools.
ED requests clearance for the collection of data under the OMB clearance agreement (OMB number [IES to complete]) for activities related to the REL program.
Almost all graduating high school seniors (97 percent) plan to enroll in college (Berkner & Chavez, 1997), but they encounter obstacles on the path to college completion. A variety of interventions have emerged to support students as they make themselves ready for college, enroll in college, finance their college education, and complete college. However, evidence on the effectiveness of these programs is scarce (Tierney et al., 2009).
Ramp-Up is an intervention that aims to increase students’ college readiness and college success. The intervention consists of a guidance curriculum differentiated by grade level, a set of tools to help students set postsecondary goals and track progress toward their achievement, and professional development to support the curriculum and tools. It is a standardized program that was developed by the College Readiness Consortium at the University of Minnesota (“the Consortium”) over six years (2006–12) through extensive review of scholarly research, intensive engagement in Minnesota secondary schools, and formative program evaluation in partnership with University of Minnesota’s Center for Applied Research on Educational Improvement. To date, 56 Minnesota schools have implemented Ramp-Up or will be implementing Ramp-Up in 2014–15.7 The 54 schools participating in this impact part of this evaluation will represent two additional waves of schools to implement Ramp-Up. One wave of 27 schools (the intervention schools) will implement Ramp-Up in 2014–15, and the other 27 schools (i.e., comparison schools) will delay implementation of Ramp-Up until the 2015–16 school year.
The logic model shown in Figure 1 illustrates the theoretical links between the key Ramp-Up program resources and activities (i.e., inputs), the five dimensions of college readiness (i.e., outputs), and high school and college outcomes (i.e., outcomes). According to the logic model that underlies the intervention, Ramp-Up is expected to increase students’ academic achievement, the likelihood that students enroll in advanced courses and complete key enrollment actions, and students’ personal readiness for college. The model indicates that improving those student outcomes in high school will produce better outcomes at the college level, such as an increased likelihood of going to college, decreased likelihood of remediation, and higher rates of college persistence.
Through its contractor for REL Midwest, ED is responding to requests of education stakeholders who have come together around the desired goal of improving college and career readiness of students (formally, the Midwest College and Career Success Research Alliance). These stakeholders believe that Ramp-Up is a promising intervention. The program incorporates research-based strategies within a single group of lessons and resources and has empirical support, based on correlational studies, for each of the Ramp-Up dimensions. Few impact studies have rigorously examined factors or interventions that lead to college readiness. Phase 2 of this evaluation will contribute strong experimental evidence about the efficacy of the program. ED and its contractor are authorized to conduct studies of this nature, with the expectation that any information to be collected from groups of nine or more people be justified as necessary for the overall program goals.
Outcomes
Outputs
Inputs
Long-Term College Outcomes
Greater likelihood of college enrollment
Lesser likelihood of remediation
Greater likelihood of persisting in college
Table 1 shows the revised timeline for all data collection activity in Phase 1 and Phase 2. It includes three data collections that do not require OMB clearance but which are included in the table to provide context for the study. Collecting administrative data from MDE and the WDPI does not require OMB clearance because providing data to researchers is part of staff’s regular practice at these organizations. In addition, the collection of extant documents from the program developers (data collection 5) does not require OMB clearance because it is the collection of existing documents without any modifications from the program developers. These documents are provided by Ramp-Up schools to the program developers as part of the Ramp-Up intervention.
The data collections that will be conducted by ED’s contractor for REL Midwest for Phase 1 and Phase 2 of the project are directly linked to the research questions of interest to members of the Midwest College and Career Success Alliance and policymakers, practitioners, and education researchers. The alignment between data collection activities and the research questions for both project phases are summarized in Tables 1 and 2.
ED’s contractor’s proposed analytic models and procedures for the data collected during both phases of the project have been preapproved by the Institute of Education Sciences (IES). The contractor will summarize project findings in two technical reports (one for each phase) and two research briefs (one for each phase). The latter types of products will contain condensed summaries of the projects’ main findings which are written in a 2-5 page practitioner-friendly document. These four reports will undergo review for quality and relevance by an external review contractor for the National Center for Education Evaluation and Regional Assistance (NCEE). After the reports have undergone IES review, findings will be disseminated to the relevant audiences. The stakeholder groups for whom the information is most important are:
State education agencies seeking strategies and programs to endorse as a potential means to improve students’ college readiness and college enrollment
Local education agencies that are considering adopting programs to improve students college readiness and enrollment rates and are considering Ramp-Up to Readiness as one possible option
The developer of this intervention (the Consortium) and developers of other college-readiness interventions that continually seek to improve their programs by using information from such studies because this study will reveal obstacles to implementation and provide information on the usefulness of a personal readiness assessment
Without the data to be collected in this study, local education agencies and schools will be unable to determine whether Ramp-Up produces impacts on students’ college readiness. The data also will enable educators to better understand the factors that facilitate or impede implementation of this or similar whole-school approaches to improving college readiness and the degree to which impact estimates are related the schools’ implementation scores.
For policymakers, the project findings will help inform decisions on whether to adopt Ramp-Up and whether to fund the implementation of Ramp-Up in schools. Finally, the results of the proposed data collection activities will provide The College Readiness Consortium with diagnostic information on which components of Ramp-Up are being implemented well and which components require additional work.
The purposes of the data collection are described for each data instrument for which OMB approval is being sought:
Extant administrative school and student data from schools and districts
To answer the primary questions related to implementation and impact, REL Midwest will obtain extant student-level and school-level administrative data from schools and districts.8 OMB already has approved clearance for the collection of administrative data during the Spring of 2014. These data will address implementation-related questions posed for Phase 1 of the project.
Additional extant data will be requested from schools participating in Phase 2 of the project. In October 2014 and June 2015, the following student-level data will be requested by student’s state identification number (for matching with the state longitudinal data system):9 grade level; cumulative GPA; ACT and SAT scores; dates on which the student took the ACT and SAT; and student’s enrollment in advanced courses by term. These data will be collected directly from schools and districts because schools either do not report these student-level variables to the state or may have more reliable information.10 Also in October 2014, REL Midwest will request previous-year school-level data on average PLAN, ACT, and SAT scores; the percentage of students submitting a college application based on transcript requests for college applications; and the percentage of students enrolling in advanced coursework by grade and the number of advanced courses offered by term. The data collected in October 2014 will provide baseline information to be used as covariates in statistical models of impact, and the data collected in June 2015 will provide data for the outcomes examined in ERQ1–ERQ2.
This request for expanded OMB clearance will cover collection of data elements presented in Attachment A-1. Administrative data will be acquired through secure file transfer protocols. These and all other data collected for this evaluation will be safeguarded through protocols approved by the contractor’s federally approved institutional review board, including adherence to Family Educational Rights and Privacy Act regulations.
Table 1. Revised Data Collection Timeline
Data Collection |
Purpose |
Schools Involved in Data Collection |
Phase 1 (22 Schools) |
Phase 2 (54 Schools) |
|
Spring 2014 |
Fall 2014 |
Spring 2015 |
|||
|
(Phase 2: CRQ1, CRQ2, ERQ1, ERQ2) |
All Phase 1 and Phase 2 Schools |
X |
X |
X |
|
(Phase 2: CRQ1, CRQ2, ERQ1, ERQ2)
|
All Phase 1 and Phase 2 Schools |
X |
X |
X |
|
(Phase 1, RQ3; Phase 2, IRQ2)
(Phase 1, RQ4; Phase 2, IRQ3)
|
All Phase 1 and Phase 2 Schools |
X |
X |
X |
|
|
All Phase 1 and Phase 2 schools |
X |
X |
X |
|
(Phase 1, RQ4) |
Phase 1: All schools |
X |
|
|
Data Collection |
Purpose |
Schools Involved in Data Collection |
Phase 1 (22 Schools) |
Phase 2 (54 Schools) |
|
Spring 2014 |
Fall 2014 |
Spring 2015 |
|||
|
(Phase 1, RQ3; Phase 2, IRQ1) |
None |
X |
|
X |
|
|
Phase 1: All schools |
X |
|
|
|
(Phase 1, RQ3)
|
Phase 1: All schools |
X |
|
|
|
(Phase 1: RQ3; Phase2, IRQ3)
(Phase 1, RQ5; Phase 2 IRQ1and IRQ 2) |
Phase 1: Early implementing schools Phase 2: Early Implementing Schools |
X |
X |
X |
|
|
Phase 2: All schools |
|
X |
|
|
(Phase 1, RQ3; Phase 2: IRQ1, IRQ2, IRQ3, IRQ4, IRQ5))
|
Phase 1: Early implementing schools Phase 2: Early Implementing Schools |
X |
|
X |
Table 2. Alignment Between Data Collection Activities and the Research Questions Underlying Both Phases of this Project.
Data Collection Activities |
Phase
1: Implementation of Ramp-Up |
Phase
2: Impact and Implementation of Ramp-Up |
|||||||||||||
RQ1 |
RQ2 |
RQ3 |
RQ4 |
RQ5 |
RQ6 |
CRQ1 |
CRQ2 |
ERQ1 |
ERQ2 |
IRQ1 |
IRQ2 |
IRQ3 |
IRQ4 |
IRQ5 |
|
1. Extant administrative school and student data from schools or districts |
• |
• |
• |
|
|
• |
• |
• |
• |
• |
|
|
|
|
• |
2. Extant administrative student data from MDE, MOHE, and WDPI |
• |
• |
|
|
|
|
• |
• |
• |
• |
|
|
|
|
|
3. Student survey |
|
|
• |
• |
|
|
• |
|
|
• |
|
• |
• |
|
• |
4. Student personal readiness assessment (i.e., ENGAGE) |
|
|
|
|
|
• |
|
• |
|
|
|
• |
|
|
|
5. Extant documents from schools |
|
|
|
• |
|
|
|
|
|
|
|
|
|
|
|
6. Extant documents from program developers |
|
|
• |
|
|
|
|
|
|
|
• |
|
|
|
• |
7. Interviews of school staff |
|
|
|
• |
|
|
|
|
|
|
|
|
|
|
|
8. Focus groups |
|
|
• |
• |
• |
|
|
|
|
|
|
|
|
|
|
9. Teacher instructional logs from teachers in Ramp-Up schools for each of five workshops |
|
|
• |
|
• |
|
|
|
|
|
• |
• |
• |
|
• |
10. Fall staff survey |
|
|
|
|
|
|
|
|
|
|
|
|
• |
|
|
11. Spring staff survey |
|
|
• |
|
• |
|
|
|
|
|
• |
• |
• |
• |
Extant administrative school and student data from MDE and the WDPI
Collection of extant data from state education agencies does not require OMB approval because staff members at those agencies are expected to provide such data to researchers as part of their regular practices. However these data collection activities are described here to indicate how such data will be used. During Phase 1 of the project, these state data will help REL Midwest to describe the sample of schools and students within the sample of 22 schools.
In October 2014, REL Midwest also will collect the following extant administrative data from MDE and WDPI: student-level demographic characteristics (e.g., race or ethnicity, gender, free or reduced-price lunch status, individualized education program [IEP] status, and English-language learner [ELL] status), student-level state standardized test scores, and school-level data from the previous year on high school graduation rates, average state standardized test scores, and the demographic composition of schools (e.g., the percentages of free or reduced-price lunch, African-American, and Latino students). School-level FAFSA completion rates from the previous year will be obtained from ED. These data will be used to describe the schools participating in the study and may be included as control variables in impact analyses. Most of these data to be collected from state education agencies for Phase 2 will serve as covariates in statistical models of impact of variables for identifying student subgroups (i.e., help address CRQ1, CRQ2, ERQ1, ERQ2).
A student survey on college readiness
OMB granted clearance to REL Midwest to collect survey data from students in grades 10-12 for the purposes of addressing questions during Phase 1 of the project. Students’ perceptions will help address questions on their exposure to Ramp-Up (RQ3) and the contrast between college readiness activities in Ramp-Up schools and non-Ramp-Up schools (RQ4).
For Phase 2, REL Midwest will gather information from students in early implementing schools (i.e., those implementing in 2014–15) and later implementing schools (i.e., those implementing in 2015–16) using a student survey. Thirty randomly selected students in grade 10, 30 randomly selected students in grade 11, and all students in Grade 12 from each school will take the survey in the fall. These same students will be asked to participate in the spring survey. Data from the 10- to 15-minute student surveys will provide the source of information for two outcomes: completion of the FAFSA and submission of a college application. As with the data collected from students in Phase 1 schools, the survey data will help the research team to better understand schools’ fidelity of implementation and the contrast between Ramp-Up and college-readiness supports offered in other schools. The fall survey will ask fewer questions than the spring survey, since the latter also records information about college readiness activities during the current school year. See Attachment A-2 and Attachment A-3 for the questionnaires and Attachment A-4 for the parental information letter and consent form.11
A student personal readiness assessment (ENGAGE)
OMB has provided clearance to gather personal readiness data from students (ACT’s ENGAGE assessment) during Phase 1. These data will help determine whether the ENGAGE scales have sufficient validity and reliability to be used as a Phase 2 outcome.
Clearance is requested to administer ACT’s ENGAGE assessment for Phase 2 during the fall and spring of the 2014–15 school year. The 30 students from grade 10 and 11 within each grade and school who are selected to complete the student survey also will complete the ENGAGE. The research team will randomly select 30 Grade 12 students from each school to complete the ENGAGE. ENGAGE measures student factors associated with academic success, such as student motivation and skills, their social engagement, and self-regulation. ENGAGE’s Grades 10–12 version has 108 items and 10 scales. Analysis for the impact study will focus on two of 10 scales on the Engage, Commitment to College and Goal Striving, which the program developers consider measures of personal college readiness.
The Commitment to College scale has 10 items that measure a student’s commitment to enrolling in and completing college, and ACT reports that the scale has good internal reliability (alpha = 0.89; ACT, 2012) when used with college student samples.12 The Goal Striving scale also consists of 10 items that measure the “strength of [a student’s] efforts to achieve [his or her] objectives and end goals” (ACT, 2012, p. 2).13 This scale, too, has good internal reliability (alpha = 0.87) with college student samples. Students’ scale scores on the two measures show a correlation of r = 0.60 (ACT, 2012). Because the Engage assessment is proprietary to ACT, the specific questions cannot be provided in this OMB package.14
There is some validity information available on these two scales as well. The two scales have a moderate correlation with high school GPA (0.3 and 0.4 for goal striving and commitment to college, respectively [ACT, 2012]), and with college GPA (0.3 for both goal striving and commitment to college [Peterson, Casillas, & Robbins, 2006]). In addition, commitment to college predicts college retention at two- and four-year colleges controlling for institutional characteristics, student demographics, and prior academic achievement (Robbins, Allen, Casillas, Peterson, & Le, 2006).15 Farrington et al. (2012) concluded that interventions can impact academic perseverance, which relates to both scales.
Collection of Extant Documents from Schools.
OMB provided clearance for ED’s contractor to collect extant documents or artifacts that document scheduled college readiness activities within Phase 1 schools. The collection of artifacts will not be continued for Phase 2 of the project.
As in Phase 1, ED’s contractor will continue to collect extant documents that indicate schools’ participation in Ramp-Up training from the program developer. This data collection activity does not require OMB clearance.
Interviews
For Phase 1, OMB provided clearance to conduct interviews with individuals from the 22 schools who are most knowledgeable about the schools’ current college readiness activities. ED’s contractor will not be collecting interview data as part of Phase 2.
Focus Groups
For Phase 1, OMB provided clearance to conduct focus groups in each of the 22 high schools with school leaders and counselors who understand the initiatives going on within the school to make students college-ready. Focus groups will not be necessary to address research questions for Phase 2 of the project.
Instructional logs
For Phase 1, OMB has provided clearance to administer brief, 10-minute instructional logs to teachers in the 22 schools following their final two Ramp-Up workshops.
For Phase 2, instructional logs from Ramp-Up advisors will be collected after each of the five workshops that advisors deliver to students. To better understand the extent to which students in the treatment schools receive the program as intended, Ramp-Up advisors will complete the logs that ask about the content, time, and quality of workshops (Attachment A-5). These instructional logs also ask teachers about the number and topics of weekly advisories that they have taught, the time devoted to the advisories, and the number of students who attend the advisories. Instructional logs will be collected from all teachers or school staff members (e.g., counselors) who deliver the workshops or a weekly advisory.16
A fall staff survey
Phase 1 of the Ramp-Up evaluation did not include a fall staff survey. Data from staff were not necessary to for the study of Ramp-Up implementation.
For Phase 2, however, school staff in the 27 early-implementing and 27 later-implementing schools will be asked to complete a survey during October, 2014. The survey will be conducted on-line and will last 20 to 30 minutes. Survey items focus on the college-readiness supports offered in Ramp-Up schools and those in control schools (IRQ3). Teachers for Grades 10–12 and counselors will be asked about the formal or informal programs (e.g., Upward Bound), services (e.g., college counseling), activities (e.g., college tours), and resources (e.g., college software) available to students through their schools and designed to support college readiness. The survey also will include questions about teachers’ expectations for students’ postsecondary pathways. See Attachment A-6 for the survey.
A spring staff survey
OMB approved clearance for a 20-30 minute staff survey to be administered to school staff in the 11 Phase 1 schools that were implementing Ramp-Up in 2013-14. That survey was to be administered in the spring of 2014.
ED requests clearance to administer a similar 20-30 minute survey to staff in Phase 2 Ramp-Up schools during the spring of 2015. This survey will ask staff about their perceptions of the strengths and weaknesses of the Ramp-Up program’s curriculum, tools, and professional development. It also will gather information about whether school staff implemented the intervention as intended. Surveys will be administered to members of schools’ Ramp-Up leadership team, the Ramp-Up coordinator, and teachers for Grades 10–12 (who presumably deliver Ramp-Up advisories). The surveys will include questions with scaled responses, as well as two open-ended questions asking about the strengths and the weaknesses of Ramp-Up (see Attachment A-7). Results from administration of the survey to Phase 1 schools will provide useful information about whether any revisions should be made to the survey for Phase 2.
After data are analyzed and summarized, ED’s contractor will sanitize the data files of any information that can be linked to individual students, teachers, schools, or districts. Data files will then be submitted to IES and made available to other researchers as restricted-use files.
The data collection plan for Phase 1 and Phase 2 reflects sensitivity to issues of efficiency, accuracy, and respondent burden. To address the study’s research questions, the contractor will collect data using electronic data collection tools when possible. The electronic tools include the following:
A secure electronic file transfer protocol site that allows MDE, WDPI, schools, and districts to transfer administrative records to ED’s contractor in an efficient manner
Online data collection tools (e.g., Vovici) that allow for the secure collection of instructional logs from teachers in early Ramp-Up schools. The tool will give respondents the opportunity to complete the survey during noninstructional hours and eliminates the need for third-party data entry
An online data collection tool (e.g., Vovici) that allows for the secure collection of survey data from staff in early implementing and later implementing Ramp-Up schools. The tool gives respondents the opportunity to complete the survey during noninstructional hours and eliminates the need for data entry.
E-mail systems maintained by schools or districts and the contractor that allow for transfer of electronic documents (docx, .xlsx, or .pdf files) rather than printed copies of documents.
An electronic data collection system used by ACT to administer the personal readiness assessment (in conjunction with the student survey). The use of ACT’s system also eliminates the need for data entry.
To the extent possible, the two phases of this project will rely on extant administrative data that are available on students, teachers, schools, or programs, rather than asking individuals to provide the data for study purposes. While other studies have examined college-readiness programs, Ramp-Up takes a relatively unique approach to improving college readiness by involving all teachers within a middle or high school in the presentation of program content to all students. Phase 2 of the project (the investigation of program impact) will build on Phase 1 (the investigation into implementation of Ramp-Up). Phase 2, for which OMB clearance is requested, will yield unique data necessary to estimate the impact of the Ramp-Up program and to interpret the impact study findings. No other systematic effort has been made or is currently under way to collect such information, and there is no alternative source of this information.
It is likely that one or more of the 54 schools that participates in Phase 2 will be small (possibly serving Grades 7–12 in one building, with 30 or fewer students per high school grade). The contractor has developed its data collection plan with this assumption and intentionally has capped the collection of instructional logs at no more than five and the length of time to complete the instructional logs at 10 minutes. Further, five of the seven data collections requiring OMB clearance (i.e., extant administrative school and student data, instructional logs from teachers, the online survey of school staff, and the student survey and personal readiness assessment) will be electronic to reduce the length of time it takes respondents to comply. These data collection activities represent the absolute minimum amount of information required to meet the study objectives.
The Education Science Reform Act of 2002 states that the central mission and primary function of the RELs includes supporting applied research and providing technical assistance to state and local education agencies within their region (ESRA, Part D, section 174[f]; see Attachment A-9 for the text). Failure to approve the data collections related Phase 2 (the investigation of Ramp-Up’s impact) will jeopardize this attempt to study this intervention and thereby prevent the REL Midwest contractor from fulfilling its mission.
This project also has the potential to inform researchers, practitioners, and policymakers more broadly. Ramp-Up’s program design and practices may be of particular interest for several reasons: (1) Ramp-Up aims to serve all students in a school rather than a select subgroup; (2) it is a data-driven approach to assess and track students’ college preparation; (3) it incorporates the practices recommended by ED’s What Works Clearinghouse for college preparation programs (see Tierney et al., 2009 for the recommended practices); and (4) its group advisory approach may be more cost effective than a similarly intense one-on-one counseling approach. Findings from Phase 2 will inform practitioners about the efficacy of Ramp-Up as a whole-school reform. Without this study, practitioners and policymakers will have less information on which to base decisions about adopting whole-school college-readiness interventions.
This request for OMB clearance does not include any of the stipulated special circumstances and thereby fully complies with regulations.
Federal Register Announcement
A 60-day notice will be published in the Federal Register, providing an opportunity for public comments. A 30-day notice will be published to further solicit comments. No public comments were received.
Consultations Outside the Agency
ED or the REL Midwest contractor have consulted with the following groups on the availability of data, the soundness of the evaluation design for addressing evaluation questions, and the clarity of measures.
A technical working group (TWG) made up of experts in research methodology and REL Midwest’s core areas of emphasis, which was assembled by the REL Midwest contractor: The TWG met on October 23, 2012, to discuss the Ramp-Up to Readiness program, the evaluation methodology, and measures. The contractor was required to submit to ED the TWG comments and the contractors’ plan for addressing those comments.
Former educators or staff with content and technical expertise within the REL Midwest contractor (i.e., AIR) about online surveys and instructional logs. These former educators or staff with content expertise reviewed the instruments, interview questions, and focus group protocols for clarity of wording, for loadedness of questions (i.e., whether questions are written to elicit only one type of response), and appropriateness of response options.
An external review contractor to examine the reasonableness of the logic model underlying the intervention (whether it is reasonable to expect that the intervention is capable of producing impacts), the analytic approach for determining fidelity of implementation, and the degree to which findings address the research questions and conclusions are supported by the data. The external review contractor has recommended the project plan for approval, and the REL Contract Officer’s Representative gave approval in May of 2014.
As with Phase 1, Phase 2 will require a $1,500 incentive to be given to each school that participates in the project but delays implementation until the 2015–16 school year. Schools in this group may feel discouraged by the results of the random assignment, and the promise of delayed implementation (after the study period has ended) may not be sufficient incentive to continue participation in this study, which requires data collection prior to implementing the program (for the later implementing schools). To prevent attrition among the later implementing schools (which would jeopardize the validity of the study), the REL Midwest contractor will offer these schools a single payment of $1,500 at the end of the study year. This amount was determined by consulting NCEE’s “Proposed Incentives and Payments,” which suggests annual payments of $2,500 to control schools. Because the data burden on schools, teachers, and students in this study is lower than in other types of studies, the amount was reduced in what was deemed a commensurate manner.
As is being done for Phase 1, the REL Midwest contractor also will offer teachers and other school staff participating in Phase 2 (i.e., administrators and counselors) in both early and later implementing schools a $25 Amazon.com gift card for each data collection activity that they perform. For some teachers in early implementing Ramp-Up schools, the total might amount to $75 in gift cards ($25 for completion of all instructional logs and for completing the fall and spring staff surveys). School staff members in the later implementing group will not be asked to complete instructional logs or the spring staff survey and will receive a $25 gift card for participating in the fall survey (fall 2014). The monetary amount of the gift cards was determined by the average salary of Minnesota teachers. When the average salary is converted into an average hourly rate, the result is approximately $25 per hour. Teachers in the early implementing Ramp-Up schools can anticipate no more than four hours for completing the fall and spring online surveys and instructional logs. Other school staff members, such as administrators and counselors, participating in the fall and spring can anticipate spending no more than one hour for participation in data collection activities.
ED’s contractor for REL Midwest will follow the policies and procedures required by ESRA of 2002, Title I, Part E, Section 183. This requires “All collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act” (20 U.S.C. 1232g, 1232h). These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment.
In addition, for student information, ESRA states:
The Director shall ensure that all individually identifiable information about students, their academic achievements, their families, and information with respect to individual schools, shall remain confidential in accordance with section 552a of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act.
Subsection (c) of section 183 requires the director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.”
Subsection (d) of section 183 prohibits disclosure of individually identifiable information as well as making the publishing or communicating of individually identifiable information by employees or staff members a felony.
Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific school, district, or individual. Any willful disclosure of such information for nonstatistical purposes, except as required by law, is a class E felony.
The contractor for REL Midwest will protect the confidentiality of all information collected during both project phases and will use the information for research purposes only. To protect confidential data, only the contractor’s data management staff, investigators, and research staff will have access to the data files on a “need-to-know” basis. Any identifiable variables, raw data, or derived variables will be stored in encrypted files on a secure data management site. Access to this site will be limited to staff assigned to the project. Any data obtained for this study will be used only for statistical and descriptive analyses. All identifiers will be destroyed as soon as they are no longer required. Study reports will not identify the name of any specific analysis unit (e.g., students, school staff members, or schools). In no case will information be reported when the total number for a quantity represents fewer than four cases. Moreover, any data that permit identity disclosure, when used in combination with other known data, will not be published or made available in restricted-use files.
All members of the study team have obtained their certification on the protection of human subjects in research, and REL Midwest staff members also have obtained federal security clearances. The REL project team will submit to the NCEE security officer a list of the names of all people who will have access to respondents and data. All staff members working on the project who have access to the data or to respondents will be required to sign a confidentiality pledge and affidavits of nondisclosure (see copies of the forms in Attachment A-11; ED will obtain the appropriate signatures). The project team will track new staff and staff who have left the project and ensure that additional signatures will be obtained or clearances will be revoked.
Respondents to the instructional logs and surveys will be informed of the voluntary nature of the data collection and the confidentiality provision.
No questions of a highly sensitive nature appear in any instrument, including the protocols for focus groups, the instructional logs, and the surveys. In addition, participants will be informed that their responses are voluntary, and they may decline to answer any question.
OMB clearance for Phase 1 was granted for 6086 responses and 1,212 burden hours annually. The addition of 54 schools for the examination of program impact (Phase 2) increases the number of responses to 21,573 and 6059 burden hours annually (see Table 3). For each data collection, the burden was estimated from the contractor’s performance of similar collections and from time needed to complete data collections from Phase 1. To be conservative, the burden estimates assume response rates of 100 percent.
The total cost to respondents is estimated to be $128,199 (Table 4). The annualized cost across three years is $42,733. On average, the burden for completing the various pieces of data collection will be 0.29 hours per respondent (i.e., the total hours divided by the total number of respondents).
The total cost to the federal government for the contractor’s activities for Phase 1 and Phase 2 is $1,639,113. The annualized cost is $546,371. The costs cited in the approved clearance for Phase 1 were $862,544 total and $287,515 for each year.
ED and its contractor want the evaluation of Ramp-Up to Readiness to examine not just implementation of the program but impact as well. Expansion of the current evaluation to examine questions of impact (i.e., Phase 2) requires the random assignment and collection of data in 54 additional schools in order to detect impacts similar to those found in other studies. The data collections for phase 2 are basically the same as what was already approved through phase 1 ( OMB clearance 1850-0709 expiration date 4/30/2017) with a couple of exceptions: Focus groups will not be held and interview data will not be collected for phase 2 as they were for phase 1 and a new Fall Staff Survey will be added for Phase 2. The data collected from the 54 new schools being added in phase 2 increases the number of responses annually by 15,487 (a program change) over the currently approved responses from phase 1. Phase 2 generates a burden hour increase (a program change) of 4,839. There is also a reduction of 8 burden hours annually (considered an adjustment) because the Administrative Data Request from phase 1 was found to take 1.5 hours by School Administrators rather than the 2 hours originally figured for each respondent.
Table 3. Revised Estimates of Respondent Burden Included Phase 1 and 2
of the Ramp-Up Evaluation Project.
Instrument |
Person Incurring Burden |
Number of Respondents |
Responses per Respondent |
Hours per Response |
Total Burden (Hours) |
||||
1. Staff consent forms |
School staff |
1,701 |
1 |
0.08 |
136 |
||||
Explanation: Adding 54 schools to the sample being studied will require consent from 1,215 more staff members (present clearance for Phase 1 is for 486 staff members). The number of additional respondents is based on:
|
|||||||||
2. Parent information letter and consent withholding form |
Parent |
33,744 |
1 |
0.08 |
2,700 |
||||
Explanation: Adding 54 schools requires informing 23,976 additional parents or guardians of students in Grades 10–12 of the study and allowing them to withhold consent for their child to be in the study. The extra parental information and consent forms represent 148 students per grade, totaling 444 parents per school. Present clearance for Phase 1 is for 9,768 parents/guardians to receive the letter and form. |
|||||||||
3. Student personal readiness assessment (ENGAGE) |
Student |
6,840 |
1 (Phase 1) 2 (Phase 2) |
0.50 |
5,850 |
||||
Explanation: With 54 more schools for Phase 2, 90 students per each additional school will complete the personal readiness assessment twice (in the fall and spring) of the 2014-15 school year. Thus, 4,860 more students will respond to the ENGAGE than planned for the Phase 1 investigation alone. (Present clearance is for 1,980 students, each completing the assessment once). |
|||||||||
4. Student survey |
Student |
15,808 |
1 (Phase 1) 2 (Phase 2) |
0.25 |
6,760 |
||||
Explanation: The addition of 208 students for each of the 54 schools (30 in Grade 10; 30 in Grade 11; and an estimated 148 in Grade 12) results in a total of 11,232 additional students, and each of these students will take the survey twice. As a result, Phase 2 increases the number of respondents to 15,808 and time burden to 6,760. Previously, the number of students was 4,576 and time burden was 1,144 hours. |
|||||||||
5. Administrative data request |
School Administrator |
76 |
2 |
1.5 |
228 |
||||
Explanation: An additional 54 school staff members will need to gather extant student-level and school-level data (one per school). The total number of respondents then is 76 (22 for Phase 1, 54 for Phase 2). Time burden associated with each data extraction is approximately 90 minutes per staff member. This includes 30 minutes per staff member for phone conversation with project staff about data request and one hour to extract the data. The previous burden estimate of 88 hours for 22 schools was based on the assumption that fulfillment of the data request would require 2 hours per request. Information obtained from school administrators who are fulfilling this request presently for Phase 1 state that each request requires only 1.5 hours to fulfill.
|
|||||||||
6. March Interviews |
School Staff |
22 |
1 |
1.0 |
22 |
||||
The time burden estimate for interviews is for Phase 1 only. Interviews will not be conducted as part of Phase 2. |
|||||||||
7. Extant Document Collection Request |
School Administrator |
22 |
2 |
.5 |
22 |
||||
The time burden estimate for extant document request from schools is for Phase 1 only. Extant documents will not be requested from schools as part of Phase 2. |
|||||||||
8. Instructional Logs |
Teachers |
776 |
2 (Phase 1) 5 (Phase 2) |
0.17 |
457 |
||||
Explanation: 378 additional teachers (14 randomly selected teachers from each of the 27 early implementing schools) will complete an instructional log for each of the five workshops conducted during Phase 2. Teachers will be randomly selected to complete the instructional logs as a means of minimizing burden yet maintaining the project team’s ability to detect statistical relationships. The previous time burden estimate of 135.2 hours was based on the completion of just 2 logs by 398 teachers from the 11 early implementing Phase 1 schools.
|
|||||||||
9. May Focus Group |
School Staff |
132 |
1 |
1.5 |
198 |
||||
The time burden estimate for focus groups is for Phase 1 only. Interviews will not be conducted as part of Phase 2. |
|||||||||
7. Fall staff survey |
School staff |
2,106 |
1 |
0.50 |
1,053 |
||||
Explanation: 1,053 staff members from early implementing schools (972 teachers and 81 nonteaching staff) and 1,053 staff members from later-implementing schools will complete the fall survey. This data collection activity was not proposed as part of ED’s application for OMB clearance for Phase 1. |
|||||||||
8. Spring staff survey |
School staff |
1,484 |
1 |
0.50 |
742 |
||||
Explanation: Adding 54 schools for Phase 2 results in 1,053 additional respondents from the 27 new early implementing schools (972 teachers and 81 nonteaching staff) to complete the spring survey. This is in addition to the 431 school staff members in Phase 1 schools for whom clearance has already been obtained. |
|||||||||
TOTALS |
|
62,7113 |
|
|
18,178 |
1There is an estimated average of 36 teachers per school (based on the average within Minnesota public high schools from the Common Core of Data).
2This is the average number of 12th graders among schools that had expressed interest in Ramp-Up by January 2012.
3The total number of respondents in this table is the sum of the number of respondents for each data collection activity. Because some individuals will participate in more than one data collection activity, the total number of respondents listed here exceeds the total number of individuals from whom data will be collected.
Note. The hours per response was rounded to the second decimal place for display only. Therefore, the total burden may not equal the product of the displayed hours per response, number of respondents, and number of respondents.
.
Table 4. Revised Estimates of Annualized Costs for Respondents Involved
Tasks |
Type of Respondent |
Total Burden Hours (Phase 1) |
Total Burden Hours (Phase 1 & 2) |
Hourly Wage Rate1 |
Total Monetary Burden Costs |
Annualized Burden Costs |
Staff consent form |
School staff |
38.9 |
136 |
$24 |
$3,264 |
$1,088 |
Parent information letter with consent form |
Parent or guardian |
781.4 |
2,700 |
$22 |
$59,389 |
$19,796 |
Student personal readiness assessment |
Students |
990 |
5,850 |
$0 |
$0 |
$0 |
Student surveys |
Students |
1,144 |
6,760 |
$0 |
$0 |
$0 |
Extant administrative student and school data collection |
School administrator |
88 |
228 |
$24 |
$5,472 |
$1,824 |
Interviews |
School staff |
22 |
22 |
$24 |
$528 |
$176 |
Extant document collection |
School Administrator |
22 |
22 |
$24 |
$528 |
$176 |
Instructional logs |
Teachers |
135.2 |
457 |
$24 |
$10,968 |
$3,656 |
Focus groups |
School staff |
198 |
198 |
$24 |
$4,752 |
$1,584 |
Fall staff survey |
School staff |
-- |
1,053 |
$24 |
$25,272 |
$8,424 |
Spring staff survey |
School staff |
215.5 |
742 |
$24 |
$17,808 |
$5,936 |
TOTAL |
3,635 |
18,178 |
|
$128,199 |
$42,733 |
1The hourly wage rates for parents and school staff are based on mean wage rates in Minnesota reported by the Bureau of Labor Statistics (2013). For parents, the overall mean wage rate in Minnesota is used ($22.42), and for school staff, the mean wage for education, training, and library occupations is used ($24.37). Because students will take the survey and assessment during school hours, it is assumed that no costs will result from students participating in the data collection.
Note. The total burden hours and wage rates were rounded for display only. Therefore, the total monetary cost may not equal the product of the displayed burden hours and the wage rate.
All results for REL studies are made available to the public through peer-reviewed reports that are published by IES. The data sets from these studies will be turned over to REL’s IES project officer. These data may become IES restricted-use data sets requiring a user’s license that is applied for through the same process as National Center for Education Statistics restricted-use data sets (see http://nces.ed.gov/pubs96/96860rev.pdf for procedures related to obtaining and using restricted-use data sets). The REL contractor also would be required to obtain a restricted-use license to conduct any work with the data beyond the original report.
The evaluation team will be conduct confirmatory and exploratory analyses beginning in the summer of 2015. Analyses for confirmatory questions are adequately powered to detect 5 percentage point differences for FAFSA completion and differences of 0.17 standard deviations for measures of personal readiness. Exploratory analyses have not been powered to detect effects. See Attachment A-11 for power analyses.
Confirmatory analyses will assess the effects of the Ramp-Up program on measures in two outcome domains: enrollment actions and personal readiness. Corresponding to CRQ1 and CRQ2, three intent-to-treat impact models will be estimated. Table 4 shows the variables to be included in each hierarchical linear model.
Each confirmatory model will be a two-level nested model (students nested within schools). The models will assume a constant treatment effect across blocks and include a dummy variable for each block at Level 2. The block variable will indicate the stratum within which schools were randomly assigned. The treatment indicator will be included at the school level to indicate whether a student attended a Ramp-Up school. At Level 2, the models will include a prior school-level measure of the dependent variable (e.g., the prior high school FAFSA completion rate in the model predicting FAFSA completion). For the binary variable (FAFSA completion), a logic link function will be used to transform the dependent variable into the odds of achieving a particular outcome.
Table 5. Description of Statistical Models for Confirmatory Analyses
Model |
Dependent Variable |
Student Grade Level in Fall |
Level-1 (Student Level) Covariates |
Level-2 (School Level) Covariates |
CRQ1 |
Submitted the FAFSA (binary) |
12 |
Indicators of race or ethnicity, gender, free or reduced-price lunch status, IEP status, ELL status, state standardized mathematics and reading test score composite, EXPLORE or PLAN scores, GPA |
Indicator of treatment status, indicator of block membership, percentage of students completing the FAFSA in 2013–14 |
CRQ2a |
Commitment to College (continuous) |
10, 11, and 12 |
Indicators of race/ethnicity, gender, FRPL status, IEP status, EL status, state standardized math and reading test score composite, EXPLORE or PLAN scores, GPA, commitment to college score from fall 2014–15, grade level |
Indicator of treatment status, indicator of block membership, average commitment to college score for 10th–12th graders from fall 2014–15 |
CRQ2b |
Goal striving (continuous) |
10, 11, and 12 |
Indicators of race/ethnicity, gender, FRPL status, IEP status, EL status, state standardized math and reading test score composite, EXPLORE or PLAN scores, GPA, goal striving score from fall 2014–15, grade level |
Indicator of treatment status, indicator of block membership, average goal striving score for 10th–12th graders from fall 2014–15 |
Phase 2 also will include exploratory analyses to better understand impacts on longer-term outcomes and impacts on specific subgroups (ERQ1 and ERQ2). Analyses for ERQ1 will examine whether Ramp-Up has an impact on three other longer-term outcomes: (a) enrollment in advanced coursework; (b) the likelihood of taking the ACT or SAT; and (c) the likelihood of submitting at least one college application. ERQ2 examines whether impacts differ for (a) students in the upper and middle tertiles of achievement from 8th grade tests; and (b) students having different FRPL statuses. These analytic models to be used to address the exploratory research questions are similar to those for the confirmatory research questions, except for changes in the outcome being examined and specific subgroups on which the statistical models are conducted (see Table 6).
The outcomes in ERQ1 are classified as exploratory rather than confirmatory because some schools may not offer students the opportunity to take advanced courses, some schools require all students to complete a college application (whether the schools that participate in this study require this is not known at this time), and some schools or states (i.e., Wisconsin) require all students to take the ACT. Because these analyses may potentially include only a subgroup of randomly assigned schools, they can only be exploratory. Unlike the confirmatory analyses, these exploratory analyses are not designed to evaluate the intervention per se; rather, they attempt to show how the program may have affected outcome not considered central to its success and impacts for student subgroups of interest. Table 4 shows models for the exploratory analyses, and the text following the table describes the models in more detail.
Table 6. Description of Models for Exploratory Analyses.
Model |
Dependent Variable |
Student Subgroup |
Level-1 Covariates (Student Level) |
Level-2 Covariates (School Level) |
ERQ1a |
Enrolled in at least one advanced course (binary) |
Students in Grades 10, 11, and 12 |
Students’ race/ethnicity, gender, FRPL status, IEP status, EL status, average academic achievement test score(s)a, EXPLORE and PLAN scoresb, GPA |
Number of advanced courses offered in 2013–14, block membership, percentage of 10th–12th graders completing an advanced course in 2013–14 |
ERQ1b |
Took the ACT or SAT (binary) |
Students in Grades 10, 11, or 12 |
Students’ race/ethnicity, gender, FRPL status, IEP status, EL status, average academic achievement test score(s)a, EXPLORE and PLAN scoresb, GPA |
Block membership, percentage of students taking ACT or SAT in 2013–14 |
ERQ1c |
Submitted a college application (binary) |
Students in grade 12 |
Students’ race/ethnicity, gender, FRPL status, IEP status, EL status, average academic achievement test score(s)a, EXPLORE and PLAN scoresb, GPA |
Block membership, percentage of 11th graders submitting a college application in 2013–14 |
ERQ2 |
Outcomes listed in CRQ1, CRQ2, ERQ1 |
Students in: (a) middle 3rd on 8th grade tests; (b) upper 3rd on 8th grade tests (c) students eligible for FRPL. |
Students’ race/ethnicity, gender, FRPL status, IEP status, EL status, average academic achievement test score(s)a, EXPLORE and PLAN scoresb, GPA |
Block membership;, School percentages for : (1) students taking advanced courses in 2013-14 (2)students in 2013-14 who completed the ACT or SAT (3) students who submitted a college application in 2013-14.
|
Notes: FRPL is eligibility for free or reduced price lunch (a proxy for whether student lives in an impoverished family); IEP is individualized education plan (an indicator of whether student receives special education services); EL status is English language status (whether student is a native speaker of English); GPA is grade point average.
aTo combine state standardized test scores, each score will be standardized (based on the statewide mean and standard deviation), the two standardized scores will be summed, and the sum will be standardized.
bFor models including Grades 10–12, PLAN scores (for 11th and 12th graders) will be combined with EXPLORE scores (for 10th graders) by first standardizing scores separately, then summing scores, and finally standardizing the sum.
Structural supports. For Ramp-Up to increase the likelihood of students enrolling and succeeding in college, school leaders need to establish the necessary structural supports for the program. These supports include: establishing a Ramp-Up leadership team; appointing a Ramp-Up coordinator; obtaining the active participation of faculty (including having them lead advisories); establishing advanced courses; providing the opportunity and time for professional development, coordination, and preparation related to Ramp-Up; and implementing a technology platform for students, staff, and parents to access or store college-related information (e.g., the Postsecondary Plan and Readiness Rubric).
Professional development. Ramp-Up involves professional development for the leadership team and coordinator who receive training from the College Readiness Consortium prior to the beginning of the school year for eight hours and four hours, respectively. The leadership team and coordinator then provide training to school staff during a four-hour session at the beginning of the school year and for 20 minutes each month.
Curriculum delivery. The College Readiness Consortium requires that students receive 28 weekly lessons lasting 30 minutes each and participate in five workshops lasting one hour. For teachers to lead advisories and workshops effectively, they need access to curriculum materials and sufficient information about the college enrollment process to deliver the content.
Curriculum content. Ramp-Up addresses five dimensions of college readiness: academic readiness, admissions readiness, career readiness, financial readiness, and personal and social readiness.
Postsecondary planning tools. Teachers use the Postsecondary Plan and the Readiness Rubric to assist students in developing realistic postsecondary plans for achieving students’ educational and career aspirations. Teachers share information from these tools with parents in two-way communication.
Multiple indicators of each implementation component are embedded in the student and staff surveys, extant documents and data, and instructional logs. Across the six implementation components, REL Midwest has identified 80 indicators (see Attachment A-12). After completing the data-coding procedures (see section C.1), a two-step process will be followed to create an implementation index for each early-implementing school. First, for every school, a score will be calculated for each component of implementation. To calculate a component score, a school-level score for each indicator will be created,17 and the school-level scores for each indicator will be averaged within a component. To illustrate variation in implementation across the five components, the report will present the range, average, and standard deviation of the component scores across schools. Second, the school-level component scores will be averaged to create the fidelity index.18 The College Readiness Consortium will help REL Midwest establish a cut-point on the fidelity index that signifies “adequate” for improving the college readiness of students. This cut-point will be established prior to the collection of data. The report will indicate the distribution of the fidelity index across early implementing schools (see Table 9).
Analysis for IRQ2 will examine the extent to which students receive the program. A measure of students’ exposure to Ramp-Up within a school is a function of three factors: (1) the number of students who participate in a Ramp-Up activity (i.e., advisories or workshops); (2) the frequency with which the activity occurs; and (3) the duration of the activity. Measures for participation, frequency, and duration will be based on the average teacher response to questions on the instructional logs and on extant documents from the College Readiness Consortium (see Attachment A-13 for the items corresponding to these factors and how they will be transformed for use in this analysis). The College Readiness Consortium will help REL Midwest establish a cut-point on the exposure index that signifies “adequate” for improving the college readiness of students. This cut-point will be established prior to the collection of data.
The three exposure factors will be calculated by grade (for Grades 10 through 12) and by school. Students’ Ramp-Up exposure by grade level and for the school overall will be the product of participation, frequency, and duration.
The findings presented in the Phase 1 and Phase 2 reports will summarize the gradewide and schoolwide participation rate, frequency, duration, and student exposure across implementing schools. The summary will include the average and standard deviation of each of these measures.
Timeline for Project
Data collection for Phase 1 is currently underway and will continue through June, 2014. Data collection for Phase 2 of the project will begin in October of 2014 and end in the schools in May of 2015. Additional extant documents will be requested of program developers (i.e., does not require OMB clearance) in June of 2015 (see Timeline, Table 7).
Approval not to display the expiration date for OMB approval is not requested.
No exceptions to the certification statement are being sought.
Table 7. Schedule of Activities for Phase 1 and Phase 2 of Ramp-Up Evaluation
Activity |
Project Phase |
Expected Date |
Draft Office of Management and Budget (OMB) package |
1 |
September 2013 |
Final proposal approved by ED |
1 |
October 2013 |
Documentation of institutional review board approval |
1 |
October 2013 |
Submit 60 day Federal Register Notice |
1 |
November 2013 |
Submit 30 day Federal Register Notice |
1 |
January 2014 |
Expected OMB clearance data |
1 |
May 2014 |
Collect extant administrative school and student data from schools and districts |
1 |
April 2014; June 2014 |
Collect extant administrative school and student data from MDE and MOHE |
1 |
June 2014 |
Conduct interviews with Ramp-Up and later-implementing schools |
1 |
April 2014 |
Collect extant documents from Ramp-Up and later-implementing schools |
1 |
May 2014 |
Collect extant documents from the program developers |
1 |
June 2014 |
Administer instructional logs to Ramp-Up teachers after the last two workshops |
1 |
April-May 2014 |
Conduct Spring focus groups in Ramp-Up and later-implementing schools |
1 |
May 2014 |
Administer survey to Ramp-Up school staff |
1 |
May 2014 |
Administer student personal readiness assessment in Ramp-Up and later-implementing schools |
1 |
May 2014 |
Administer student survey |
1 |
May 2014 |
Collect administrative data from SLEDS |
1 |
July-August 2014 |
Conduct data analysis |
1 |
August-September 2014 |
Submit technical report draft on Ramp-Up Implementation for IES review |
1 |
October 2014 |
Submit research brief on Ramp-Up Implementation |
1 |
December 2014 |
Final proposal for Phase 2 approved by ED |
2 |
April 2014 |
Revision to OMB package submitted |
2 |
May 2014 |
Obtain approval from institutional review board for Phase 2 |
2 |
June 2014 |
Submit 60-day Federal Register Notice |
2 |
May 2014 |
Submit 30-day Federal Register Notice |
2 |
July 2014 |
Expected OMB clearance data |
2 |
September 2014 |
Collect extant administrative school and student data from schools and districts |
2 |
October 2014; June 2015 |
Collect extant documents from the program developers |
2 |
June 2015 |
Administer instructional logs to Ramp-Up teachers after each of five workshops |
2 |
October 2014 - May 2015 |
Administer student personal readiness assessment in Ramp-Up and later implementing schools |
2 |
October 2014 - May 2015 |
Administer student survey |
2 |
October 2014 - May 2015 |
Administer fall staff survey (early and later implementing schools) |
2 |
October 2014 |
Administer spring staff survey (early implementing schools) |
2 |
May 2015 |
Submit first draft of technical report on Ramp-Up impact for IES review |
2 |
August 2015 |
Submit first draft of research brief on Ramp-Up impact for IES review. |
2 |
ACT, Inc. (2012). ENGAGE™ grades 10–12 user’s guide. Iowa City, IA: Author. Retrieved from http://www.act.org/engage/pdf/10-12_user_guide.pdf
Berkner, L., & Chavez, L. (1997). Access to postsecondary education for the 1992 high school graduates (NCES 98-105). Washington, DC: U.S. Department of Education, National Center for Education Statistics. Retrieved from http://nces.ed.gov/pubs98/98105.pdf
Bureau of Labor Statistics. (2013). May 2012 state occupational employment and wage estimates: Minnesota. Washington, DC: Author. Retrieved from http://www.bls.gov/oes/current/oes_mn.htm#00-0000
Council of Chief State School Officers. (2010). ESEA reauthorization principles and recommendations. Washington, DC: Author. Retrieved from http://www.ccsso.org/Documents/2009/ESEA_Task_Force_Policy_Statement_2010.pdf
Durlack, J. A., Weissberg, R. P., Dymnicki, A. B., Taylor, R. D., & Schellinger, K. B. (2011). The impact of enhancing students' social and emotional learning: A meta-analysis of school-based universal interventions. Child Development, 82(1), 405–432.
Farrington, C. A., Roderick, M., Allensworth, E., Nagaoka, J., Keyes, T. S., Johnson, D. W., & Beechum, N. O. (2012). Teaching adolescents to become learners: The role of noncognitive factors in shaping school performance. Chicago, IL: University of Chicago Consortium on Chicago School Research.
Hill, D. H. (2008). School strategies and the “college-linking” process: Reconsidering the effects of high schools on college enrollment. Sociology of Education, 81(1), 53-76.
McDonough, P. M. (1997). Choosing colleges: How social class and schools structure opportunity. Albany, NY: State University of New York Press.
Peterson, C. H., Casillas, A., & Robbins, S. B. (2006). The student readiness inventory and the Big Five: Examining social desirability and college academic performance. Personality and Individual Differences, 41, 663–673.
Robbins, S. B., Allen, J., Casillas, A., Peterson, C. H., & Le, H. (2006). Unraveling the differential effects of motivational and skills, social and self-management measures from traditional predictors of college outcomes. Journal of Educational Psychology, 98, 598–616.
Schochet, P. Z. (2013). Statistical power for school-based RCTs with binary outcomes. Journal of Research on Educational Effectiveness,6(3), 263-294.
Tierney, W. G., Bailey, T., Constantine, J., Finkelstein, N., & Hurd, N. F. (2009). Helping students navigate the path to college: What high schools can do (NCEE #2009-4066). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. Retrieved from http://ies.ed.gov/ncee/wwc/practiceguide.aspx?sid=11
U.S. Department of Education. (2010). A blueprint for reform: The reauthorization of the Elementary and Secondary Education Act. Washington, DC: Author. Retrieved from http://www2.ed.gov/policy/elsec/leg/blueprint/blueprint.pdf
August xx, 2014
Dear [SCHOOL/DISTRICT STAFF MEMBER]:
Regional Educational Laboratory (REL) Midwest at American Institutes for Research (AIR) is the evaluator of the Ramp-Up to Readiness program at [SCHOOL]. REL Midwest is one of 10 regional educational laboratories funded by the Institute of Education Sciences at the U.S. Department of Education and tasked with providing technical assistance and research support to topic-focused groups of education-related stakeholders with the ultimate aim of learning what works for improving student academic outcomes
In [MONTH/YEAR], our research team received approval from [SCHOOL/DISTRICT] to conduct this study. REL Midwest is beginning the data collection process. As a first step, we are sharing with you a list of student- and school-level data that we will need to collect for the study. The data that you share with our project team, in combination with data from the Minnesota Department of Education, will allow us to understand schools’ experience with implementing Ramp-Up.
A data request for this project is on the second page of this letter. It provides a general description of the data elements needed for this part of the project. You will see that the data request is organized according to these levels of data (i.e., student data and school data). We are requesting student-level data only for students enrolled in 10th, 11th, or 12th grade in fall 2014.
We have found that an initial phone discussion about the data elements being requested can help prevent misunderstandings about data availability and quality. That said, we would like to find a 30-minute window of time next week, [DATES], when we can discuss the data request. Please let me know of your availability (or the availability of someone else who is familiar with the data), and we will arrange a phone conference.
We appreciate your support for this study. Please let me know when you are available next week. In the meantime, if there are other questions you have, feel free to contact me at the phone number below my name.
Best regards,
Jim Lindsay, Ph.D.
Principal Investigator,
REL Midwest
630-649-6591
REL Midwest Study of Ramp-Up to Readiness: Request for Student-Level Data
|
Students Grade Level Fall 2014–15 |
||
Student-Level Variables |
10th |
11th |
12th |
Student MARSS ID in fall 2014–15 |
|
|
|
Grade level in fall 2014–15 |
|
|
|
Student leave code (e.g., to indicate transfer, dropout) |
|
|
|
Cumulative unweighted and weighted GPA in spring 2014 |
|
|
|
EXPLORE score and date of administration |
|
|
|
PLAN score and date of administration |
|
|
|
Indicator of whether student took the ACT or SAT in 2013–14 |
|
|
|
ACT composite scores with dates of administration |
|
|
|
SAT reading and mathematics scores with dates of administration |
|
|
|
Number of E-level courses enrolled in 2013–14 and fall 2014–15 |
|
|
|
Number of D-level courses enrolled in 2013–14 and fall 2014–15 |
|
|
|
Number of A-level courses enrolled in 2013–14 and fall 2014–15 |
|
|
|
Number of C-level courses enrolled in 2013–14 and fall 2014–15 |
|
|
|
Number of transcripts requested in fall 2014–15 |
|
|
|
School-Level Variables |
School ID |
Number of 10th, 11th, and 12th graders (separately) in 2013–14 |
Percentages of 10th, 11th, and 12th graders (separately) who took an E-level course in 2013–14 |
Percentages of 10th, 11th, and 12th graders (separately) who took a D-level course in 2013–14 |
Percentages of 10th, 11th, and 12th graders (separately) who took an A-level course in 2013–14 |
Percentages of 10th, 11th, and 12th graders (separately) who took a C-level course in 2013–14 |
School average PLAN composite score in 2013–14 |
Percentages of 11th and 12th graders (separately) who took the ACT during 2013–14 |
Percentages of 11th and 12th graders (separately) who took the SAT during 2013–14 |
School average ACT composite score in 2013–14 |
School average SAT critical reading and mathematics scores in 2013–14 |
Percentage of 12th graders who submitted a college application in 2013–14 |
According
to the Paperwork Reduction Act of 1995, no persons are required to
respond to a collection of information unless it displays a valid
OMB control number. The valid OMB control number for this
information collection is XXXX.XXXX,
OMB expiration date is XXXX,
XX,
2014. The time required to complete this information collection is
estimated to average 90 minutes per respondent, including the time
to review instructions, gather the data needed, and complete and
review the information collected. If you have any comments
concerning the accuracy of the time estimate(s) or suggestions for
improving this form, please write to: U.S. Department of Education,
Washington, DC 20202. If you have comments or concerns regarding the
status of your individual submission of this form, write directly
to: U.S. Department of Education, Institute of Education Sciences,
555 New Jersey Avenue, NW, Washington, DC 20208.
Purpose. We want to learn about your experiences at school with planning and preparing for life after high school. The questions on this survey ask about preparing for college and a career. The information you provide will help schools provide better information and assistance to students so that they can prepare for the future. This study is being conducted through the Regional Educational Laboratory (REL) Midwest.
Your answers will be kept confidential. All data collected will be kept confidential. We will not provide information that identifies you or your school to anyone outside the study team, except as required by law. Your answers will be combined with the answers of other students to describe what students think about the ways schools prepare them for life after high school.
Risks. There are no known risks related to participating in this survey.
Your answers are voluntary. You have the right to stop participating in this survey at any time without consequences. We hope you will answer all the questions, but if there is a question you do not wish to answer, simply skip it. Also, there are no right or wrong answers to these questions—we really just want to learn about your experiences at your school.
Procedure. This survey will take about 10 to 15 minutes.
Contact Information. If you have questions or concerns about this study, please contact Jim Lindsay at [email protected] or 630-649-6591. If you have concerns or questions about your rights as a participant, contact the chair of AIR’s Institutional Review Board (which is responsible for the protection of study participants) using the following contact information:
E-Mail: [email protected]
Phone: 1-800-634-0797 (toll free)
Mail: IRB Chair
c/o AIR
1000 Thomas Jefferson Street NW
Washington, DC 20007
If you want to take the survey, please continue. If you prefer not to participate, please check the “do not” box below and inform your survey administrator. Thank you for your help!
I want to continue with the survey I do not want to complete the survey
According
to the Paperwork Reduction Act of 1995, no persons are required to
respond to a collection of information unless it displays a valid
OMB control number. The valid OMB control number for this
information collection is XXXX.XXXX,
OMB expiration date is XXXX,
XX,
2014. The time required to complete this information collection is
estimated to average 15 minutes per respondent, including the time
to review instructions, gather the data needed, and complete and
review the information collected. If you have any comments
concerning the accuracy of the time estimate(s) or suggestions for
improving this form, please write to: U.S. Department of Education,
Washington, DC 20202. If you have comments or concerns regarding the
status of your individual submission of this form, write directly
to: U.S. Department of Education, Institute of Education Sciences,
555 New Jersey Avenue, NW, Washington, DC 20208.
Background Information
What grade are you currently in?
9th
10th
11th
12th
Postsecondary Plans
The following set of questions asks about your plans after high school. When this survey says “college,” it means any kind of college, including two-year colleges, four-year colleges, universities, community colleges, and career or technical colleges (such as a culinary school or a cosmetology school).
At this time, what is your plan for next year? (Check all that apply.)
Attend a four-year college or university
Attend a community college
Attend a career or technical college
Get a job
Enter the military
Take a year off
Other
I don’t know.
So far this school year, how often have you talked to a counselor, teacher, or other adult at school about planning for college?
Never
One or two times
More than two times but less than once a week
Once a week
More than once a week
This school year, have you developed a written plan for achieving your educational or career goals after high school?
Yes
No
I’m not sure.
Is your plan stored electronically (for example, in the Minnesota Career Information System, the Wisconsin Career Information System, or Naviance)?
Yes
No
I’m not sure.
I have not developed a written postsecondary plan with a counselor, teacher, or other adult at my school.
At the last registration time, did school staff help you in choosing classes that you need to reach your goals for after high school?*
Yes
No
So far this school year, how many times have you discussed your progress toward attaining the goals on your plan with a counselor, teacher, or other adult in your school?
Never
Once
Twice
Three times
More than three times
I’m not sure.
I have not developed a written postsecondary plan with a counselor, teacher, or other adult at my school.
To what extent do you disagree or agree with the following statements?
At my school, all students are expected to go to some type of college.*
Strongly disagree
Disagree
Agree
Strongly agree
I know the skills that I need to work on if I am going to graduate from high school ready for success in college.* 20
Strongly disagree
Disagree
Agree
Strongly agree
I don’t plan to attend college.
Students’ Academic Readiness
The following set of questions asks about your academic preparation for college.
So far this school year, has an adult at your school encouraged you to take an honors course or a course for college credit, such as an Advanced Placement (AP), International Baccalaureate (IB), Postsecondary Enrollment Options (PSEO), or College in the Schools course?*
Yes
No
So far this school year, how often has an adult at your high school discussed with you your likelihood to succeed academically in college-level classes?**
Never
Once
Twice
Three to five times
More than five times
Students’ Admissions Readiness
The following questions ask about developing college plans.
To what extent do you disagree or agree with the following statement?
I know which type of college (for example a four-year college, a community college, a career or technical college) would help me reach my goals after high school.
Strongly disagree
Disagree
Agree
Strongly agree
I don’t plan to attend college.
So far this school year, how often has an adult at your high school discussed with you the steps that you need to take to apply to the type of college that you want to attend?*
Never
One or two times
Three to five times
More than five times
I don’t plan to attend college.
So far this school year, how often has an adult at your high school discussed with you your likelihood of being accepted at different types of colleges?**
Never
One or two times
Three to five times
More than five times
I don’t plan to attend college.
Students’ Career Readiness
The following questions ask about developing career plans.
To what extent do you disagree or agree with the following statements?
I know the kinds of careers that would best fit my strengths and skills.*
Strongly disagree
Disagree
Agree
Strongly agree
I know the level of education required for the career I am most interested in.*
Strongly disagree
Disagree
Agree
Strongly agree
So far this school year, how helpful has your high school been to you in assessing your career interests and abilities? **
Not at all helpful
Somewhat helpful
Helpful
Very helpful
How helpful has your high school been to you in developing a career plan?**
Not at all helpful
Somewhat helpful
Helpful
Very helpful
I do not have a career plan.
Students’ Financial Readiness
The following questions ask about paying for college.
So far this school year, how often has an adult at your school talked to you about how to pay for tuition or other college expenses?**
Never
Once
Twice
Three to five times
More than five times
Do you have a plan for paying for college?*
Yes
No
I don’t plan to attend college.
Students’ College Actions
The following questions ask about some college-related actions you may have taken or plan to take.
Have you ever taken the ACT or SAT test?
Yes
No, but I plan to take the ACT or SAT.
No, I do not plan to take the ACT or SAT.
How many college applications, if any, have you submitted so far this school year?
None
One
Two or three
Four or five
More than five
So far this school year, how much have your teachers, counselors, or other school staff helped you with a college application essay or personal statement?**
Not at all
A little
Some
A lot
I do not plan to graduate from high school this school year.
So far this school year, how much have your teachers, counselors, or other school staff helped you find scholarships to apply for?**
Not at all
A little
Some
A lot
I do not plan to graduate from high school this school year.
Wrap-Up
These last questions ask for some general information.
So far this school year, who has helped you most to prepare for college? (Check only one.)
Counselors
Teachers
Dean
Other adults in my school
Parents or guardians
Other family members
Other adults aside from my school or family
Friends
No one
Do you have at least one parent or guardian who has completed a college degree?*
Yes
No
I’m not sure.
Thank you for participating in this survey!
Assent Form
Purpose. We want to learn about your experiences at school with planning and preparing for life after high school. The questions on this survey ask about preparing for college and a career. The information you provide will help schools provide better information and assistance to students so that they can prepare for the future. This study is being conducted through the Regional Educational Laboratory (REL) Midwest.
Your answers will be kept confidential. All data collected will be kept confidential. We will not provide information that identifies you or your school to anyone outside the study team, except as required by law. Your answers will be combined with the answers of other students to describe what students think about the ways schools prepare them for life after high school.
Risks. There are no known risks related to participating in this survey.
Your answers are voluntary. You have the right to stop participating in this survey at any time without consequences. We hope you will answer all the questions, but if there is a question you do not wish to answer, simply skip it. Also, there are no right or wrong answers to these questions—we really just want to learn about your experiences at your school.
Procedure. This survey will take about 10 to 15 minutes.
Contact Information. If you have questions or concerns about this study, please contact Jim Lindsay at [email protected] or 630-649-6591. If you have concerns or questions about your rights as a participant, contact the chair of AIR’s Institutional Review Board (which is responsible for the protection of study participants) using the following contact information:
E-Mail: [email protected]
Phone: 1-800-634-0797 (toll free)
Mail: IRB Chair
c/o AIR
1000 Thomas Jefferson Street NW
Washington, DC 20007
If you want to take the survey, please continue. If you prefer not to participate, please check the “do not” box below and inform your survey administrator. Thank you for your help!
According
to the Paperwork Reduction Act of 1995, no persons are required to
respond to a collection of information unless it displays a valid
OMB control number. The valid OMB control number for this
information collection is XXXX.XXXX,
OMB expiration date is XXXX,
XX,
2014. The time required to complete this information collection is
estimated to average 15 minutes per respondent, including the time
to review instructions, gather the data needed, and complete and
review the information collected. If you have any comments
concerning the accuracy of the time estimate(s) or suggestions for
improving this form, please write to: U.S. Department of Education,
Washington, DC 20202. If you have comments or concerns regarding the
status of your individual submission of this form, write directly
to: U.S. Department of Education, Institute of Education Sciences,
555 New Jersey Avenue, NW, Washington, DC 20208.
Background Information
What grade are you currently in?
9th
10th
11th
12th
Postsecondary Plans
The following set of questions asks about your plans after high school. When this survey says “college,” it means any kind of college, including two-year colleges, four-year colleges, universities, community colleges, and career or technical colleges (such as a culinary school or a cosmetology school).
At this time, what is your plan for next year? (Check all that apply)
Attend a four-year college or university
Attend a community college
Attend a career or technical college
Get a job
Enter the military
Take a year off
Other
I don’t know.
So far this school year, how often have you talked to a counselor, teacher, or other adult at school about planning for college?
Never
One or two times
More than two times but less than once a week
Once a week
More than once a week
This school year, have you developed a written plan for achieving your educational or career goals after high school?
Yes
No
I’m not sure.
Is your plan stored electronically (for example, in the Minnesota Career Information System, the Wisconsin Career Information System, or Naviance)?
Yes
No
I’m not sure.
I have not developed a written postsecondary plan with a counselor, teacher, or other adult at my school.
At the last registration time, did school staff help you in choosing classes that you need to reach your goals for after high school?*
Yes
No
So far this school year, how many times have you discussed your progress toward attaining the goals on your plan with a counselor, teacher, or other adult in your school?
Never
Once
Twice
Three times
More than three times
I’m not sure.
I have not developed a written postsecondary plan with a counselor, teacher, or other adult at my school.
To what extent do you disagree or agree with the following statements?
At my school, all students are expected to go to some type of college.*
Strongly disagree
Disagree
Agree
Strongly agree
I know the skills that I need to work on if I am going to graduate from high school ready for success in college.*22
Strongly disagree
Disagree
Agree
Strongly agree
I don’t plan to attend college.
Students’ Academic Readiness
The following questions ask about your academic preparation for college.
So far this school year, has an adult at your school encouraged you to take an honors course or a course for college credit, such as an Advanced Placement (AP), International Baccalaureate (IB), Postsecondary Enrollment Options (PSEO), or College in the Schools course?*
Yes
No
So far this school year, how often has an adult at your high school discussed with you your likelihood to succeed academically in college-level classes?**
Never
Once
Twice
Three to five times
More than five times
Students’ Admissions Readiness
The following questions ask about developing college plans.
To what extent do you disagree or agree with the following statement?
I know which type of college (for example a four-year college, a community college, a career or technical college) would help me reach my goals after high school.
Strongly disagree
Disagree
Agree
Strongly agree
I don’t plan to attend college.
So far this school year, how often has an adult at your high school discussed with you the steps that you need to take to apply to the type of college that you want to attend?*
Never
One or two times
Three to five times
More than five times
I don’t plan to attend college.
So far this school year, how often has an adult at your high school discussed with you your likelihood of being accepted at different types of colleges?**
Never
One or two times
Three to five times
More than five times
I don’t plan to attend college.
Students’ Career Readiness
The following questions ask about developing career plans.
To what extent do you disagree or agree with the following statements?
I know the kinds of careers that would best fit my strengths and skills.*
Strongly disagree
Disagree
Agree
Strongly agree
I know the level of education required for the career I am most interested in.*
Strongly disagree
Disagree
Agree
Strongly agree
So far this school year, how helpful has your high school been to you in assessing your career interests and abilities? **
Not at all helpful
Somewhat helpful
Helpful
Very helpful
How helpful has your high school been to you in developing a career plan?**
Not at all helpful
Somewhat helpful
Helpful
Very helpful
I do not have a career plan.
Students’ Financial Readiness
The following questions ask about paying for college.
So far this school year, how often has an adult at your school talked to you about how to pay for tuition or other college expenses?**
Never
Once
Twice
Three to five times
More than five times
Do you have a plan for paying for college?*
Yes
No
I don’t plan to attend college.
Students’ College Actions
The following questions ask about some college-related actions you may have taken or plan to take.
Have you ever taken the ACT or SAT test?
Yes
No, but I plan to take the ACT or SAT.
No, I do not plan to take the ACT or SAT.
Have you submitted the Free Application for Federal Student Aid (FAFSA) so far this school year?
Yes
No, but I plan to submit the FAFSA by the end of the summer.
No, I do not plan to submit the FAFSA.
I don’t know.
So far this school year, how much have your teachers, counselors, or other school staff helped you fill out the FAFSA?**
Not at all
A little
Some
A lot
How many college applications, if any, have you submitted so far this school year?
None
One
Two or three
Four or five
More than five
So far this school year, how much have your teachers, counselors, or other school staff helped you with a college application essay or personal statement?**
Not at all
A little
Some
A lot
I do not plan to graduate from high school this school year.
So far this school year, how much have your teachers, counselors, or other school staff helped you find scholarships to apply for?**
Not at all
A little
Some
A lot
I do not plan to graduate from high school this school year.
Wrap-Up
These last questions ask for some general information.
So far this school year, who has helped you most to prepare for college? (Check only one.)
Counselors
Teachers
Dean
Other adults in my school
Parents or guardians
Other family members
Other adults aside from my school or family
Friends
No one
Do you have at least one parent or guardian who has completed a college degree?*
Yes
No
I’m not sure.
Thank you for participating in this survey!
[HIGH SCHOOL LETTERHEAD]
August XX, 2014
Dear Parent or Guardian:
[Insert high school] is committed to helping our students graduate college and career ready. As part of our commitment, we are working with the Regional Educational Laboratory (REL) Midwest to study a schoolwide guidance program called Ramp-Up to Readiness (“Ramp-Up”). Scholars at the College Readiness Consortium at the University of Minnesota developed Ramp-Up based on existing research.
The purpose of this letter is to let you know that we plan to release some information about the students in [high school] to REL Midwest and that your son or daughter may be asked to complete a survey and college-readiness assessment. The information provided to REL Midwest will not include student names or any other personally identifiable information about you or your child. In other words, the data will be anonymous to the researchers.
Parents and students should understand the following:
This anonymous information will help the research team better understand the challenges that schools have in carrying out Ramp-Up and the experiences students have when participating in Ramp-Up activities.
The anonymous information will include students’ grades, test scores, course enrollments, college enrollment activities, and the student identification number used by the Minnesota Department of Education. The researchers will be able to use your son’s or daughter’s identification number to obtain other information from state databases, such as test scores, but they will be unable to link that number with your child’s identity.
As part of the study, your son or daughter may be asked to do the following:
Take a survey this spring that asks about his or her experiences at school with planning and preparing for life after high school.
Take an online college-readiness assessment (ACT’s ENGAGE assessment) in the spring. This assessment will measure students’ motivation and skills, social engagement, or self-regulation.
No student has to answer questions on the survey or assessment that he or she does not want to answer.
All information about your child will be anonymous. The information collected will only be used for this research project, and the researchers will average the data for all students and all participating schools. They will report these averages in government reports and research articles, but readers will be unable link those findings with individual students, teachers, or schools.
Risks: This study presents minimal risk to your child. That is, students do not experience any risks beyond what they experience every day at school.
Benefits: Study participation helps build knowledge about how to better support students to be college or career ready.
Participation in the study is voluntary. Students do not have to participate if they do not want to, and they will experience no repercussions at school if they decide not to participate. Our school’s participation in this research study helps educators learn more about how schools can help students become college and career ready.
If you do not wish us to release anonymous information for your child or have your child complete the surveys and assessments, please fill in the form below and have your son or daughter return this letter to [return location] by [deadline].
If you have questions about this research project or about your child’s rights as a participant, please contact Jim Lindsay of REL Midwest at 630-649-6591.
Sincerely,
[insert district signatory]
By signing this form, you are indicating that you do not wish your child to participate in the study or for us to share your child’s information with the REL Midwest research team.
I do NOT want my child, __________________________________________,
Full Student Name
(Student ID # _____________________) to participate in the Ramp-Up evaluation being conducted by REL Midwest.
Your name: ______________________________________________________
Your signature: ___________________________________________________
According
to the Paperwork Reduction Act of 1995, no persons are required to
respond to a collection of information unless it displays a valid
OMB control number. The valid OMB control number for this
information collection is XXXX.XXXX,
OMB expiration date is XXXX,
XX,
2014. The time required to complete this information collection is
estimated to average 5 minutes per respondent, including the time to
review instructions, gather the data needed, and complete and review
the information collected. If you have any comments concerning the
accuracy of the time estimate(s) or suggestions for improving this
form, please write to: U.S. Department of Education, Washington, DC
20202. If you have comments or concerns regarding the status of your
individual submission of this form, write directly to: U.S.
Department of Education, Institute of Education Sciences, 555 New
Jersey Avenue, NW, Washington, DC 20208.
Purpose
Our school has partnered with the Regional Educational Laboratory (REL) Midwest to study the implementation of a schoolwide guidance program called Ramp-Up to Readiness (“Ramp-Up”) developed by the College Readiness Consortium at the University of Minnesota. REL Midwest, operated by American Institutes for Research, is sponsored by the Institute of Education Sciences (IES) at the U.S. Department of Education. The evaluation will examine how school staff members implement Ramp-Up and how the intervention compares with college-readiness supports in other high schools. The study has been submitted to IES for research approval. An application also will be submitted to the Office of Management and Budget for review.
REL Midwest invites you and other school staff to participate in the study, which will begin this spring. The study involves the following data collection activities, in which you may be asked to participate:
Two focus groups, one in February and one in May
Short instructional logs following each of five Ramp-Up college workshops
An online survey in the spring 2014
Voluntary Participation
Participation in the data collection activities is voluntary. You also can withdraw from the study at any time. Individuals who decline to participate or later withdraw from the study will face no personal or professional repercussions.
Risks
There are few anticipated or known risks in participating in this study. Data collected and maintained by, or under the auspices of, IES under a pledge of confidentiality shall be treated in a manner that will ensure that individually identifiable data will be used only for statistical purposes and will be accessible only to authorized persons.
Benefits
Your participation in the evaluation will contribute to an understanding of a schoolwide college readiness program that seeks to improve the college readiness outcomes of all students. You will also receive a $25 gift card from Amazon.com for participating in the data collection activities.
Confidentiality
Responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with any specific individual. The researchers will not provide information that identifies you or your school to anyone outside the study team, except as required by law.
More Information
If you would like more information about this study, you may contact Jim Lindsay of REL Midwest at 630-649-6591.
Informed Consent
By signing this form, you are indicating that you have read and understood the information provided to you about your participation.
According
to the Paperwork Reduction Act of 1995, no persons are required to
respond to a collection of information unless it displays a valid
OMB control number. The valid OMB control number for this
information collection is XXXX.XXXX,
OMB expiration date is XXXX,
XX,
2014. The time required to complete this information collection is
estimated to average 5 minutes per respondent, including the time to
review instructions, gather the data needed, and complete and review
the information collected. If you have any comments concerning the
accuracy of the time estimate(s) or suggestions for improving this
form, please write to: U.S. Department of Education, Washington, DC
20202. If you have comments or concerns regarding the status of your
individual submission of this form, write directly to: U.S.
Department of Education, Institute of Education Sciences, 555 New
Jersey Avenue, NW, Washington, DC 20208.
Purpose. We would like your feedback on the workshop you taught today and some information about the advisories that you have taught so far this year. Your opinions are useful for improving the quality of the Ramp-Up program.
Confidentiality. Regional Educational Laboratory Midwest will keep all collected data confidential. Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific individual. We will not provide information that identifies you to anyone outside the study team, except as required by law. Any willful disclosure of such information for nonstatistical purposes, without the informed consent of the respondent, is a class E felony.
Risks. There are no known risks related to participating in this survey.
Voluntary Participation. You have the right to discontinue your participation in this survey at any time without consequences. We hope you will answer all the questions, but if there is a question you do not wish to answer, simply skip it.
Procedure. Completion of each log entry will take no longer than 10 minutes. If you complete all five logs, you will receive a $25 gift card for your participation.
Contact Information. If you have questions or concerns about this study, please contact Jim Lindsay at [email protected] or 630-649-6591. If you have concerns or questions about your rights as a participant, contact the chair of AIR’s Institutional Review Board (which is responsible for the protection of study participants) using the following contact information:
E-Mail: [email protected]
Phone: 1-800-634-0797 (toll free)
Mail: IRB Chair
c/o AIR
1000 Thomas Jefferson Street NW
Washington, DC 20007
According
to the Paperwork Reduction Act of 1995, no persons are required to
respond to a collection of information unless it displays a valid
OMB control number. The valid OMB control number for this
information collection is XXXX.XXXX,
OMB expiration date is XXXX,
XX,
2014. The time required to complete this information collection is
estimated to be 10 minutes per respondent, including the time to
review instructions, gather the data needed, and complete and review
the information collected. If you have any comments concerning the
accuracy of the time estimate(s) or suggestions for improving this
form, please write to: U.S. Department of Education, Washington, DC
20202. If you have comments or concerns regarding the status of your
individual submission of this form, write directly to: U.S.
Department of Education, Institute of Education Sciences, 555 New
Jersey Avenue, NW, Washington, DC 20208.
Background Information
What is your current position at this school? Check all that apply.
Teacher
Counselor
Dean
Principal
Assistant principal
Other school staff
What is the name of your school? [text box]
Today’s Ramp-Up Workshop
The following questions ask about the Ramp-Up workshop that you taught today.
Did you teach or assist in teaching a Ramp-Up workshop today?
Yes [continue to q4]
No [skip to q19]
What topic(s) did you cover in today’s workshop? [text box]
How many students actually attended the workshop? [drop down box]
What percentage of your students attended the workshop? [drop down box]
In what grades are the students who attended the workshop that you taught today? Check all that apply.
9th grade
10th grade
11th grade
12th grade
How long did today’s workshop last?
Less than 30 minutes
30–45 minutes
46–60 minutes
More than 60 minutes
Select the college-readiness pillars that were discussed during today’s workshop? Check all that apply.
Academic Readiness
Admissions Readiness
Career Readiness
Financial Readiness
Personal and Social Readiness
Did you adapt the instructional materials to suit the needs of your workshop?
Yes [continue to q11]
No [skip to q12]
Please describe how you adapted the instructional materials to suit the needs of your workshop. [text box]
What percentage of students who attended today’s workshop were actively engaged in the workshop’s activities?
Less than 25 percent
25 percent to 50 percent
51 percent to 75 percent
More than 75 percent
I do not know.
I had enough time to prepare lesson content prior to teaching today’s workshop.
Strongly disagree
Disagree
Agree
Strongly agree
I had enough information about the college selection and enrollment process to teach today’s workshop.
Strongly disagree
Disagree
Agree
Strongly agree
I had enough information about the knowledge and skills needed to succeed in college to teach today’s workshop.
Strongly disagree
Disagree
Agree
Strongly agree
What worked well in today’s workshop? [text box]
What could be improved about today’s workshop? [text box]
If you have any additional comments about today’s workshop, please enter them here: [text box]
Weekly Ramp-Up Advisories
The following questions ask about the weekly Ramp-Up advisories taught so far this year.
Have you taught at least one Ramp-Up advisory this school year?
Yes [continue to q20]
No [end survey]
Are you assigned to teach a ninth-grade Ramp-Up advisory?
Yes [continue to q21]
No [skip to q22]
Which lessons have you taught so far this year to ninth graders? Check all that apply. [insert names of each lesson taught to ninth graders]
Are you assigned to teach a 10th-grade Ramp-Up advisory?
Yes [continue to q23]
No [skip to q24]
Which lessons have you taught so far this year to 10th graders? Check all that apply. [insert names of each lesson taught to 10th graders]
Are you assigned to teach an 11th-grade Ramp-Up advisory?
Yes [continue to q25]
No [skip to q26]
Which lessons have you taught so far this year to 11th graders? Check all that apply. [insert names of each lesson taught to 11th graders]
Are you assigned to teach a 12th-grade Ramp-Up advisory?
Yes [continue to q27]
No [skip to q28]
Which lessons have you taught so far this year to 12th graders? Check all that apply. [insert names of each lesson taught to 12th graders]
On average, what percentage of students scheduled to attend your weekly advisory have attended every session so far?
Less than 25 percent
25–50 percent
51–75 percent
More than 75 percent
I do not know.
On average, how long have the weekly advisory sessions you’ve taught so far this year lasted?
Less than 20 minutes
20–29 minutes
30 minutes
I do not know.
So far this year, how often have you adapted the instructional materials to suit the needs of your advisory?
Never
Rarely
Sometimes
Often
Always
On average, what percentage of students in your advisory actively engage in the advisory’s activities?
Less than 25 percent
25–50 percent
51–75 percent
More than 75 percent
I do not know.
If you have any additional comments about the Ramp-Up advisories, please enter them here: [text box]
Thank you for completing this log!
(To Be Administered Over Internet)
Purpose. The Regional Educational Laboratory (REL) Midwest at American Institutes for Research is conducting this survey as part of its evaluation of Ramp-Up to Readiness (“Ramp-Up”). We want to learn about your experiences with Ramp-Up overall and also how those experiences relate to the program’s curriculum, tools, and professional development. The information you provide will be used to improve Ramp-Up and other college-readiness programs.
Confidentiality. REL Midwest will keep all collected data confidential. Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. We will not provide information that identifies you or your district to anyone outside the study team, except as required by law. Any willful disclosure of such information for nonstatistical purposes, without the informed consent of the respondent, is a class E felony.
Risks. There are no known risks related to participating in this survey.
Voluntary Participation. You have the right to discontinue your participation in this survey at any time without consequences. We hope you will answer all the questions, but if there is a question you do not wish to answer, simply skip it.
Procedure. This survey will take about 20 to 30 minutes, and you will receive a $25 gift card for your participation.
Contact Information. If you have questions or concerns about this study, please contact Jim Lindsay at [email protected] or 630-649-6591. If you have concerns or questions about your rights as a participant, contact the chair of AIR’s Institutional Review Board (which is responsible for the protection of study participants) using the following contact information:
E-Mail: [email protected]
Phone: 1-800-634-0797 (toll free)
Mail: IRB Chair
c/o AIR
1000 Thomas Jefferson Street NW
Washington, DC 20007
According
to the Paperwork Reduction Act of 1995, no persons are required to
respond to a collection of information unless it displays a valid
OMB control number. The valid OMB control number for this
information collection is XXXX.XXXX,
OMB expiration date is XXXX,
XX,
2014. The time required to complete this information collection is
estimated to average 30 minutes per respondent, including the time
to review instructions, gather the data needed, and complete and
review the information collected. If you have any comments
concerning the accuracy of the time estimate(s) or suggestions for
improving this form, please write to: U.S. Department of Education,
Washington, DC 20202. If you have comments or concerns regarding the
status of your individual submission of this form, write directly
to: U.S. Department of Education, Institute of Education Sciences,
555 New Jersey Avenue, NW, Washington, DC 20208.
Background Information
What is the name of your school? [text box]
What is your current position at this school? (Check all that apply.)
Teacher
Counselor
Dean
Principal
Assistant principal
Other school staff
What grade-level are the majority of students you work with? (Check all that apply.)
Grade 8 or lower
Grade 9
Grade 10
Grade 11
Grade 12
I teach students in multiple grades.
Expectations and Beliefs
The following set of questions asks about your expectations and beliefs related to college preparation. In this survey, “college” refers to all postsecondary educational opportunities, including two-year colleges, four-year colleges or universities, community colleges, and career or technical colleges. Please indicate to what extent you disagree or agree with the following statements:
I believe that our school should prepare all students to go on to college.*23
Strongly disagree
Disagree
Agree
Strongly agree
School personnel share a common goal to prepare all students for college.*
Strongly disagree
Disagree
Agree
Strongly agree
All teachers should be able to advise students on college options.*
Strongly disagree
Disagree
Agree
Strongly agree
College counseling is the job of school counselors, not teachers.*
Strongly disagree
Disagree
Agree
Strongly agree
College Knowledge
The following questions ask about your knowledge of college and career readiness. Please rate your own level of knowledge in the following areas:
The range of postsecondary options available to students*
None
Limited
Basic
Moderate
Proficient
Advanced
The level of academic skill (for example, reading, writing, mathematics) necessary for college work*
None
Limited
Basic
Moderate
Proficient
Advanced
Tests that students need for admission to college*
None
Limited
Basic
Moderate
Proficient
Advanced
The college application process*
None
Limited
Basic
Moderate
Proficient
Advanced
Financing a college education*
None
Limited
Basic
Moderate
Proficient
Advanced
The types of personal and social skills that students need to succeed in college
None
Limited
Basic
Moderate
Proficient
Advanced
So far this school year, have you received any professional development related to preparing students for college?
Yes
No
College-Readiness Supports
The following questions ask about any services, activities, and resources that your school offers to help students prepare to succeed in college.
What percentage of students who graduate from your high school are prepared academically to succeed in nonremedial college classes?
Less than 25 percent
25–50 percent
51–75 percent
More than 75 percent
I do not know.
Do all students at your school develop a written plan for achieving their educational or career goals after high school?
Yes
No
Unsure
What percentage of students at your high school use a technology platform (e.g., Naviance, Minnesota Career Information System [MCIS], Wisconsin Career Information System [WCIS]) to support the development of their postsecondary plans?
Less than 25 percent
25–50 percent
51–75 percent
More than 75 percent
I do not know.
Do staff members at your school provide feedback to students about whether they are on track academically for college?
Yes [continue to Q18]
No [skip to Q20]
Unsure [skip to Q20]
How often do all students in Grades 10 through 12 receive feedback?
All students in Grades 10 through 12 receive feedback at least once per school year.
All students in Grades 10 through 12 receive feedback more than once per school year.
Not all students receive feedback every year.
How do students receive feedback? (Check all that apply.)
In mandatory discussions with a counselor or teacher (for example during course scheduling)
In informal discussions with a counselor
In informal discussions with a teacher or other school staff member
In writing without discussion with a school staff member
Other
What practices does your school offer to assist students with the transition to college? (Check all that apply.)***
Holding or participating in college fairs
Consulting with college representatives about requirements
Encouraging students to visit colleges
Offering college visits organized by your school
Offering programs that help students plan or prepare for college (such as Upward Bound, AVID, College Possible, etc.)
Large assemblies or information sessions where students receive information about searching for, and applying to, college
Large assemblies or information sessions where students receive information about paying for college
What percentage of seniors who plan to attend college take the necessary steps to enroll in college?
Less than 25 percent
25–50 percent
51–75 percent
More than 75 percent
I do not know.
What kinds of assistance does your school offer to help students complete college enrollment actions? (Check all that apply.)
Assistance identifying colleges that match a student’s interests, goals, and level of preparation
Assistance with college applications, individually or in small groups
Assistance with completing the Free Application for Federal Student Aid (FAFSA), individually or in small groups
Assistance with identifying scholarship opportunities, individually or in small groups
Assistance with completing scholarship applications, individually or in small groups
Classes or workshops to prepare students to take college admissions exams
Does your school collect timely information about which students complete the following college enrollment actions?
|
Yes |
No |
College applications |
|
|
FAFSA application |
|
|
Scholarship applications |
|
|
Completion of a college admissions exam |
|
|
What portion of your seniors receive school help with the following?***
|
Less than 25 percent |
25–50 percent |
51–75 percent |
More than 75 percent |
I don’t know |
Completing college applications |
|
|
|
|
|
Planning how to pay for college |
|
|
|
|
|
Filling out financial aid forms |
|
|
|
|
|
Identifying scholarship opportunities |
|
|
|
|
|
Completing scholarship applications |
|
|
|
|
|
What percentage of your students understand the requirements of different careers?
Less than 25 percent
25–50 percent
51–75 percent
More than 75 percent
I do not know.
What percentage of your students understand which careers will match their personal goals and abilities?
Less than 25 percent
25–50 percent
51–75 percent
More than 75 percent
I do not know.
What kinds of career exploration activities, if any, does your school offer to students? (Check all that apply.)
Assistance writing a resume
Speakers who discuss careers
A career interest inventory
Job shadowing
Assistance in finding internships
Information about educational and skill requirements of different careers
Information about the earnings payoffs of different careers
Who at your school is responsible for delivering college-related programs, services, activities, and resources to students? (Check all that apply.)
Counselors
Teachers
Administrators
For which of your students do you communicate with parents or guardians about their child’s readiness for college?
None
A few students
Most students
All students
How often do you communicate with parents or guardians about a child’s readiness for college?
More than once per school year for all students
At least once per school year for all students
At least once per school year for some students
Other
Thank you for participating in this survey!
Purpose. The Regional Educational Laboratory (REL) Midwest at American Institutes for Research is conducting this survey as part of its evaluation of Ramp-Up to Readiness. We want to learn about your experiences with Ramp-Up overall and also how your experiences relate to the program’s curriculum, tools, and professional development. The information you provide will be used to improve Ramp-Up and other college-readiness programs.
Confidentiality. REL Midwest will keep all collected data confidential. Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. We will not provide information that identifies you or your district to anyone outside the study team, except as required by law. Any willful disclosure of such information for nonstatistical purposes, without the informed consent of the respondent, is a class E felony.
Risks. There are no known risks related to participating in this survey.
Voluntary Participation. You have the right to discontinue your participation in this survey at any time without consequences. We hope you will answer all the questions, but if there is a question you do not wish to answer, simply skip it.
Procedure. This survey will take about 20 to 30 minutes, and you will receive a $25 gift card for your participation.
Contact Information. If you have questions or concerns about this study, please contact Jim Lindsay at [email protected] or 630-649-6591. If you have concerns or questions about your rights as a participant, contact the chair of AIR’s Institutional Review Board (which is responsible for the protection of study participants) using the following contact information:
E-Mail: [email protected]
Phone: 1-800-634-0797 (toll free)
Mail: IRB Chair
c/o AIR
1000 Thomas Jefferson Street NW
Washington, DC 20007
According
to the Paperwork Reduction Act of 1995, no persons are required to
respond to a collection of information unless it displays a valid
OMB control number. The valid OMB control number for this
information collection is XXXX.XXXX,
OMB expiration date is XXXX,
XX, 2014.
The time required to complete this information collection is
estimated to average 30 minutes per respondent, including the time
to review instructions, gather the data needed, and complete and
review the information collected. If you have any comments
concerning the accuracy of the time estimate(s) or suggestions for
improving this form, please write to: U.S. Department of Education,
Washington, DC 20202. If you have comments or concerns regarding the
status of your individual submission of this form, write directly
to: U.S. Department of Education, Institute of Education Sciences,
555 New Jersey Avenue, NW, Washington, DC 20208.
What is your current position at this school? (Check all that apply.)
Teacher
Counselor
Dean
Principal
Assistant principal
Other school staff
What is your role in delivering the Ramp-Up to Readiness program? (Check all that apply.)
I do not play any role in delivering the Ramp-Up to Readiness program. [end survey and display “Thank you for participating in this survey”]
Ramp-Up coordinator
Member of the Ramp-Up leadership team
Ramp-Up advisor (a teacher who facilitates Ramp-Up advisories)
Other. Please indicate your role: [text box]
[If Q2=Ramp-Up advisor, ask Q3; else, skip to Q4]
What is the grade-level of students in your Ramp-Up advisory?
Grade 9
Grade 10
Grade 11
Grade 12
What is the name of your school? [text box]
The following set of questions asks about your understanding of the goals of the Ramp-Up to Readiness program (“Ramp-Up”) and your role in it. Please indicate to what extent you disagree or agree with the following statements:
I understand the goals of Ramp-Up.
Strongly disagree
Disagree
Agree
Strongly agree
I understand Ramp-Up’s five pillars of readiness (academic, admissions, career, financial, and personal and social readiness).
Strongly disagree
Disagree
Agree
Strongly agree
I understand my role in delivering Ramp-Up.
Strongly disagree
Disagree
Agree
Strongly agree
Do you know who the Ramp-Up Coordinator at your school is?
Yes
No
I’m not sure.
[If Q2=Ramp-Up advisor, ask Q9; else, skip to 10]
My school and district gives me enough time to implement the Ramp-Up program.
Strongly disagree
Disagree
Agree
Strongly agree
[If Q2=Ramp-Up coordinator or Member of the Ramp-Up leadership team, ask 10; else, skip to 11]
My school and district gives me enough time to coordinate the Ramp-Up program.
Strongly disagree
Disagree
Agree
Strongly agree
The following questions ask about your expectations and beliefs related to college preparation. In this survey, “college” refers to all postsecondary educational opportunities, including two-year colleges, four-year colleges or universities, community colleges, and career or technical colleges. Please indicate to what extent you disagree or agree with the following statements:
I believe that our school should prepare all students to go on to college.*24
Strongly disagree
Disagree
Agree
Strongly agree
School personnel share a common goal to prepare all students for college.*
Strongly disagree
Disagree
Agree
Strongly agree
All teachers should be able to advise students on college options.*
Strongly disagree
Disagree
Agree
Strongly agree
College counseling is the job of school counselors, not teachers.*
Strongly disagree
Disagree
Agree
Strongly agree
The range of postsecondary options available to students*
None
Limited
Basic
Moderate
Proficient
Advanced
The level of academic skill (for example, reading, writing, mathematics) necessary for college work*
None
Limited
Basic
Moderate
Proficient
Advanced
Tests that students need for admission to college*
None
Limited
Basic
Moderate
Proficient
Advanced
The college application process*
None
Limited
Basic
Moderate
Proficient
Advanced
Financing a college education*
None
Limited
Basic
Moderate
Proficient
Advanced
The types of personal and social skills that students need to succeed in college
None
Limited
Basic
Moderate
Proficient
Advanced
The following questions ask about your perceptions of the Ramp-Up curriculum taught in advisories and workshops.
How familiar are you with the Ramp-Up curriculum?
Not at all familiar [skip to q33]
Slightly familiar [continue to q22]
Moderately familiar [continue to q22]
Very familiar [continue to q22]
Please indicate the extent to which you disagree or agree with the following statements:
The Ramp-Up curriculum helps students develop postsecondary plans.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up curriculum enables students to make informed decisions about preparing for college.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up curriculum enables students to make informed decisions about preparing for a career.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up curriculum helps students develop the belief that they can turn their postsecondary plans into reality.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up curriculum helps students understand whether they are on or off track to reach college readiness by the end of high school.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up curriculum provides students with clear information about what steps must be taken to enroll in college.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up curriculum provides students with clear information about when key steps in the enrollment process must occur.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up curriculum covers appropriate topics on preparing for college.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up curriculum spends the appropriate amount of time on each topic.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up curriculum consists of a coherent sequence of concepts and ideas.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up curriculum is engaging to students.
Strongly disagree
Disagree
Agree
Strongly agree
Have you taught at least one Ramp-Up advisory this school year?
Yes [continue to q34]
No [skip to q41]
How many of the Ramp-Up lessons did you teach this school year in your advisory?
Fewer than five lessons [continue to q35]
5–10 lessons [continue to q35]
11–15 lessons [continue to q35]
16–20 lessons [continue to q35]
21–25 lessons [continue to q35]
All 28 lessons [skip to q36]
I don’t remember. [skip to q36]
Why did you teach fewer than 28 lessons? [text box]
Did you receive an Advisor Guide at the beginning of the school year that describes the lesson plan and activities for each weekly advisory?
Yes
No
How often did you receive information from the Ramp-Up coordinator about a lesson prior to teaching it?
Never
Rarely
Sometimes
Often
Always
I am not sure who the Ramp-Up coordinator in my school is.
Did you teach the Ramp-Up lessons as they were designed or did you modify them?
I taught them without any modifications.
I modified some of the lessons.
I modified most of the lessons.
I modified all of the lessons.
How often did you provide the Ramp-Up instructional materials and resources to students at the time assigned for the advisory?
Never
Rarely
Sometimes
Often
Always
Did you have enough time to prepare lesson content prior to teaching it?
Never
Rarely
Sometimes
Often
Always
The Ramp-Up program includes two tools to assist students with their plans after high school. These are the Postsecondary Plan and the Readiness Rubric.
The following questions ask about the Postsecondary Plan.
How familiar are you with the Postsecondary Plan?
Not at all familiar [skip to q46]
Slightly familiar [continue to q42]
Moderately familiar [continue to q42]
Very familiar [continue to q42]
Thinking about the Postsecondary Plan and how students, parents, and school staff use it, please indicate the extent to which you disagree or agree with the following statements.
The Postsecondary Plan helps students to develop a plan for their life after high school.
Strongly disagree
Disagree
Agree
Strongly agree
I use the Postsecondary Plan when helping students develop plans for their life after high school.
Strongly disagree
Disagree
Agree
Strongly agree
How many students in your Ramp-Up advisory completed the Postsecondary Plan at least once this year?
None
A few students
Most students
All students
I do not teach a Ramp-Up advisory.
For how many students in your Ramp-Up advisory have you discussed the Postsecondary Plan with his or her parents?
None
A few students
Most students
All students
I do not teach a Ramp-Up advisory.
The following questions ask about the Readiness Rubric.
How familiar are you with the Readiness Rubric?
Not at all familiar [skip to q51]
Slightly familiar [continue to q47]
Moderately familiar [continue to q47]
Very familiar [continue to q47]
Thinking about the Readiness Rubric and how students, parents, and school staff use it, please indicate the extent to which you disagree or agree with the following statements.
The Readiness Rubric helps students to monitor their progress toward their postsecondary goals.
Strongly disagree
Disagree
Agree
Strongly agree
I use the Readiness Rubric to monitor students’ progress toward their postsecondary goals.
Strongly disagree
Disagree
Agree
Strongly agree
How many students in your Ramp-Up advisory completed the Readiness Rubric at least twice this year?
None
A few students
Most students
All students
I do not teach a Ramp-Up advisory.
For how many students in your Ramp-Up advisory have you discussed the Readiness Rubric with his or her parents?
None
A few students
Most students
All students
I do not teach a Ramp-Up advisory.
[If Q2=Ramp-Up advisor and Q3=Grade 10 ask Q51 – Q55; else, skip to Q56]
The following questions ask about the Personal Readiness Evaluation for Postsecondary (PREP) survey.
How familiar are you with the PREP survey?
Not at all familiar [skip to q56]
Slightly familiar [continue to q52]
Moderately familiar [continue to q52]
Very familiar [continue to q52]
Thinking about the PREP survey and how students, parents, and school staff use it, please indicate the extent to which you disagree or agree with the following statements.
The PREP survey helps students to understand their personal readiness for college.
Strongly disagree
Disagree
Agree
Strongly agree
I use the PREP to understand students’ personal readiness for college.
Strongly disagree
Disagree
Agree
Strongly agree
How many students in your Ramp-Up advisory completed the PREP survey at least once this year?
None
A few students
Most students
All students
I do not teach a Ramp-Up advisory.
For how many students in your Ramp-Up advisory have you discussed a student’s PREP survey results with his or her parents?
None
A few students
Most students
All students
I do not teach a Ramp-Up advisory.
The following questions ask about professional development related to Ramp-Up.
Are you a Ramp-Up coordinator or member of the Ramp-Up leadership team?
Yes [continue to q57]
No [skip to q62]
Uncertain [skip to q62]
Have you received any training by the University of Minnesota’s College Readiness Consortium?
Yes [continue to q58]
No [skip to q62]
Uncertain [skip to q62]
Please indicate the extent to which you disagree or agree with the following statements:
The training I received provided useful information to me about how to gain staff support for implementing a schoolwide college-readiness program.
Strongly disagree
Disagree
Agree
Strongly agree
The training I received provided useful information to me about my role and responsibilities in delivering Ramp-Up.
Strongly disagree
Disagree
Agree
Strongly agree
Staff members at the College Readiness Consortium have responded effectively to questions I have asked about the Ramp-Up program.
Strongly disagree
Disagree
Agree
Strongly agree
I have not asked the College Readiness Consortium any questions about Ramp-Up.
Were the travel costs of your training at the University of Minnesota paid by your school or district?
Yes
No
Have you received any training about Ramp-Up at your school?
Yes [continue to q63]
No [skip to q72]
Uncertain [skip to q72]
How many times this school year did you attend training on Ramp-Up?
Not at all
One time
A couple of times
Every month
More than once a month
Other: [text box to specify]
Please indicate the extent to which you disagree or agree with the following statements:
The training I received helped me to understand why my school has adopted a college-readiness program.
Strongly disagree
Disagree
Agree
Strongly agree
The training I received helped me understand the Ramp-Up curriculum.
Strongly disagree
Disagree
Agree
Strongly agree
The training I received helped me understand the Ramp-Up tools (specifically, the Postsecondary Plan and the Readiness Rubric).
Strongly disagree
Disagree
Agree
Strongly agree
The training I received provided useful information to me about my role and responsibilities in delivering Ramp-Up.
Strongly disagree
Disagree
Agree
Strongly agree
I have enough information about the college selection and enrollment process to teach the Ramp-Up curriculum.
Strongly disagree [continue to q69]
Disagree [continue to q69]
Agree [skip to q70]
Strongly agree [skip to q70]
I do not teach the Ramp-Up curriculum.
What additional information would be useful? [text box]
I have enough information about the knowledge and skills needed to succeed in college to teach the Ramp-Up curriculum.
Strongly disagree [continue to q71]
Disagree [continue to q71]
Agree [skip to q72]
Strongly agree [skip to q72]
I do not teach the Ramp-Up curriculum.
What additional information would be useful? [text box]
The following questions ask about your perceptions of Ramp-Up’s effects. Please indicate the extent to which you disagree or agree with the following statements.
The Ramp-Up program increases students’ ability to set educational goals.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up program increases students’ ability to make and monitor progress toward educational goals.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up program increases students’ ability to create relationships to support their educational goals.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up program increases students’ ability to meet admissions requirements at a range of colleges.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up program increases students’ likelihood of succeeding academically at college.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up program increases students’ ability to find a career that matches their goals and abilities.
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up program increases students’ understanding of ways to pay for college (for example, through savings, loans, financial aid).
Strongly disagree
Disagree
Agree
Strongly agree
The Ramp-Up program has increased my ability to help students prepare and plan for college.
Strongly disagree
Disagree
Agree
Strongly agree
I have more productive conversations with students about how to prepare for life after high school because of Ramp-Up.
Strongly disagree
Disagree
Agree
Strongly agree
I have not had conversations with students about how to prepare for life after high school.
I have more productive conversations with families about how to prepare their children for life after high school because of Ramp-Up.
Strongly disagree
Disagree
Agree
Strongly agree
I have not had conversations with families about how to prepare their children for life after high school.
I have more productive conversations with colleagues about how to prepare students for life after high school because of Ramp-Up.
Strongly disagree
Disagree
Agree
Strongly agree
I have not had conversations with colleagues about how to prepare students for life after high school.
Which students, if any, can benefit from Ramp-Up? (Check all that apply.)
Students performing in the lower third of their class academically
Students performing in the middle third of their class academically
Students performing in the upper third of their class academically
Students who would be the first in their families to attend college
Students whose parents attended college
All types of students
No students
Uncertain
Finally, thinking about the Ramp-Up program overall…
What are the strengths of Ramp-Up? [text box]
What are the weaknesses of Ramp-Up? [text box]
What factors, if any, have made implementing Ramp-Up challenging at your school? [text box]
What factors, if any, have facilitated the implementation of Ramp-Up at your school? [text box]
Thank you for participating in this survey!
This evaluation is authorized through provisions in the Education Sciences Reform Act (ESRA) of 2002. Specifically, ESRA Part D, Section 174 (4) describes the role of regional education laboratories and its mission and function. One aspect of that role is
(4) in the event such quality applied research does not exist as determined by the regional educational laboratory or the Department, carrying out applied research projects that are designed to serve the particular educational needs (in prekindergarten through grade 16) of the region in which the regional educational laboratory is located, that reflect findings from scientifically valid research, and that result in user-friendly, replicable school-based classroom applications geared toward promoting increased student achievement, including using applied research to assist in solving site-specific problems and assisting in development activities (including high-quality and on-going professional development and effective parental involvement strategies) (ESRA, Part D, Section 174, f.4).
CONFIDENTIALITY AGREEMENT
Ramp-Up to Readiness Implementation Study
(American Institutes for Research under Contract No. ED-IES-12-C-0004)
Safeguards for Individuals Against Invasion of Privacy: In accordance with the Privacy Act of 1974 (5 United States Code 552a), the Education Sciences Reform Act of 2002 (Public Law 107-279), the Federal Statistical Confidentiality Order of 1997, the E-Government Act of 2002 (Public Law 107-347), and the Computer Security Act of 1987, American Institutes for Research (AIR) and all its subcontractors are required to comply with the applicable provisions of the legislation, regulations, and guidelines and to undertake all necessary safeguards for individuals against invasions of privacy.
To provide this assurance and these safeguards in performance of work on this project, all staff, consultants, and agents of AIR, and its subcontractors who have any access to study data, shall be bound by the following assurance.
Assurance of Confidentiality
In accordance with all applicable legislation, regulations, and guidelines, AIR assures all respondents that their responses may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (ESRA 2002), 20 U.S. Code, § 9573].
The following safeguards will be implemented to assure that confidentiality is protected as allowable by law (20 U.S.C. § 9573) by all employees, consultants, agents, and representatives of AIR and all subcontractors and that physical security of the records is provided:
All staff with access to data will take an oath of nondisclosure and sign an affidavit to that effect.
At each site where these items are processed or maintained, all confidential records that will permit identification of individuals shall be kept in a safe, locked room when not in use or personally attended by project staff.
When confidential records are not locked, admittance to the room or area in which they reside shall be restricted to staff sworn to confidentiality on this project.
All electronic data shall be maintained in secure and protected data files, and personally identifying information shall be maintained on separate files from statistical data collected under this contract.
All data files on network or multi-user systems shall be under strict control of a database manager with access restricted to project staff sworn to confidentiality, and then only on a need-to-know basis.
All data files on single-user computers shall be password protected and all such machines will be locked and maintained in a locked room when not attended by project staff sworn to confidentiality.
External electronically stored data files (e.g., tapes on diskettes) shall be maintained in a locked storage device in a locked room when not attended by project staff sworn to confidentiality.
Any data released to the general public shall be appropriately masked such that linkages to individually identifying information are protected to avoid individual identification in disclosed data.
Data or copies of data may not leave the authorized site for any reason.
Staff, consultants, agents, or AIR and all its subcontractors will take all necessary steps to ensure that the letter and intent of all applicable legislation, regulations, and guidelines are enforced at all times through appropriate qualifications standards for all personnel working on this project and through adequate training and periodic follow-up procedures.
By my signature affixed below, I hereby swear and affirm that I have carefully read this statement and fully understand the statement as well as legislative and regulatory assurances that pertain to the confidential nature of all records to be handled in regard to this project and will adhere to all safeguards that have been developed to provide such confidentiality. As an employee, consultant, agent, or representative of AIR or one of its subcontractors, consultants, agents, or representatives, I understand that I am prohibited by law from disclosing any such confidential information to anyone other than staff, consultant, agents, or representatives of AIR, its subcontractors, or agents, and Institutes of Education Science. I understand that any willful and knowing individual disclosure or allowance of disclosure in violation of the applicable legislation, regulations, and guidelines is punishable by law and would subject the violator to possible fine or imprisonment.
(Signature) (Date)
AFFIDAVIT OF NONDISCLOSURE
Ramp-Up to Readiness Implementation Study
(American Institutes for Research under Contract No. ED-IES-12-C-0004)
[insert name]
[insert position]
Date of Assignment to Ramp-Up to Readiness Implementation Study: January 2014
American Institutes for Research
1000 Thomas Jefferson Street, NW
Washington, DC 20007-3835
I, [insert name], do solemnly swear (or affirm) that when given access to any Ramp-Up to Readiness Implementation Study databases or files containing individually identifiable information, I will not:
use or reveal any individually identifiable information furnished, acquired, retrieved or assembled by me or others, under the provisions of Section 183 of the Education Sciences Reform Act of 2002 (PL 107-279) and Title V, subtitle A of the E-Government Act of 2002 (PL 107-347) for any purpose other than statistical purposes specified in the NCES survey, project or contract;
make any disclosure or publication whereby a sample unit or survey respondent could be identified or the data furnished by or related to any particular person under this section could be identified; or
permit anyone other than the individuals authorized by the Commissioner of the National Center for Education Statistics to examine the individual reports.
(Signature)
(The penalty for unlawful disclosure is a fine of not more than $250,000 [under 18 U.S.C. 3571] or imprisonment for not more than five years [under 18 U.S.C. 3559], or both. The word “swear” should be stricken out wherever it appears when a person elects to affirm the affidavit rather than to swear to it.)
State of _____________________________
County of _______________________________
Subscribed and sworn/affirmed before me, ______________________, a Notary Public in and for ________________County, State of ________________________, on this date, ______________________.
___________________________________________
Notary Public
My commission expires: _____________________________.
The impact study proposes a cluster randomized controlled trial (RCT) with blocking to investigate the effect of Ramp-Up on three confirmatory outcomes: completion of the FAFSA and two measures in the domain of personal readiness, Commitment to College and Goal Striving. For these outcomes, power is estimated using a constant-effects blocked cluster random assignment design with the treatment occurring at Level 2 and block dummies as intercepts (no interaction with the treatment variable).
The following power analyses use CRT-Power software to estimate the number of schools needed to have adequate statistical power (0.80) for detecting differences between students in the treatment and control schools for the confirmatory outcomes. The power analyses use a correction for multiple statistical tests to estimate power for the two outcomes within the domain of personal readiness.
General Assumptions
REL Midwest researchers will randomly assign schools within blocks. Blocks will be created for each state, and quartiles of the standardized sum of two standardized school variables, percentage of students eligible for free or reduced-price lunch, and average MCA mathematics score.25 Half of the schools from each of the four blocks will be randomly assigned to receive the treatment. Statistical models examining program impacts will include a school-level covariate, which will be the baseline year’s school-level dependent variable (e.g., the percentage of students completing the FAFSA) when it is available. Unless otherwise stated, all analyses are based on the following assumptions: (1) 127 students per grade, which was the average number of students in Grade 12 in the high schools that implemented Ramp-Up in 2012–13 and planning to implement in 2013–14 (MOHE, 2012d); (2) the inclusion of a school-level covariate explaining 60 percent of the variance in the mean outcome for binary outcomes and 75 percent for continuous outcomes;26 and (3) an alpha of 0.05 for FAFSA completion and 0.025 for the two personal readiness measures. Assumptions about the intraclass correlation coefficient (ICC) for continuous outcomes may not translate well to analyses with binary outcomes. For each outcome, we make assumptions about the ICC level based on (1) the likely range of school-level outcomes in the study schools based on data for the 2012–13 cohort when available or discussion with the program developer and alliance members, and (2) a review of ICC’s for binary outcomes in seven RCT’s that indicates that the median ICC for binary outcomes across studies was 0.05 (Schochet, 2013). Table A-11.1 shows the power associated with different numbers of schools given the stated assumptions. Detailed discussion regarding each power analysis follows the table.
Table A-11.1 Minimum Detectable Effect Sizes Based on
Numbers
of Schools, Teachers, and Students
Domain |
Measure |
Effect to Detect |
ICC |
% Variation Explained by Level-2 Covariate |
Number of Students |
Estimated Power With 48 Schools |
Estimated Power With 54 Schools |
College actions |
Completing the FAFSA |
5 percentage points |
0.02 |
0.60 |
127 |
0.77 |
0.83 |
Personal readiness |
Commitment to college |
0.17 standard deviations |
0.10 |
0.75 |
90 |
0.79 |
0.86 |
Personal readiness |
Goal striving |
0.17 standard deviations |
0.10 |
0.75 |
90 |
0.79 |
0.86 |
Power analysis for college enrollment actions. The study will assess the impact of Ramp-Up on completing the FAFSA for students in Grade 12. In four studies with seven estimates of the impact of a college-readiness intervention on FAFSA completion, the treatment effect ranges from not significant to 24 percentage points. Among the 2013–14 Ramp-Up schools, the average FAFSA completion rate was 54 percent (calculations are based on the school-level FAFSA completions by July 2013 reported by the U.S. Department of Education and the total Grade 12 enrollment for 2012–13 reported by MDE). The treatment effect to identify is 5 percentage points. The power analysis assumes that 95 percent of schools in the study cohort will have FAFSA completion rates ranging between 39.9 percent and 68.1 percent, which is consistent with the actual range among 2013–14 Ramp-Up schools (i.e., 95 percent of schools had a FAFSA completion rate between 42 percent and 67 percent). This range translates into an ICC of 0.02. This is similar to the ICC for college expectations (0.03) or always completes homework (0.04) reported in Schochet (2013).
Power analysis for personal readiness. The study will assess the impact of Ramp-Up on two measures of personal readiness—Commitment to College and Goal Striving—scales measured on ACT’s ENGAGE assessment. The study team has not found any evaluations using personal college readiness as an outcome, but other evaluations have looked at the impact of interventions on related constructs. In a meta-analysis of 68 social-emotional learning interventions aimed at improving social-emotional skills (which include goal setting) among K–12 students, the mean effect size was 0.57 standard deviations (Durlak, Weissberg, Dymnicki, Taylor, & Schellinger, 2011). The power analysis assumes an ICC equal to 0.10 an effect size of 0.17 standard deviations, and 90 students per school taking the assessment. Because there are two measures in the domain of personal readiness, power analyses assume an alpha of 0.025 to correct for multiple statistical tests.
Table A-12.1 Components and Indicators Associated With Ramp-Up Implementation
Implementation Component |
Subcomponent |
Data Source |
Indicator |
Definition of Adequate |
Structural supports |
Ramp-Up leadership team |
Extant data from Consortium |
When was the leadership team established? |
1.0 = Before August |
Structural supports |
Ramp-Up leadership team |
Extant data from CRC |
Does the leadership team identified in the annual plan consist of a principal, a counselor, and a teacher? |
1.0 = The team consists of a principal, a counselor, and a teacher 0.5 = The team consists of three people with two of the following: the principal, a counselor, and a teacher 0.0 = Else |
Structural supports |
Ramp-Up leadership team |
Extant data from CRC |
When did the Ramp-Up leadership team complete a CRC-approved Annual Plan? |
1.0 = Before August |
Structural supports |
Ramp-Up leadership team |
Extant data from CRC |
When did the Ramp-Up leadership team complete a CRC-approved Implementation Calendar? |
1.0 = Before September |
Structural supports |
Ramp-Up leadership team |
May focus group (2) |
How often did the Ramp-Up leadership team meet this school year to guide and monitor Ramp-Up implementation in your school? |
1.0 = The leadership team met at least four times. |
Structural supports |
Ramp-Up coordinator |
Spring staff survey (2) |
Someone at the school identifies himself or herself as the Ramp-Up coordinator. |
1.0 = Yes |
Structural supports |
Ramp-Up coordinator |
Extant data from CRC |
When was a Ramp-Up coordinator chosen? |
1.0 = Before August |
Structural supports |
Ramp-Up coordinator |
May focus group (1) |
Have there been any changes over the school year in who serves as the Ramp-Up coordinator? |
1.0 = No |
Structural supports |
Ramp-Up coordinator |
Spring staff survey (8) |
Do you know who the Ramp-Up coordinator at your school is? |
1.0 = Yes |
Structural supports |
Ramp-Up advisors |
Fall staff survey (28) |
Who at your school is responsible for delivering college-related programs, services, activities, and resources to students? |
1.0 = Includes all teachers; |
Structural supports |
Ramp-Up advisors |
Spring staff survey (1) and (2) |
The percentage of teachers who teach advisories based on the number of staff who identify as a teacher in (1) and who report being a Ramp-Up advisor in (2) |
1.0 = At least 90 percent of teachers 0.5 = 75 percent to less than 90 percent of teachers 0.0 = Less than 75 percent |
Structural supports |
Advanced course offerings |
Extant administrative data |
Do students have the opportunity to take college-level courses (e.g., dual-credit, AP, IB, or College in the Schools)? |
1.0 = Students have the opportunity to take dual-credit, AP, IB, and College in the Schools classes. 0.5 = Students have the opportunity to take some but not all of the specified college-credit coursework. 0.0 = Students do not have the opportunity to take any of these courses. |
Structural supports |
Advanced course offerings |
Extant administrative data |
What kinds of students are eligible to participate in college-level courses? |
1.0 = Sophomores, juniors, and seniors |
Structural supports |
Advanced course offerings |
Extant administrative data |
Do students have the opportunity to take honors courses? |
1.0 = Yes |
Structural supports |
Time |
Spring staff survey (9) |
My school and district gives me enough time to implement the Ramp-Up program. |
1.0 = Agree or agree strongly |
Structural supports |
Time |
Spring staff survey (2) and (10) |
My school and district gives me enough time to coordinate the Ramp-Up program. [applied to leadership team member] |
1.0 = Agree or agree strongly |
Structural supports |
Time |
Spring staff survey (2) and (10) |
My school or district gives me enough time to coordinate the Ramp-Up program. [applied to Ramp-Up coordinator] |
1.0 = Agree or agree strongly |
Structural supports |
Time |
Instructional log (13) |
I had enough time to prepare lesson content prior to teaching today’s workshop. |
1.0 = Agree or agree strongly 0.0 = Disagree or disagree strongly |
Structural supports |
Time |
Spring staff survey (40) |
Did you have enough time to prepare lesson content prior to teaching it? |
1.0 = Always |
Structural supports |
Technology platform |
Fall staff survey (16) |
What percentage of students at your high school use a technology platform (e.g., Naviance, Minnesota Career Information System, Wisconsin Career Information System) to support the development of their postsecondary plans?
|
1.0 = More than 75 percent |
Structural supports |
Technology platform |
Fall student survey (5) |
Is your postsecondary plan stored electronically? |
1.0 = Yes |
Structural supports |
Technology platform |
Spring student survey (5) |
Is your postsecondary plan stored electronically? |
1.0 = Yes |
|
|
|
|
|
Professional development |
For Ramp-Up leadership team or coordinator |
Spring staff survey (2) and (57) |
Have you received any training by the University of Minnesota’s College Readiness Consortium? [applied to Leadership Team members and Coordinator] |
1.0 = Yes |
Professional development |
For Ramp-Up leadership team or coordinator |
Extant data from CRC |
The percentage of ramp-up leadership team members present at the planning session |
1.0 = 100 percent |
Professional development |
For Ramp-Up leadership team or coordinator |
Extant data from CRC |
Did the Ramp-Up Coordinator attend the training? |
1.0 = Yes |
Professional development |
For Ramp-Up leadership team or coordinator |
Spring staff survey (2) and (58) |
The training I received provided useful information to me about how to gain staff support for implementing a schoolwide college-readiness program. [applies to Leadership Team and Coordinator] |
1.0 = Agree or agree strongly |
Professional development |
For Ramp-Up leadership team or coordinator |
Spring staff survey (59) |
The training I received provided useful information to me about my role and responsibilities in delivering Ramp-Up. |
1.0 = Agree or agree strongly |
Professional development |
For Ramp-Up leadership team or coordinator |
Spring staff survey (60) |
Staff members at the College Readiness Consortium have responded effectively to questions I have asked about the Ramp-Up program. |
1.0 = Agree or agree strongly |
Professional development |
For teachers |
May focus group (10) |
At the beginning of the school year, were four hours spent introducing staff to the Ramp-Up program? |
1.0 = Yes |
Professional development |
For teachers |
Spring staff survey (63) |
How many times this school year did you attend training on Ramp-Up? |
1.0 = Every month or more than once a month 0.5 = A couple of times 0.0 = One time or not at all |
Professional development |
For teachers |
Spring staff survey (64) |
The training I received helped me to understand why my school has adopted a college-readiness program. |
1.0 = Agree or agree strongly |
Professional development |
For teachers |
Spring staff survey (65) |
The training I received helped me understand the Ramp-Up curriculum. |
1.0 = Agree or agree strongly |
Professional development |
For teachers |
Spring staff survey (66) |
The training I received helped me understand the Ramp-Up tools (specifically, the Postsecondary Plan and the Readiness Rubric). |
1.0 = Agree or agree strongly |
Professional development |
For teachers |
Spring staff survey (67) |
The training I received provided useful information to me about my role and responsibilities in delivering Ramp-Up. |
1.0 = Agree or agree strongly |
|
|
|
|
|
Curriculum delivery |
Materials |
Spring staff survey (36) |
Did you receive an Advisor Guide at the beginning of the school year that describes the lesson plan and activities for each weekly advisory? |
1.0 = Yes |
Curriculum delivery |
Materials |
Spring staff survey (37) |
How often did you receive information from the Ramp-Up coordinator about a lesson prior to teaching it? |
1.0 = Always |
Curriculum delivery |
Materials |
Spring staff survey (39) |
How often did you provide the Ramp-Up instructional materials and resources to students at the time assigned for the advisory? |
1.0 = Always |
Curriculum delivery |
Sufficient information |
Spring staff survey (68) |
I have enough information about the college selection and enrollment process to teach the Ramp-Up curriculum. |
1.0 = Agree or agree strongly |
Curriculum delivery |
Sufficient information |
Spring staff survey (70) |
I have enough information about the knowledge and skills needed to succeed in college to teach the Ramp-Up curriculum. |
1.0 = Agree or agree strongly |
Curriculum delivery |
Sufficient information |
Instructional log (14) |
I had enough information about the college selection and enrollment process to teach today’s workshop. |
1.0 = Agree or agree strongly |
Curriculum delivery |
Sufficient information |
Instructional log (15) |
I had enough information about the knowledge and skills needed to succeed in college to teach today’s workshop. |
1.0 = Agree or agree strongly 0.0 = Disagree or disagree strongly |
Curriculum delivery |
Sufficient information |
Spring staff survey (1) and (21) |
How familiar are you with the Ramp-Up curriculum? [applied to Ramp-Up advisors] |
1.0 = Very familiar |
Curriculum delivery |
Advisories |
Spring staff survey (34) |
How many of the Ramp-Up lessons did you teach this school year in your advisory? |
1.0 = All 28 lessons |
Curriculum delivery |
Advisories |
Spring Staff survey (2) and staff survey (33) |
Percentage of staff who self-identify as a Ramp-Up advisor and who indicated they taught at least one Ramp-Up advisory this school year |
1.0 = 100 percent |
Curriculum delivery |
Workshops |
May focus group (6a) |
How many workshops were held over the course of the year? |
1.0 = Five workshops |
Curriculum delivery |
Workshops |
May focus group (6d) |
On average, how long did these workshops last? |
60.0 = More than 60 minutes |
|
|
|
|
|
Curriculum content |
Academic readiness |
Fall student survey (10) |
So far this school year, has an adult at your school encouraged you to take an honors course or a course for college credit? |
1.0 = Yes |
Curriculum content |
Academic readiness |
Spring student survey (10) |
So far this school year, has an adult at your school encouraged you to take an honors course or a course for college credit? |
1.0 = Yes |
Curriculum content |
Academic readiness |
Fall student survey (11) |
So far this school year, how often has an adult at your high school discussed with you your academic readiness for college-level classes? |
1.0 = Three or more times |
Curriculum content |
Academic readiness |
Spring student survey (11) |
So far this school year, how often has an adult at your high school discussed with you your academic readiness for college-level classes? |
1.0 = Three or more times |
Curriculum content |
Admissions readiness |
Fall student survey (12) and (1) |
I know which type of college would help me reach my goals after high school. [applied to Grade 11 and Grade 12 students] |
1.0 = Agree or agree strongly |
Curriculum content |
Admissions readiness |
Spring student survey (12) and (1) |
I know which type of college would help me reach my goals after high school. [applied to Grade 11 and Grade 12 students] |
1.0 = Agree or agree strongly |
Curriculum content |
Admissions readiness |
Fall student survey (13) and (1) |
So far this school year, how often has an adult at your high school discussed with you the steps that you need to take to apply to the type of college that you want to attend? [for Grade 11 and Grade 12 students] |
1.0 = Three or more times |
Curriculum content |
Admissions readiness |
Spring student survey (13) and (1) |
So far this school year, how often has an adult at your high school discussed with you the steps that you need to take to apply to the type of college that you want to attend? [for Grade 11 and Grade 12 students] |
1.0 = Three or more times |
Curriculum content |
Admissions readiness |
Fall student survey (14) and (1) |
So far this school year, how often has an adult at your high school discussed with you your likelihood of being accepted at different types of colleges? [applied to Grade 11 and Grade 12 students] |
1.0 = Three or more times 0.5 = Once or twice 0.0 = Never |
Curriculum content |
Admissions readiness |
Spring student survey (14) and (1) |
So far this school year, how often has an adult at your high school discussed with you your likelihood of being accepted at different types of colleges? [applied to Grade 11 and Grade 12 students] |
1.0 = Three or more times 0.5 = Once or twice 0.0 = Never |
Curriculum content |
Admissions readiness |
Fall student survey (23) and (1) |
So far this school year, how much have your teachers, counselors, or other school staff helped you with a college application essay or personal statement? [applied to Grade 12 students] |
1.0 = A lot or some |
Curriculum content |
Admissions readiness |
Spring student survey (24) and (1) |
So far this school year, how much have your teachers, counselors, or other school staff helped you with a college application essay or personal statement? [applied to Grade 12 students] |
1.0 = A lot or some |
Curriculum content |
Career readiness |
Fall student survey (17) |
So far this school year, how helpful has your high school been in helping you assess your career interests and abilities? |
1.0 = Very helpful or helpful |
Curriculum content |
Career readiness |
Spring student survey (17) |
So far this school year, how helpful has your high school been in helping you assess your career interests and abilities? |
1.0 = Very helpful or helpful |
Curriculum content |
Career readiness |
Fall student survey (18) |
How helpful has your high school been in helping you to develop a career plan? |
1.0 = Very helpful or helpful |
Curriculum content |
Career readiness |
Spring student survey (18) |
How helpful has your high school been in helping you to develop a career plan? |
1.0 = Very helpful or helpful |
Curriculum content |
Financial readiness |
Fall student survey (19) |
So far this school year, how often has an adult at your school talked to you about how to pay for tuition or other college expenses |
1.0 = Three or more times |
Curriculum content |
Financial readiness |
Spring student survey (19) |
So far this school year, how often has an adult at your school talked to you about how to pay for tuition or other college expenses |
1.0 = Three or more times |
Curriculum content |
Financial readiness |
Spring student survey (26) and (1) |
So far this school year, how much have your teachers, counselors, or other school staff helped you fill out the FAFSA? [applied to Grade 12 students] |
1.0 = A lot or some |
Curriculum content |
Personal and social readiness |
Fall student survey (9) |
I know the skills that I need to work on if I am going to graduate from high school ready for success in college. |
1.0 = Agree or agree strongly |
Curriculum content |
Personal and social readiness |
Spring student survey (9) |
I know the skills that I need to work on if I am going to graduate from high school ready for success in college. |
1.0 = Agree or agree strongly |
|
|
|
|
|
Postsecondary planning tools |
Use of Postsecondary Plan |
Spring staff survey (41) |
How familiar are you with the Postsecondary Plan? |
1.0 = Very familiar |
Postsecondary planning tools |
Use of Postsecondary Plan |
Spring staff survey (42) |
The Postsecondary Plan helps students to develop a plan for their life after high school. |
1.0 = Agree or agree strongly |
Postsecondary planning tools |
Use of Postsecondary Plan |
Spring staff survey (43) |
I use the Postsecondary Plan when helping students develop plans for their life after high school. |
1.0 = Agree or agree strongly |
Postsecondary planning tools |
Use of Postsecondary Plan |
Spring staff survey (44) |
How many students in your Ramp-Up advisory completed the Postsecondary Plan at least once this year? |
1.0 = All students |
Postsecondary planning tools |
Use of Postsecondary Plan |
Spring student survey (6) |
At last registration time, did school staff help you in choosing classes that you need to reach your goals for after high school? |
1.0 = Yes |
Postsecondary planning tools |
Use of Readiness Rubric |
Spring staff survey (46) |
How familiar are you with the Readiness Rubric? |
1.0 = Very familiar 0.5 = Moderately familiar 0.0 = Slightly familiar or not at all familiar |
Postsecondary planning tools |
Use of Readiness Rubric |
Spring staff survey (47) |
The Readiness Rubric helps students to monitor their progress toward their postsecondary goals. |
1.0 = Agree or agree strongly |
Postsecondary planning tools |
Use of Readiness Rubric |
Spring staff survey (48) |
I use the Readiness Rubric to monitor students’ progress toward their postsecondary goals. |
1.0 = Agree or agree strongly |
Postsecondary planning tools |
Use of Readiness Rubric |
Spring staff survey (49) |
How many students in your Ramp-Up advisory completed the Readiness Rubric at least twice this year? |
1.0 = All students |
Postsecondary planning tools |
Use of Readiness Rubric |
Fall student survey (7) |
So far this school year, how many times have you discussed your progress towards attaining your postsecondary plan with a counselor, teacher, or other adult in your school? |
1.0 = Three times or more |
Postsecondary planning tools |
Use of Readiness Rubric |
Spring student survey (7) |
So far this school year, how many times have you discussed your progress toward attaining your postsecondary plan with a counselor, teacher, or other adult in your school? |
1.0 = Three times or more |
Postsecondary planning tools |
Communication with Parents |
Spring staff survey (45) |
For how many students in your Ramp-Up advisory have you discussed a student’s Postsecondary Plan with his or her parents? |
1.0 = All students |
Postsecondary planning tools |
Communication with parents |
Spring staff survey (50) |
For how many students in your Ramp-Up advisory have you discussed a student’s Readiness Rubric with his or her parents? |
1.0 = All students |
Table A-13.1 Rubric for Assessing Students’ Exposure to Ramp-Up
Activity |
Exposure Factor |
Data Source |
Indicator |
Recoded Indicator for Analysis |
Further Transformations |
Ramp-Up advisories |
Participation of Grade 10 students |
Final instructional logs (22) and (28) |
For teachers reporting that they are assigned to a Grade 10 advisory, their response to the question: on average, what percentage of students scheduled to attend your weekly advisory have attended every session so far? |
0.875 = More than 75 percent |
None |
Ramp-Up advisories |
Participation of Grade 11 students |
Final instructional logs (24) and (28) |
For teachers reporting that they are assigned to a Grade 11 advisory, their response to the question: on average, what percentage of students scheduled to attend your weekly advisory have attended every session so far? |
0.875 = More than 75 percent |
None |
Ramp-Up advisories |
Participation of Grade 12 students |
Final instructional logs (26) and (28) |
For teachers reporting that they are assigned to a Grade 12 advisory, their response to the question: on average, what percentage of students scheduled to attend your weekly advisory have attended every session so far? |
0.875 = More than 75 percent |
None |
Ramp-Up advisories |
Frequency for Grade 10 students |
Final instructional logs (22) and (23) |
Total number of lessons taught to Grade 10 students |
Number ranging from 0 to 28 |
None |
Ramp-Up advisories |
Frequency for Grade 11 students |
Final instructional logs (24) and (25) |
Total number of lessons taught to Grade 11 students |
Number ranging from 0 to 28 |
None |
Ramp-Up advisories |
Frequency for Grade 12 students |
Final instructional logs (26) and (27) |
Total number of lessons taught to Grade 12 students |
Number ranging from 0 to 28 |
None |
Ramp-Up advisories |
Duration for |
Final instructional log (22) and (29) |
For teachers reporting that they are assigned to a Grade 10 advisory, their response to the question: on average, how long have the weekly advisory sessions you’ve taught so far this year lasted? |
30.0 = 30 minutes |
None |
Ramp-Up advisories |
Duration for |
Final instructional log (24) and (29) |
For teachers reporting that they are assigned to a Grade 11 advisory, their response to the question: on average, how long have the weekly advisory sessions you’ve taught so far this year lasted? |
30.0 = 30 minutes |
None |
Ramp-Up advisories |
Duration for |
Final instructional log (26) and (29) |
For teachers reporting that they are assigned to a Grade 12 advisory, their response to the question: on average, how long have the weekly advisory sessions you’ve taught so far this year lasted? |
30.0 = 30 minutes |
None |
Ramp-Up workshops |
Participation of Grade 10 students |
All instructional logs (6) and (7) |
What percentage of your students attended the workshop? (based on responses from teachers who taught a workshop for Grade 10 students) |
Number ranging from 0 to 1 |
Averaged across all instructional logs
|
Ramp-Up workshops |
Participation for Grade 11 students |
All instructional logs (5) and (6) |
What percentage of your students attended the workshop? (based on responses from teachers who taught a workshop for Grade 11 students) |
Number ranging from 0 to 1 |
Averaged across all instructional logs
|
Ramp-Up workshops |
Participation for Grade 12 students |
All instructional logs (5) and (6) |
What percentage of your students attended the workshop? (for teachers who taught a workshop for Grade 12 students) |
Number ranging from 0 to 1 |
Averaged across all instructional logs
|
Ramp-Up workshops |
Frequency for Grade 10 students |
May focus group (6a) |
How many workshops were held over the course of the year? (for Grade 10 students) |
Number ranging from 0 to 5 |
None |
Ramp-Up workshops |
Frequency for Grade 11 students |
May focus group (6a) |
How many workshops were held over the course of the year? (for Grade 11 students) |
Number ranging from 0 to 5 |
None |
Ramp-Up workshops |
Frequency for Grade 12 students |
May focus group (6a) |
How many workshops were held over the course of the year? (for Grade 12 students) |
Number ranging from 0 to 5 |
None |
Ramp-Up workshops |
Duration for |
All instructional logs (7) and (8) |
How long did today’s workshop last? (based on responses from teachers who taught a workshop for Grade 10 students) |
60.0 = More than 60 minutes |
Averaged across all instructional logs
|
Ramp-Up workshops |
Duration for |
All instructional logs (7) and (8) |
How long did today’s workshop last? (based on responses from teachers who taught a workshop for Grade 11 students) |
60.0 = More than 60 minutes |
Averaged across all instructional logs
|
Ramp-Up workshops |
Duration for |
All instructional logs (7) and (8) |
How long did today’s workshop last? (based on responses from teachers who taught a workshop for Grade 12 students) |
60.0 = More than 60 minutes |
Averaged across all instructional logs
|
Table A-14.1 Rubric for Comparing College-Readiness Supports in Treatment and Control Schools
Component for Making Contrasts |
Subcomponent |
Data Source |
Indicator |
Recoded Indicator for Analysis |
Structural Supports |
Advanced Course Offerings |
Extant administrative data |
Do students have the opportunity to take college-level courses (e.g., dual-credit, AP, IB, or College in the Schools)? |
1.0
= Students have the opportunity to take dual-credit, AP, IB, and
College in the Schools classes; |
Structural Supports |
Advanced Course Offerings |
Extant administrative data |
What kinds of students are eligible to participate in college-level courses? |
1.0
= Sophomores, juniors, and seniors |
Structural Supports |
Advanced Course Offerings |
Extant administrative data |
Do students have the opportunity to take honors courses? |
1.0
= Yes |
Structural Supports |
Technology Platform |
Fall Staff Survey (16) |
What percentage of students at your high school use a technology platform (e.g., Naviance, MCIS, WCIS) to support the development of their postsecondary plans?
|
1.0
= More than 75 % |
Structural Supports |
Technology Platform |
Fall Student Survey (5) |
Is your postsecondary plan stored electronically? |
1.0
= Yes |
Structural Supports |
Technology Platform |
Spring Student Survey (5) |
Is your postsecondary plan stored electronically? |
1.0
= Yes |
|
|
|
|
|
Professional Development |
For Staff |
Fall Staff Survey (13) |
So far this school year, have you received any professional development related to preparing students for college?
|
1.0
= Yes |
Professional Development |
For Staff |
Fall Staff Survey (2) and (13) |
Calculate based on the number of teachers in a school, identified in (2), responded that they had received professional development related to preparing students for college (13) |
1.0
= All teachers |
|
|
|
|
|
Curriculum Content |
Academic Readiness |
Fall Student Survey (10) |
So far this school year, has an adult at your school encouraged you to take an honors course or a course for college credit? |
1.0
= Yes |
Curriculum Content |
Academic Readiness |
Spring Student Survey (10) |
So far this school year, has an adult at your school encouraged you to take an honors course or a course for college credit? |
1.0
= Yes |
Curriculum Content |
Academic Readiness |
Fall Student Survey (11) |
So far this school year, how often has an adult at your high school discussed with you your academic readiness for college-level classes? |
1.0
= 3 or more times |
Curriculum Content |
Academic Readiness |
Spring Student Survey (11) |
So far this school year, how often has an adult at your high school discussed with you your academic readiness for college-level classes? |
1.0
= 3 or more times |
Curriculum Content |
Admissions Readiness |
Fall Student Survey (12) and (1) |
I know which type of college would help me reach my goals after high school [applied to Grade 11 and Grade 12 students] |
1.0
= Agree or agree strongly |
Curriculum Content |
Admissions Readiness |
Spring Student Survey (12) and (1) |
I know which type of college would help me reach my goals after high school [applied to Grade 11 and Grade 12 students] |
1.0
= Agree or agree strongly |
Curriculum Content |
Admissions Readiness |
Fall Student Survey (13) and (1) |
So far this school year, how often has an adult at your high school discussed with you the steps that you need to take to apply to the type of college that you want to attend (for Grade 11 and Grade 12 students) |
1.0
= 3 or more times |
Curriculum Content |
Admissions Readiness |
Spring Student Survey (13) and (1) |
So far this school year, how often has an adult at your high school discussed with you the steps that you need to take to apply to the type of college that you want to attend (for Grade 11 and Grade 12 students) |
1.0
= 3 or more times |
Curriculum Content |
Admissions Readiness |
Fall Student Survey (14) and (1) |
So far this school year, how often has an adult at your high school discussed with you your likelihood of being accepted at different types of colleges [applied to Grade 11 and Grade 12 students] |
1.0 = 3 or more times 0.5 = Once or twice 0.0 = Never |
Curriculum Content |
Admissions Readiness |
Spring Student Survey (14) and (1) |
So far this school year, how often has an adult at your high school discussed with you your likelihood of being accepted at different types of colleges [applied to Grade 11 and Grade 12 students] |
1.0 = 3 or more times 0.5 = Once or twice 0.0 = Never |
Curriculum Content |
Admissions Readiness |
Fall Student Survey (23) and (1) |
So far this school year, how much have your teachers, counselors, or other school staff helped you with a college application essay or personal statement? [applied to Grade 12 students] |
1.0
= A lot or some |
Curriculum Content |
Admissions Readiness |
Spring Student Survey (24) and (1) |
So far this school year, how much have your teachers, counselors, or other school staff helped you with a college application essay or personal statement? [applied to Grade 12 students] |
1.0
= A lot or some |
Curriculum Content |
Career Readiness |
Fall Student Survey (17) |
So far this school year, how helpful has your high school been in helping you assess your career interests and abilities? |
1.0
= Very helpful or helpful |
Curriculum Content |
Career Readiness |
Spring Student Survey (17) |
So far this school year, how helpful has your high school been in helping you assess your career interests and abilities? |
1.0
= Very helpful or helpful |
Curriculum Content |
Career Readiness |
Fall Student Survey (18) |
How helpful has your high school been in helping you to develop a career plan? |
1.0
= Very helpful or helpful |
Curriculum Content |
Career Readiness |
Spring Student Survey (18) |
How helpful has your high school been in helping you to develop a career plan? |
1.0
= Very helpful or helpful |
Curriculum Content |
Financial Readiness |
Fall Student Survey (19) |
So far this school year, how often has an adult at your school talked to you about how to pay for tuition or other college expenses |
1.0
= 3 or more times |
Curriculum Content |
Financial Readiness |
Spring Student Survey (19) |
So far this school year, how often has an adult at your school talked to you about how to pay for tuition or other college expenses |
1.0
= 3 or more times |
Curriculum Content |
Financial Readiness |
Spring Student Survey (26) and (1) |
So far this school year, how much have your teachers, counselors, or other school staff helped you fill out the FAFSA? [applied to Grade 12 students] |
1.0
= A lot or some |
Curriculum Content |
Personal/Social Readiness |
Fall Student Survey (9) |
I know the skills that I need to work on if I am going to graduate from high school ready for success in college? |
1.0
= Agree or agree strongly |
Curriculum Content |
Personal/Social Readiness |
Spring Student Survey (9) |
I know the skills that I need to work on if I am going to graduate from high school ready for success in college? |
1.0
= Agree or agree strongly |
|
|
|
|
|
Postsecondary Planning Tools |
Plans and Monitoring |
Fall Student Survey (4) |
This school year, have you developed a written postsecondary plan with a counselor, teacher, or other adult in your school that describes your educational or career plans for after high school? |
1.0
= Yes |
Postsecondary Planning Tools |
Plans and Monitoring |
Spring Student Survey (4) |
This school year, have you developed a written postsecondary plan with a counselor, teacher, or other adult in your school that describes your educational or career plans for after high school? |
1.0
= Yes |
Postsecondary Planning Tools |
Use of Readiness Rubric |
Fall Student Survey (7) |
So far this school year, how many times have you discussed your progress towards attaining your postsecondary plan with a counselor, teacher, or other adult in your school? |
1.0
= Three times or more |
Postsecondary Planning Tools |
Use of Readiness Rubric |
Spring Student Survey (7) |
So far this school year, how many times have you discussed your progress towards attaining your postsecondary plan with a counselor, teacher, or other adult in your school? |
1.0
= Three times or more |
Postsecondary Planning Tools |
Plans and Monitoring |
Fall Student Survey (6) |
At last registration time, did school staff help you in choosing classes that you need to reach your goals for after high school? |
1.0
= Yes |
Postsecondary Planning Tools |
Plans and Monitoring |
Spring Student Survey (6) |
At last registration time, did school staff help you in choosing classes that you need to reach your goals for after high school? |
1.0
= Yes |
Postsecondary Planning Tools |
Plans and Monitoring |
Fall Staff Survey (17) and (18) |
For staff who indicate that staff at their school provide feedback to students about whether they are on track academically for college. How often do all students receive feedback?
|
1.0
= All students in grades 10 through 12 receive feedback multiple
times per year |
Postsecondary Planning Tools |
Plans and Monitoring |
Fall Staff Survey (17) and (19) |
How do students receive feedback? |
1.0
= Feedback is discussed with the student |
Postsecondary Planning Tools |
Plans and Monitoring |
Fall Staff Survey (15) |
Do all students at your school develop a written plan for achieving their educational or career goals after high school?
|
1.0
= Yes |
Postsecondary Planning Tools |
Plans and Monitoring |
Fall Staff Survey (23a) |
Does your school collect timely information about which students complete the following college enrollment actions: College applications
|
1.0
= Yes |
Postsecondary Planning Tools |
Plans and Monitoring |
Fall Staff Survey (23b) |
Does your school collect timely information about which students complete the following college enrollment actions: FAFSA application
|
1.0
= Yes |
Postsecondary Planning Tools |
Plans and Monitoring |
Fall Staff Survey (23a) |
Does your school collect timely information about which students complete the following college enrollment actions: Scholarship applications
|
1.0
= Yes |
Postsecondary Planning Tools |
Plans and Monitoring |
Fall Staff Survey (23a) |
Does your school collect timely information about which students complete the following college enrollment actions: Completion of a college admissions exam
|
1.0
= Yes |
Postsecondary Planning Tools |
Communication with Parents |
Fall Staff Survey (29) |
For which of your students do you communicate with parents/guardians about their children's readiness for college?
|
1.0 = All 0.5
= Most |
Postsecondary Planning Tools |
Communication with Parents |
Fall Staff Survey (29) |
How often do you communicate with parents/guardians about their children’s readiness for college? |
0.5 = At least once per school year for all students 0.0 = Else |
1 These analyses seek to explore—rather than confirm—relationships, and so no power estimates were calculated for these analyses. Thus, these impact estimates may be underpowered. Per IES/NCEE guidance, exploratory analyses should be limited in number, require no additional data collection, and help with the interpretation of findings for the confirmatory analyses. The data for these two questions come from the same data draws as the covariates to be analyzed to address CRQ1 and 2, and so no new data are being collected. Findings will help determine whether Ramp-Up affects students in different groups equally and whether the impacts are seen with longer-term outcomes.
2 Minnesota schools are required to classify all local course offerings using the Minnesota Common Course Catalogue. This statewide classification system indicates whether a class is an enriched, honors, or advanced class; a dual/concurrent enrollment class; a class with an articulated curriculum agreement (such classes align high school and college curricula); or a class leading to an industry/occupation certification (Minnesota Department of Education [MDE], 2012). These classes are generally perceived to be relatively rigorous and, for purposes of this evaluation, will all be considered “advanced.” See Appendix D for further information on the classification system. Wisconsin does not have a common course catalogue and the two schools from Wisconsin will not be included in this analysis.
3If Minnesota requires all students to take the ACT, then this part of the ERQ will be dropped.
4 Enrollment in advanced coursework and personal readiness will be examined for students in Grades 10, 11, and 12. Taking the ACT or SAT exam will be examined for students in Grade 11, and the remaining college actions (submitting at least one college application and completing the FAFSA) will be examined for students in Grade 12.
5 Prior research suggests that low-income students or those whose parents did not attend college have greater needs for assistance in the college enrollment process and less access to academic opportunities to prepare them for college. However, Ramp-Up aims to improve college readiness among all students, not just those born into financially-secure families. The second subgroup analysis will provide evidence of whether Ramp-Up does improve college readiness for all students, regardless of family income.
6 In a What Works Clearinghouse (WWC) review of college access programs, Tierney et al. (2009) found that only 16 of more than 500 studies met the WWC standards for evidence.
7 Of these 56 schools, 22 are participating in another study being conducted by the REL program. Half of the 22 schools (11 schools) are implementing Ramp-Up in 2013–2014, and the other half will implement Ramp-Up in 2014–15. These 22 schools are not part of the sample to be studied in the impact study described in this OMB application.
8 Individual students and schools will not be described. Student- and school-level data will be aggregated to describe the early and later implementing groups of schools or student subgroups.
9 In Minnesota, the student unique identification number is the student Minnesota Automated Reporting Student System (MARSS) number. In Wisconsin, the identification number is the Wisconsin Student Number (WSN).
10 Although MDE collects some ACT and SAT information at the student level, staff at MDE recommended that schools or districts provide these data because schools or districts may be more reliable sources of this information.
11 The fall questionnaire will not ask about completion of the FAFSA because students will not have had the opportunity to complete this action yet..
12 Sample items in this scale include: “A college education will help me achieve my goals” and “I am committed to attend and finish college regardless of obstacles” (ACT, 2012, p. 33).
13 Sample items in this scale include: “Once I set a goal, I do my best to achieve it” and “I bounce back after facing disappointment or failure” (ACT, 2012, p. 34).
14ACT has provided some sample items. Sample items in the Commitment to College scale include: “A college education will help me achieve my goals” and “I am committed to attend and finish college regardless of obstacles.” Sample items in the Goal Striving scale are “Once I set a goal, I do my best to achieve it” and “I bounce back after facing disappointment or failure” (ACT, 2012, pp. 33–34).
15 These studies examine the predictive validity of an earlier version of the ENGAGE assessment known as the Student Readiness Inventory among two- and four-year college students.
16 The study team considered whether it would be possible to collect this type of data in observations of workshops and advisories. To collect reliable information would require multiple observations, and the cost of this type of data collection exceeds the project resources. The study team plans informal observations of some professional development training and potentially some workshops or advisories. However, because these data will be collected informally, it will not factor into the measures of implementation, and its use in any reporting will be limited only to providing anecdotal evidence to support a point (the report will note that the information is anecdotal).
17 In some cases (e.g., for staff surveys), creating a school-level indicator score will require averaging responses for all staff within a school.
18 REL Midwest will consult with the program developers to determine whether this will be a straight average or whether some components should receive a higher weight. This decision will be made prior to data collection.
19 For most students, this survey will be administered in conjunction with ACT’s ENGAGE assessment. Because ACT’s technology does not allow for skip patterns when questions are added to their assessment, this questionnaire does not use skips. Instead, response categories indicating that a question does not apply to a student have been included when needed.
20 An * indicates that this question is based on a question included on a survey from the Center for Applied Research and Educational Improvement (CAREI) at the University of Minnesota. ** indicates that this question is based on a question included on the Consortium on Chicago School Research 2009 12th-grade student survey (http://ccsr.uchicago.edu/downloads/23532009_my_voice_senior_student_codebook.pdf). Some questions have been reworded slightly to accommodate this study.
21 For most students, this survey will be administered in conjunction with ACT’s ENGAGE assessment. Because ACT’s technology does not allow for skip patterns when questions are added to their assessment, this questionnaire does not use skips. Instead, response categories indicating that a question does not apply to a student have been included when needed.
22 An * indicates that this question is based on a question included on a survey from the Center for Applied Research and Educational Improvement (CAREI) at the University of Minnesota. ** indicates that this question is based on a question included on the Consortium on Chicago School Research 2009 12th-grade student survey (http://ccsr.uchicago.edu/downloads/23532009_my_voice_senior_student_codebook.pdf). Some questions have been reworded slightly to accommodate this study.
23 An * indicates that this question is based on a question included on a survey from the Center for Applied Research and Educational Improvement (CAREI) at the University of Minnesota. ** indicates that this question is based on a question included on the Consortium on Chicago School Research 2009 12th-grade student survey (http://ccsr.uchicago.edu/downloads/23532009_my_voice_senior_student_codebook.pdf). *** indicates that this question is based on a question included on a survey of counselors conducted by Northwestern University’s High School to College Transition Study (James E. Rosenbaum, principal investigator). Some questions have been reworded slightly to accommodate this study.
24 An * indicates that this question is based on a question included on a survey from the Center for Applied Research and Educational Improvement (CAREI) at the University of Minnesota.
25 Urbanicity will be considered as an additional blocking factor if the diversity of locale is greater in the final set of schools than in previous samples of schools that have volunteered for Ramp-Up to Readiness.
26 Bloom, Richburg-Hayes, and Black (2007) estimate that the average proportion of school-level variance reduced by a pretest school-level covariate ranges from 0.91 to 0.97 for 10th graders in an analysis of reading and mathematics achievement. Similarly, Hedges and Hedberg (2007) estimate a reduction ranging from 0.87 to 0.98 in the school-level variance of reading or mathematics achievement with the use of a school-level pretest for high school students. Schochet (2013), however, concludes that the amount of variation explained by a binary school-level covariate is less than that from a continuous school-level covariate. We have not found estimates for the specific outcomes used in this analysis but have assumed a lower estimate of variance explained for all outcomes and even lower for binary outcomes.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | pbonsu |
File Modified | 0000-00-00 |
File Created | 2021-01-27 |