OMB_RtS-ND_PartA rev_06 11-15

OMB_RtS-ND_PartA rev_06 11-15.docx

Roads to Success in North Dakota: A Randomized Study of a College and Career Preparation Curriculum

OMB: 1830-0576

Document [docx]
Download: docx | pdf




Roads to Success in North Dakota: A Randomized Study of a College and Career Preparation Curriculum




Supporting Statement

Part A







Request for OMB Review
OMB #1830-NEW







Prepared by RTI International and FHI 360 for:


Office of Career, Technical, and Adult Education

U.S. Department of Education


Table of Contents

Section Page






Attachments

Appendix A Respondent Communication Materials

Appendix B Facsimile Student, Instructor, and Principal Surveys

Appendix C Site Visit Procedures and Protocols


A. Justification

A.1 Circumstances Necessitating Collection of Information

A.1.a Purpose of this Submission

This package requests approval from the Office of Management and Budget (OMB) for data collection activities for a rigorous evaluation of a college and career preparation curriculum instituted in North Dakota (RTS-ND). This study is conducted by the Office of Career, Technical, and Adult Education (OCTAE), part of the U.S. Department of Education, under the umbrella of OCTAE’s National Center for Innovation in Career and Technical Education (NCICTE). The contractor for this study is RTI International (also the prime contractor for NCICTE), and the sole subcontractor is FHI 360.

This request is to conduct an experimental evaluation of a college and career preparation curriculum—provided as part of the Roads to Success program—in enhancing 11th- and 12th-graders’ college and career aspirations, planning for postsecondary transitions and life, and attitudes toward education and careers in North Dakota. Roads to Success (RTS) offers after-school programs, workforce and leadership development programs, and a college and career planning curriculum designed to enhance the academic and career readiness of students from 7th grade through the end of high school. This study would specifically examine the implementation in North Dakota of the college access and career development curriculum provided to 11th and 12th graders as part of RTS, which includes elements of college and career advising, career research, job searching basics, interviewing preparation, college essay practice, and financial aid navigation.

As part of this submission, OCTAE is publishing a notice in the Federal Register allowing first a 60- and then a 30-day public comment period.

A.1.b Statement of Need for a Rigorous Evaluation of Roads to Success in North Dakota

College and career planning programs vary significantly in their scope and reach. Nationally, the most prominent programs are programs such as GEAR UP, Upward Bound, and Talent Search. These programs primarily emphasize college planning and preparation and are delivered by local grant recipients—either schools, districts, colleges or a combination of them. Because of the diversity of grant recipients and the broad mandates of the programs, the programs are not consistent across grantees, offering different mixes of services to different groups of students at different ages and grades.

In contrast, RTS was developed in part to provide a long-term program that serves all students in a school, not just disadvantaged students, and is systematic and consistent in its approach (Bowers and Hatch 2005). RTS addresses two critical needs. First, as a long-term program beginning before high school, it provides the necessary resources and guidance to orient students toward postsecondary planning. Without such assistance, students, particularly those whose parents have limited education or are from disadvantaged backgrounds, may fail to establish a long-term horizon for their efforts in school. Orienting students toward postsecondary life can help maintain school engagement and reduce behaviors that threaten good academic progress (Rosenbaum 2001). Second, and more importantly for the current study, RTS specifically addresses informational deficits that may hinder students in taking concrete steps toward college or careers. RTS helps students to understand the courses that will be required to prepare for college and the skills necessary to secure stable employment; it walks students through the steps of applying to college, securing a job, and managing finances independently. Informational deficits such as these have been shown to be a large part of why students fail to apply or attend to college (Grodsky and Riegle-Crumb 2010), especially academically excellent students from poor backgrounds (Hoxby and Avery 2012; Radford 2013).

From 2007 to 2009, Mathematica Policy Research conducted a two-year evaluation of the portion of RTS delivered to 7th and 8th graders (Duncan, Bleeker, and Booker 2010). Conducted in rural, predominately white schools in New York and West Virginia, the study randomly assigned schools within sampling blocks (defined by free and reduced-price lunch eligibility rates and average test scores) to either a treatment (13 schools) or control (12 schools) condition. Using baseline and end-of-intervention student surveys combined with attendance and mobility data, the study found that “RTS does not have clearly positive short-term impacts” on school and class engagement measures (motivation in school; learning and study habits; and attendance and negative behaviors) (p. 43). RTS was found to positively impact some individual measures of attendance and behavior (specifically school absences and detention), but to also negatively impact classroom attendance. The study did find impacts for several career outcomes related to the type of informational deficits cited above: career exploration with school staff, knowing how to find jobs that fit their needs, and knowing what is required for different careers. Effect sizes for these measures ranged from 0.15 to 0.50.

Duncan, Bleeker, and Booker’s study is the only known evaluation of RTS. Though it did not find statistically significant differences for its main outcomes of interest (engagement measures), it is promising that specific career exploration activities were found to be positively affected by RTS. The design of the current study, focusing on the opposite end of RTS’s grade range and on concrete activities undertaken in those grades, provides an opportunity to confirm and extend the earlier findings and rigorously assess the effects of a well-designed, intensive, and integrated curriculum can have on college and career planning and preparedness.

North Dakota is an excellent site in which to implement such a rigorous evaluation. North Dakota has passed legislation requiring students to develop college and career plans and, with funding from the Hess Foundation, has implemented the RTS program in a small number of schools. Therefore, state and district staff are poised to implement RTS more broadly as part of a randomized evaluation. As part of this study, high school juniors and seniors will be provided the 11th- and 12th-grade curriculum in the academic year 2015-2016 and be administered a baseline and post-intervention survey to measure the growth of key indicators of college and career planning and development. The survey draws on existing student surveys available in the RTS Program Manual (and used by Mathematica Policy Research in a previous study of RTS). These surveys measure immediate postsecondary plans, educational and occupation aspirations, career goals and attitudes, student demographics, learning and study habits, and school and community involvement. Two surveys of instructors in treatment schools will track topics covered to assess the fidelity of implementation, and principal surveys will provide context for the school’s college and career preparation agenda. In addition, the study will conduct site visits of control and treatment schools to develop a thorough understanding of how the curriculum is being implemented and the successes and challenges faced by instructors and students in learning from the material.

A.1.c Research Questions

As described in section A.1.d, RTS is a specific set of instructional modules tailored to different grades. The intervention planned as part of this study will be for one academic year and examine the end-of-year outcomes of 11th- and 12th-graders. Given this time horizon, the research questions for this study focus on concrete activities or specific student interests that are the target of each grade’s intervention. These outcomes are what the RTS Logic Model refers to as “second-stage intermediate” outcomes, in the context of providing RTS instruction over many grades. Here, they are immediate short-term outcomes directly addressed in the RTS modules that will be implemented.

At the 11th grade, does one year of RTS instruction increase the percentage of students:

  • With specific senior-year coursetaking plans?

  • Having visited a college or university? Does RTS increase the number of visits?

  • Having taken or registered for workforce development assessments such as the Test of Adult Basic Education (TABE), the Armed Services Vocational Aptitude Battery (ASVAB), and the ACT WorkKeys Foundational and Personal Assessments?

  • Reporting preparation of a resume?

  • Reporting career exploration activities such as job shadowing, attendance at career fairs, internships, or apprenticeships?

  • With any interest in one or more specific careers?

  • With high interest in a specific career?

At the 12th grade, does one year of RTS instruction increase the percentage of students:

  • Applying to college, overall and by level (technical or trade school, two-year community college, or four-year college)? Does RTS increase the number of college applications, overall and by level?

  • Completing the Free Application for Federal Student Aid (FAFSA)?

  • Accepted to or registered at college for the fall semester, overall and by level? Does RTS increase the number of acceptances?

  • Reporting preparation of a resume?

  • Applying for a job?

  • With a job offer?

  • With any interest in one or more specific careers?

  • With high interest in a specific career?

  • Planning to major in an area related to career interests?

A.1.d Description of Intervention

The RTS curriculum is designed to influence college admissions behaviors and postsecondary work plans beginning in 7th grade and continuing through 12th grade. RTS is organized as a set of instructional modules for each grade, with topics repeating across grades and work in prior grades occasionally providing the foundation for later work—for example, grade 11 modules for preparing a resume are followed by grade 12 modules for revising the same resume. The program is designed to be delivered throughout the school year, for 45 minutes each week. The program can be delivered by teachers, counselors, or designated RTS facilitators. In North Dakota, instructors are trained by state or regional career and technical education (CTE) staff associated with the state CTE office.



Source: Roads to Success Affiliate Program Manual.



As a long-term, in-depth intervention, RTS is designed to affect a wide array of short-, medium-, and longer-term outcomes that range from pre-secondary student beliefs about the importance of college and career preparation to concrete goals of taking college admissions tests such as the ACT or SAT, applying to and enrolling in college, and preparing resumes and cover letters. The logic model for the full intervention (drawn directly from the RTS-ND Program Manual) is presented in figure 1. The intervention is seen as having both content elements (i.e., the focus of the instruction) as well as structural elements that support the delivery of the content (e.g., regularly scheduled sessions, project-based learning, RTS facilitators). Within the logic model, outcomes are divided into short-term outcomes about the importance of school for postsecondary life; intermediate outcomes involving both diffuse concepts of motivation and college and career awareness, and specific goals of college preparatory activities and high school completion; and long-term outcomes including college enrollment, persistence, and career attainment. The model incorporates the understanding that contexts of pre-program student skills and beliefs, school climate and peers, and parent and community support can affect the impact that the program has on the desired outcomes.

Although the intervention and its expected outcomes are designed to be multi-year, instructional modules and attendant goals are specified for each grade. At the 11th grade, the intervention consists of seven topic areas and 30 different modules. At the 12th grade, there are 6 topic areas with 30 distinct instructional modules. Table 1 outlines the topic areas and instructional modules under each at the 11th and 12th grades. In general, college preparation in grade 11 consists of college admissions test preparation (the modules are either SAT or ACT, not a combination) and college research (“Education after High School”); college preparation in grade 12 consists of applying to college, financial aid, and first-year preparation (under “Money Matters” and “Next Steps”). Career preparation in grade 11 consists of career research, resumes, and interviews; while career preparation in grade 12 consists of company research, networking, resumes, cover letters, and interviews. The remaining instructional modules cover management of finances more generally (e.g., car shopping and health insurance), and are not the focus of the current study.

Working with ND CTE staff and RTS coordinators, this study will implement each of these modules in 11th and 12th grade. However, because RTS is designed as a multi-year intervention that builds off of each year’s prior work with students, some tailoring of the instructional modules is necessary in 11th and 12th grades to ensure instructional coherence. Specifically, the following modules will be incorporated from earlier grades: from grade 8, an introductory module explaining different types of postsecondary programs (e.g., two-year, four-year); from grade 9, a work and values module that helps student inventory what they value about work, as well as a module covering how to complete job applications; and at grade 10, a module that helps students plan for visiting and understanding college visits, and a module preparing students for a job interview.




Table 1. Roads to Success Topic Areas and Instructional Modules for Grades 11 and 12

Topic Area

Instructional Modules

Grade 11

Introduction

Introduction


Test Prep

Introduction to the ACT/SAT

ACT/SAT Registration


ACT/SAT Practice Questions

Workforce and Adult Ed Tests

Careers

Interest Inventory

Career Choice


My Career Research


Job Shadow

Introduction to Job Shadow

Informational Interview I and II


Creating Resumes I and II

Reflection and Thank-You Note


Workplace Behavior


Education after High School

Choosing Courses for Senior Year

Choosing a College


Choosing a College Major

College Research I through III


Other Education Paths

Letters of Recommendation


Evaluating Postsecondary Options


Money Matters

Understanding Credit

Paying for a Car


Credit Cards

Renting Your First Place


Let’s Go Car Shopping

Signing a Lease

Portfolio Review

Year in Review





Grade 12

Introduction

Introduction I and II


Applying to College

Completing a College Application

Essay Writing I through V

Finding a Job

Who Gets Jobs?

Cover Letters I and II


Your Network

The Interview


Company Research I and II

Follow-Up


Your Resume


Financial Aid

Financial Aid Options

FAFSA I and II


Scholarship I and II

Comparing Aid Packages

Money Matters

Freshman Year” Budget

Health Insurance


Big Ticket Expenses

Budgeting Details

Next Steps

Freshman Year Survival Guide

Getting Ahead at Work


Advice from the Experts


Source: Roads to Success Affiliate Program Manual.


The modules and lessons described above are a framework for instruction as well as an organizational tool for RTS materials and lesson plans. In other words, RTS instructors are encouraged to adapt lesson plans to fit class schedules, resource availability, teaching styles, and needs of their students—for example, introducing new materials, altering delivery methods, or splitting or doubling up lessons as necessary. Since this flexibility is an integral part of the program, this study will not attempt to enforce unreasonable uniformity by insisting on lockstep application of the curriculum. RTS also encourages instructors to collect and assess student work, but not in a typical graded manner. Students will not receive a letter grade for their participation in RTS classes, but they will receive a Pass/No-Pass designation, which will be collected as part of this study. Students will also receive feedback on collected work using a simple check, check plus, and check minus system. Information on individual students’ work and grades will not be tracked or gathered for this study, both because such detail is not needed to evaluate the program and because of the logistical difficulties and high costs of doing so.

A.1.e. Study Design

To answer the research questions, this study will employ an experimental design in which target schools (defined as public high schools and Bureau of Indian Education [BIE] schools with both an 11th and 12th grade) are randomly assigned to the treatment or control conditions and all 11th and 12th grade students in a treatment school provided with RTS instruction. This design is necessitated by the relatively small size of schools in North Dakota, but it also provides the advantage that cross-condition contamination within a school will not be possible. In a random assignment design, the simple difference between the outcomes of treatment and control students serves as an unbiased estimate of the impact of RTS on college and career preparation activities.

To achieve adequate statistical power, the minimal number of schools for the study would be between 64 and 80 participating schools, depending on various assumptions. However, for maximum statistical power and to better account for the possibility of some schools refusing to participate or dropping out, we will select 95 schools, which includes all estimated eligible schools (those that have not implemented RTS at any grade level) minus 30 schools with very small 11th- and 12th-grade enrollments. After recruitment into the study, half of schools will be randomly placed into the treatment group and half into the control group (if the number of eligible schools is odd, we will randomly decide if the odd case is in the treatment or control group). This allows us to have 80% power with an alpha of 0.05. Further information on sample design can be found in Part B, section 1 of this OMB request.

Most schools in the study sample would be in their own school districts; i.e., assigning two or more schools within a district to different treatment and control conditions will not be possible. For example, ND contains a high number of school districts (148) relative to its number of schools with a 12th grade (167), meaning that most school districts only contain one school with an 11th or 12th grade. In addition, because of the typically small size of ND high schools (the median 11th or 12th-grade class size is 19), all 11th- and 12th-graders in a selected treatment school will receive the RTS curriculum. In most schools, this will mean that one teacher will be able to deliver the curriculum to one of their classes. However, instruction may be delivered by teachers, guidance counselors, or hired career advisors. If teachers are providing the instruction, English or social studies teachers are the most likely to do so; at the 11th and 12th grades, they are the only teachers likely to have all of the students in the grade regularly.

Instructors will receive training from REA coordinators or one of the four career resource coordinators (CRC) in the state. Training will be delivered in a two-day session at different regional sites about prior to the start of the 2015-2016 school year; those being trained may be able to apply the training to professional development requirements or college credit, increasing the motivation to attend training. Training will be organized and managed by ND Department of CTE staff. In addition, principals will receive the program manual and guidelines on how to support teachers in their RTS delivery. ND Department of CTE staff, along with REA coordinators, will also provide support for the instruction throughout the year, such as answering questions about the RTS curriculum and providing or pointing to additional RTS resources. This support will be on an as-requested basis.

Control schools will not receive RTS training or support. Because assignment is by school, and because most treatment schools will be the only one in their district receiving RTS, spillover effects should be largely nonexistent. Substitution may occur if some schools decide to implement RTS on their own; however, given the resources required, control schools are very unlikely to independently adopt RTS at these grade levels.

A.1.f Data Collection Needs

As part of the evaluation, the study includes multiple data collection efforts, summarized in Table 1 and described below.

Table 1. Data Collection Needs

Instrument

Data Need

Respondent(s)

Mode

Schedule

Baseline student survey

Baseline college and career preparation

Background characteristics

All treatment and control 11th graders and

12th graders

Web or paper/ pencil

First two weeks of 2015-16 school year

End-of-year student surveys

College and career preparation outcomes


All treatment and control 11th graders and

12th graders

Web or paper/ pencil

Last two weeks of 2015-2016 school year

First instructor survey

Fidelity of implementation

Instructor background

Training efficacy

RTS instructor (treatment schools)

Web

January 2016

Second instructor survey

Fidelity of implementation

Instructor background

Support for RTS

RTS instructor (treatment schools)

Web

June 2016

Principal survey

RTS support

Other college and career support

Principal or designee in treatment or control schools

Web

May 2016

Site visits

Perceptions and attitudes toward RTS

Obstacles and supports for RTS

Students, instructors, counselors, and principals in 20 selected treatment or control schools

In-person

April - May 2016

Administrative data

Academic background

Attendance/engagement

School-level FAFSA applications

All treatment and control 11th graders and

12th graders

Electronic records

July 2016



Student surveys. The primary data source will be baseline and outcome student surveys tailored for each grade (11 and 12). The surveys will be based on a 12th-grade survey previously developed for RTS that asks about college and career activities and plans (the original survey can be seen on pp. 142-148 of the RTS Program Manual here: http://www.roadstosuccess.org/ourcurriculum/category/18-program-manual). The original survey asks about behaviors related to college application (including two-year and technical/trade schools), college enrollment if accepted (including name of school, living arrangements, and costs), financial aid, and immediate plans for work (including jobs applied for, job offers), in addition to long-term career and educational goals. This survey has been adapted to produce separate 11th- and 12th-grade surveys that ask about the specific RTS objectives of each grade’s curriculum, in addition to modifying question wording, response options, and question sequence to conform to best survey practices (e.g., simplified language, placing more sensitive questions toward the end).

The baseline surveys, to be administered within the first month of schooling, will ask the same questions as the outcome survey, with modifications reflecting the student’s place in the college and career planning process. The baseline survey will also ask consent to link survey and participation data to administrative data.

Instructor surveys. We will ask RTS instructors in intervention schools to complete a survey asking about RTS topics covered and activities completed at two time points: at the end of the first half of the school year (early January) and at the end of the second half of the school year (early June). These surveys will monitor the fidelity of implementation of RTS and provide measures of exposure to RTS concepts and activities for the impact analysis. Two surveys allows for better recall of RTS topics covered and time spent on RTS instruction. The instructor surveys will include a list of major RTS modules and sub-components, with corresponding check boxes for items covered. In addition, the survey will gather the average amount of time spent in RTS each week of the relevant semester. The first instructor survey will include additional items about instructor background and prior experiences teaching college and career preparation lessons. The second instructor survey will add items providing general feedback about their experiences teaching the RTS modules, and ask them to record the final pass/no-pass (or satisfactory/unsatisfactory) designation for each student (RTS only provides pass/no-pass grades for participation in RTS classes).

Principal survey. Principals or staff they may designate, such as assistant principals or guidance counselors, will be surveyed at all treatment and control schools to measure the college and career planning activities offered to students in these schools at a high level. The treatment school questionnaire will be designed to elicit additional information about college and career planning beyond or in addition to what is being offered as part of the adapted RTS curriculum; in control schools, the questionnaire will be broader and investigate all college and career planning activities in and outside of the classroom.

All surveys will be delivered as online modules using existing school computer equipment to record and secure student, teacher, and principal responses. Facsimile versions of the baseline and end-of-grade student surveys, the first and second semester teacher surveys, and the principal survey can be found in appendix B of this OMB request.

Site visits. Fourteen treatment schools and six control schools will be visited one time, at the end of the 2015-16 school year, to assess perceptions of RTS curriculum, instruction; attitudes toward college and career preparation generally and RTS specifically; and obstacles to and supports for learning RTS material. Treatment school site visits will provide rich background and context for survey findings and opportunities to gauge the fidelity of implementation of the intervention. Site visit data will be collected from schools in both the treatment and control groups to allow the study team to examine how RTS in context might differ from other college and career planning services provided in schools in the state. The site visits are designed to help researchers understand how and why RTS may or may not have impacted 11th- and 12th-graders’ perceptions of their college and career preparedness and their plans after high school. Because of the greater level of explicit college and career planning and instruction to be conducted in treatment schools as part of RTS, the study will visit more treatment schools than control schools; the number of treatment site visits (14) is designed to enable examination of variation within districts or REAs with multiple treatment schools. The number of control schools (6) is designed to provide insight into other modes of college and career preparation but not systematic knowledge of variations in preparedness activities.

The site visits will draw on data including informal, open-ended interviews, focus groups, direct observations, and classroom and school documentation. The bulk of information from the site visits will be collected via interviews and focus groups with individuals participating in RTS. These interviews and focus groups will allow researchers to develop a holistic understanding of the curriculum and its implementation from multiple points of view. Researchers will interview stakeholders including: academic teachers; teachers and career counselors involved in the implementation of RTS; school administrators; and other school staff as appropriate. Researchers will also conduct focus groups with small groups of 11th- and 12th-grade students who participate in RTS. The number of informants in each case will be determined by the number of cases being investigated and available resources. The majority of interviews and all focus groups will be conducted in-person during site visits, supplemented by phone interviews as needed. With permission from subjects, audio will be recorded and transcribed to ensure accuracy in data analysis and recording.

Site visits will also be used to conduct direct observations of the RTS curriculum being implemented. Observations allow researchers to see programs “in action,” and provide data to cross-check and validate information collected during interviews and focus groups. Observations will provide direct evidence of behavior and processes to enhance data on perceptions, attitudes, and beliefs collected through interviews. Documentation regarding the implementation of RTS will be essential to understanding individual cases. The study team will collect program descriptions, lesson plans, planning documents and additional relevant information associated with RTS and the students participating in the curriculum.

Site visit protocols for each of these modes of data collection are described in detail in appendix C of this OMB submission.

Administrative data. The study will incorporate data from North Dakota’s statewide longitudinal data system (SLDS) to provide additional information on the background of students, their academic achievement, their CTE status, and their attendance habits. Administrative data agreements for these data will be negotiated with ND state SLDS staff. The study will also include school-level FAFSA completion data reported biweekly from studentaid.gov: https://studentaid.ed.gov/about/data-center/student/application-volume/fafsa-completion-high-school. These data provide important contextual information about overall levels of student planning for higher education.

A.1.h Legislative Authorization

This study is authorized by section 114(d)(4)(A) of the Carl D. Perkins Career and Technical Education Act (20 U.S.C. § 2324(d)(4(A)).

A.1.i Prior and Related Studies

As noted previously, from 2007 to 2009, Mathematica Policy Research conducted a two-year evaluation of the portion of RTS delivered to 7th and 8th graders (Duncan, Bleeker, and Booker 2010). Conducted in rural, predominately white schools in New York and West Virginia, the study randomly assigned schools within sampling blocks (defined by free and reduced-price lunch eligibility rates and average test scores) to either a treatment (13 schools) or control (12 schools) condition. Using baseline and end-of-intervention student surveys combined with attendance and mobility data, the study found that “RTS does not have clearly positive short-term impacts” on school and class engagement measures (motivation in school; learning and study habits; and attendance and negative behaviors) (p. 43). RTS was found to positively impact some individual measures of attendance and behavior (specifically school absences and detention), but to also negatively impact classroom attendance. The study did find impacts for several career outcomes related to the type of informational deficits cited above: career exploration with school staff, knowing how to find jobs that fit their needs, and knowing what is required for different careers. Effect sizes for these measures ranged from 0.15 to 0.50. Duncan, Bleeker, and Booker’s study is the only known evaluation of RTS.

A.2 Purpose and Use of Information Collection

The data for the evaluation of 11th and 12th-grade RTS curriculum will be obtained from student participants and non-participants, RTS instructors in treatment schools, principals, and other school staff.

Baseline student survey. Data from the baseline student survey will be used in conjunction with the end-of-year student survey to obtain estimates of the increase in frequency of concrete college and career planning activities outlined in the research questions (such as completing FAFSA or applying for a job after high school). Data from the baseline student survey will also be used to compare differences in treatment effects across student subgroups such as gender or race/ethnicity.

End-of-year student survey. Data from the end-of-year student survey will be used in conjunction with the baseline student survey to obtain estimates of the increase in frequency of concrete college and career planning activities outlined in the research questions (such as completing FAFSA or applying for a job after high school). Data from the end-of-year student survey will also be used to assess fidelity of implementation for RTS instruction by analyzing variance in concrete activities RTS instructors were expected to require of students (e.g., creating a resume).

First semester instructor survey. The first semester instructor survey will be used to assess fidelity of implementation in the first half of the school year by asking questions about concrete instructional activities taken during the time period covered and overall time spent on RTS instruction. This survey also gathers background information on instructors to enable analysis of impact by instructor type (teacher, counselor, or other RTS facilitator).

Second semester instructor survey. The second semester instructor survey will be used to assess fidelity of implementation in the second half of the school year by asking the same questions as in the first instructor survey about concrete instructional activities taken during the time period covered and overall time spent on RTS instruction. Conducting two surveys covering shorter blocks of time will enable better recall of activities undertaken by the instructor than a comprehensive year-end survey. This survey will also ask instructors to indicate the given or expected pass/fail grade each student in the class earned, which will be used in analyses relating student performance in RTS to key outcomes.

Principal survey. The principal survey will be used to systematically document variations in school support for RTS instruction and obstacles to instructors’ implementation of RTS, as well as the extent to which control schools provide alternative college and career readiness training. Data from this survey therefore will be used in analysis of contextual effects on RTS impact and multivariate impact models controlling for school support and control school activities that may substitute for and thereby blunt the impact of RTS.

Site visits. Data from site visits will be used to construct a cross-site analysis report documenting how RTS has been received in staff and students’ own words. The cross-site report will develop a thematic understanding of the challenges and opportunities for students and schools as a whole, in order to draw example lessons about the constraints and considerations schools and sites face in implementing a college and career planning curriculum.

Administrative data. Data from administrative records held by ND will be used to identify, where possible, students involved in a CTE program (of special interest to OCTAE), as well as the academic background of all students, for use in models accounting for student characteristics in estimating overall impact.

A.2.a Content Justifications

This section contains justifications for RTS-ND survey and site visit instruments, reflecting the main research objectives of the study overall. The survey instruments are included in appendix B, and site visit protocols in appendix C.

Student surveys. As noted previously, many questions in the student questionnaire derive from a 12th-grade survey found in the RTS Program Manual that asks about program involvement, student plans, and attitudes toward RTS. The first section of both the baseline and end-of-year survey solicits information on student background, including confirmation of their school of attendance (to ensure that the web questionnaire is functioning properly), a question to identify any students that have experienced RTS previously, and a question about general grades that help classify the student academically for purposes of subsequent analysis. It also incorporates two scales to focus on career readiness through two career readiness concepts: career readiness concern and career adaptability self-efficacy.

Career readiness concern derives from the Career Maturity Inventory—Form C (CMI-C), which is designed to assess four factors relating to career readiness: concern, curiosity, confidence and consultation (Savickas and Porfeli, 2011). “Concern” refers to the extent that respondents give consideration to finding a career, and maps to the short-term RTS outcome of understanding yourself and the intermediate outcome of developing career awareness. A hierarchical factor analysis (Savickas and Porfeli, 2011) showed that the concern subscale of CMI-C has a loading of about .51 onto the higher factor of career readiness. Career adaptability self-efficacy derives from the Career Adapt-Abilities Scale (CAAS), which is designed to assess factors relating to career adaptability: concern, control, curiosity and confidence. Although the sub-factors of the CMI-C and CAAS two scales are similarly named, they and their measurement scales are distinct. Adaptability is distinct from readiness, as it refers to an “individual’s resources for coping with current and anticipated tasks, transitions, traumas in their occupational roles that, to some degree large or small, alter their social integration” (Savickas and Porfeli, 2012, p. 662). This measure maps to the RTS short-term outcomes of work skills and solving real-world problems.

The second section of the surveys asks about postsecondary planning, and includes items adopted from external sources such as the National Center for Education Statistics’ (NCES) High School Longitudinal Study of 2009 (e.g., question 12). The postsecondary planning questions are tailored to the grade and time of year during which the questions will be posed in order to elicit information about baseline and outcomes on key measures of college preparedness. These questions all relate to the pipeline of college and career readiness that RTS is designed to influence, including high school coursetaking, college or university visits, college or university application, financial aid applications and receipt, career preparation tests, resume preparation, and immediate postsecondary plans.

The third section of the student surveys asks further detail about plans for higher education, including measures that relate to critical informational deficits that students may have (college costs) and concrete aspirations for postsecondary study (majors the students may be interested in). For students with plans to work after high school, this section is followed by information solicited on work plans, with questions about current job in high school, planned jobs for after high school, jobs applied to, and work intensity. Detailed information on interested or planned majors or jobs will be gathered through a web-based coder that allows students to search for an choose majors or job titles that correspond to existing major and occupational code frames. Collectively, data on immediate postsecondary plans permits evaluation not just of the activities RTS curriculum is designed to teach (such as creating a resume or applying for college), but of the immediate effects such instruction may have on actual behavior of students.

The fourth section of the student surveys (and the last section of the end-of-year versions) concerns distal educational and occupational plans, again drawing on the existing survey and similar items employed in NCES’s high school longitudinal studies. Interest in specific careers, expectations for educational and occupational attainment, and a more recent item developed and employed in HSLS:09 that asks about the relationship between expected occupation and required education, gauge the student’s overall academic and occupational orientation and long-term expectations. These measures will permit another layer of analysis relating RTS exposure to student outcomes that are less proximate to the instructional activities of RTS but conceptually linked to RTS’s goals (as, for example, illustrated in the logic model in figure 1).

Consistent with best survey practice placing more sensitive questions at the end of questionnaires, the last section of the baseline versions of the student questionnaires asks more specific information about student background, specifically demographic and socioeconomic information about birthdate, gender, race/ethnicity, parent’s education, and language background. Each of these questions are consistent with NCES question and item response wording, and provide key measures enabling disaggregation of results by well-recognized background characteristics. A final set of measures asks about student economic background using an approach seen on international assessments (specifically, the Program for International Student Assessment [PISA] 2006 administration): to wit, given that students may not be the best source of information about parental income or occupation, and that there is no corresponding parent survey planned as part of this study, the questionnaire uses household estimates as a foundation for a scale of wealth or income. Exploratory and confirmatory factor analyses have shown these and similar scales to be reliable and valid, with items loading on a single wealth factor in representative nations (Traynor and Raykov, 2013).

Instructor surveys. The first and second semester instructor surveys derives it items from NCES studies and from the organization of the RTS grade 11 and grade 12 instructional modules themselves. The first part of the first semester survey asks standard background questions of instructors, including gender, race/ethnicity, and educational attainment, in order to enable comparisons during analysis of RTS impacts by instructor characteristics. The second section of both surveys primarily consists of a list of RTS instructional modules that instructors will be asked to indicate they have taught—this is a primary measure of fidelity of implementation for the study, and, for modules taught, will include solicited information about challenges encountered during instruction. This section also asks about time on RTS tasks in an average week, the regularity with which instructors have taught material, grading practices, and specific college and career preparation activities taught. Collectively, these questions will provide insight into the extent to which teachers have regularly and thoroughly followed the RTS curriculum. The list of specific activities can be correlated with student responses to help validate each respondent’s claims.

The third and final section of the first semester survey asks more general questions about whether the instructor has prior experience with a college and career readiness curriculum and how they perceive RTS training, materials, and support from other school staff such as principals and counselors. These questions provide a gauge of instructor preparation to teach RTS. The second and final section of the second semester survey asks similarly general questions, but about students’ reaction to RTS, likelihood of using RTS in the future replacing questions about prior experience teaching and reactions to any training they received. These questions in these sections of the respective surveys provide summary measures of attitudes towards RTS which can be unpacked via more in-depth exploration through the site visits.

Principal survey. The principal survey in both control and treatment schools provides a brief, high-level examination of college and career planning activities and supports either around RTS or in its absence, depending on treatment status. Measures for the principal survey draw from NCES’s HSLS:09 base-year school administrator survey and NCES’s Education Longitudinal Study of 2002’s (ELS:2002) base-year school administrator survey. The first section asks general information about the school, including school type, attendance, typical (estimated) outcomes of 12th graders, and school climate. These measures provide key information on the context of college and career readiness for the school’s student. The second section poses questions about key college and career readiness activities, the approximate percentage of 11th and 12th graders who participate in them, and the reasons for not offering such programs or activities; these align with activities covered in the teacher and student surveys. The third and final section is for principals in treatment schools, and asks broad questions about their perception of students’ reactions to RTS and the usefulness and quality of RTS materials and training. The content of the principal survey is designed to complement the teacher and student surveys, on the one hand, as well as the planned interviews at site visit schools in which open-ended exploratory items can provide further information about college and career preparation offered by the school.

Site visits. The site visits are designed to help researchers understand how and why RTS may have impacted 11th and 12th graders’ college and career preparedness and their plans after high school. In order to do so, researchers will conduct interviews with RTS teachers, academic teachers, counselors, and principals, as well as focus groups with 11th and 12th grade students and observations of RTS lessons. These strategies will allow study team members to collect information on various perspectives on RTS and triangulate across these data sources to develop a rich, detailed picture of RTS implementation in schools and its effectiveness. Site visit protocols, including question prompts, are included in appendix C. They cover interviews, focus groups, and document collection that will be used to address several sets of questions.

First, the site visits will seek to understand the college and career planning assistance received prior to the 2015-16 academic year. Understanding what was available prior to the intervention will be crucial to understanding how RTS has changed the content available in treatment schools and how that content is delivered and received from the perspectives of various groups impacted by these changes: administrators, counselors, teachers, and students. Questions about prior college and career planning will give study participants an opportunity to reflect on change over time in how, if at all, college and career planning activities are delivered in schools, and how these changes impact student outcomes.

Second, the site visits will investigate challenges and successes instructors and students have experienced in implementing the program. This information is important to the goals of the research because it will help researchers understand how implementation of college and career planning curricula fits into broader school contexts. Questions on challenges and successes will also address participants’ perceptions of and attitudes toward RTS, and how these attitudes might have affected implementation. These questions will help researchers identify factors that both facilitated and impeded successful RTS implementation and reflect on how factors that mitigated its implementation might be addressed.

Third, site visits will help to understand how and why RTS is or is not implemented with fidelity to its model. This will allow researchers to understand how RTS implementation actually unfurls in schools, rather than the ideal picture perhaps presented in program documentation. Understanding fidelity of implementation, or lack thereof, will help give context to study findings from other data sources such as student and instructor surveys. Deviations from the model might influence program outcomes; understanding what these deviations are, and how and why they might have occurred, will help researchers explore how to promote greater fidelity of implementation and, ideally, improved outcomes.

Fourth, and finally, the site visits will provide insight into how college and career preparation activities differ across the control and treatment schools, beyond merely the presence or absence of RTS. Questions pertaining to this issue will allow the study team to explore how the intervention differs from the control context in practice. Moreover, collecting these data will help researchers understand how school contexts and practical factors can influence the rollout of college and career planning curricula across a state. This information, in turn, will provide context for any potential differences in program outcomes. While other data sources will help researchers document outcomes, site visit data will allow the study team to reflect on how and why those differences might occur, with practical lessons for future RTS and college/career preparation curricula implementation.

Administrative data. Administrative data will be key in providing systematic data on the prior academic achievements and levels of school engagement of students in the study. This will enable parsing of RTS impacts by CTE program status, academic success, and measures of student behavior such as attendance and disciplinary incidences. The data elements requested will include coursetaking data including courses taken, credits earned, and grades earned; CTE, academic/gifted, or other program status or membership; absences; and suspensions.

A.3 Use of Information Technology

The data collection plan is designed to obtain information in an efficient and reliable fashion while minimizing burden to respondents. All surveys will be delivered as online modules using existing school computer equipment to record and secure student responses. Skip patterns and complex routing can be employed as necessary, but a simple questionnaire is planned to reduce costs. RTI’s Hatteras instrument development and deployment system will be used to create and distribute (i.e., make available online) the surveys, allowing responses to be immediately uploaded into a secure database. Reliance on a web-based survey will reduce the risk that paper copies of questionnaires are lost, copied, or stolen and respondent answers thereby disclosed. In addition, some items, particularly those relating to college, college majors, and occupations, will be designed so that students can self-code into existing codeframes such as the National Center for Education Statistics’ Integrated Postsecondary Education Data Systems’ ID, the Classification of Instructional Programs (CIP), and O*NET occupational categories. Although the study team has confirmed with ND state staff that all high schools should be capable of hosting the online survey with existing computer equipment, we will have available paper copies for any schools that require them.

Administrative data records will be delivered electronically as well through a secure FTP server to which only authorized ND staff and study leadership at RTI have access. These records will be immediately downloaded and stored on secure RTI servers, and removed from the FTP server. Electronic exchange of administrative data lowers the risk of disclosure compared to secure faxes or hard-copy document delivery, in addition to reducing burden on state staff compared to hard-copy preparation, faxing, and/or mailing.

A.4 Efforts to Identify Duplication and Use of Similar Information

As part of planning for this study, researchers reviewed existing college and career curriculum studies and evaluations, particularly those related to RTS, and found only one prior data collection of lower grades conducted previously in northeastern states. There are no known datasets or ongoing collections gathering information about RTS curriculum in the 11th and 12th grade. North Dakota has not implemented any data collection or evaluation of RTS in the small number of schools that have so far adopted it.

A.5 Impact on Small Businesses or Other Small Entities

Target respondents for RTS-ND surveys and site visits are individuals, and data collection activities for this study will not burden small businesses or entities. The study will minimize burden for all respondents by including only the questions and items necessary to address the research questions and how impacts differ by school, instructor, and student characteristics.

A.6 Consequences of Not Collecting Data

The implementation of RTS in ND represents a rare opportunity to study a statewide curriculum initiative with concrete, clearly delineated components and goals in a rigorous manner. Large-scale randomized studies are difficult to implement because of the necessity to gain buy-in from a number of stakeholders and the challenges of operating in many varied sites. In ND, the existing state commitment to RTS means both a willing partner, a base of familiarity with RTS content and its implementation, and a somewhat uniform but large-scale state environment in which to assess the impact of college and career readiness. Not collecting the data deprives Federal program staff at OCTAE and in other offices of the U.S. Department of Education of the chance to reliably and rigorously test a significant, coherent college and career preparation curriculum that can have important implications for policy design, grant programs, and other efforts to identify and promote effective postsecondary readiness training mechanisms.

Specifically, this study will provide direct, rigorous evidence of the utility of an intensive, coherent college-and-career readiness curriculum at the upper grades of high school. In contrast to other prominent readiness curricula, RTS serves both disadvantaged and non-disadvantaged students, making documentation about its effectiveness important for a wide array of students, educators, and policy-makers. RTS also concretely targets information deficits and practical stepping stones that lead to college and career preparedness—needs which are acute at the end of high school where postsecondary options are diverse and pathways to and through them are often unclear for students. Determining the strengths and weaknesses of RTS will help improve RTS directly and can provide guidance to other programs with similar goals. Furthermore, the North Dakota setting allows us to examine the implementation and impact of such a program in small-school, rural settings that are often not rigorously and widely studied, in addition to city and suburban environments within the state.



A.7 Special Circumstances

No special circumstances of data collection are anticipated.

A.8 Federal Register Announcements and Consultations

A.8.a Federal Register Announcement

The Department published both a 60-and 30-day FRN and received no public comments during the 60-day comment period.

A.8.b Consultations Outside the Agency

Officials from the North Dakota Department of Career and Technical Education were consulted extensively in planning the design of this study and its associated information collections.

A.9 Provision of Payment or Gift to Respondents

Incentives will be given to RTS instructors, school coordinators, and principals to motivate participation in data collection activities. For teachers, counselors, or career advisors who will be delivering RTS, we will offer $50 or $100 for participation, depending on the number of classes they are instructing. The specific level of incentive for RTS instructors will depend on their level of involvement in the study, as described in table 3. This size of incentive is commensurate with the burden placed on instructors for completing two surveys for each class ($25 per survey). Principals at treatment schools will receive an incentive ($100) for their assistance with selecting the RTS instructors, scheduling class periods for RTS instruction, promoting RTS within the school community, and for participation in the principal survey. Principals at control schools will also receive a small incentive ($25) for their participation in the principal survey. The School Coordinator serves a critical role in data collection, functioning as the central school contact, and facilitating arrangements for various data collections. School coordinators in both treatment and control schools will receive a $75 incentive for submitting class lists, managing consent forms, and scheduling the student surveys. School coordinators in schools selected for site visits will receive an additional $75 incentive to coordinate with researchers to plan site visit logistics. The incentive plan is summarized in table 2.

Table 2. School and Instructor Incentives

RTS instructors (teacher, counselor, or career adviser)

One class

Two or more classes

$50

$100

School coordinators

Schools with no site visits: $75

Schools with site visits: $150

Principals

Treatment school principals: $100

Control school principals $25















A.10 Assurance of Confidentiality Provided to Respondents

OCTAE assures participating individuals that all identifiable information collected as part of the RTS-ND study may be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district, school, or individual. We will not provide information that identifies districts, schools, or individuals to anyone outside the study team, except as required by law. RTS-ND data security and confidentiality protection procedures are in place to ensure that RTI and its subcontractors comply with all privacy requirements, including:

  • The approved Revised Proposal submitted by RTI and on file with OCTAE;

  • Privacy Act of 1974 5 U.S.C. § 552(a);

  • The U.S. Department of Education Incident Handling Procedures (February 2009);

  • The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);

  • The U.S. Department of Education, ACS Directive OM: 5- 101, Contractor Employee Personnel Security Screenings.

  • Family Educational and Privacy Act of 1974, 20 U.S.C. § 1232(g);

  • All new legislation that impacts the data collected through this contract.

Additionally, RTI will comply with the Department’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance as well as IT security requirements in the Federal Information Security Management Act (FISMA), OMB Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to the NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/standards.asp.

The RTS-ND procedures for maintaining confidentiality include notarized nondisclosure affidavits obtained from all personnel who will have access to individual identifiers; personnel training regarding the meaning of confidentiality; controlled and protected access to computer files; built-in safeguards concerning status monitoring and receipt control systems; and a secure, staffed, in-house computing facility. RTS-ND follows detailed guidelines for securing sensitive project data, including, but not limited to: physical/environment protections, building access controls, system access controls, system login restrictions, user identification and authorization procedures, encryption, and project file storage/archiving/destruction.

Furthermore, the Department has established a policy regarding the personnel security screening requirements for all contractor employees and their subcontractors. The contractor must comply with these personnel security screening requirements throughout the life of the contract. The Department directive that contractors must comply with is OM:5-101, which was last updated on 7/16/2010. There are several requirements that the contractor must meet for each employee working on the contract for 30 days or more. Among these requirements are that each person working on the contract must be assigned a position risk level. The risk levels are high, moderate, and low based upon the level of harm that a person in the position can cause to the Department’s interests. Each person working on the contract must complete the requirements for a “Contractor Security Screening.” Depending on the risk level assigned to each person’s position, a follow-up background investigation by the Department will occur.

Study notification materials sent to sample members will describe the voluntary nature of RTS-ND and convey the extent to which respondent identifiers and all responses will be kept confidential. Contacting materials are presented in appendix A, including confidentiality language provided in a study brochure.

A.11 Justification for Sensitive Questions

There are no sensitive questions to be asked as part of this data collection.

A.12 Estimates of Hours Burden

Estimates of response burden for the RTS-ND study are shown in table 3. Estimates of survey response burden are based on estimates developed from experience with similar items and interview subjects in conducting NCES surveys of high school students (ELS:2002 and HSLS:09). The number of respondents (targeted and expected given the response rates) overlap across some of the data collection efforts; that is, some of the students taking the surveys will also be part of the focus groups during the site visits. Therefore, the totals for those columns are less than the sum of the column values.

The total estimated cost to school and state (administrative records) staff respondents is $4,224, which represents their combined 164 burden hours times various hourly rates depending on staff title. Estimates of hourly rates for North Dakota principals ($35) and teachers ($21) in public schools was obtained from published tables of the 2011-12 Schools and Staffing Survey (hourly rates were calculated as estimated annual salaries divided by 2080 hours). Counselors were estimated to have slightly higher hourly rates ($25), as was administrative records staff person ($35).


Table 3. Estimated Burden for RTS-ND Data Collection

Data collection activity

Number of Targeted Respondents

Expected Response Rate (%)

Number of Respondents

Unit Response Time (Hours)

Total Response Time (Hours/Year)

Total Burden Time (Hours)

11th-grade student surveys

4,220

95

4,009

0.58

2325

2325

12th-grade student surveys

4,188

95

3,979

0.50

1990

1990

Instructor surveys

120

90

108

0.50

54

54

Principal survey

95

90

86

0.25

22

22

Site visit: focus groups

320

100

320

1.0

320

320

Site visit: instructor interviews

40

100

40

1.00

40

40

Site visit: principal interview

20

100

20

1.00

20

20

Site visit: counselor interview

20

100

20

1.00

20

20

Administrative records match

1

100

1

8

8

8

Total



8,584


4,791

4,791

Note: Because of overlap of respondents for the student surveys and focus groups, the instructor surveys and interviews, and the principal surveys and interviews, the totals for targeted and expected respondents is lower than the sum of the column values.



A.13 Estimates of Cost Burden to Respondents

There are no capital, startup, or operating costs to respondents for participation in the project. No equipment, printing, or postage charges will be incurred.

A.14 Annualized Cost to the Federal Government

The estimated annual cost to the federal government for RTS-ND is $643,841, which includes total OCTAE staff costs for managing NCICTE work ($62,660) and total contract costs ($1,225,022) associated with this specific study. The estimate includes costs for study recruitment, materials and instruments preparation, administration of data collection, processing and analyzing data, and producing reports. The contract costs were generated through budgeting specific contractor staff for express numbers of hours and direct costs associated with conducting the study (e.g., mailouts, incentives).

A.15 Reasons for Program Changes

This is an original submission and no changes are being requested, there is a program change increase of 4,791.

A.16 Publication Plans and Project Schedule

Under the approved Revised Proposal submitted to OCTAE as part of the NCICTE contract, this study will produce two products for public use: a summary report detailing research background, methods, data collection results, descriptive findings, and impact estimates; and a restricted-use data file able to be licensed to researchers for additional analysis.

The analysis approach will be to compare changes in outcomes for students in treatment schools to those in control schools. Outcomes will include changes in percentages (such as of students who have applied for a job) as well as changes in averages for specific measures (e.g., number of postsecondary institutions visited). Tables 4 and 5 link key outcomes (by grade) to the indicators measured through data collection, the source of data, and the type of outcome analysis to be conducted. At each grade level, one year of the RTS curriculum is expected to be associated with a statistically significant increase in each outcome.

Table 4. Outcomes, measures, and analysis for 11th-grade RTS intervention

Outcome

Measure

Data source

Analysis (treatment versus control)

Specific senior-year coursetaking plans

Number of specific courses listed by student

Student survey item

Compare mean number of courses listed

College or university visits

Number of specific sites listed by student

Student survey item

Compare mean number of sites listed

Having taken or registered for workforce development assessments (i.e., Test of Adult Basic Education (TABE), the Armed Services Vocational Aptitude Battery (ASVAB), the ACT WorkKeys Foundational and Personal Assessments)

Number of specific assessments identified by student as registered or completed

Student survey checklist items

Compare (for registration, completion, and combined) mean number of valid assessments listed

Resume preparation

Self-rating of resume creation status (e.g., “I am planning to create it,” I have an early draft,” “I have a finished version,” “I haven’t given it much thought”)

Student survey Likert-scale item

Compare mean rating of resume status; compare proportion of student resumes in each category

Career exploration activities

Number of relevant activities (e.g., job shadowing, attendance at career fairs, internships, apprenticeships) students report

Student survey items

Compare mean number of exploration activities

Interest in specific careers

Level of interest in careers

Student survey Likert-scale items

Compare mean rating of interest




Table 5. Outcomes, measures, and analysis for 12th-grade RTS intervention

Outcome

Measure

Data Source

Analysis (treatment versus control)

Applying to post-secondary education institutions (i.e., technical or trade school, two-year community college, or four-year college)

Type and number of institutions students report having submitted applications to attend

Student survey items

Compare mean number of applications; compare number of applications in each category

Completing the Free Application for Federal Student Aid (FAFSA)

Whether the student reports FAFSA completion

Student survey item

Compare mean number of students completing FAFSA

Acceptance to post-secondary institution(s) for fall semester

Number of institutions student reports receiving notice of acceptance

Student survey item

Compare mean number of accepting institutions; compare proportion accepted to at least one institution

Registration at a post-secondary institution

Whether student reports being registered

Student survey item

Compare proportion of students registered

Resume preparation

Self-rating of resume creation status (e.g., “I am planning to create it,” I have an early draft,” “I have a finished version,” “I haven’t given it much thought”)

Student survey Likert-scale item

Compare mean rating of resume status; compare proportion of student resumes in each category

Applying for a job*

Type and number of employers students report having submitted applications to work

Student survey items

Compare mean number of applications; compare number of applications by job type

Receiving a job offer*

Number of offers student reports receiving

Student survey item

Compare mean number of offers; compare proportion receiving at least one offer

Interest in specific careers

Level of interest in careers

Student survey Likert-scale items

Compare mean rating of interest

Planning to major in an area related to specific career interests

Student self-report

Student survey items

Compare proportion of students with career-related major

* Including full-time employment or service in the armed forces



Analysis will be done both descriptively and through the use of linear regression models which improve precision of impact estimates by controlling for additional sources of variation in outcomes. Models may include multilevel models (a.k.a., hierarchical linear models, HLM) that model each outcome based on a set of covariates that include treatment status; baseline survey measures; administrative data including pre-intervention test scores and attendance; student demographics and background characteristics; instructor characteristics; school-level contextual variables; and measures of the fidelity of implementation.

The schedule for the RTS-ND study is shown in table 6.



Table 6. Schedule for RTS-ND Study

Task No.

Deliverables

Due date

1.3

IRB approvals

3/1/2015

1.4

State SLDS research agreement/MOU

7/17/2015

2.1

Draft sample of schools

1/16/2015

2.2

Final sample of schools

5/22/2015

3.1

Pre-recruitment documentation

3/18/2015

3.2

District research agreements

6/23/2015

3.3

School recruitment summary

8/12/2015

4.1

Draft online (student, instructor, principal) instruments

2/6/2015

4.2

Final online (student, instructor, principal) instruments

6/5/2015

4.3

Baseline student survey results memo

10/2/2015

4.4

Instructor survey results memo

5/18/2016

4.5

End-of-year student survey results memo

6/3/2016

5.1

Principal interview results memo

6/3/2016

5.2

Case study report

7/8/2016

6.1

Draft restricted-use data file

7/15/2016

6.2

Final restricted-use data file

8/5/2016

6.3

Draft summary report

7/22/2016

6.4

Final summary report

8/19/2016




A.17 Reason(s) Display of OMB Expiration Date Is Inappropriate

The expiration date for OMB approval of the information collection will be displayed on data collection instruments and materials. No special exception to this requirement is requested.

A.18 Exceptions to Certification for Paperwork Reduction Act Statement

There are no exceptions to the certification statement identified in the Certification for Paperwork Reduction Act Submissions of OMB Form 83-I.




References

Bowers, Judy, and Hatch, Trish. 2005. The ASCA National Model: A Framework for School Counseling Programs. Alexandria, VA: American School Counselor Association.

Duncan, Chaplin; Bleeker, Mary; and Booker, Kevin. 2010. Roads to Success: Estimated Impacts of an Education and Career Planning Program During Middle School: Final Report. Princeton, NJ: Mathematica Policy Research.

Grodsky, Eric and Riegle-Crumb, Catherine. 2010. “Those Who Choose and Those Who Don’t: Social Background and College Orientation." The Annals of the American Academy of Political and Social Science, 627(2): 14-35.

Hoxby, Caroline and Avery, Christopher. 2012. “The Missing ‘One-Offs’: The Hidden Supply of High-Achieving, Low Income Students.” NBER Working Paper No. 18586.

Radford, Alexandria W. 2013. Top Student, Top School?: How Social Class Shapes Where Valedictorians Go to College. Chicago, IL: University of Chicago Press.

Rosenbaum, James E. 2001. Beyond College for All: Career Paths for the Forgotten Half. American Sociological Association’s Rose Series in Sociology. New York: Russell Sage Foundation.

Savickas, M. L., & Porfeli, E. J. (2011). Revision of the Career Maturity Inventory The Adaptability Form. Journal of Career Assessment, 19(4), 355-374.

Savickas, M. L., & Porfeli, E. J. (2012). Career Adapt-Abilities Scale: Construction, reliability, and measurement equivalence across 13 countries. Journal of Vocational Behavior, 80(3), 661-673.

Traynor, A., & Raykov, T. (2013). Household possessions indices as wealth measures: a validity evaluation. Comparative Education Review, 57(4), 662-688.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleChapter 2
Authorspowell
File Modified0000-00-00
File Created2021-01-25

© 2024 OMB.report | Privacy Policy