Transitions OMB PartB Supp Statement_2024_07-12_post-60-day_clean

Transitions OMB PartB Supp Statement_2024_07-12_post-60-day_clean.docx

Evaluation of Transition Supports for Youth with Disabilities

OMB: 1850-0979

Document [docx]
Download: docx | pdf

f

Evaluation of Transition Supports for Youth with Disabilities: Second Phase of Data Collection Activities

Supporting Statement, Part B


JULY 2024



Contents





Exhibits



Part B. Collection of Information Employing Statistical Methods

Introduction


The Institute of Education Sciences (IES) within the U.S. Department of Education (ED) requests clearance from the Office of Management and Budget (OMB) to conduct new data collection activities for the Evaluation of Transition Supports for Youth with Disabilities study. The evaluation will provide rigorous findings about the effectiveness, implementation, and costs of two new strategies for supporting youth with disabilities (YWD) and their families to prepare for a successful transition from high school to adult life. (Please refer to Sections A.1 and A.2 in Part A for more information about these strategies and the study’s evaluation research questions.)

This is a revision to the original information collection request and requests clearance to measure outcomes and assess the implementation and cost-effectiveness of each strategy, specifically: (a) collection of participating students’ individualized education programs (IEPs), (b) student surveys, (c) school staff surveys, and (d) district cost interviews and staffing records. The original request approved in May 2023 was primarily related to site recruitment (see ICR summary here and supporting statement here).

B.1. Respondent Universe and Sampling Methods


ED’s IES has contracted with the American Institutes for Research (AIR) and its partners, the University of Kansas, University of North Carolina Charlotte (UNCC), and Social Policy Research Associates—collectively referred to as the study team—to carry out the study’s evaluation activities. To evaluate the two new transition support strategies, this study will use a student-level random assignment design within a purposively selected sample of districts and schools. The study team will conduct the study in districts/schools that meet specific eligibility criteria, are interested in implementing the two strategies, and are willing to support the study’s implementation and evaluation requirements. In these districts/schools, the study team will also work with districts to select study instructors to deliver the strategies with support from a provider team. The study team will assign YWD whose parents/guardians have provided consent) to one of two treatment conditions (SDLMI-Transition [Strategy 1] or SDLMI-Transition with Mentoring [Strategy 2]) or a business-as-usual (BAU) control group. All eligible students whose parents provide affirmative consent and who have been randomized to one of the three conditions will be defined as participating students.

The study team will use the methods below to select and randomly assign the sample.

  • Recruitment and selection of districts and schools. The study team will recruit districts with at least 125 students meeting the eligibility criteria defined in the next bullet point across schools that each serve at least 18 students with IEPs who are approximately two years from completing high school. Recruitment of districts/schools will be completed by Fall 2024 using the following procedures.

  • The study team will reach out via email and phone to potentially eligible districts to initiate conversations about the study and districts’ eligibility. To ensure a strong study design, the study team will exclude districts that already implement self-determination programs closely aligned with Strategy 1 and Strategy 2 or intensive coaching or mentoring programs that are explicitly focused on transition planning. Finally, the study team and IES will select districts that will facilitate outreach to families about study participation, coordinate logistics, and supply the student records needed for the evaluation.

  • After identifying districts that meet the criteria noted above, the study team will recruit high schools in these districts that serve at least 18 eligible students. The study team will reach out to these schools to verify that they meet the study’s eligibility criteria and establish whether they are willing and interested to participate in the study. The study will include only schools that are able to support outreach to families, accommodate the implementation requirements of Strategy 1 and Strategy 2, and supply the student IEP information needed for the evaluation.

  • Recruitment, selection, and random assignment of students. In each participating school, the study team will work with staff to identify eligible students and conduct outreach to their families to encourage participation in the study.

    • Students will be eligible for the study if they have an IEP and are approximately two years from completing high school. The study also may limit eligibility to the students on transition “pathways” for which (based on associated coursework and services) participation in the two new strategies would represent a clear difference from the BAU.

    • Staff will request informed consent during study enrollment periods in Spring 2024 and Fall 2024; for the purposes of planning, the study team assumed that 67 percent of families will provide affirmative consent for participation in the study. Once enrollment ends, the study team will collect baseline data on participating students from districts/school records and randomly assign equal proportions of these students to the three study groups (Strategy 1, Strategy 2, and BAU).

  • Selection of the provider team and study instructors. ED and the study team will work with a provider team to support Strategy 1 and Strategy 2 through training, monitoring, and technical assistance. The study team will work with districts in Fall 2024 to select instructors who have relevant education and experience for delivering the two strategies.

We will use data on all districts, schools, and participating students when evaluating Strategy 1 and Strategy 2. Exhibit B.1 describes the respondent universe and sampling approach to be used in each data collection activity for which approval is being requested in the current request. In Part A, Exhibit A.3 contains more information about the purposes and uses of data, which provide essential information on implementation of the two new strategies by the provider team and study instructors, and the costs of implementation. Additionally, Exhibit A.4 describes data collection activities included in the original information collection request or not requiring OMB clearance—which are critical for the evaluation to establish and characterize the study sample.

Exhibit B.1. Respondent Universe and Sampling Methods

Data source

Respondent

Respondent universe
(estimated)

Sampling approach

Students’ IEPs

District data staff

16

Census of participating students (district staff will provide IEPs for all applicable students)

Student surveys

Students

3,000

Census of participating students

School staff surveys

School staff

93

Census of participating schools

District cost interviews and staffing records

District administrators

16

Census of participating districts

B.2. Information Collection Procedures


B.2.1. Statistical Methods for Sample Selection

The study will not use statistical methods to select the samples described in Section B.1. The study team will purposively recruit and select districts and schools based on their interest in and fit with the study. The study team will then conduct outreach to eligible students from these districts and schools and their parents. Eligible students whose parents provide affirmative consent to participate in the study will become participating students. All participating districts, schools, staff, and students will be part of each applicable data collection. This approach will yield reliable estimates of the effectiveness of Strategy 1 and Strategy 2 compared to one another and to the BAU control group within the study sample. As feasible, we will explore whether the results may be used for statistical inference about a larger population that has a known degree of precision.

B.2.2. Estimation Procedures

The data collection activities described in this submission will allow the study team to measure key intermediate outcomes for the study sample used in the evaluation and assess the implementation and cost-effectiveness of each strategy. The study team plans to use estimation procedures described in the following subsections to conduct the evaluation and answer the study’s research questions listed in Exhibit B.2.

Exhibit B.2. Primary Research Questions and Applicable Estimation Procedures

RQ#

Research Question

Procedures to Assess Effectiveness

Procedures to Assess Implementation

Procedures to Assess Costs and Cost-Effectiveness

1

Is instruction in self-determination skills and how to apply them to transition planning (Strategy 1) effective in improving the intermediate and post-school outcomes of students with disabilities?

X

X


2

Is offering individual mentoring along with self-determination skill instruction (Strategy 2) effective?

X

X


3

What is the added benefit and cost of providing individual mentoring support?

X


X

Estimation Procedures to Assess Effectiveness. The study team will estimate the effectiveness of Strategy 1 and Strategy 2 by comparing the outcomes for students in each treatment group to outcomes for students in the BAU control group, some of which will be obtained through district/school records for students, IEPs, student surveys, and school staff surveys. The study team will estimate overall, intent-to-treat effects for each outcome based on the following hierarchical linear regression model that accounts for student- and school-level factors:1

where is the outcome for student in school , represents a set of school fixed-effects, is a vector of baseline covariates for student , and are indicators for whether student was assigned to each strategy, and is the student-level error term. The model includes baseline covariates, drawing on information from district/school records for students, to improve precision and guard against any imbalances that arise due to chance or attrition. In this model, and represent the effects of Strategy 1 and Strategy 2 relative to BAU.

In addition to estimating the effects of Strategy 1 and Strategy 2 compared to the BAU control condition, the study team also will test whether effects differ between Strategy 1 and Strategy 2. These tests will be based on the difference between and in Equation 1.

The study team will also estimate the effectiveness of Strategy 1 and Strategy 2 for key subgroups of students as part of an exploratory analysis. For these subgroups, the study team will estimate effects using a variation of Equation 1 in which the treatment indicators are interacted with indicators for each pair of subgroups defined by a binary baseline characteristic (such as lower or higher self-determination skills). The study team will use the resulting estimates to compare effects across subgroups.

Estimation Procedures to Assess Implementation. The study team will describe the students participating in the study using measures of their characteristics, achievement, and school participation and progress, drawing on information in district/school records for students. The study team will calculate and report means for continuous measures and percentages for binary and categorical measures. The study team will then put these results in context by comparing them to summary statistics calculated for the full set of eligible students in the study’s districts who were included in outreach efforts. In addition, the study team will tabulate the most common reasons why families decline to be part of the study to provide information on barriers to participation, including the extent to which the requirement to provide participants’ social security numbers deters study participation. This information may be useful for the design and implementation of future related studies.

The study team will also describe and compare the transition services received by treatment and BAU students, using information from the district contextual information forms and district/school records for students. One analysis will measure the service contrast between study groups based on the relative prevalence of transition services that Strategy 1, Strategy 2, and BAU students receive, focusing on services that are particularly relevant to this study, such as self-determination instruction. Service contrast estimates will be based on a variation of Equation 1 with outcomes set to measures of participation in these services. Another analysis will describe implementation fidelity and challenges. The study team will use the provider’s records to construct fidelity measures and summarize how they vary across student characteristics, instructor characteristics, school characteristics, and by district. The study team will also measure the prevalence of challenges and solutions to delivering important intervention features and the degree of student or family uptake of these features.

Estimation Procedures to Assess Costs and Cost-Effectiveness. The study team will use a resource cost model (RCM) to measure and analyze the costs of Strategies 1 and 2 based on the “ingredients” approach to cost analysis (Levin, 1983; Levin & McEwan, 2001). The study team will develop an RCM using the CostOut tool (Hollands et al., 2015), and will use the RCM to calculate the per-student costs of Strategy 1 and Strategy 2 and analyze variation in costs across schools and districts. The study team will also produce cost-effectiveness estimates by dividing the per-student cost by the impact estimate for the given outcome.

B.2.3. Degree of Accuracy Needed

The study is intended to reliably answer the research questions about effectiveness shown previously in Exhibit B.2. Given the random-assignment design, the study’s sample sizes were chosen to yield sufficient statistical power for detecting impacts of the size that Strategy 1 and Strategy 2 might plausibly achieve and considered by experts and stakeholders advising the study to be meaningful and policy relevant.

To assess statistical power, the study team used findings from past research to determine the potential impacts of the strategies that would need to be detected, focusing on the following key intermediate and post-school outcomes:

  • A key intermediate outcome is the student self-determination score from an assessment in the student survey to be conducted at the end of the implementation period for the strategies (Spring 2026).

  • Past research suggests that Strategy 1 could increase self-determination scores by 0.30 standard deviations, given the range of findings from evaluations of self-determination interventions for YWD in high school (Wehmeyer et al., 2011, 2013; Zhang, 2001).2

  • An evaluation of a program similar to Strategy 2 for YWD nearing age 18 found self-determination impacts of approximately 1.10 standard deviations (Powers et al., 2012).

  • A key outcome to measure post-school success is the rate of engagement in employment or postsecondary education in the second year after the implementation period ends (i.e., July 2027 to June 2028)—which corresponds to the second year after expected graduation from high school for students in the study.

  • For Strategy 1, the study team identified correlational research suggesting that a potential self-determination impact of 0.30 standard deviations could translate into an impact of 0.14 standard deviations for employment and enrollment rates after high school.3 (No rigorous evaluations of self-determination interventions have produced impact estimates for post-school outcomes.) This corresponds to a potential Strategy 1 impact of 6.8 percentage points for engagement in employment or postsecondary education after high school, given the prevalence of these outcomes.4

  • Additionally, the evaluation of a program similar to Strategy 2 for YWD nearing age 18 found impacts of 17 percentage points for employment and 16 percentage points for post-secondary enrollment in the year after implementation of that program ended (Powers et al., 2012).

As indicated in Exhibit B.3, a sample size of 3,000 students will be sufficient to detect impacts of this size (or smaller), as well as comparably sized differences in impacts between Strategy 1 and Strategy 2. The exhibit shows minimum detectable effect sizes (MDESs) at an 80 percent power level for (a) self-determination scores measured for survey respondents, assuming 20 percent attrition, (b) engagement in employment or postsecondary education measured for survey respondents, assuming 20 percent attrition, and (c) engagement in employment or postsecondary education measured using administrative data, assuming no attrition. Considering the full sample, the MDES for self-determination scores is below 0.30 standard deviations, and MDESs for engagement in employment or postsecondary education are below 6.8 percentage points.

Exhibit B.3 also shows MDESs for a subgroup comprising 50 percent of the overall study sample (such as lower or higher initial self-determination skills). Given the size of potential impacts for the strategies described above, the MDESs indicate that exploratory analyses for such a subgroup:

  • Will have sufficient power to detect the potential self-determination impacts of Strategy 1 and Strategy 2, as well as potential differences in these impacts between the strategies;

  • Will also have sufficient power to detect the potential post-school impacts of Strategy 2, as well as potential differences in post-school impacts between strategies; and

  • Will have more limited power to detect the potential post-school impacts of Strategy 1. As shown in the exhibit, with an 80 percent power level, MDES values are approximately 8 to 9 percentage points. While slightly larger than the 6.8 percentage point potential Strategy 1 impact noted above, these exploratory MDEs would still be policy relevant and considered by experts advising our study to be meaningful.

Exhibit B.3. MDES Values for Pairwise Comparisons (Strategy 1 vs. BAU, Strategy 2 vs. BAU, and Strategy 1 vs. Strategy 2)

Sample

Self-determination scores

(standard deviations):

20% attrition

Engagement in employment or postsecondary education (standard deviations / percentage points):

20% attrition

Engagement in employment or postsecondary education (standard deviations / percentage points):

No attrition

Full sample (3,000 students)

0.113

0.129 / 6.30 pp

0.116 / 5.64 pp

50% subgroup

0.160

0.183 / 8.92 pp

0.164 / 7.98 pp

Note. MDESs were calculated using PowerUp! (Dong & Maynard, 2013). Common assumptions for all entries: (a) students will be equally divided across Strategy 1, Strategy 2, and BAU; (b) impacts will be estimated using Equation 1; (c) the study seeks an 80 percent power level and will use two-tailed statistical tests with a 0.05 significance level. The calculations also assume that the R2 from covariates and school fixed effects is 0.15 for engagement in employment or postsecondary education and 0.35 for self-determination scores, based on Hedges and Hedberg (2013) and the study team’s experience analyzing data on employment and education outcomes for YWD. In the first two columns, the conversion from standard deviations to percentage points (pp) assumes an underlying rate of engagement in employment or postsecondary education of 61 percent.

B.2.4. Unusual Problems Requiring Specialized Sampling Procedures

There are no unusual problems requiring specialized sampling procedures.

B.2.5. Use of Periodic (Less than Annual) Data Collection to Reduce Burden

IES and the study team have carefully considered the frequency of each data collection covered under this request, and plan to collect data at the minimum possible frequencies to minimize burden while meeting the requirements of the planned analyses.

  • The study will collect IEPs from district staff and administer school staff surveys at baseline (Fall 2024) and near the end of the implementation period (Spring 2026).

  • The study will collect information from district cost interviews and staffing records only once per academic year (Spring 2025 and Spring 2026) from each applicable respondent. Each data collection asks respondents to recall activities over the past year; extending the recall window beyond a year would reduce response quality.

  • The study will administer student surveys more frequently than annually: at baseline (Fall 2024), at one interim time point (Fall 2025), and near the end of the implementation period (Spring 2026). Administering the interim survey in addition to the baseline and final surveys is necessary to examine students’ interim progress toward key outcomes and increase the quality of responses to questions asking students to recall their experiences in transition planning meetings.

B.3. Methods to Maximize Response Rates and Address Nonresponse


B.3.1. Methods to Maximize Response Rates

To maximize response rates, we will work closely with participating districts and use strategies that the study team has used successfully in past studies that collected information from district staff, school staff, and students facing barriers (including the Impact Evaluation of Training in Multi-Tiered Systems of Support for Reading in Early Elementary School, the Impact Evaluation of Training in Multi-Tiered Systems of Support for Behavior, and the Impact Evaluation of Parent Messaging Strategies on Student Attendance).

In general, we will partner with respondents to establish procedures at the start of the study, emphasize the importance of following these procedures, and provide advance notifications and follow-ups to remind respondents of the study’s data collection expectations. Further, as discussed in Section B.4, we will pretest or pilot the student survey form to ensure that it is concise and clear.

Below are additional features of our strategies for maximizing response rates for the data collections covered by this study’s current clearance request.

  • Students’ IEPs (100 percent expected response rate). The study team will establish clear expectations for district staff to submit records in an initial memorandum of understanding (MOU) with each district. The study team will also adhere to any district data requirements, such as preparing research applications. To make sharing of students’ IEP documents as easy as possible for respondents, the study team will provide detailed information on the documents requested. The study team will also appoint a data liaison for each district. This liaison will notify district staff in advance of the need to share the student IEPs, remind district staff of upcoming dates for sharing IEPs, follow up by email and telephone as needed to answer questions and encourage submissions, and accept responses in electronic or hard copy format.

  • Student surveys (80 percent expected response rate). The student survey is designed to be brief, to minimize the time required of students (see Appendix A). The student surveys will be administered electronically during the school day as part of students’ time spent on the project, and supports for students will be provided as needed to help them complete the survey. If needed, the study team will also offer a paper version of the survey. The study team will make surveys available in both English and Spanish, and will consider other translations if a participating district indicates that other languages are spoken by students.

  • School staff surveys (80 percent expected response rate). The school staff survey is designed to be brief, to minimize the time required of staff (see Appendix B). The study team’s data liaison for the district will ensure that school staff are reminded in advance and understand the importance of these data collections, follow up by email and telephone as needed to answer questions and encourage submissions, and accept responses in electronic or hard copy format.

  • District cost interviews and staffing records (100 percent expected response rate). The cost interview protocol is designed to be brief, to minimize the time required of staff (see Appendix C). The study team will prepopulate information wherever possible, such as dates, length, and staff attendance for meetings held with study or provider team staff related to implementing Strategy 1 and Strategy 2. Study team members will be responsible for maintaining contact with the districts, ensuring that the district leaders are reminded of the interview and staffing records requests in advance and understand the importance of these data collections. Interviews will be scheduled a month in advance, with reminders sent a week before.

B.3.2. Methods to Deal with Issues of Nonresponse

The only data collection covered by the current clearance request for which the study team expects any unit nonresponse are the student survey and the school staff survey. We expect response rates to be above 80 percent; however, should response rates fall below 80%, the study team will conduct non-response analyses. First, the study team will compare administrative data on the characteristics of students who completed the surveys to the characteristics of those who did not. Second, using these baseline characteristics we will use a statistical model to predict the probability that a student responded to the survey. If these analyses point to the possibility of non-response bias, the study team will create sampling weights based on the observable baseline characteristics and use the weights in analyses.

B.4. Tests of Procedures


During the 60-day public comment period, the study team pilot tested the student survey instrument with nine individuals and the staff survey instrument with eight individuals. Each pilot test included representatives of each respondent population. Based on feedback received during testing, the study team made changes to the student survey instrument and the school staff survey instrument to ensure that respondents can understand and complete the surveys accurately and as intended.

In the test of the student survey, the study team asked respondents to complete the survey and share feedback about their experience with support from a school staff member who administered the survey. The study team focused on students’ perceptions of the organization and format of the survey, words or phrases that were unclear and instructions that were not straightforward, and the overall presentation and interpretation of the survey. The study team also observed respondents completing the survey and noted questions and challenges that arose during survey administration. The study team used feedback from the student survey testing to revise and improve the wording and ordering of instructions, questions, and response options.

In the test of the school staff survey, the study team asked respondents to review the survey questions and provide feedback about the feasibility of collecting the information requested in the survey questions for each participating student, as well as feedback about their perceptions of the organization and format of the survey, words or phrases that were unclear and instructions that were not straightforward, and the overall presentation and interpretation of the survey. The study team used feedback from the school staff survey testing to revise and improve the wording of instructions and questions and to revise the estimated time required to complete the survey. Respondents reported they would need to gather information from colleagues to complete some survey questions, so the revised time estimate accounts for the time required for respondents to request and their colleagues to share that information.

The study team did not pilot test the request for students’ IEPs or the district cost interview protocol and staffing records request. Both requests are for extant records in their current form and are based closely on similar requests used in prior studies. The IEP request asks districts to provide copies of participating students’ IEPs in whatever form they currently exist. The district cost interview protocol and staffing records request asks districts to provide summaries and extant documents related to processes and procedures related to implementing Strategy 1 and Strategy 2. Members of the study team also have communicated extensively with district staff about similar information in the past when providing technical assistance, so these requests will be familiar to district staff.

B.5. Individuals Consulted on Statistical Aspects of the Design and Leading Data Collection/Analysis


The study team members listed in Exhibit B.5 provided primary consultation for ED about the design of the study and data collection plan and/or will lead the data collection and analysis.

Exhibit B.5. Key Consultants on Statistical Design / Leads for Data Collection and Analysis

Name

Role(s)

Title and Affiliation

Telephone Number

Tamara Linkow

Consultation on statistical design, lead for data collection/analysis

Senior Director, AIR

(202) 403-6822

Jessica Heppen

Consultation on statistical design, lead for data collection/analysis

President and CEO, AIR

(202) 403-5488

Michael Garet

Consultation on statistical design, lead for data collection/analysis

Vice President and Institute Fellow, AIR

(202) 403-5345

Valerie Mazzotti

Consultation on statistical design, lead for data collection/analysis

Professor, University of Kansas

(704) 687-8179

Seth Brown

Consultation on statistical design, lead for data collection/analysis

Principal Researcher, AIR

(781) 373-7034

Garima Siwach

Consultation on statistical design, lead for data collection/analysis

Senior Researcher, AIR

(202) 403-5686

Megan Austin

Consultation on statistical design, Lead for data collection/analysis

Principal Researcher, AIR

(202) 403-5301

References


Dong, N., & Maynard, R. A. (2013). PowerUp! A tool for calculating minimum detectable effect sizes and sample size requirements for experimental and quasi-experimental designs. Journal of Research on Educational Effectiveness, 6(1), 24–67.

Hedges, L. V., & Hedberg, E. C. (2013). Intraclass correlations and covariate outcome correlations for planning two- and three-level cluster-randomized experiments in education. Evaluation Review37(6), 445–489.

Hollands, F. M., Hanisch-Cerda, B., Levin, H. M., Belfield, C. R., Menon, A., Shand, R., Pan, Y., Bakir, I., & Cheng, H. (2015). CostOut®. Teachers College, Columbia University. Available online at https://www.cbcse.org/costout.

Levin, H. (1983). Cost-effectiveness: A primer. Sage.

Levin, H., & McEwan, P. (2001). Cost-effectiveness analysis: Methods and applications. Sage.

Ruggles, S., Flood, S., Foster, S., Goeken, R., Pacas, J, Schouweiler, M, & Sobek, M. (2021). IPUMS USA: Version 11.0 [data set]. IPUMS. https://doi.org/10.18128/D010.V11.0

Shogren, K. A., Lee, J., & Panko, P. (2017). An examination of the relationship between postschool outcomes and autonomy, psychological empowerment, and self-realization. Journal of Special Education51(2), 115–124.

Wehmeyer, M. L., Palmer, S. B., Lee, Y., Williams-Diehm, K., & Shogren, K. (2011). A randomized-trial evaluation of the effect of whose future is it anyway? On self-determination. Career Development for Exceptional Individuals, 34(1), 45-56.

Wehmeyer, M. L., Palmer, S. B., Shogren, K., Williams-Diehm, K., & Soukup, J. H. (2013). Establishing a causal relationship between intervention to promote self-determination and enhanced student self-determination. The Journal of Special Education, 46(4), 195-210.

What Works Clearinghouse. (2020). Standards handbook, version 4.1. https://ies.ed.gov/ncee/wwc/Docs/referenceresources/WWC-Standards-Handbook-v4-1-508.pdf

Wooldridge, J. M. (2010). Econometric analysis of cross section and panel data. MIT press.

Zhang, D. (2001). The effect of Next STEP instruction on the self-determination skills of high school students with learning disabilities. Career Development for Exceptional Individuals, 24(2), 121-132.





AIR® Headquarters

1400 Crystal Drive, 10th Floor
Arlington, VA 22202-3289
+1.202.403.5000 |
AIR.ORG

About the American Institutes for Research

Established in 1946, with headquarters in Arlington, Virginia, the American Institutes for Research® (AIR®) is a nonpartisan, not-for-profit organization that conducts behavioral and social science research and delivers technical assistance to solve some of the most urgent challenges in the U.S. and around the world. We advance evidence in the areas of education, health, the workforce, human services, and international development to create a better, more equitable world. The AIR family of organizations now includes IMPAQ, Maher & Maher, and Kimetrica. For more information, visit AIR.ORG.

1 The main analysis will use linear models for binary outcome measures. Estimates from linear models tend to be similar to marginal effects derived from nonlinear models such as logits, with linear results being more directly interpretable (Wooldridge 2010). As a sensitivity check, the study will report marginal effects from logit models for binary outcomes.

2 While some reported estimated impacts were smaller than 0.30 standard deviations, the past research analyzed interventions that were generally delivered at a lower intensity than what is planned for Strategy 1.

3 This statement is based on an analysis of YWD that measured associations between (a) self-determination scores at ages 16 to 18 and (b) subsequent college enrollment, employment status, and independent living outcomes (Shogren et al., 2017). The authors’ results for positive and statistically significant associations suggest correlations of at least 0.45. Hence, a 0.30 standard deviation change in intermediate self-determination scores could lead to a 0.30.45 ≈ 0.14 change in post-school outcomes.

4 According to American Community Survey data from 2015 to 2019 on YWD ages 19–21 who had completed 10th grade, this engagement rate was approximately 61 percent. These data were obtained from the IPUMS-USA database (Ruggles et al., 2021). With this prevalence, 0.14 standard deviations is equivalent to sqrt[(0.61(1-0.61)0.1100 = 6.8 percentage points.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleReport
SubjectReport
AuthorAustin, Megan
File Modified0000-00-00
File Created2024-07-23

© 2024 OMB.report | Privacy Policy