SSB_TLPEval_NonSubChange_20180503_clean

SSB_TLPEval_NonSubChange_20180503_clean.doc

Evaluation of the Transition Living Program

OMB: 0970-0383

Document [doc]
Download: doc | pdf






Transitional Living Program Evaluation



OMB Information Collection Request

0970 - 0383




Supporting Statement

Part B

Updated May 2018


Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Caryn Blitz, Office of Planning, Research and Evaluation

Angie Webley, Family Youth Services Bureau

Table of Contents


Part B Table of Contents

B1. Respondent Universe and Sampling Methods 3

B2. Procedures for Collection of Information 5

B3. Methods to Maximize Response Rates and Deal with Nonresponse 6

B4. Tests of Procedures or Methods to be Undertaken 6

B5. Individual(s) Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 7



B1. Respondent Universe and Sampling Methods


The sampling frame includes all TLPs funded in September 2017, and 20 TLPs will be selected to participate in the study. The selection process is purposive. A pool of candidate TLPs will be identified from among FYSB’s TLP grantees for recruitment into the study based on the number of expected entries, which estimates the TLP service volume over a 12-month period. In addition, Maternity Group Homes will be excluded from consideration because these programs serve a special subpopulation and offer a unique set of services to address their needs. New TLPs with little prior program experience will also be excluded.


ACF provided the contractor a complete list of TLP grantees funded in September 2017, as well as information about their service volume. To identify an initial set of grantees, TLPs will be rank-ordered according to their projected service volume to identify TLPs with a high service volume. The contractor is currently in the process of reviewing the candidate TLPs to identify those to include in the pre-post study. This involves evaluating grantees based on three primary criteria: (1) size of the agency, (2) the TLP’s track record (i.e., exclusion of programs with little prior experience), and (3) serving only pregnant or parenting youth (i.e., exclusion of maternity group homes).


Achieving the requisite sample size requires that the study target relatively large TLPs. Service volume is important because large TLPs evince high youth entry/exit rates and thus contribute significantly to the ability to recruit enough youth in the study to produce reliable outcome estimates. We estimate needing a sample of at least 600 youth participants to adequately power the outcome analysis, which means that on average, each of 20 TLP participating in the study would need to enroll 30 youth. Most TLPs serve a relatively small number of youth. Among the September 2017 grantees, the average number of youth served annually was about 10, ranging from 1 to 54 youth annually.

Across the ultimately 20 grantees included in the study, the intent is to achieve a total sample size of 600 youth. Thus, the average agency will enroll 30 youth in the study over an estimated 24-month period. This sample will allow us to detect effects of TLP on binary outcomes (e.g., stable housing) of between 5 and 10 percentage points. In Exhibit B.1.1, we present calculated Minimum Detectable Effects (MDEs) for this design at 6 months and at 12 months following program enrollment. At 6 months, the exhibit shows MDEs of 0.167 standard deviations for continuous outcomes (e.g., delinquency score after 6 months) and 5 - 8 percentage points for binary outcomes. At 12 months, the exhibit shows MDEs of 0.205 standard deviations for continuous outcomes and 6 - 10 percentage points for binary outcomes. The estimated MDEs assume a response rate of 75 percent for the 6-month follow-up survey and 50 percent for the 12-month follow-up survey. Both the 6-month and 12-month follow-up surveys assume a pre-post correlation of 0.2. This low pre-post correlation was selected to provide a conservative MDE estimate. Outcome estimates with a higher R-square value would yield smaller MDEs.







Exhibit B.1.1i


6 months after enrollment

12 months after enrollment

Number of Youth in Analytic Sample:

N = 450

N = 300

Continuous Outcomes a

0.167 (standard deviations)

0.205

Binary Outcomes 2


Control Mean of 10% (or 90%)

5.0 percentage points

6.2

Control Mean of 30% (or 70%)

7.7 percentage points

9.4

Control Mean of 50%

8.4 percentage points

10.3


Analytic Methods

[NOTE: The following description of the analytic methods replaces what was previously presented in SSA under the subheading “Plans for Tabulation.”]

The goal of the outcome analysis is to estimate the effects that TLPs participating in the study have on youth development within the domains of stable housing, positive social connections, social and emotional well-being, and education or employment. The effect of participating in a TLP will be estimated at 6 months after enrollment (E6mo) and at 12 months after enrollment (E12mo). The effect at 6 months will be estimated by comparing the average (mean) score on a given outcome for youth 6 months after enrollment (Y̅6mo) to the mean score for youth on the pretest for that outcome measured at baseline prior to TLP participation (Y̅b). The effect at 12 months will be estimated using the same method—that is, by comparing the mean score on a given outcome for youth 12 months after enrollment (Y̅12mo) to the mean score for youth on the pretest for that outcome measured at baseline prior to TLP participation in a TLP (Y̅b).


E6mo = Y̅6mo - Y̅b

E12mo = Y̅12mo - Y̅b


Following standard practice, we will use regression adjustment to increase the precision of the estimated effect (Orr, 1999). The statistical model for regression adjustment to estimate the effect of TLP participation 6 or 12 months after program enrollment on an outcome Y (e.g. risky behavior) is presented in equation (1):


      1. Yi = β0 + β1Prei + σXi + ϵi

where:

Yi  is the posttest outcome of interest for youth   measured at 6 months or 12 months;

 β0 β0 is a constant;

Prei is the pretest measure for the outcome for youth   measured at baseline;

β1 is the pretest covariate which provides an estimate of the effect of TLP participation;

Xi is a vector of dummy variables for study sites and baseline characteristics or control variables for youth participating in the study (e.g., race, gender, and age at baseline);

σ is a vector of regression coefficients corresponding to the covariates; and

ϵi is a random error term measured with mean of 0 and variance σ2.



To determine if TLP participation had a statistically significant effect on youth, we will conduct a t-test for each outcome measure. If the estimate of β1 is statistically significant at the 5-percent level using a two-tailed test, we will conclude that we have found convincing scientific evidence that the intervention affected the outcome measure; otherwise, we will conclude that there is no convincing scientific evidence of an effect on this outcome.


For continuous or categorical outcomes, we plan to estimate the model above using ordinary least squares (OLS), which assumes that the outcome data have a normal distribution (i.e., form a bell-shaped curve) with homoscedasticity (e.g., a common variance). For binary (dichotomous) outcomes, models will be estimated using logistic regression and we will report the marginal effect on the probability of observing the binary outcome.


We have no reason a priori to expect homoscedasticity, because some TLPs could have higher variability in youth outcomes than other TLPs (Angrist & Pischke, 2008). To address the potential of heteroscedasticity and account for variation in continuous and categorical youth outcomes across TLPs, we will include site-level indicator variables (“fixed effects”) in our linear models, and we will compute robust standard errors (i.e., Huber-Eicker-White robust standard errors; Huber, 1967; Greene, 2003; White 1980, 1984).


Ultimately, the analysis will produce estimates indicative of TLPs’ efficacy—that is, the effects on youth of relatively large and relatively well-designed TLPs based on the study selection criteria—rather than the effect of the average TLP.



B2. Procedures for Collection of Information

The evaluation will collect information on youth baseline characteristics and behaviors from approximately 600 youth across 20 grantees. The research approach uses a series of web-based surveys to collect data from youth. A secure, encrypted, passcode protected website will serve as the portal for data collection and will allow research staff to monitor survey completion rates. The website will permit youth survey respondents to log in using a unique username and password and complete their respective surveys online.


Before data collection begins, trained TLP staff will obtain youth consent. Then they will administer the baseline survey at the TLP, which will involve seating each youth respondent at a computer (in the designated private space) and assisting them in registering and logging into the web portal in order to complete the survey. English and Spanish versions of the survey will be available so that respondents can choose their preferred language. The respondent will be left to complete the survey in private. The final screen of the survey will inform youth they have completed the survey and ask them to confirm the method by which they would like to receive their incentive (an electronic gift card or gift card code, sent by email or text, or mail if neither of those options is possible). The youth will then exit the survey, real-time verification of survey completion will be automatically recorded in the survey database, and the incentive will be sent to the youth.


The follow-up surveys will be self-administered. The contracted research team will invite all youth enrolled in the study to complete the 6and 12-month follow-up surveys using the communication strategy previously indicated by the youth (email, texting, etc.). Youth respondents will also receive support from program staff, when possible, to remind them about the follow-up survey, provide instructions on how to access the Web survey, and assistance resetting their password if needed. Repeated reminders will be sent electronically until the survey has been accessed and completed, with telephone and in-person outreach from TLP staff and outreach to secondary or tertiary contacts if permitted by youth.



B3. Methods to Maximize Response Rates and Deal with Nonresponse

Collecting data from homeless youth will be the greatest challenge of this study, because many are expected to be transitory and lack fixed addresses. To obtain adequate response rates, we will implement a robust data monitoring and tracking process. The study team will employ several outreach tactics to obtain the highest response rates possible from study participants for each of the surveys. The primary mode of outreach will be email or cell phone text message. (Note that upon enrollment into the study, youth will have an opportunity to refuse text messaging from the study team if this approach forces youth to incur additional costs. Some cell phone data plans have unlimited text messaging, while others have an additional charge.)


In addition, the study team will send email or text message invitations for the 6- and 12-month surveys. The invitations will include a link to the study’s web portal where youth will login using their unique username and password (set up during enrollment). Up to two email/text reminders will be sent to non-responders before moving them to outreach by TLP staff. For those who do not complete the surveys within 48 hours of the second email/text invitation, specially trained TLP staff will reach out to the youth via telephone or in person for the completion of the survey using a study-provided laptop with an internet connection. Alternatively, TLP staff may offer to help troubleshoot any issues the youth may have connecting to the survey (e.g., password reset), logging into the study web portal, or completing the survey. Staff will use secondary and tertiary contact information obtained from the most recent completed survey to contact individuals who may know of the youth's whereabouts or have updated contact information for them.


The study team will actively monitor data collection and produce bi-weekly reports on the status (e.g., response rates) of each participating TLP agency.


The study team will distribute youth incentives as soon as possible after surveys are completed.


Finally, TLP program staff will also be assisting in locating youth for the follow-up surveys to boost their response rates.



B4. Tests of Procedures or Methods to be Undertaken

The early versions of the youth surveys were subjected to a pretest involving fewer than 10 individuals. This pretesting occurred during visits to three agencies, during which time youth in three TLPs were asked to take and review the surveys. Since that time, the surveys have been significantly modified to include additional outcome measures and measurement of service dosage. The modified surveys were pretested for the purposes of timing survey administration under differing response scenarios.


The baseline survey for the pre-post outcome study was previously fielded as part of the random assignment pilot study, which yielded 164 survey respondents.


The surveys rely heavily on questions that have been validated and used in many other national studies, especially the questions associated with the study’s key outcome domains—e.g., homelessness, psychosocial wellbeing, and employment and education. As such, the study team and ACF are confident that sufficient cognitive pretesting of the survey questions has been conducted and that further cognitive pretesting is not needed. Nevertheless, ACF and the study team will monitor survey completion rates throughout the study to assess whether study participants are completing the survey. If survey completion rates are low, the study team will engage grantees to understand if study participants are having difficulties with the survey questions.



B5. Individual(s) Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Consultations on the statistical methods used in this study have been undertaken to ensure the technical soundness of the research. Administration of the data collection will be overseen by Abt Associates (statistical and research contractor). The same contractor will analyze data. Members of this research team include:


Dr. Alvaro Cortes

Abt Associates

6130 Executive Boulevard

Rockville, MD 20852

(301) 634-1857


Dr. Jill Khadduri

Abt Associates

6130 Executive Boulevard

Rockville, MD 20852

(301) 634-1745


Dr. Daniel Gubits

Abt Associates

6130 Executive Boulevard

Rockville, MD 20852

(301) 634-1854


Dr. Jessica Thornton Walker

Abt Associates

6130 Executive Boulevard

Rockville, MD 20852

(301) 347-5622

8


File Typeapplication/msword
File TitleOPRE OMB Clearance Manual
AuthorDHHS
Last Modified BySYSTEM
File Modified2018-05-04
File Created2018-05-04

© 2024 OMB.report | Privacy Policy