PART A Supporting Statement TLP 2015 7 20

PART A Supporting Statement TLP 2015 7 20.docx

Evaluation of the Transition Living Program

OMB: 0970-0383

Document [docx]
Download: docx | pdf


Application for Transitional Living Program Evaluation

OMB# 0970-0383



Supporting Statement













Office of Data, Analysis, Research and Evaluation

Administration on Children, Youth and Families

Administration for Children and Families

U.S. Department of Health and Human Services



September 23, 2014

Updated May 15, 2015

Final July 20, 2015






Table of Contents



A. Justification

  1. Circumstances Making the Collection of Information Necessary

  2. Purpose and Use of the Information Collection

  3. Use of Improved Information Technology and Burden Reduction

  4. Efforts to Identify Duplication and Use of Similar Information

  5. Impact on Small Businesses or Other Small Entities

  6. Consequences of Collecting the Information Less Frequently

  7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

  8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

  9. Explanation of Any Payment or Gift to Respondents

  10. Assurance of Confidentiality Provided to Respondents

  11. Justification for Sensitive Questions

  12. Estimates of Annualized Burden Hours and Costs

  13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

  14. Annualized Cost to the Federal Government

  15. Explanation for Program Changes or Adjustments

  16. Plans for Tabulation and Publication and Project Time Schedule

  17. Reason(s) Display of OMB Expiration Date is Inappropriate

  18. Exceptions to Certification for Paperwork Reduction Act Submissions

B. Statistical Methods (used for collection of information employing statistical methods)

  1. Respondent Universe and Sampling Methods

  2. Procedures for the Collection of Information

  3. Methods to Maximize Response Rates and Deal with Nonresponse

  4. Test of Procedures or Methods to be Undertaken

  5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data



EXHIBIT 1: Federal Register Notice

EXHIBIT 2: Data Measures Eliminated


Information Collection Supporting Statement


A. Justification


  1. Circumstances Making the Collection of Information Necessary


The Runaway and Homeless Youth Act (RHYA), as amended by Public Law 106-71 (42 U.S.C. 5701 et seq.), provides for the Transitional Living Program (TLP), a residential program designed to prepare older homeless youth ages 16-21 for a healthy and self-sufficient adulthood. The following amendment was included in the 2003 “Runaway, Homeless Youth and Missing Children’s Assistance Act” (P.L. 108-96), which reauthorized the “Runaway and Homeless Youth Act”. The RHYA was reauthorized in 2008.


SEC. 119. STUDY OF HOUSING SERVCIES AND STRATEGIES

The Secretary of Health and Human Services shall conduct a study of programs funded under part B of the Runaway and Homeless Youth Act (42 W.S.C. 5714-1 et seq.) to report on long-term housing outcomes for youth after exiting the program. The study of any such program should provide information on housing services available to youth upon exiting the program, including assistance in locating and retaining permanent housing and referrals to other residential programs. In addition, the study should identify housing models and placement strategies that prevent future episodes of homelessness.


  1. Purpose and Use of the Information Collection


The proposed collection is being carried out in three steps:

  1. A set of surveys to be administered to run away and homeless youth to measure their short-term and longer-term outcomes and the types and amounts of services received, and

  2. Interviews with TLP grantees about program structure, implementation, and approaches to service delivery.

  3. A pilot with a small group of TLP grantees to test the proposed design.


Collection of data from youth

If this request is approved, the study will collect data from youth on their demographic characteristics, receipt of TLP or “TLP-like” services, housing, employment, education, social connections (e.g., social relationships, civic engagement), psychosocial well-being (e.g., depressive symptoms, traumatic stress, risky behavior, history of abuse), and other measures related to self-sufficiency and well-being (exposure to violence, financial competence). The study will also collect identifying information, including contact information, for the purposes of tracking youth throughout the evaluation and locating them for follow-up. As shown in Exhibit A.2.1 below, data will be collected from youth at seven points in time.









Exhibit A.2.1 Overview of Data Collection from Youth


Instrument

Timing

Purpose

Estimated Length

1

Young Adult Baseline Survey

Prior to random assignment (RA)

Measure background characteristics, initial status on outcomes, gather contact information

0.62 hours*

2

Young Adult 3-Month Follow-up Survey

3 months after RA

Measure service receipt and a limited set of outcomes, verify contact information

0.48 hours*

3

Young Adult 6-Month Tracking Survey (Contact Update)

6 months after RA

Maintain contact, verify contact information

0.17 hours

4

Young Adult 9-Month Tracking Survey (Contact Update)

9 months after RA

Maintain contact, verify contact information

0.17 hours

5

Young Adult 12-Month Follow-up Survey

12 months after RA

Measure full set of outcomes

0.61 hours*

6

Young Adult 15-Month Tracking Survey (Contact Update)

15 months after RA

Maintain contact, verify contact information

0.17 hours

7

Young Adult 18-Month Follow-up Survey

18 months after RA

Measure full set of outcomes

0.61 hours*



The first collection will occur at baseline, prior to random assignment (and delivery of the intervention). A main purpose of the baseline survey is to measure participants’ initial status with respect to the outcome variables. As such, the survey queries participants’ experiences across varied aspects of their lives, including housing and homelessness, social connections (e.g., supportive relationships with adults, supportive relationships with peers, peer delinquency, civic engagement, civic attitudes); psychosocial well-being, (e.g., self-esteem, self-efficacy, depressive symptoms, traumatic stress, delinquency, criminality, substance use, risky sexual behavior); education or training, employment, and other relevant measures (e.g., money management, health care needs and use, history of abuse and neglect, personal goals and progress toward them). In order to obtain information necessary to describe the sample, it asks about such demographic and background characteristics as race, ethnicity, language, citizenship, and gender/sexual identity. To better understand service receipt, the baseline survey also measures current or prior receipt of TLP or TLP-like services.


These baseline variables are important in several ways for the analysis. They will be used to establish the baseline equivalence of the treatment and control groups on measurable variables and thus to confirm the integrity of the random assignment process. Baseline variables may also be used to define subgroups for which impacts will be estimated, and to adjust impact estimates for the baseline characteristics of non-respondents to the follow-up survey. Many baseline variables are outcomes to be measured again at follow-up; their baseline values can be used to improve the precision of impact estimates through their inclusion as covariates in the impact models.


At 12 and 18 months after random assignment, youth will complete a similarly comprehensive follow-up survey. The 12 and 18-month follow-up surveys are nearly identical to the baseline survey, using the same measures to obtain data on participants’ experiences with housing and homelessness, social connections, psychological well-being, education or training, employment, and other relevant measures. There are only minor adjustments appropriate to a follow-up survey (e.g., change in the reference period to query experiences since random assignment, removal of such demographic items as race and ethnicity that do not change over time). The data collected on the 12- and 18-month follow-up surveys are important primarily for assessing the program’s impact on expected outcomes.


Between the baseline, 12-month, and 18-month follow-up surveys, youth will be asked to take four surveys that are planned to occur at about three-month intervals. The 3-month follow-up survey constitutes the second collection, and it will occur 3 months after random assignment. Its primary purpose is to collect information about the receipt of TLP or TLP-like services—namely the nature and amount of housing, employment, education, alcohol or drug treatment , mental health treatment and counseling, mentoring, and coaching, physical health care, life and interpersonal skill building, legal, family reunification, case management, and auxiliary services (e.g., assistance with food, transportation, etc.). By collecting this information from study participants in both the treatment and control groups, the study will be able to estimate treatment-control contrasts for the aforementioned 12 service areas. These data may help explain any observed program impacts or lack thereof. In addition, information will be collected on a limited set of outcomes in order to measure stable housing, positive social connections, psychosocial well-being, employment, education, and exposure to violence. Data on these outcomes will be used to estimate the impacts of the TLP on youth well-being at a time when youth assigned to the treatment group are likely to be receiving TLP services. Given that the average length of stay in a TLP is approximately 6 months, the 3 month data collection is timed to identify early effects of the TLP, and the outcomes were selected for their alignment with program services and the immediacy with which program effects might be observed.


At three time points, 6 months, 9 months, and 15 months after random assignment (i.e., the third, fourth, and sixth collections), the sole purpose of the data collection is to maintain contact and support the study’s overall tracking effort to ensure high response rates to later surveys. In the 6-, 9-, and 15-month tracking surveys participants are asked to verify and update their contact information and that of their alternate contacts. Due to the transient and impermanent circumstances of runaway and homeless youth, tracking the study sample over a period of 18 months from random assignment to the final follow up survey will likely be challenging. Regular and relatively frequent contact with members of the treatment and control groups alongside regular verification of contact information is one means through which to reduce sample attrition.



A crosswalk showing the measures collected at baseline and each of the subsequent follow-up data collections (3-month, 12-month, and 18-month surveys) is presented in Attachment A. A listing of the items included in each instrument and their sources is presented in Attachment B. The data collection instruments themselves are presented in the following attachments:

  • Attachment C is the baseline survey;

  • Attachment D is the 3-month follow-up survey;

  • Attachment E is the tracking survey used for the 6 month, 9 month, and 15 month data collections;

  • Attachment F is the 12-month follow-up survey; and

  • Attachment G is the 18-month follow-up survey.


Collection of data from TLP staff

If this request is approved, the study will collect data from TLP staff related to service methods, program structure, and program implementation. The information gathered will be used to document and understand the way the TLP model is implemented by each participating grantee, including each TLP’s fidelity to the Positive Youth Development framework, and the activities that occur during implementation of the program, common or innovative approaches to serving homeless youth; challenges to serving this population and approaches to overcoming these challenges. Respondents will be asked about the types of youth served by the program, program eligibility and admission processes, staffing approach, program structure and rules, and program services and service delivery model. These program components contribute to each TLP’s environment and, in turn, shape the experiences and outcomes of youth who are enrolled in the program. Findings will be used to help contextualize and interpret the results of the impact analysis. The data may be used to examine (non-experimentally) the extent to which program implementation features may be associated with program effects


Data will be collected during in-person interviews using three instruments: (a) the Program Overview Survey Executive Director (POS-ED) Interview Guide, (b) the Program Overview Survey Program Staff (POS-PS) Interview Guide, and (c) the Youth Development Survey Interview Guide. These are presented in Attachment H, Attachment I, and Attachment J, respectively, and shown in Exhibit A.2.2 below.



Exhibit A.2.2 Overview of Data Collection from TLP Grantees


Instrument

Timing

Purpose

Estimated Length

1

Program Overview Survey: Executive Director Interview Guide

During site visit

Document program implementation from management perspective

1.00 hours

2

Program Overview Survey: Program Staff Interview Guide

During site visit

Document program implementation from perspective of frontline staff

2.00 hours

3

Youth Development Survey Interview Guide

During site visit

Document fidelity to PYD framework

0.50 hours


The POS-ED Interview Guide will be used to interview one high level staff person per grantee such as the agency’s executive director or program director. The POS-PS Interview Guide will be used to interview four front-line staff persons per grantee in such positions as a program coordinator, intake specialists or case manager. Both the POS-ED and POS-PS interviews focus on the “nuts-and-bolts” of the TLP and will be used to collect information about the program from the perspective of a high-level director overseeing program implementation, in the case of the POS-ED, and frontline staff involved in daily program operations, in the case of the POS-PS.


Topics covered in the two POS interviews include:

  • Agency and TLP overview

  • Partnerships

  • Staffing:

  • Program eligibility and admission:

  • TLP services and service approaches:

  • Trauma-informed services

  • Program structure, policies, and rules:

  • Program outcomes


The POS-ED and POS-PS interview guides differ in the level of detail and are designed to be appropriate to the roles of management-level versus frontline staff.


In addition to the POS interviews, the executive director and one program staff person per agency will be interviewed using the Youth Development Survey Interview Guide. The purpose of this interview is to collect information about each agency’s approach to serving youth, focusing on the 14 dimensions of ACF’s Positive Youth Development (PYD) Framework. Not all programs pursue all of the dimensions, and programs vary significantly in the amount of emphasis they place on any given dimension and the strategies used to pursue it. For example, programs may use specific program design features (such as program rules, youth development plans, and the provisions of privileges), or they may provide particulate types of services to promote the program’s key dimensions. For each dimension, the survey asks two questions: how important is the dimension to your program, and what strategies do you use to pursue it? The former is measured on a 10-point scale, and the latter is an open ended question. This information will help the evaluation team understand the specific dimensions that each program prioritizes and how the programs emphasize and operationalize each of the dimensions of the PYD framework.


Data collection from TLP staff will occur at one point in time, during site visits to each grantee. The data collected will be used in a qualitative cross-site analysis in order to identify and describe common practices and approaches to serving youth in TLPs, focusing on those that are believed to reinforce positive youth outcomes. The data will also be used to describe how TLPs are structured and how program services are implemented, and to explore, non-experimentally, what features of programs may be associated with program effects.


Pilot Testing the Study Design

A sample of TLP grantees will be selected to pilot test the current study design as well as a proposed alternative cross over design which may be incorporated to preserve the fidelity of the study.  The pilot will implement the full recruitment and random assignment process, as well the deployment of the baseline and 3-month follow-up surveys.  The pilot will test the critical aspects of the study design to include: rates of enrollment into the study; level of demand for TLP services; completion of the consent process; performance of the data collection web portal (though which youth surveys are administered); random assignment; participant tracking methods; and the response rates to the baseline and 3-month follow-up surveys.  The pilot may take up to 6 months to complete, providing sufficient time to recruit a reasonably-sized sample of youth into the study, track youth for three months, and obtain response rates to the 3-month follow-up survey.  The results from the pilot test may result in modifications to the study procedures.



  1. Use of Improved Information Technology and Burden Reduction


The data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden. Where feasible, information will be gathered from existing data sources; the information being requested through surveys is limited to that for which the youth are the best or only information sources. For all youth surveys (baseline, 3-month follow-up, 6-month tracking, 9 month tracking, 12-month follow-up, 15 month tracking, and 18 month follow-up) secure web-based technology will be used to reduce burden, improve accuracy of responses, and ensure data security. The survey will be hosted on a secure, encrypted, passcode protected digital platform, which will capture and store data in real time. Each response to a question (as it is entered) is sent immediately to a central and secure database and no information is stored on local computers Research has demonstrated that surveys administered online are characterized by higher levels of self-disclosure, an increased willingness to answer sensitive questions and a reduction in socially desirable responses. Once approved, the survey instrument will be translated into Spanish, so that respondents can choose the language in which they take it. There are no plans to back-translate the survey from Spanish back into English. A reputable professional translator who has produced accurate and reliable translations for other large scale national studies will be used.


  1. Efforts to Identify Duplication and Use of Similar Information


This study is a unique, one-time event studying a population that has not previously been surveyed nationally or as intensively and systematically as this research envisions. FYSB collects in-service information on non-identified youth through the Runaway and Homeless Youth Management Information System (RHYMIS). Information from RHYMIS will provide background and context but not in sufficient detail or individual specificity to produce quality research. Moreover, RHYMIS reports on youth services and issues while they are in the TLP. The important living status of youth after they leave the program is not available from RHYMIS, only their immediate destination at exit.


5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.


Most TLP grantees are small entities, operated by community-based organizations. The information being requested has been held to the minimum required for the intended use. The information collection from grantee staff will be implemented on a one-time basis.


The plan for collecting data from youth is designed to minimize burden on such sites by utilizing a web-based survey platform and providing training and technical support from the evaluation contractor team that will enable staff to help youth register and log onto the survey portal for the first time. For the follow-up surveys, the contractor, Abt Associates (and its subcontractor for data collection, Abt SRBI) will provide support and technical assistance to youth respondents who would like assistance or have difficulty accessing or completing the survey on their own. The contactor staff will help youth troubleshoot connectivity issues, provide passwords for web completion, and as needed conduct phone-based survey completions. Grantees may be asked to help contact youth who have not been heard from when their expected follow-up surveys are due, but the actual follow-up effort will be implemented by a contractor. Findings from the survey will in no way impact the funding or management of the grantees, although program design improvements may result over the long term as the effectiveness of various models is determined.


6. Consequences of Collecting the Information Less Frequently


The data being collected are essential to conducting a rigorous evaluation of TLPs as the reauthorization language requires. The data are necessary for determining whether the interventions had short term or longer-term impacts on program participants relative to youth in the control group. Furthermore, without additional study, funding decisions about TLPs will continue to be based on insufficient information on program effectiveness.


7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5:


  • requiring respondents to report information to the agency more often than quarterly;


Not required.


  • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;


Not required.


  • requiring respondents to submit more than an original and two copies of any document;

Not required.


  • requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;


Not required.


  • in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;


Not applicable. Statistical reliability and valid generalization to the universe of study will be part of the inherent design and adhered to in process.


  • requiring the use of a statistical data classification that has not been reviewed and approved by OMB;


Not applicable.


  • that includes a pledge of confidentiality that is not supported by authority established in statue or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or


Youth will be promised confidentiality before providing informed consent. Youth will sign a consent form that has been reviewed and approved by an Institutional Review Board. The consent form will be signed at the time of enrollment, prior to taking the baseline survey and being randomized. In addition, we are seeking a Certificate of Confidentiality from the National Institute of Child Health and Human Development within the National Institutes of Health, which will further protect the confidentiality of youths’ data. The Runaway and Homeless Youth Act at Sec 322, (a)(13) states that grantees must pledge “not to disclose records maintained on individual homeless [TLP] youth without the informed consent of the individual youth to anyone other than an agency compiling statistical records.”


  • requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


Not applicable.


8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency


We have not received any public comments in response to the FR1 publication notice for this RCT TLP Evaluation, but will make every effort to respond to any comments that may be forthcoming.


ODARE and FYSB have worked to partner with other agencies in the review of the research design as well as the collection tools. Staff and leadership from OPRE have weighed in. Additionally, FYSB project officers and technical assistance staff have provided feedback regarding the project and have performed programmatic assessments of the potential grantee organizations to ensure that they have the capacity to support a study of this nature.


9. Explanation of Any Payment or Gift to Respondents


The population targeted for the evaluation presents a challenge for the study that is increased by the desire to measure long-term impacts of the program beyond the measures taken immediately at the end of the program that are typical in this research field. By design, the program in this evaluation targets youth who are disconnected and highly mobile and hard to reach. To achieve a 70 percent response rate for the 12- and 18-month follow-up surveys, it is important to take steps to attach them firmly to the study at the outset and to maintain that attachment over time. These steps are also essential to prevent differential attrition, leading to response bias, since members of the control group are not receiving program services and are not in contact with program staff. To this end, we have proposed to provide modest tokens of appreciation to each participant at each survey point.


We propose that each youth will receive an electronic gift card upon submission of each survey to compensate them for their time. The schedule of compensation is presented below in Exhibit A.9.1. For the baseline and follow-up surveys, tokens of appreciations will be in the form of electronic gift cards to Amazon.com. For the tracking surveys, tokens of appreciation will be in the form of electronic gift cards to a coffee shop or fast food restaurant, depending on the youth’s location.


The research team considered a variety of different incentive approaches, including prepaid VISA gift cards. The research team selected the Amazon gift card as the best option because these cards do not expire, can be canceled without a fee, can be replaced with a new code that is sent electronically to the participant, and are controlled and monitored by the study team.


By contrast, the study team opted against the use of VISA gift cards for several reasons:


  • VISA gift cards come with an assortment of additional fees—activation fees, shipping fees, cancellation fees (e.g., if lost or stolen), and inactivity fees (e.g., if the cards remain idle for too long), which add costs to the overall study and decreases the value of the card to participants. Please see Exhibit 1 for a few examples of fees and penalties associated with VISA gift cards.

  • Unlike Amazon gift cards, VISA gift cards have expiration dates that can vary depending on which bank issues the gift card.

  • To use VISA gift cards, a prepaid VISA gift card must be physically handed to each youth upon completing the baseline survey, which requires the study team to mail hundreds of prepaid gift cards to the 14 grantees that will participate in the study. Grantee staff would be responsible for distributing the gift cards to youth. This approach is highly undesirable because: (a) the study team loses control over the distribution of the gift cards, and thus it increases the opportunities to misuse the gift cards and reduces the team’s ability to monitor the incentive process; and (b) lost or stolen gift cards must be replaced with new cards, which requires a physical mailing address to receive them.




Exhibit A.9.1 Schedule of Compensation

Survey

Incentive

Young Adult Baseline Survey

$30

Young Adult 3-Month Follow-up Survey

$30

Young Adult 6-Month Tracking Survey

$10

Young Adult 9-Month Tracking Survey

$10

Young Adult 12-Month Follow-up Survey

$40

Young Adult 15-Month Tracking Survey

$10

Young Adult 18-Month Follow-up Survey

$40


The amount of the incentive is $30 each for the baseline and 3-month surveys. The 12 and 18-month survey tokens of appreciation are $40, slightly more to emphasize its importance to the study. The compensation for each tracking survey is $10. Homeless youth are an especially transient population, making them particularly difficult to survey over many months. Thus, we believe that these gift card amounts are necessary to encourage participation in the study and completion of the follow-up surveys.


The electronic gift cards will be emailed or text messaged to study participants, to an email address or cell phone number they designate. If a youth does not have access to email or cell phone, the study team will mail the gift card codes to the address of their choice. Youth will receive their gift card after they have submitted their survey. For a suspended or incomplete survey, youth will be sent their gift card after their individual window for completion of the said survey has closed


This compensation model is designed to increase the survey response rates, particularly from youth dispersing widely into the general population as they gain the independence which is the purpose of TLP. We anticipate the need for both passive and active tracking approaches. We will rely on multiple tracking mechanisms to ensure the highest response rate possible, and the type of mechanism utilized will vary depending on the status of youth.


10. Assurance of Confidentiality Provided to Respondents


Protections for privacy are embedded in the study design. Data collection will only occur if informed consent is provided by youth themselves. In order to assure the privacy and safety of the runaway and homeless youth participating in the study, the contractor will obtain a waiver of parental permission from its Institutional Review Board (IRB). Federal regulations permit the IRB to approve research without parent permission “if the IRB determines that a research protocol is designed for conditions or for a subject population for which permission is not a reasonable requirement to protect the subjects, provided an appropriate mechanism for protecting the children who will participate as subjects in the research is substituted and provided further that the waiver is not inconsistent with federal, state or local law”.


All identifying information that will be collected with each youth’s informed consent and all records will be protected with security systems to guarantee privacy and confidentiality. All data will be securely protected under IRB-certified ethical research methods. Each survey will contain an assurance of this, which will also convey that answers will be kept private, that youths’ participation is voluntary, and that they may refuse to participate at any time.


11. Justification for Sensitive Questions


Some questions in the youth surveys ask for information of a sensitive nature—for example, substance use, sexual behavior, delinquent activity, involvement with law enforcement, health and mental health symptoms, participation in social services. These are very personal but also essential to understanding a youth’s overall situation, and they are collected because the program under evaluation (the TLP) is designed specifically to address and reduce risk and improve well-being related to these very factors.


The less sensitive measures such as housing circumstances and attachment to education and employment are valued outcomes of TLP, but risk reduction, healthy choices, and pro-social adaptation are important as well. Youth will be assured of privacy and told they do not have to answer a particular question if they find it objectionable.


12. Estimates of Annualized Burden Hours and Costs


The annualized burden estimates are based on pretests of the survey instruments, as well as experience with similar data collection efforts for other studies.



Exhibit A.12.1 Reporting Burden Hours on Study Participants

Instrument

Number of Respondents

Number of Responses per Respondent

Average Burden Hours per Response

Total Burden Hours

Site Visit Interviews

 

 

 

 

Program Overview Survey: Executive Director Interview Guide (1 Executive Director respondent per grantee)

14

1

1.00

14.00

Program Overview Survey: Program Staff Interview Guide (4 Program Staff respondents per grantee)

56

1

2.00

112.00

Youth Development Survey Interview Guide (1 Executive Director and 1 Program Staff respondent per grantee)

28

1

0.50

14.00

Young Adult Surveys

 

 

 

 

Young Adult Baseline Survey

1250

1

0.62

775

Young Adult 3-Month Follow Up Survey

1000

1

0.48

480.00

Young Adult 6-Month Tracking Survey

1000

1

0.17

170.00

Young Adult 9-Month Tracking Survey

1000

1

0.17

170.00

Young Adult 12-Month Follow Up Survey

1000

1

0.61

610.00

Young Adult 15-Month Tracking Survey

1000

1

0.17

170.00

Young Adult 18-Month Follow Up Survey

1000

1

0.61

610.00

Estimated Total Burden Hours

3125




Exhibit A.12.2 Reporting Burden Costs on Study Participants


NOTE: The youth surveys will be completed by unpaid youth, who will receive an incentive valued at between $10 and $40 for each survey as described above in Section A.9 of this supporting statement. Because the incentive will be covered under the contract with the research contractor, the incentive costs do not impact respondents.


3. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers


There are no annualized capital/startup or ongoing operation and maintenance costs associated with collecting this information. There are no direct respondent costs associated with this data collection other than opportunity costs of respondents’ time required to complete the surveys. The evaluation does not place any capital equipment, start-up, or record maintenance requirements on respondents.



14. Annualized Cost to the Federal Government


The overall one-time cost of the research, including survey design, information collection, analysis, and reporting will be $2,643,465, but excluding the 15- and 18-month surveys. This includes: developing the study foundation ($833,333); data collection($1,507,489), analysis ($155,227), and reports/briefing ($147,416). FYSB is currently negotiating the costs of the 15- and 18-month surveys with the contractor. Because the contract takes place over six years, the estimated annualized is $440,577.


Annual costs of Federal employee time is estimated to be: $12,700 Executive Management Oversight (10% FTE), and $39962 Project COR and Task Lead (40% FTE).


15. Explanation for Program Changes or Adjustments


The original TLP Information Collection expired October 2013 (The study was not implemented). In July 2014, an amended TLP Information Collection request package was submitted to the Office of Management and Budget (OMB) for approval. The amendments included a change in study methodology, moving from a pre and post test design to a more rigorous multicomponent randomized control trial. This newly designed impact and implementation evaluation resulted in an increase in sample size, from 760 youth to 1250 youth. The collection instruments were also updated to reflect the new collection intervals: baseline, 3, 6, 9, 12, 15 and 18 months.


The newly proposed study design was submitted to both the Office of Management and Budget (OMB) as well as an independent Institutional Review Board (IRB). To date, we've only received feedback from the IRB who expressed concerns about the sensitive nature of the survey questions and the potential that they might cause distress among study participants. Although it is customary for sensitive questions to be asked of youth and young adults in an effort to assess potential risk factors and to better understand the experiences of vulnerable populations, the research team took IRB concerns under advisement. Because OMB had yet to render a final approval of the revised collection instruments, the evaluation team, along with ODARE and FYSB leadership, took the opportunity to identify alternative data measures for several of the highly sensitive constructs and to consider additional ways to reduce the overall length of the surveys. The strategies proposed below and which have been added to the, have satisfied the concerns as expressed by the IRB.


There are ten (10) collection instruments associated with the TLP Evaluation, the changes listed below have only been made to the following four (4) collection instruments:


1) Young Adult Baseline Survey Burden Hours Reduced from .75 hours to .62 hours Eliminated 36 questions

2) Young Adult 3 Month Survey Burden Hours Reduced from .54 hours to .48 hours Eliminated 22 questions

3) Young Adult 12 Month Survey Burden Hours Reduced from .75 hours to .61 hours Eliminated 41 questions

4) Young Adult 18 Month Survey Burden Hours Reduced from .75 hours to .61 hours Eliminated 41 questions


The reduction of burden was achieved and issues surrounding sensitivity were addressed by:


*Reducing the number of questions that are found to be:

- Redundant or asked in other measures

- Sensitive and have the highest potential to trigger an adverse response

- Descriptive in nature and not integral to the overall goal of the study


*Replacing construct scales for the following measures:

- Depressive Symptoms reduced from a 20 questions to 11 questions

- Traumatic Stress Symptoms reduced from a 17 questions to 7 questions

- Substance Use reduced from 13 questions to 8 questions

The remaining collection instruments have only changed in formatting and have no substantive content changes:

- Young Adult 6-Month Tracking Survey

- Young Adult 9-Month Tracking Survey

- Young Adult 15-Month Tracking Survey

- Program Overview Survey: Executive Director

- Program Overview Survey: Program Staff

- Youth Development Survey


See Supporting Statement Exhibit 2, for tables that identify specific elements that have been eliminated and/or revised.


16. Plans for Tabulation and Publication and Project Time Schedule


When the study is completed, findings will be reported to Congress and then made available to the general public and research community. Detailed data tabulations will be part of the record and available for study. No personal information will be included.


Plans for Tabulation

The goal of the impact analysis is to estimate the effects participating TLPs have on youth development within the domains of stable housing, social connections, psychosocial well-being, and education or employment. The effect, , of being offered the opportunity to participate in a TLP will be estimated by comparing the average (mean) score on a given outcome for the youth in the treatment group , designated , to mean score on that outcome for youth in the control group , designated .


The estimated effect of TLPs is free from selection bias because of random assignment.


Following standard practice, we will use regression adjustment to increase the precision of the estimated effect (Orr, 1999). The statistical model for regression adjustment to estimate the effect of TLP participation on an outcome Y (e.g. risky behavior) is presented in equation (1):

(1)

where:

is the outcome of interest for youth ;

is a constant;

is a dummy variable which = 1 if youth was assigned to the treatment group and = 0 if youth was assigned to the control group;

is the effect of being offered the opportunity to participate in a TLP ;

is a vector of baseline characteristics or control variables for youth participating in the study, such as race, gender, and age at baseline;

is a vector of regression coefficients corresponding to the covariates; and

is a random error term measured with mean of 0 and variance .


To determine if the opportunity to participate in one of the participating TLPs had a positive effect on youth assigned to the treatment group, we will conduct a t-test for each outcome measure. If the estimate of is statistically significant at the 5-percent level using a two-tailed test, we will conclude that we have found convincing scientific evidence that the intervention affected the outcome measure; otherwise, we will conclude that there is no convincing scientific evidence of an effect on this outcome.

For continuous or categorical outcomes, we plan to estimate the model above using ordinary least squares (OLS), which assumes that the outcome data have a normal distribution (i.e., form a bell-shaped curve) with homoscedasticity (e.g., a common variance). For binary (dichotomous) outcomes, models will be estimated using logistic regression and we will report the marginal effect on the probability of observing the binary outcome.


We have no reason a priori to expect homoscedasticity, because some TLPs could have higher variability in youth outcomes than other TLPs (Angrist & Pischke, 2008). To address the potential of heteroskedasticity and account for variation in continuous and categorical youth outcomes across TLPs, we will include site-level indicator variables (“fixed effects”) in our linear models, and we will compute robust standard errors (i.e., Huber-Eicker-White robust standard errors; Huber, 1967; Greene, 2003; White 1980, 1984).


Missing data are of concern in any study, particularly one with a longitudinal design such as this. The presence of non-response to the follow-up survey concerns us in two ways. First, non-response to the follow-up survey presents a challenge to the internal validity of the study if the treatment and control groups have different patterns of non-response. Second, follow-up survey non-response can threaten the generalizability of results to the entire enrolled sample if survey non respondents differ from respondents, even if the rate of non-response is symmetrical across the treatment and control groups. To address these issues, we will prepare a set of weights that adjust for survey non-response.1 The weights will be used in all impact regressions, and they will be constructed as follows. First, we will regress a dummy variable for survey response on a large number of baseline characteristics and use the results to generate a propensity to respond for each youth. Second, we will divide each intervention group into quintiles based on this propensity. Third, within each intervention group-quintile, the weight for the respondents will be the total number of sample youth in the quintile divided by the number of respondent youth in the quintile. This last step raises the representation of respondents to the level of the full sample in the weighted data, thereby restoring the composition of the analysis data to that of the full sample on the factors used to estimate propensities to respond. However, the weighted outcome data may still depart from the full sample on other unadjusted background factors related to subsequent outcome levels; if so, some amount of non-response bias will remain in the impact estimates—but less than without the reweighting.


Ultimately, the analysis will produce estimates indicative of TLPs’ efficacy—that is, the effects on youth of relatively large, relatively over-subscribed, relatively well-designed and well-designed TLPs based on the study selection criteria—rather than the effect of the average TLP. The analysis will yield estimates of the impact of these TLPs known as “intention to treat” (ITT) estimates. ITT compares individuals who were assigned to, or offered, the treatment (i.e., a service slot in a TLP) to individuals who assigned to the control group. In other words, it is a comparison of the entire treatment group to the entire control group. We plan to focus on ITT estimates because our primary goal is to gauge the effects of TLPs participating in the study relative to the counterfactual of not having the opportunity to participate in one of them. We also plan to present a second type of impact estimate— the impact of actually getting a treatment, known as the “treatment on the treated” (TOT) impact or the Complier Average Causal Effect (CACE). These estimation techniques produce estimates of the effect TLPs have on outcomes for the subset of youth assigned to the TLP who actually enrolled and receive any services due to assignment to the treatment group. The TOT or CACE estimates will substantially differ from the ITT estimates if we find a high rate of no-shows among treatment group members (e.g., treatment group members receiving none of the TLP services that they were offered) or a high crossover rate for control group members (i.e., control group members receiving TLP services).


Project Time Schedule

The TLP evaluation will be conducted over a four-year period that begins in October 2014. Major project activities are presented in Exhibit A.16.1 below. Pending OMB approval, participant enrollment into the study and collection of the baseline survey data will occur over a 18-month period beginning in January 2015 and ending in October 2016. Follow-up data collections are projected to occur between March 2014 and December 2017. Data collection from grantees will occur during site visits beginning December and ending March 2015. Analysis and reporting would occur between August 2016 and August 2018 with the submission of a memorandum focused on the preliminary results of the three month survey expected in November 2017 and submission of the final impact report in anticipated in August 2018.


Exhibit A.16.1

Finalize evaluation design, data collection and management plan, and survey instruments

October 2014 – December 2014

Site selection and recruitment

October 2014 – December 2014

OMB and IRB approval of instruments

March 2014 – December 2014

Training of survey administrators, site visits and grantee interviews

January 2015 – March 2015

Enrollment of participating youth and baseline survey

March 2015 – September 2016

3-month follow up survey

June 2014 – December 2016

6-month tracking survey

September 2015 – March 2017

9-month tracking survey

December 2015 – June 2017

12-month follow up survey

March 2016 – September 2017

15-month tracking survey

June 2016 – December 2017

18-month follow up survey

September 2016 – March 2018

Analysis and reporting

January 2016 – September 2018



17. Reason(s) Display of OMB Expiration Date is Inappropriate


Not applicable. All instruments will display the OMB number and the expiration date.


18. Exceptions to Certification for Paperwork Reduction Act Submissions


Not applicable. No exceptions are necessary for this information collection.


1 The construction of weights to address survey nonresponse is discussed in Little (1986, pp. 139–157).

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWindows User
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy