YTD Supporting Statement (4-08)--revised and corrected

YTD Supporting Statement (4-08)--revised and corrected.doc

Youth Transition Process Demonstration Evaluation Collection

OMB: 0960-0687

Document [doc]
Download: doc | pdf

Contract No.: SS00-05-60084

MPR Reference No.: 6209-134





Supporting Statement for Paperwork Reduction Act Submission


April 11, 2008

















Anne Ciemnecki

Karen CyBulski

John Martinez







Submitted to:


Social Security Administration

Office of Program Development and Research

Suite 700, 400 Virginia Avenue, SW

Washington, DC 20024

Telephone: (202) 358-6448

Facsimile: (202) 358-6505


Project Officer:

Jamie Kendall


Submitted by:


Mathematica Policy Research, Inc.

P.O. Box 2393

Princeton, NJ 08543-2393

Telephone: (609) 799-3535

Facsimile: (609) 799-0005



Project Director:

Thomas Fraker





CONTENTS

Page


A. JUSTIFICATION 1


1. Circumstances that Make the Data Collection Necessary: Legal or Administrative Requirements 1

2. How, by Whom, and for What Purpose the Information Will Be Used 3

3. Use of Improved Information Technology 5

4. Efforts to Identify Duplication 6

5. Involvement of Small Entities 6

6. Consequences if Information Is Not Collected or Collected Less Frequently 6

7. Special Circumstances 7

8. Adherence to Guidelines in 5 CFR 1320.5(d)(2) and Consultation Outside the Agency 7

9. Remuneration of Respondents 10

10. Assurance of Confidentiality 11

11. Questions of a Sensitive Nature 11

12. Estimates of Annualized Hour Burden 12

13. Estimates of Annualized Capital Burden 12

14. Estimates of Annualized Cost to the Government 14

15. Explanation for Program Changes or Adjustments 15

16. Plans for Tabulation and Publication and Project Time Schedule 15

17. Expiration Date for OMB Approval 17

18. Exceptions to the Certification Statement 17


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL
METHODS
18


1. Respondent Universe and Sampling Methods 18

2. Procedures for the Collection of Information 21

3. Methods to Maximize Response Rates 26

4. Tests of Procedures 28

5. Statistical Consultants and Persons Collecting and Analyzing the Data 29


APPENDIX A: SECTIONS 234 AND 1110 OF THE SOCIAL SECURITY
ACT (not attached)


APPENDIX B: BASELINE QUESTIONNAIRE


APPENDIX C: 12-MONTH FOLLOW-UP QUESTIONNAIRE

CONTENTS (continued)


Page


APPENDIX D: GUIDES FOR INTERVIEWS AND/OR ROUNDTABLE DISCUSSIONS


IN-DEPTH INTERVIEW GUIDE


APPENDIX E: FEDERAL REGISTER NOTICE (not attached)


APPENDIX F: BASELINE PRE-NOTIFICATION LETTER


APPENDIX G: BASELINE CONSENT FORMS


APPENDIX H: 12-MONTH PRE-NOTIFICATION LETTER



TABLES

Table Page

A.1 ANNUALIZED BURDEN 13


a.2 annual COSTS TO THE FEDERAL GOVERNMENT 15


B.1 CHARACTERISTICS OF EXISTING YTD PROJECTS PARTICIPATING

IN THE RANDOM ASSIGNMENT EVALUATION 20


b.2 CHARACTERISTICS OF FIVE YTD PILOT PROJECTS 20


B.3 MINIMUM DETECTABLE IMPACTS FOR THE YTD EVALUATION ASSUMING INDIVIDUALIZED RANDOM ASSIGNMENT 26




Supporting Statement For
YOUTH TRANSITION PROCESS DEMONSTRATION EVALUATION
OMB CONTROL No. 0960-0687

The Social Security Administration (SSA) is requesting clearance for the collection of data needed to implement and evaluate the Youth Transition Demonstration (YTD) projects. YTD projects are intended to help young people with disabilities make the transition from school to work. By waiving certain disability program rules and offering services to youth who are either receiving disability benefits or at risk of receiving them, these projects are expected to encourage youth to work and/or continue their education. YTD projects will be fully implemented in 10 sites across the country. The evaluation will produce empirical evidence on the impacts of the waivers and project services not only on educational attainment, employment, earnings, and receipt of benefits by youth with disabilities but also on the Social Security Trust Fund and federal income tax revenues.


Given the importance of estimating YTD impacts as accurately as possible, the evaluation will use rigorous analytic methods based on the random assignment of youth to a treatment or control group. Several data collection efforts are planned. These include (1) baseline interviews with youth and their parents or guardians prior to random assignment; (2) follow-up interviews at 12 and 36 months after random assignment; (3) interviews and/or roundtable discussions with local program administrators, program supervisors, and service delivery staff; (4) focus groups of youth, their parents, and service providers; and (5) in-depth interviews with youth and/or their parents or guardians within three months of completing the 12 month follow-up interview. Note that the in-depth interviews are intended to supplement structured questions on service use in the 12-month follow-up survey.  We have evidence from pre-tests that the semi-structured approach can capture services or service durations that are missed in the structured interview.


OMB has granted clearance for the baseline questionnaire and related baseline data collection activities under OMB # 0960-0687, which expires November 30, 2010. In this package, SSA requests clearance of the remaining baseline interviewing, 12 month follow-up interviewing, focus groups, and discussions with program staff and service providers. We will request clearance for the 36 month interview in January 2009.

A. JUSTIFICATION

1. Circumstances that Make the Data Collection Necessary: Legal or Administrative Requirements

a. Circumstances

The transition to adulthood for youth with disabilities can be difficult. SSA is sponsoring the YTD projects, and the related evaluation of those projects, to examine the effectiveness of providing services to youth with disabilities during their transition to adulthood. In addition to the host of issues facing all transition-age youth, those with disabilities have special issues related to health, social isolation, multiple service needs, and lack of access to supports. This set of challenges complicates their planning for future education and work and often leads to poor educational and employment outcomes and high risk of dependence. SSA is investing considerable resources in developing and evaluating strategies to maximize the economic self-sufficiency of youth with disabilities, focusing on youth at the ages of 14 to 25 years, as they transition from school to work. Hallmark features of the YTD evaluation include carefully designed and targeted demonstration projects that are policy-relevant and can operate at a scale required by the evaluation and a rigorous random assignment evaluation design.

b. Legal or Administrative Requirements

Congress has, since 1980, required the SSA to conduct demonstration projects to test the effectiveness of possible program changes that could encourage individuals to work and decrease their dependence on disability benefits. In fostering work, these demonstrations and the program changes they test are intended to produce savings in the trust funds or improve program administration.


To achieve these objectives, SSA’s demonstration authority contains several key features that provide SSA with a potentially valuable tool for assessing the effectiveness of policy alternatives. One of these features is SSA’s authority to waive certain disability insurance and Medicare program rules. For example, when conducting demonstrations, SSA is permitted to exempt certain beneficiaries from requirements that workers with disabilities earn below a certain amount to remain eligible for benefits. Another key aspect of SSA’s demonstration authority is the requirement that demonstration projects be of sufficient scope and conducted on a wide enough scale to ensure a thorough evaluation and results that are applicable to the program as a whole.


In addition, the legislation authorizes SSA to use trust fund monies to pay for the demonstrations and requires SSA to periodically report to Congress on its demonstration activities, providing, when appropriate, recommendations for legislative or administrative changes.


Sections 234 and 1110 of the Social Security Act (Appendix A) direct the Commissioner of SSA to carry out experiments and demonstration projects to determine the relative advantages and disadvantages of the following:

  • Various alternative methods of treating the work activity of individuals receiving benefits, including such methods as a reduction in benefits based on earnings designed to encourage these beneficiaries to return to work

  • Altering other limitations and conditions, such as lengthening the trial work period or altering the 24 month waiting period for Medicare

  • Implementing a sliding scale benefit offset

The Act requires that these demonstration projects be designed to show that savings will accrue to the trust funds, or will otherwise promote or facilitate the administration of the program. Section 234 also provides that these projects must be conducted in a manner that will allow SSA to evaluate the appropriateness of implementing such a program on a national scale.


To overcome the barriers to employment for beneficiaries, YTD provides person-centered job development, training, benefits counseling, service coordination, and family support. Enrollees in the demonstration will be randomly assigned to either the treatment or the control group. Enrollees in the control group will have access to the traditional services and existing work incentives available, while the treatment group will receive the demonstration’s enhanced services as well as waivers of certain disability insurance rules to strengthen work incentives. The evaluation will assess the impact of these services and waivers on educational attainment, employment, earnings, and reduced use of disability benefits. The demonstration and planned evaluation meet SSA’s legislative and congressional mandates.

2. How, by Whom, and for What Purpose the Information Will Be Used

Information collected will answer three key questions central to assessing the effectiveness of YTD projects:

  • How Are the YTD Projects Implemented and Operated? What are the important issues and challenges in designing, implementing, and operating YTD projects, and what lessons can be drawn from the experience? What approaches are taken to providing services to promote self-sufficiency among youth with disabilities? What are the characteristics of the interventions and the context of their provision? Who participates in the YTD projects, for how long, and what services do they get? Who provides these services? How do those services differ from those received by members of the control group? To what extent do the youth use the SSA waivers? How does participation in YTD differ for population subgroups?

  • What Are the Short Term and Longer Term Impacts of the Projects? How effective are the projects in increasing employment and earnings and reducing dependence on disability benefits? Do the projects affect educational attainment or other intermediate outcomes, such as work attitudes or work experience? Do they improve social-psychological well being? Do they increase the likelihood that disabled youth will be able to live independently as adults? Do these impacts differ across subgroups of the population of youth with disabilities?

  • What Are the Costs of Operating the Projects, and Do the Benefits Outweigh the Costs? What are the projects’ operating costs? What other costs are incurred as a result of the YTD projects? To what extent do the projects lead to net changes in disability benefit receipt? Are there any induced entry effects? How do the projects affect income and payroll tax receipts, benefit outlays, and the status of the Social Security Trust Funds? From the perspectives of key stakeholders, do the benefits of the projects exceed their costs?

To address these three sets of questions, the evaluation will include process, impact, and benefit-cost analyses. This supporting statement requests clearance only for the data collections that appear in bold-face type in the paragraphs below.


Process Analysis. The process analysis will document how the intervention services were delivered, including information provided to participating youth on SSA waivers and the extent to which waivers were utilized. It will identify implementation successes, issues, and challenges and will examine program costs. It also will provide details on the nature of each YTD intervention and how the projects have achieved the observed results. Data for this analysis will come primarily from site visits, project records and documents, and the projects’ management information systems. Site visits will include discussions with staff of the YTD projects and partner organizations, SSA field office staff, and other youth service providers; focus group discussions with participating youth and their families; case reviews; and program observations. Baseline survey data will be used to describe the youth enrolled in the study. MIS data will be used to describe and analyze service receipt and utilization among treatment group members. Data from the first follow-up survey will be used to examine participant experiences and satisfaction with YTD services.


Impact Analysis. A rigorous random assignment design is being used to determine the differences these YTD projects make in educational attainment, employment, earnings, and reduced use of disability benefits as well as such outcomes as living arrangements, quality of life, and other measures of well being among the transition-age youth enrolled in the study. Under this design, youth eligible for YTD services will be randomly assigned to a treatment group (offered YTD waivers and services) or to a control group (not offered YTD waivers or services but may use existing SSA work incentives and services available in the community). Outcomes for the two groups will be compared using data collected in follow-up interviews, conducted 12 and 36 months after youth enter the demonstrations, as well as data obtained from SSA program files, administrative files of state and local agencies, and possibly SSA summary earnings records (SER). On the basis of these comparisons, we will assess the net effects of the YTD intervention approaches for the youth enrolled in the study and the differential effectiveness of YTD services for members of certain subgroups. We will use administrative data to address impacts on SSA disability benefits receipt and the use of SSA waivers, earnings, and other public assistance. We will use the more comprehensive data from the follow-up surveys to examine employment and other outcomes such as education, income, health, and measures of life quality and well being.


Benefit-Cost Analysis. The evaluation will conduct a comprehensive benefit-cost analysis of the YTD projects. We will start with a comprehensive cost analysis of each project; the goal is to construct an estimate of overall project costs as well as estimates of average unit costs, such as the cost per participant and cost per program component. Drawing on data reports from project records and on information from program staff interviews, we will build up an estimate of the cost of each project. In addition, information from in-depth interviews with youth or their guardians about service utilization will provide information needed for the service cost analysis. In particular, detailed information on the nature, frequency, and dosage of services utilized by control group members, as well as non-YTD services accessed by treatment group youth will supplement information gathered in follow-up interviews. Statistical methods will not be used to analyze information gathered in in-depth interviews. In addition to its usefulness as an adjunct to the process study’s description of program operations, the cost analysis will provide important input for the benefit-cost analysis. For purposes of this analysis, key costs include operating and administrative costs. Benefits include, but are not limited to, net increased earnings and tax payments, net reduced disability benefits, and net reductions in the receipt of public assistance. The benefit-cost analysis also will examine net changes in services used as a result of the YTD projects. The benefit-cost analysis will examine the extent to which the projects lead to net increases or reductions in SSI benefit receipt (and, hence, the cost or savings to SSA) as well as assess the extent to which there are any induced entry effects as a result of the waivers and services offered by the YTD projects. The analysis will examine the costs and benefits of the projects from the perspectives of a variety of stakeholders—including SSA, other government agencies, the YTD participants, and society as a whole—and will be produced in a format consistent with the requirements of SSA’s actuaries.


The baseline questionnaire is in Appendix B; the 12 month follow-up questionnaire is in Appendix C; and the topic guides for interviews with local program administrators, program supervisors, and service delivery staff as well as the topic guide for the in-depth interviews are in Appendix D. We have also included sample focus group topic guides for youth and parents. These will be customized as programs are selected for the evaluation.

3. Use of Improved Information Technology

For the YTD evaluation, MPR and its partner, Social Solutions, will implement a management information system (MIS), the Efforts-to-Outcomes (ETO) database, to facilitate the real-time exchange of data between MPR’s survey division and the YTD projects. The ETO database draws information from several data sources including SSA administrative data, respondent survey data, claims and utilization data, and data entered by YTD project staff. The ETO database is designed so that data from all sources are linkable so that it fully supports drawing extracts and generating reports and summaries to facilitate administering, monitoring, and evaluating the study.


Computer-assisted interviewing will be used to collect data for the baseline and follow-up surveys. It is expected that the baseline and follow-up surveys will be administered as both computer-assisted telephone (CATI) and face-to-face interviews. The questionnaires used in both applications will have the same core content. Both applications will incorporate standard checkpoints to assess each respondent’s level of fatigue and to provide the respondent with an opportunity to take a break, if necessary. Both the baseline and 12-month follow-up interviews use Computer Assisted Telephone Interviewing (CATI) software. A questionnaire is programmed into the software application. The software is able to customize the flow of the questionnaire based on the answers provided, as well as information already known about the sample member such as their gender, treatment or control status, or state of residence. Interviewers read questions that appear on their computer screen and enter the respondents’ answers. In this sense, it is similar to SSA’s MCS/MSSICS systems.


Telephones equipped with amplifiers will be available for use as needed to accommodate sample members who are hearing impaired. In addition, TTY and Relay technologies will also be used to facilitate participation in the telephone survey. A TTY is a special device that lets people who are deaf, hard of hearing, or speech-impaired use the telephone to communicate by allowing them to type messages back and forth to one another instead of talking and listening. A TTY is required at both ends of the conversation in order to communicate. MPR’s telephone operations center is equipped with TTY technology. The Telecommunications Relay Service (TRS) will be used for sample members who are deaf, hard of hearing, or speech-impaired but who do not have a TTY. With TRS, a special operator types whatever the interviewer says so that the person being called can read the interviewer’s words on his or her telephone display. He or she will type back a response, which the TRS operator will read aloud for the interviewer to hear over the phone. Both methods, TTY and TRS, increase survey administration times but enable us to conduct interviews with sample members who, without the help of these technologies, would not be able to participate. Forms are not available electronically because they are not self-administered.

4. Efforts to Identify Duplication

The surveys will only ask respondents about information that is not available in SSA’s administrative records. We have reviewed administrative records in detail to limit repetition. Some information about treatment group members that is collected through the surveys may be redundant with data that could also be available from the ETO database. However, this duplication is necessary to collect comparable data from sample members in the control group.


In-depth interviews will collect information on service utilization. Though general information on service utilization is available from the follow-up surveys, there was concern expressed by our Technical Working Group (TWG) that the information would not be sufficiently detailed or complete to fully inform the cost analysis. Our pre-test of the in-depth interview did demonstrate that important additional information was elicited during the course of the interview that was not reported on the follow-up instrument. This will provide for more precise cost estimates.

5. Involvement of Small Entities

Some of the service providers that will be interviewed for the process analysis may be small entities. Our protocol will impose minimal burden on all organizations involved and discussions will be kept under one hour. The information being requested has been held to the absolute minimum required for the intended use.

6. Consequences if Information Is Not Collected or Collected Less Frequently

The baseline survey is a one-time collection and is necessary to conduct a credible evaluation. The baseline survey is needed to identify and select sample members into the study groups, assure that the treatment and control groups are comparable, and obtain important covariates for subsequent analyses. The data collected during the baseline interview are not available from other sources.


The first follow-up survey will be conducted at 12 months after random assignment and will collect information on short term outcomes regarding education, earnings, employment, living arrangements, health, and quality of life. The questionnaire will focus on the 12 months since random assignment. A second follow-up survey, for which we will request clearance in January 2009, is planned for 36- months after random assignment. It will collect information about longer term outcomes in the same domains as the 12 month follow-up survey. The questions will focus on the past 12 or 24 months so as not to overlap with the 12 month survey recall period. Respondents’ inability to accurately recall necessary information over long periods of time precludes administering the survey just once.


These follow-up surveys will collect a richer set of information than can be gathered from administrative records. For example, administrative records might have data on earnings from jobs but would not have detail about the jobs such as rates of pay, hours worked, or if the job was competitive or supported employment.


Interviews and/or roundtable discussions with local program administrators, program supervisors, and service delivery staff to support the process analysis and focus groups of youth, their parents, and service providers will take place twice. The first visit will be within the first two years of demonstration startup, and the second visit will be one year later. Two visits are necessary to develop an understanding of the intervention and steps taken to implement project services. The first visit will focus on start-up activities and the second will assess the projects’ outstanding features, key challenges, and lessons learned about service delivery. During the second visit, cost data covering the duration of project operations will also be collected. Less frequent, longer visits would place more burden on staff during each visit and would not provide data needed for the site-specific interim reports due 18 months after random assignment ends.


In-depth interviews will be conducted within 3 months after completion of the 12 month follow-up survey. This will allow interviewers to have a sense of what service utilization was reported on the survey and what probes to employ to elicit additional information. Not collecting this information would result in less precise cost estimates.

7. Special Circumstances

There are no special circumstances related to the collection of information required to carry out the evaluation of YTD.

8. Adherence to Guidelines in 5 CFR 1320.5(d)(2) and Consultation Outside the Agency

a. Federal Register Notice

The 60-day advance Federal Register Notice was published on May 2, 2008 at 73 FR 24340, and SSA has received no public comments. The second Notice was published on August 22, 2008, at 73 FR 49730.


b. Consultation with Outside Agencies

An interdisciplinary project team of economists, disability policy researchers, survey researchers, and information systems professionals is needed to carry out the design and implementation of the evaluation. MPR is the prime contractor with overall responsibility for implementing and evaluating the demonstration. However, staff members from four other organizations are integral members of the study team. The participating organizations include the following:



Mathematica Policy Research, Inc.

600 Maryland Ave., SW

Suite 550

Washington, DC 20024-2512

(202) 484-4698

600 Alexander Park

Princeton, NJ 08540

(609) 799-3535


MDRC

19th Floor

16 East 34 Street

New York, NY 10016-4326

(212) 532-3200

475 14th Street

Suite 750

Oakland, CA 94612-1900

(510) 663-6372

Cornell University Institute for Policy Research

1342 22nd St., NW

Washington, DC 20037-3010

(202) 223-7670


Social Solutions, Inc.

2400 Boston St.

Suite 360

Baltimore, MD 21224

(410) 732-3560


TransCen, Inc.

451 Hungerford Drive

Suite 700

Rockville, MD 20850

(301) 424-2002



Key staff from these organizations, their roles, and contact information include the team members listed below:


Tom Fraker

MPR-DC

Project Director

[email protected]

(202) 484-4698


David Butler

MDRC-NY

Leader of task force on program development

[email protected]

(212) 340-8621


Karen CyBulski

MPR-NJ

Leader of task force on instrument development

[email protected]

(609) 936-2797


Anu Rangarajan

MPR-NJ

Leader of task force on evaluation

[email protected]

(609) 936-2765


John Martinez

MDRC-NY

Leader of task force on pilot sites

[email protected]

(212) 340-8690



Anne Ciemnecki

MPR-NJ

Survey director

[email protected]

(609) 275-2323


Richard Luecking

TransCen

Programmatic TA

[email protected]

(301) 424-2002 x230


Matt Schubert

Social Solutions

ETO

[email protected]

(410) 732-3560


George Tilson

TransCen

Director, programmatic TA

[email protected]

(301) 424-2002 x226


In addition, SSA has convened a technical working group with these members:


Michael Callahan

Marc Gold & Associates/Employment for All

[email protected]

(228) 497-6999

(228) 497-6966


Nancye Campbell

DHHS/ACF

[email protected]

(202) 401-5760


Elizabeth McGuire

DHHS/HRSA

[email protected]

(301) 443-9290


Alexandra Kielty

USDOL/ETA

[email protected]

(202) 693-3730

(202) 693-3818






K. Charlie Lakin

University of Minnesota

[email protected]

(612) 624-5005

(612) 625-6619


Rebecca Maynard

University of Pennsylvania

[email protected]

(215) 898-3558


Betsy Valnes

Black Hills Special Services

[email protected]

(605) 224-5336


Mary Wagner

SRI International

[email protected]

(650) 859-2867


Gary Walker

Public/Private Ventures

[email protected]

(215) 557-4400

c. Consultation with Beneficiaries

Beneficiaries and their parents have participated in the pretests of the baseline follow-up surveys, and the in-depth interviews.

9. Remuneration of Respondents

At baseline, beneficiaries will be offered a $10 gift for returning a completed consent form. The form of the payment varies by location. In New York City, for example, it is a $10 MetroCard. In other places, it is a $10 Target or Wal-Mart gift card. Another $10 gift will be offered to beneficiaries completing the 12 month follow-up interview and the in-depth interview. The incentive will increase interview response rates and reduce sample attrition between the baseline and 12 month interviews and between the 12 month and 36 month interviews.


All focus group participants are offered $40 to cover their time, transportation, or other costs of participating.


Program staff members are not offered remuneration for completing interviews because they will do this as part of their job responsibilities.

10. Assurance of Confidentiality

The information provided for this project is protected and held confidential in accordance with 42 U.S.C. 1306, 20 CFR 401 and 402, 5 U.S.C. 552 (Freedom of Information Act), 5 U.S.C. 552a (Privacy Act of 1974), and OMB Circular No. A-130. Data will be treated in a confidential manner unless otherwise compelled by law.


The study team takes seriously the ethical and legal obligations associated with the collection of confidential data. Ensuring the secure handling of confidential data is accomplished via several mechanisms, including obtaining suitability determinations for designated staff, training staff to recognize and handle sensitive data, protecting computer systems from access by staff without favorable suitability determinations, limiting access to secure data on a “need to know” basis and only for staff with favorable suitability determinations, and creating data extract files from which identifying information has been removed.


We will take several steps to assure sample members that the information they provide will be treated confidentially and used for research purposes only. The assurances and limits of confidentiality will be made clear in all advance materials sent to recruit potential participants and restated at the beginning of each interview session. The Paperwork Reduction and Privacy Act statements appear on the advance letter and on informed consent paperwork.


A detailed informed consent process will be employed to enroll potential sample members in the demonstration. After beneficiaries have been confirmed as eligible, interviewers will administer an informed consent protocol to enroll them into the study. The consent script addresses several issues in a balanced manner, including the program benefits, random assignment process, commitment to complete follow-up surveys, and the voluntary nature of participation. Any potential risks of participation and the use of personal information are also disclosed. After the sample member provides verbal consent, the interviewer continues on to the baseline survey. When the baseline interview is complete, MPR immediately mails a written consent form to the sample member and his or her legal guardian, if appropriate. A youth is not randomized into a treatment or control group until after written consent is received.


Subcontractors, consultants, and vendors will be required to establish confidential information safeguards that meet prime contract security requirements. The project director or task leader will take action to ensure that any confidential information provided to or generated by a subcontractor, consultant, or vendor is properly disposed of at the completion of the agreement between the parties.

11. Questions of a Sensitive Nature

The purpose of the study is to test the effects of waivers of SSA program rules and an innovative array of enhanced employment and educational services for youth with disabilities. Therefore, obtaining information about potentially sensitive topics, such as the health status and the disabling condition of sample members, is central to the intervention. Race and ethnicity is required for certain subgroup analyses. The surveys will not collect data that can be obtained directly from other sources (for example, information about receipt of disability benefits is best obtained directly from SSA administrative records).

The survey will include questions about the following topics that can be considered potentially sensitive:


  • Health status, including disability information and severity of disabling condition

  • Assistance needed with Activities of Daily Living (ADLs) and Instrumental Activities of Daily Living (IADLs) (for example, help or supervision needed with bathing, dressing, eating, or using the toilet)

  • Mental health status

  • Race and ethnicity

Many of the questions were adapted without modification from other national surveys of similar populations, such as the National Longitudinal Transition Survey (NLTS) and the National Beneficiary Survey (NBS). The instrument also contains items from the Short Form 12 (SF-12).

12. Estimates of Annualized Hour Burden

Table A.1 shows the annualized number of expected participants in the data collections, the number of interviews, hours per response, and the total associated response burden. Burden was determined though actual administration time of the baseline and through pretest-based estimates of the follow-up questionnaire and in-depth interviews. Focus groups with youth and discussions with project staff will be limited to 1.5 and 1 hour each, respectively. Baseline, 12-month follow-up, in-depth interview and focus group burden listed in Table A.1 is for individuals. Program staff and service providers may be businesses, not-for-profit organizations, or service providers.

13. Estimates of Annualized Capital Burden

There are no direct costs to respondents other than their time to participate in the study, as described above. Beneficiaries will not be asked to maintain any new records. The evaluation contractor will collect and maintain all survey data. Costs for data collection, storage, processing, and other functions related to these data will also be borne solely by the contractor. These costs are summarized below and are considered costs to the federal government, paid through SSA contracts. For reporting purposes, we use the data for 2008 in ROCIS.


TABLE A.1

ANNUALIZED BURDEN


Data Collection
Year

Collection

Number of Respondents

Responses Per Respondent

Average Burden
Per Response (Hours)

Total Response Burden
(Hours)

2007

Baseline

962

1

0.55

529


Informed Consent

962

1

.083

80


12-month follow-upa

437

1

0.83

363


Focus group

140

1

1.5

210


Program staff/service provider

32

1

1

32

Total 2007





1,214

2008

Baseline

2,531

1

0.55

1,392


Informed Consent

2,531

1

.083

210


12 month follow-up

1,502

1

0.83

1,247


In-depth interviews

120

1

.42

50


Focus group

60

1

1.5

90


Program staff/service provider

32

1

1

32

Total 2008





3,021

2009

Baseline

1,895

1

0.55

1,042


Informed Consent

1,895

1

.083

157


12 month follow-up

1,518

1

0.83

1,260


In-depth interviews

120

1

.42

50


Focus group

150

1

1.5

225


Program staff/service provider

80

1

1

80

Total 2009





2,714

2010

Baseline

263

1

0.55

145


Informed Consent

263

1

.083

22


12 month follow-up

1,137

1

0.83

944


Focus group

90

1

1.5

135


Program staff/service provider

48

1

1

48

Total 2010





1,294

2011

12 month follow-up

158

1

0.83

131

Total 2011





131


Grand Total

Baseline

5,651

1

0.55

3,108


Informed Consent

5,651

1

.083

469


12 month follow-up

4,752

1

0.83

3,944


In-depth interviews

240

1

.42

101


Focus group

440

1

1.5

660


Program staff/service provider

192

1

1

192

Grand Total


11,105



8,474


a We conduct follow-up interviews only for those baseline respondents who sign consent forms



14. Estimates of Annualized Cost to the Government

The total cost to SSA of conducting the YTD evaluation is $36,765,420. The costs by year are shown in Table A.2.


Labor costs are budgeted by estimating the number of hours of required staff at the various wage levels, multiplying by the applicable wage rates, and multiplying the resulting subtotals by factors to cover fringe benefits and burden expense. The basis for estimating other direct costs varies with the type of cost being estimated. For example, the estimates of survey telephone expense and computer expense for CATI are based on the estimated hours of interviewer time, while reproduction expense is based on the number of pages of material to be reproduced.


Finally, the total of labor costs and other direct costs are summed and multiplied by a factor to cover general and administrative expenses and the fee is added.



TABLE A.2

ANNUAL COSTS TO THE FEDERAL GOVERNMENT


Year

Cost

2006

$3,827,618

2007

$6,910,051

2008

$5,470,765

2009

$5,152,297

2010

$5,724,316

2011

$5,329,964

2012

$1,517,609

2013

$1,440,502

2014

$1,392,299

Total

$36,765,420



15. Explanation for Program Changes or Adjustments

This is the first submission for the in-depth interviews. The overall burden has increased due to the addition of the in-depth interviews, and the increase in the number of respondents for the baseline questionnaire, informed consent, and the 12-month follow-up. This increase in the number of respondents is a part of the expansion of the program, as indicated in the previous ICR approved in 2007. There is also a slight decrease in the burden for the Focus Groups. As shown in the chart in #12 above, the number of respondents for the focus group fluctuates per year of the program; therefore, next year, the burden for the focus group will increase again. Also, rather than using the annualized burdens for all of the years, we are only reporting the burden for the current year of the newly revised program.

16. Plans for Tabulation and Publication and Project Time Schedule

Baseline data collection began in July 2006 and will continue through 2010. The 12 month follow-up data collection, for which we are requesting clearance, will begin in August 2007 and will continue through 2011. Likewise, the 36 month data collection (for which we will request clearance in January 2009) will begin in August 2009 and continue through 2014. The first interviews with program staff and focus groups at each program site will take place between 12 and 15 months after the demonstration programs enroll their first youth, beginning in November 2007. The second will take place between 12 and 18 months after the first. The exact timing will depend on the length of the programs’ enrollment periods and duration of services.


A series of reports is planned throughout the life of the demonstration. Program-specific early assessment reports are scheduled to be produced 8 months after the demonstrations enroll their first youth, beginning in the spring of 2007. Process and implementation and early impact reports are due 18 months after the programs enroll their last youth, beginning in January 2010. The final report and public use data files will be produced by October 2014. Up to three reports on special topics may be produced over the life of the demonstration by October 2014.


The process and implementation reports will document and describe how the demonstration was planned and implemented, explain program processes, document beneficiary experiences with the demonstration and describe outcomes or results. The following distinct components of program implementation will be addressed: (a) outreach, recruitment, and participation; (b) the intervention, including whether each component was implemented as planned, differences in implementation across subgroups, existing service systems, and the use of services; (c) organizational arrangements, communication, and coordination; (d) coordination with SSA field offices; and (e) experiences and satisfaction of beneficiaries and other stakeholders. We will explicitly document implementation issues encountered and how they were addressed. We will also document how major features of the program change over the course of the evaluation, the reasons for the changes, and the implications for program outcomes being measured in the evaluation.


The impact reports will investigate the demonstration’s effects on a wide array of education, earnings, and self-determination outcomes; the amount of benefits the beneficiary receives from SSA; and the beneficiary’s quality of life, both overall and for meaningful subgroups. Our proposed methodological approach combines a random assignment design with regression adjustment to improve the precision of our estimates. Because individuals are randomly assigned to the control group and to the treatment group, the impact analysis will focus on differences in the outcomes of beneficiaries between these two groups using a regression framework to control for other explanatory variables. Regression-adjusted comparison of randomly assigned treatment group to control group for the full sample will be used to address the impact of the intervention on beneficiaries’ education, labor market, and other outcomes. Regression-adjusted comparison of randomly assigned treatment group to control group will be used for subgroups defined by pre-randomization values of age, race, gender, and type of disability.


The exact statistical technique used to estimate regression-adjusted impacts will depend on the nature of the dependent variable and the type of issues being addressed. For example, if the dependent variable is continuous, then ordinary least squares regression produces estimates of impacts that are unbiased. For binary outcome variables (such as whether or not the beneficiary is employed), logistic regression models generate estimates that are consistent and efficient if the parametric assumptions underlying those models are correct. If the dependent variable is a count variable then an ordered logit model will be used. If the dependent variable is ordinal, we will first reduce the measure to binary outcomes and then estimate a logit model. To account for the fact that sample members will be observed for different lengths of time, we will also consider using event-history or hazard models for binary outcome measures. These models provide unbiased estimates of program effects on binary outcomes when participants’ data are truncated.


The purpose of the benefit-cost analysis is to determine whether the program impacts of the YTD demonstration are sufficiently large to justify the costs of providing program services. The results of this analysis will play an integral part in the decision to expand the demonstration to the larger population. The analysis will be based on an accounting framework that summarizes the intervention’s effects and resource use from the perspective of SSA and other key stakeholder groups, including society as a whole.


To ensure that the benefit-cost findings are as helpful as possible to SSA, we plan to present the information in a way that has proven useful for communicating this type of information to the SSA Office of the Actuary and to OMB. First, we will summarize all of the information that is based directly on data collected during the demonstration period. The second set of estimates will present the size of future effects (if any) that would be required for the program to generate benefits that exceed costs along with an analysis of how likely it seems that future effects of that size will occur. In this way, SSA actuaries will be able to see the net value generated during the observation period and then use the more speculative analysis of possible future benefits and costs to draw conclusions about whether the YTD projects would ultimately pay for themselves. In addition to using this general presentation format, we will work with the actuaries during the evaluation to ensure that the other assumptions used in the analysis—the discount rate, correction for inflation, and projections about potential productivity growth—are consistent with the ones they are using to assess other potential SSA initiatives. This consistency will go a long way in ensuring that comparisons of the various options are accurate and useful.

17. Expiration Date for OMB Approval

The OMB expiration date will be displayed on all survey materials sent to respondents, including the advance letter and consent forms. It will be accessible in the computer-assisted instruments when a respondent requests the information.

18. Exceptions to the Certification Statement

We are not requesting any exceptions. The data collection will conform to all provisions of the Paperwork Reduction Act.



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

1. Respondent Universe and Sampling Methods

YTD projects are intended to improve long-term employment outcomes for youth ages 14-25 with disabilities. Both current SSI beneficiaries and children who are at risk of receiving benefits as adults comprise the respondent universe for YTD services. In April 2005, approximately 776,000 youth ages 14 to 25 years old received SSI benefits. In addition 320,000 youth were at risk of receiving benefits as adults, even though they did not qualify to receive benefits as children.1

YTD projects deliver services to youth with disabilities in their jurisdictions. On September 30, 2003, SSA awarded five-year cooperative agreements to seven state agencies and universities to implement YTD projects. Three of these projects were selected for the national random assignment evaluation. In addition, five new projects (out of 13 that applied) were selected for a limited pilot phase. Three of the pilot projects will be selected in the fall of 2007 to join the national random assignment evaluation, for a total of six random assignment projects. The respondent universe for this evaluation is youth who are willing and eligible to participate in the YTD services of the six random assignment projects.2 We will conduct baseline and follow-up interviews among youth in these six sites. Below we describe (1) the selection of projects for the evaluation and (2) the selection of youth in the projects’ service areas.

a. Selection of Projects

The YTD projects vary widely in the services offered, the geographic areas in which they operate, and the types of youth to whom services are offered. Projects were selected based on the following criteria:

  • Willingness to participate in an individual random assignment study

  • The sharpness of the distinction between the services offered to treatment group members and those available to control group members (the counterfactual)

  • Services that focus on work-based experiences, counseling on SSA benefits, and marketing of the special waivers of SSA rules that are available to YTD participants

  • Evidence that the project can meet sample size targets of 880 youth participating in the evaluation

  • A lead organization with demonstrated capacity and experience to manage a complex demonstration with multiple partners and to implement the intervention within the required time frame

In addition, we sought a diverse set of projects. We decided that at least one project should serve an at-risk population and that the evaluation should include a mix of projects that

  • Serve both in- and out-of-school youth

  • Serve all of the SSI impairment groups as well as focus on specific impairment groups

  • Focus on youth between the ages of 16 and 21, the key transition ages

  • Include a diverse group of lead agencies with varied organizational affiliations, expertise, and experience

  • Represent geographical, ethnic, and racial communities, including a mix of urban, suburban, and rural sites

b. Selection of Youth

Each of the random assignment projects will be expected to serve 400 treatment group youth. To allow for attrition, we will generate a treatment group of 480 youth who may be served and a control group of 400 youth. This will result in a total of 5,280 youth in the study (880 youth in each of 6 projects). We will obtain baseline information and written consent to participate in the evaluation for all of these youth. We expect that 90 percent (approximately 4,752) will complete the 12-month follow-up interview.

Tables B.1 and B.2 provide descriptions of the populations and our best estimates of the numbers of youth who meet the eligibility criteria for each project.


It is necessary to obtain large numbers of respondents to the baseline survey in order to generate 880 youth who consent to be in the study. Since July 2006, MPR has conducted 925 baseline interviews with youth in the existing YTD programs under OMB #0960-687. Of those who have completed baseline interviews, 69 percent have provided written consent to participate in the study. We have made only limited attempts to convert refusals and increase the consent rate through incentives and field follow up because we have found that baseline respondents who do not readily provide written consent to participate in the study are less likely to participate in program services if assigned to the treatment group.

Large samples are necessary to generate the necessary number of completed baseline interviews. About 25 percent of the youth on the SSA beneficiary lists are not readily locatable, making them poor candidates for program services. Another 20 to 25 percent do not complete the baseline interview because they either are not interested in participating or are not available at times when program services are offered.

TABLE B.1

CHARACTERISTICS OF EXISTING YTD PROJECTS

PARTICIPATING IN THE RANDOM ASSIGNMENT EVALUATION


Project Name

Location

Description of Youth Served

Population Estimates

Colorado’s Youth Work Incentive Network of Support (WINS)

Boulder, El Paso, Larimer, and Pueblo counties, CO

14- to 25-year-old current SSI beneficiaries

2,750a



New York’s Transition WORKS

Erie County, NY

14- to 25-year-old current SSI beneficiaries in Erie County

3,300

New York’s CUNY Youth Transition Demonstration

Bronx, NY

17- to 18-year-old current SSI beneficiaries with disabilities other than blindness, deafness, schizophrenia, and conduct disorders

3,450

aColorado Youth WINS has indicated a willingness to expand to adjacent counties as necessary.



TABLE B.2
CHARACTERISTICS OF FIVE YTD PILOT PROJECTS


Project Name

Location

Description of Youth Served

Population Estimates

Abilities, Inc.

Miami-Dade County, FL

Current SSI beneficiaries who are in their last two years of high school when enrolled

4,700a

Community-Minded Enterprises (CME)

Spokane County, WA

Current SSI beneficiaries and at-risk youth who have been denied benefits

1,325 current beneficiariesb

+ an estimated 543 at-risk youth

Career Transition Program (CTP)

Montgomery County, MD

Youth with severe emotional disabilities—75 percent will be at risk; 25 percent will be current SSI beneficiaries

313c

+ an estimated 128 at-risk youth

Vermont Division of Vocational Rehabilitation (DRV)

VT (state)

Current SSI beneficiaries ages 15-25

1,748

Human Resources Development Foundation (HRDF)

WV (state)

Current SSI beneficiaries who are in their last two years of high school when enrolled

3,400d

Note: Three of the five YTD pilot projects will participate in the random assignment evaluation.

a Based on 10,834 youth ages 14-25 in Miami Dade County.

bCME will serve at-risk youth as well as current SSI beneficiaries and is willing to expand outside Spokane County.

c Currently about 978 SSI beneficiaries reside in Montgomery County. We estimate that 32 percent have severe emotional disorders. CTP is willing to expand into Frederick and Prince Georges counties.

d Based on 5,968 16- to 22-year-olds statewide.

All youth who are randomly assigned to the study are eligible for 12-month follow-up interviews. As stated above, we expect a 90 percent response rate to the 12-month interview.

2. Procedures for the Collection of Information

a. Recruiting Study Participants at Baseline

Different recruitment strategies are necessary depending on whether a project serves only youth who are in SSA records or whether it also or only serves youth who are identified by other means, such as referrals. We first discuss recruiting procedures for youth with presence in SSA records. For projects serving current beneficiaries or at-risk youth who can be identified through SSA records (for example, youth whose applications were denied), MPR conducts baseline interviews, gathers written informed consent, and randomly assigns consenting youth into the treatment or control group. After random assignment, only the names of treatment group members are shared with the YTD projects for enrollment and services. The specific steps in the recruitment process are listed below:

1. Obtain a list of beneficiaries (or denied applicants) from SSA for the relevant catchment areas


2. Check the list to exclude ineligible youth based on age, place of residence, or disabling condition(s)


3. Randomly sort the list into batches of youth (also called replicates)


4. Send letters to a batch of youth informing them about YTD program services to recruit them into the study (Appendix F)


5. Place telephone calls to determine interest in YTD services


6. Gather baseline and re-contact data by telephone, and obtain written informed consent from youth/parent by mail or in person. Appendix G contains the consent forms for the three existing sites. Consent forms for newly selected will be similar.


7. Randomly assign youth to the treatment or control group


8. Provide YTD project staffs with information on treatment youth so they can contact them and start providing program services


9. Continue to release cases in batches until the desired enrollment for the project has been reached



At-risk youth who have not applied for SSI benefits will be identified through referrals from local organizations, including schools and other agencies that work with youth with disabilities. We have not yet recruited at-risk youth into the study; however, our design for the recruitment procedures is as follows:

  1. The YTD project determines whether a youth who has been referred to it meets the project’s eligibility criteria.

  2. If a youth meets the criteria, the project obtains a completed application form, collects baseline and re-contact data, and obtains written informed consent.

  3. The project transmits this information to MPR.

  4. MPR conducts random assignment and immediately provides the YTD project with information on the treatment/control status of the case.

  5. YTD project staff informs the youth of his or her random assignment status and commences services to youth in the treatment group.

  6. This process continues until the desired enrollment target has been reached.


Hybrid procedures will be designed for projects that serve both youth who can be identified in SSA records as well as youth who must be identified through other sources.


For either recruitment method, MPR (or the YTD project) contacts parents or legal guardians of youth under age 18 and gains consent to speak with youth. For youth over age 18 with legal guardians, MPR (or the YTD project) gains permission from the legal guardians before approaching the youth. Both the baseline and 12-month follow-up interviews contain a parent module consisting of questions that youth may not be able to answer reliably.


Most YTD projects have limited samples in their intended catchment areas and are planning expansions to reach enough youth to generate the 880 sample members needed to make statistically significant comparisons between treatment and control group members. For example, the New York City YTD project is now considering expanding from the Bronx into areas of Manhattan to target its services to 17- and 18-year-olds. Likewise, the Montgomery County, Maryland, project is considering an expansion into Prince Georges County and has expressed a willingness to expand to Frederick if necessary so that it can target services to youth with severe emotional disorders.

b. Study Procedures for 12-Month Follow-Up Interviewing

Sample members will be mailed an advance letter advising them of the upcoming survey about one week prior to their 12-month anniversary (Appendix H). The letter will contain a toll-free number that the youth or his or her parent may call if they have questions or wish to set an interview appointment. Next, MPR will telephone the last known number for the youth and/or his or her parent or guardian. If the number is disconnected, MPR will attempt to locate an address or telephone number. MPR will use CATI as the primary mode of data collection for the follow-up survey. Sample members who do not respond by telephone, or whose disabilities prevent them from being able to complete the interview via telephone, will be interviewed in person. However, before conducting an in-person interview, we will attempt to use TTY, computers, and other technologies that might enable an interview without field followup, similar to the procedures used for the baseline data collection. It is important that follow-up interviews be conducted at the appropriate interval following random assignment, which is 12 months or shortly thereafter for the first follow-up survey and 36 months or shortly thereafter for the second. Given that the sample intake period will be over a long period for most projects, the number of in-person interviews required per month at a site may be too few to justify the cost of computer-assisted personal interviewing (CAPI) data collection. Thus, MPR expects to use more cost-effective in-person data collection methods, such as having the field interviewers use hardcopy instruments to complete surveys or providing the youth with cell phones they can use to call in to MPR. For youth who decline to participate, MPR will identify why they are reluctant to participate and will send a letter that addresses their concerns and encourages participation.

c. Study Procedures for Process Visits, Focus Groups and In-Depth Interviews

A major source of information for the process analysis will be two comprehensive visits to each random assignment project. The exact timing of the process visits to a specific project will depend on how long youth will be enrolled in the project as well as the duration of intervention services. However, we expect that the first visit to most projects will be within the first two years of demonstration startup (that is, the start of random assignment), and the second visit will be approximately a year later.

Staff Interviews. During the site visits, the evaluation team will conduct individual and group interviews with management and staff of various stakeholders in the local YTD project such as the following:

  • Project directors and site managers will offer insights into the history of each project’s sponsoring organization and its experience in serving youth with disabilities; an overview of the conception, development, and implementation of the program model and the organizational and management structure for the project, including the project budget and key project partners; and the roles and qualification of staff members, their caseloads, and the supervisory structure of the primary service providers. Interviews at this level will be designed to highlight some of the major challenges that service providers have encountered.

  • Project line staff, who are in direct contact with the youth being served, will provide insight into how the youth are identified and recruited, the methods used to assess a youth’s needs and the project’s approach to serving them, the way appropriate services are selected and delivered, and the extent to which youths’ families are involved with project services. These staff will also provide insight into how much structure or flexibility staff members have in performing their jobs, the extent to which clients’ experiences diverge from the program model, and the reasons behind such variation.

  • Staffs of partner organizations can provide information on linkages between the project and other services providers as well as on the successes or challenges of the collaborations. They will provide perspectives on the nature of the agreements, how effectively they function, and the ways in which project services complement or are integrated with the services of partner organizations. These might include interviews with the staff of direct service partner organizations as well as with the staffs of schools, vocational rehabilitation agencies, mental retardation and developmental disabilities agencies, and other agencies that serve persons with disabilities.

  • Staffs of local SSA field offices can provide insights into the broad context of services available in the area and the local implementation of SSA’s waivers for YTD participants.

Separate protocols will be developed to provide structure for the each of the types of data to be collected during the site visits. We will create a master protocol that will include the items to be covered during the visits and will identify their relationships to the objectives or key questions for the process analysis. Items from the master protocol will be selected, tailored, and used with appropriate follow-up probing and elaboration depending on the specific project and the person being interviewed. Similarly, we will create focus group guides, as well as structured protocols to record data from case reviews and observations of project activities.


Focus Groups with YTD Participants. To capture critical qualitative information about the experiences of YTD participants (and where relevant, their families), we will conduct two focus groups in each project with participating youth and their families. These focus groups will be discussions to gather information on participants’ experiences while participating in the project and their awareness and utilization of services. The focus group discussions will cover the perceived quality of project services, perceptions of gaps in activities or services, and how the SSA waivers were explained and offered to participants. Each group will include 8 to 12 youth or parents. The focus groups will complement the information collected in the follow-up surveys, providing a more in-depth and qualitative understanding of their experiences. They will help the evaluation team assess whether and how the projects did or did not meet participants’ expectations. We will also try to conduct a focus group in each project with treatment group members who did not participate in services to understand their reasons for nonparticipation. Project staff members will recruit youth and parents to participate in the focus groups. The discussions will be held at project facilities that are well known in the community and are accessible to persons with disabilities.


In-depth Interviews with YTD Participants and/or their Guardians. To capture critical information related to service utilization that will supplement the follow-up interviews, we will conduct in-depth interviews with YTD participants and/or their guardians. Twenty treatment and 20 control youth (40 total) from each random assignment site will be selected to participate based on the level of service utilization reported on the 12-month follow-up survey: low, moderate, and high service use. The telephone interviews will be free form, with a general topic guide to help the interviewer guide the conversation. Probes will be used to solicit the detailed service utilization information needed to inform the cost analysis. When possible, we will ask that both the youth and their parent/guardian be available to contribute to the discussion. Our fallback will be to administer the in-depth interview to the person who answered the service utilization questions on the 12 month follow-up survey.

d. Statistical Power/Precision Estimates

For this evaluation to be useful to policymakers, it needs to have a sample that is large enough to allow us to detect policy-relevant impacts. The design of the YTD evaluation calls for the random assignment of 880 youth with disabilities to either a treatment or a control group for each of six projects. Table B.3 presents estimates of the minimum treatment-control differences that could be detected for three types of outcomes that the evaluation will examine. First, for outcomes that can be expressed in binary terms, such as the likelihood of becoming employed or of leaving the SSI rolls, we present estimates for outcomes centered on 50 percent (the most conservative assumption), as well as on 30 or 70 percent. Second, we examine annual earnings based on SER data. Third, we consider monthly SSA benefit amounts. The earnings and benefit outcomes will be critical in determining the cost effectiveness of YTD services. The minimum detectable treatment-control differences are presented for these outcomes under the assumption that we use a two-tailed test and 90 percent confidence levels to determine impacts. The table shows minimum detectable differences at 80 percent power (that is, the ability to detect true differences 80 percent of the time). A reduction in variance of 10 percent owing to the use of regression models is assumed.


The numbers in the table indicate that, with sample sizes of 480 treatment group members and 400 control group members, we could detect impacts on employment and benefit receipt of 7 to 8 percentage points, impacts on earnings of $489 annually, and impacts on SSI benefits of $42 per month. For example, if the likelihood of being employed one year after random assignment were 30 percent in the absence of YTD services, and if YTD services raised this to 38 percent, then we would have an 80 percent chance of detecting this impact with our sample.


The adequacy of samples of 480 treatments and 400 controls is confirmed by several studies of people with disabilities. For example, the evaluation of the Transition Employment Training Demonstration was based on samples of about 375 recipients each in the treatment and control groups. The study estimated that transitional employment services for SSI recipients with mental retardation increased earnings during the second year after random assignment by $835 and the probability of being employed at the end of that year by 12 percent. Similarly, the evaluation of the Structured Training and Employment Transitional Services demonstration, which targeted youth with mental illness, found an increase of more than 9 percentage points in employment for treatment group youth 15 months after random assignment.



TABLE B.3

MINIMUM DETECTABLE IMPACTS FOR THE YTD EVALUATION,

ASSUMING INDIVIDUALIZED RANDOM ASSIGNMENT



Employment Rate or SSI Receipt Rate



Sample Size (Treatment/Control)

50 Percent

30 or 70 Percent

SER Annual Earnings
(Mean = $1,213)

Monthly SSI Benefits
(Mean = $588)

Full Sample





480/400

8.0

7.3

$489

$42

Subgroup Sample





240/200

11.7

10.3

$690

$60

Note: The calculations assume (1) a 90 percent level of confidence for a two-tailed test and an 80 percent level of power, (2) a standard deviation of $267 for the monthly SSI benefits amount and $3,069 for annual earnings, and (3) a reduction in variance of 10 percent owing to the use of regression models. The standard deviations are derived from Mathematica’s Ticket to Work Evaluation Summary Earnings Records data and SSI benefits data for youth ages 18 to 25 in 2001.



3. Methods to Maximize Response Rates

We will use the following procedures to maximize the response and participation rates of the baseline and follow-up interviews:

  • Effective and targeted advance materials

  • Collecting and updating of contact data for the sample member and for someone who would know how to reach the sample member at the time of next interview

  • Multiple methods for tracking and locating beneficiaries, including the use of extracts from SSA administrative data to capture address updates during the course of the survey, the use of an independent vendor providing commercially available contact information, and MPR’s internal respondent tracking efforts

  • At the 12-month follow-up interview, the use of a combination of telephone and in-person interviewing to maximize our ability to contact sample members

  • Interviewer training that includes instruction on motivational interviewing, that stresses the importance of respondent cooperation, and that develops interviewer skills for averting and converting refusals

  • Interviewer training on when and how to select an appropriate proxy to conduct an interview

  • A bilingual module to help bilingual interviewers assess whether to conduct an interview in Spanish or English and to cover differences in dialects

Protocols for breaking off and then resuming interviews to accommodate beneficiaries who may become fatigued during the interview


The focus of all respondent materials (letters, brochures, and consent forms) will be to secure cooperation through the clarity, simplicity, and thoroughness of the materials, which will be written at a sixth grade reading level. The expected response rate to the 12-month follow-up survey is 90 percent. Locating participants will be the first challenge to obtaining this response rate. While SSA has contact information for all current beneficiaries, that information is not always accurate, and at follow up some sample members will no longer be receiving benefits from SSA. Telephone numbers can be particularly problematic because there is no administrative reason to keep them updated in SSA records. Addresses are more reliable because they are sometimes used for mailing checks. These might, however, be post office boxes, addresses of guardians, financial institutions, or other individuals or organizations that may be of only limited use in locating a beneficiary. Further, since many beneficiaries now receive their checks via direct deposit, SSA address information is less accurate now than it once was.

To improve the contact information, we will mail an advance letter to each sampled person prior to each survey, using the most recent address of record. The letter will describe the survey, provide a toll-free number to contact Mathematica, and indicate that the beneficiary will be contacted regarding it. The letter will be sent “address service requested,” which results in (1) the mail being forwarded to recipients who have a forwarding address and (2) a notice of the new address being sent to the sender. If the forwarding authorization has expired, the letter is returned to the sender with the new address attached.

When an address is available but a phone number is not, we will conduct a directory search to obtain a number. For cases where neither SSA records nor the directory search yields a telephone number, MPR will use alternative locating strategies, including online nationwide databases to verify or update addresses and other information. During the baseline interview we request the name, address, and telephone number of two people who are likely to know how to contact the sample member in the future.  If we lose contact with the sample member, we will contact these individuals to obtain the sample member’s most recent information. At followup, if locating contacts are exhausted and no current phone number is available, we will conduct a field search, starting with any available information. This will usually involve a contact with the addressee for the beneficiary’s monthly check, which may be the beneficiary or their representative payee. If the addressee is not the beneficiary, we would expect that individual to have the contact information that we are seeking. Some sources might be reluctant to provide that information, and in such an instance we would ask the source to pass on a written request to the beneficiary to send us the information on a postage-paid card, to call a toll-free number, or to contact us by email.

When a phone number is available or has been obtained, we will attempt to contact the beneficiary by telephone to conduct the interview. We will make attempts on different days and times. If successful contact is made and the beneficiary consents to be interviewed, we will conduct the interview using CATI technology. As indicated above, we will make multiple accommodations to increase response and encourage participation by sample members in the interview. For respondents who are deaf or hard of hearing we will use amplified telephones, TTY, and Relay technologies. For respondents who speak Spanish, advance materials will be available in Spanish, and a Spanish-language version of the survey instruments will be developed and administered by Spanish-speaking interviewers. Interpretation services will be used for other non-English speakers. For respondents who fatigue easily, we will use structured checkpoints during the interview so that interviewers can assess whether a respondent is becoming too fatigued to continue with the interview and schedule a convenient time to complete the interview. A ten-dollar post-paid incentive at baseline and after each follow-up interview will keep sample members engaged over time.

4. Tests of Procedures

The baseline survey instrument has been pretested both in person and over the telephone. To determine whether youth with disabilities could self-respond reliably to this survey, we administered the entire baseline instrument, including both the parent and youth modules, to each selected youth and his or her parent or legal guardian. We then compared the youth’s responses to factual questions with those given by the youth’s parent or guardian. In assessing the youth’s responses, we treated the parental responses as being “correct.” Overall, the responses given by youth and their parents or guardians to factual questions had an agreement rate of 72 percent. The agreement rate increased to 80 percent when items in the parent module (that focus on family information such as household income or parents’ education) were excluded from the comparison. These results indicated that youth could reliably answer the items in the youth module; in fact, for many items, the youth were able to provide more information than their parents. The pretest revealed that some questions in the draft baseline instrument were confusing for both youth and their parents, and we dropped those items from the final version of the instrument. The in-person pretest interviews, when we administered the same instrument to both youth and parents, took on average 25 minutes for the youth and 33 minutes for the parents or guardians. Based on those results, we modified the baseline instrument and conducted telephone pretests. We have been administering this baseline questionnaire since July 2006.

To test the 12-month follow-up questionnaire, we conducted nine telephone pretests among youth with disabilities who are currently being served by the Bridges to Work Program, a YTD program that is not part of the random assignment study. Pretests revealed that the youth could not easily distinguish between educational services and transition services delivered by other providers. These questions were modified. The pretest also showed that the original follow-up instrument was too long. We deleted questions of lower priority until we reached an interview that took, on average, 50 minutes to administer.

Most questions in both the baseline and follow-up questionnaires have been used on other studies of youth or persons with disabilities. These include the National Longitudinal Transition Survey (NLTS), the National Beneficiary Survey (NBS), the Short Form 12 (SF12), and the Canadian Youth in Transition Survey (YITS).


To test the in-depth interviews, we conducted seven telephone pretests. Three were among youth with disabilities participating in HRDF’s pilot of YTD; the remaining four were among youth with disabilities participating in CO’s YTD program who responded to the 12 month follow-up survey. The pretest showed that detailed information on service utilization could be collected via the free form interviews. More importantly, it showed that the in-depth interview collected additional service utilization data that was not reported on the 12 month follow-up survey. This additional information will lead to more precise cost estimates for the benefit-cost analysis.

5. Statistical Consultants and Persons Collecting and Analyzing the Data

Mathematica Policy Research, Inc. (MPR) is conducting this study, including collecting and analyzing the survey data, under contract to SSA (Contract No. SS00-05-60084). MDRC is a subcontractor to MPR on this study. Thomas Fraker of MPR (202-484-4698) is the project director and has overall responsibility for the project. Anu Rangaragan (609-936-2765) and John Martinez of MDRC (212-340-8690) are the principal investigators. Karen CyBulski (609-936-2797) and Anne Ciemnecki (609-275-2323) direct the data collection effort. Jamie Kendall of SSA (202-358-6448) is the technical Project Officer.



1 Our definition of at-risk youth includes denied child SSI applicants age 16 to 25 and youth with serious emotional disturbances age 14 to 17.

2 All seven of the existing projects and the three new ones selected for the random assignment study will be included in a process study of the implementation of YTD. That study will include discussions with project staff and service providers.


File Typeapplication/msword
File TitleMEMORANDUM
AuthorDawn L. Patterson
Last Modified By177717
File Modified2008-11-13
File Created2008-08-29

© 2024 OMB.report | Privacy Policy