Supporting Statement Part A - 0799

Supporting Statement Part A - 0799.docx

Promoting Readiness of Minors in SSI (PROMISE) Evaluation - Interviews with Program Staff, and Focus Group Discussions

OMB: 0960-0799

Document [docx]
Download: docx | pdf


Supporting Statement for The PROMISE EVALUATION

The Promoting Readiness of Minors in Supplemental Security Income (PROMISE) demonstration pursues positive outcomes for youth with disabilities who receive Supplemental Security Income (SSI), and their families, by reducing dependency on SSI. The Department of Education (ED) awarded six cooperative agreements to states to improve the provision and coordination of services and support for youth with disabilities who receive SSI and their families to achieve improved education and employment outcomes. ED awarded PROMISE funds to five single-state programs and one six-state consortium.1 With support from ED, the Department of Labor (DOL), and the Department of Health and Human Services (HHS), the Social Security Administration (SSA) is evaluating the six PROMISE programs. SSA contracted with Mathematica Policy Research to conduct the evaluation.

Under PROMISE, targeted outcomes for youth include: (1) an enhanced sense of self‑determination; (2) achievement of secondary and post-secondary educational credentials, and attainment of early work experiences culminating with competitive employment in an integrated setting; and (3) long-term reduction in reliance on SSI. Outcomes of interest for families include: (1) heightened expectations for, and support of, the long-term self-sufficiency of their youth; (2) parent or guardian attainment of education and training credentials; and (3) increases in earnings and total income. To achieve these outcomes, the PROMISE projects make better use of existing resources by improving service coordination among multiple state and local agencies and programs.

SSA is requesting clearance for the collection of data needed to evaluate PROMISE. The evaluation provides empirical evidence on the impact of the intervention for youth and their families in several critical areas, including: (1) improved educational attainment; (2) increased employment skills, experience, and earnings; and (3) long-term reduction in use of public benefits. The PROMISE evaluation is based on a rigorous design that entailed the random assignment of approximately 2,000 youth in each of the six programs to treatment or control groups (12,000 youth total). The PROMISE programs provide enhanced services to youth in the treatment groups (and their families), whereas youth and families in the control groups are eligible only for those services already available in their communities independent of the interventions.

The evaluation assesses the effect of PROMISE services on educational attainment, employment, earnings, and reduced receipt of disability payments. The three components of this evaluation include:

  1. The process analysis, which documents program models; assesses the relationships among the partner organizations; documents whether the grantees implement the programs as planned; identifies features of the programs that may account for their impacts on youth and families; and identifies lessons for future programs with similar objectives.

  2. The impact analysis, which determines whether youth and families in the treatment groups receive more services than their counterparts in the control groups. It also determines whether treatment group members have better results than control group members with respect to the targeted outcomes noted above.

  3. The benefit-cost analysis assesses whether the benefits of PROMISE, including increases in employment and reductions in benefit receipt, are large enough to justify its costs. The evaluator will conduct this assessment from a range of perspectives, including those of the participants, state and federal governments, SSA, and society as a whole.

Table 1 lists the research questions addressed by each of the components of this evaluation.

Table 1. Evaluation Research Questions and Analysis Components

Research Question

Process Analysis

Impact Analysis

Benefit-Cost Analysis

How were the programs designed, implemented, and operated and what factors contributed to the implementation experience?

X



Do PROMISE participants receive more and better transition and supportive services than others?

X

X


Are the PROMISE programs successful at achieving intended outcomes?


X


Are the PROMISE programs more effective for some youth and families than others?


X


Which program features are associated with achievement of the goals of the PROMISE initiative?

X

X


Are the benefits of PROMISE, including increased employment and earnings and reduced benefit receipt, large enough to justify its costs?


X

X

How might programs such as PROMISE be strengthened in the future?

X




SSA already implemented several data collection efforts for the evaluation. These include: (1) follow-up survey interviews with youth and their parents or guardians 18 months after enrollment; (2) telephone and in-person interviews with local program administrators, program supervisors, and service delivery staff; (3) two rounds of focus groups with participating youth in the treatment group and their parents or guardians; and (4) collection of administrative data.

SSA is now planning follow-up survey interviews with youth and their parents or guardians 60 months after enrollment. At this time, SSA requests clearance for the 60-month survey interviews. The data collected from parents and youth in these surveys will inform the impact analysis and the benefit-cost analysis. This information collection was always part of the plan for the national evaluation; however, it was not included in the evaluation’s prior Office of Management and Budget (OMB) submissions because the approval would have expired before it was needed.



Part A. Justification for the Study

1. Authoring Laws/Circumstances Making the Collection of Information Necessary

Since 1980, Congress has required SSA to conduct demonstration and research projects to test the effectiveness of possible program changes that could encourage individuals to work and decrease their dependence on disability benefits. In fostering work efforts, SSA intends for this research and the program changes evaluated to produce federal program savings and improve program administration. Section 1110 of the Social Security Act authorizes SSA to conduct research and evaluation projects.

Youth who receive SSI face substantial barriers in making the transition to adult life. In addition to the issues facing all transition-age youth, SSI recipients and their families must consider issues related to the youth’s impairment and eligibility for continuing supports, especially cash assistance and medical insurance, as youth move into adulthood. SSI recipients who work and earn income above a certain threshold generally lose $1 of benefits for every $2 of earnings. Upon reaching age 18, child SSI recipients must undergo a redetermination of eligibility based on the adult definition of disability to continue receiving cash assistance. Uncertainty surrounding the outcome of that process may influence the decisions by youth to seek education, training, and work skills prior to age 18, as well as the support of families for their investment in human capital (Loprest and Wittenburg 2007). The poor outcomes of child SSI recipients prior to and after age 18 are indicative of the challenges they face moving into adulthood. Nearly one-third of them drop out of high school prior to age 18, and 43 percent experienced problems in school that resulted in their suspension or expulsion (Hemmeter et al. 2009). Relative to other young adults, former child SSI recipients are more likely to be inactive in employment, school, and service programs; have higher rates of arrest; and have higher school dropout rates after age 18 (Wittenburg 2011; Hemmeter et al. 2009; Loprest and Wittenburg 2007). Approximately two-thirds continue to receive SSI as adults and only 22 percent work between the ages of 19 and 23 (Loprest and Wittenburg 2007). These poor outcomes may reflect the unique characteristics of these youth, particularly their severe impairments; however, they may also reflect factors associated with their families, such as low incomes, and other characteristics of the service environment.

A growing body of research suggests the importance of families in the employment outcomes of transition-age youth with disabilities. Studies have demonstrated positive associations between the employment outcomes of these youth and the resources of their families, such as income, education, and family structure (Chiang et al. 2012; Emerson 2007; Loprest and Wittenburg 2007; Shattuck et al. 2012). Further evidence suggests that youth with disabilities rely primarily on family networks to find jobs (Hasazi et al. 1985) and they report family involvement as more important than other transition factors to their success (Powers et al. 2007). Family expectations about employment may be a particularly important determinant of the employment outcomes of transition-aged youth with disabilities (Blacher et al. 2010; Carter et al. 2012; Lee and Carter 2012; Lindstrom et al. 2011; Lindstrom et al. 2007; Simonsen and Neubert 2013) – potentially more important than income (Carter et al. 2012) or family structure (Lindstrom et al. 2007). Carter et al. (2012) suggest that family expectations are associated with youths’ paid employment experiences during school, and so may improve youths’ post-school employment outcomes. The importance of families in youth transitions may be amplified by the weakness of the transition service environment. High school students with disabilities may experience significant gaps in services and lack linkages to adult services. Many do not get information from their schools on how to access needed services. The U.S. Government Accountability Office (GAO) (2006) reports that youth with disabilities and their families often have difficulties identifying and learning how to ask for the accommodations they need to succeed in school and the workplace. Outside systems do not consistently provide these youth with the supports they need to achieve positive adult outcomes, especially in the critical areas of continuing education and employment. For example, only about one-quarter of secondary special education students, ages 17 or 18, have vocational rehabilitation (VR) counselors involved in their transition planning (Cameto et al. 2004). The problem of accessing supports is compounded by a lack of coordination between school- and adult‑based services as youth leave secondary school (Luecking and Certo 2003; U.S. GAO 2006; Wittenburg et al. 2002).

The PROMISE programs are intended to address key limitations in the existing service system for youth with disabilities. By intervening early in the lives of these young people, at ages 14–16, the programs engage the youth and their families well before critical decisions regarding the age-18 redetermination are upon them. We expect the required partnerships among the various state and federal agencies that serve youth with disabilities to result in improved integration of services and fewer dropped handoffs as youth move from one agency to another. And, by requiring the programs to engage and serve families and provide youth with paid work experiences, the initiative is mandating the adoption of critical best practices in promoting the independence of youth with disabilities.

OMB proposed PROMISE as an interagency project between HHS, DOL, ED, and SSA. OMB requested that SSA conduct a rigorous evaluation of the PROMISE programs, focusing on key outcomes of interest, including reductions in SSI payments. The federal partners will use the information the evaluation contractor collects to assess the effectiveness of the interventions that the individual PROMISE programs implement.

2. Purposes and Uses of the Information

The 60-month survey will focus on outcomes that the PROMISE programs might affect, and will collect information that cannot be readily obtained from administrative data files and other sources. The survey will cover long-term outcomes, such as educational attainment; employment; earnings; and non-SSA program benefit receipt. The data the 60-month survey gathers will be critical input to several of the evaluation’s analytic components. Along with data from SSA’s administrative files, the evaluation will use data from the 60-month survey as the basis for the long-term impact analysis. In addition, the evaluator will incorporate the impact estimates into the evaluation’s benefit-cost analysis.

Given their substantial investment in PROMISE and the pressing needs of transition-age SSI youth and their families, the federal sponsors of this initiative are keenly interested in whether and how the PROMISE programs achieve their goals, and whether the benefits of the programs outweigh their costs. To respond to the needs of the program sponsors, we designed the PROMISE evaluation to address the following overarching research questions:

  • How were the programs designed, implemented, and operated, and what factors contributed to the implementation experience?

  • Do PROMISE participants receive more and better transition and supportive services than others?

  • Are the PROMISE programs successful at:

  • Increasing educational attainment?

  • Increasing employment credentials?

  • Improving employment outcomes?

  • Reducing SSI payments?

  • Reducing the use of other public benefits?

  • Increasing total household income?

  • Are the PROMISE programs more effective for some youth and families than others?

  • Which program features are associated with achievement of the goals of PROMISE?

  • Are the benefits of PROMISE, including increased employment and earnings and reduced benefit receipt, large enough to justify its costs?

  • How might programs such as PROMISE be strengthened in the future?

The sections below describe the information we will collect in the parent and youth interviews, as well as its purposes and its uses.

SSA contracted with Mathematica Policy Research to conduct the evaluation of PROMISE and oversee all aspects of the survey administration. Enrollment and random assignment began in April 2014, and continued through April 2016. The evaluation conducted an initial follow-up survey of the approximately 12,000 PROMISE evaluation enrollees (2,000 at each of the six PROMISE programs) and their parents or guardians 18 months after each enrollee’s random assignment date. Mathematica fielded the 18-month survey from November 2015 to March 2018, and obtained an 81 percent response rate. Mathematica will interview the enrollee and parent or guardian again on the 60-month anniversary of their random assignment date, starting in May 2019. Mathematica will conduct the interviews primarily via computer-assisted telephone interviewing (CATI), with field locating and computer-assisted in-person interviewing (CAPI) as necessary (the instruments for telephone and in-person interviewing are shown in Appendices E and F). Mathematica will also mail a self-administered questionnaire to all non-responders shortly before each cohort’s field period closes (the self-administered questionnaires are shown in Appendices C and D).

The 60-month survey will yield information on critical outcomes that is not available at all, or not available for members of the control group in administrative data. Examples include employment; education; household income sources and amounts; youth self-determination and goals for the future; and youth’s knowledge of key SSA program provisions. Although earnings from formal jobs will be available from SSA administrative files, the surveys will collect more current and detailed information about earnings, including wage rates and hours worked in both formal and informal employment. Findings from the Youth Transition Demonstration (YTD) evaluation suggest that information on informal employment may be particularly important for an intervention targeting youth with disabilities. At one YTD site, the program showed a positive and statistically significant impact on any employment (formal or informal) based on survey data, but no significant impact on formal employment based on administrative data (Fraker et al. 2014).

The survey data also eliminates the need to collect the Social Security numbers (SSNs) of all household members for the purpose of identifying these individuals in administrative files. Individuals are often reluctant to provide their SSNs because of security concerns, or may have difficulty providing them for all members of their households. Therefore, a requirement to collect SSNs could have made it more challenging for the PROMISE programs to reach their enrollment targets. The survey data also reduces the number of administrative data sources needed for the evaluation, access to which can be difficult. Identifying and arranging to collect all of the relevant administrative data from the eleven states participating in PROMISE, including data from federal and local programs, would be logistically difficult and potentially result in inconsistent measures across the states. The survey allows us to focus on the key variables of interest and collect them in a consistent manner across the eleven states. Table A1 lists the intended uses of information from the PROMISE parents and youth interviews that will be conducted 60 months after random assignment.

Table A1. Youth and Parent/Guardian Instruments for 60-Month Survey: Modules and Domains

Modules and domains

Measures

Parent instrument

Parent educational credentials and employment experience

Education and training

Whether parent/guardian and spouse had any postsecondary degree, certificate, or license; type of highest degree, certificate, or license (bachelor’s, associate’s, certificate, or license) achieved by parent/guardian and spouse

Employment and earnings

For parent/guardian and spouse (if applicable) separately: employment, hours of work, earnings, and access to fringe benefits through paid jobs in past year; current employment; barriers to employment (if not currently employed)

Parent and family well-being

Income and program participation

Household income in past year (total and by source); household’s current participation in other public-assistance programs

Health insurance

Any current health insurance coverage; any current private health insurance coverage, any current public health insurance coverage; and any current coverage through the health insurance exchanges for parent/guardian and spouse (if applicable)

Parent expectations for youth

Expectations

Parent’s expectations about youth’s future education, and employment, residential, and financial independence at age 25

Youth instrument

Youth education and training

Secondary and postsecondary education

Current school enrollment status; type of school currently attending; whether currently receiving education accommodations; highest grade completed; high school completion; type of high school credential received; age at high school completion; postsecondary educational attainment, by type of institution or degree; barriers to pursuing further education

Training

Currently attending a training program; type of training program currently attending; whether currently receiving training accommodations; receipt of training diploma, certificate, or license in past year

Youth employment-related service receipt and employment experience

Employment-related service receipt

Receipt of employment-related transition services (services to prepare for, get, and keep a job; services to continue education beyond high school; services to get accommodations for school, work, or living independently)

Employment

Employment in paid and unpaid jobs in the past year; self-employment; how youth found the job(s); employment, hours of work and earnings in paid jobs in the past year; current employment; types of jobs; employment in integrated setting(s); current receipt of job supports; unemployed youth: barriers to employment; job-seeking activities

Youth self-determination and expectations for the future

Self-determination

Index of self-determination; indices of autonomy, psychological empowerment, self-realization, and agentic action

Expectations

Youth’s expectations about highest level of schooling and employment, residential and financial independence at age 25

Youth contact with the justice system

Arrested or charged

Ever arrested or charged with delinquency or criminal complaint; number of times arrested; whether arrested in past year

Conviction and incarceration

Ever convicted of or pled guilty to a charge; ever incarcerated (in jail, prison, or detention home); duration of incarceration

Youth health

Health status

Self-assessment of health status

Health insurance

Any current health insurance coverage; any current private health insurance coverage, any current public health insurance coverage; any current coverage through the health insurance exchanges

Parenthood

Whether ever became a biological parent; age at parenthood

Youth well-being

Living arrangement

Currently lives alone or with friends, with family, in group home or other institution; currently married or cohabiting; number of people in (independent) youth’s household

Income and program participation

All youth: knowledge of SSA benefits, work incentives, and wage reporting policies

Independent youth (only): Income in past year (total and by source); household income in past year; household’s current receipt of SSA disability benefits and household’s current participation in other public-assistance programs

3. Use of Technology to Reduce Burden

The study will use a combination of mechanical and electronic technology to collect data. The technology selected will provide reliable information while minimizing respondent burden. Examples include the following:

We will use technology to streamline outreach and locating efforts. Using a sophisticated sample management system, updated contact information will be combined from multiple sources, including the programs’ management information systems, SSA’s administrative records, and results of locating efforts. This streamlined effort ensures that resources are targeted at contacting sample members using the most up-to-date, legally permissible contact information.

Mathematica will conduct interviews in a computer-assisted (CAI) format, using technology to minimize the burden of navigating complex skip patterns and survey logic. This system also enables streamlined conversion to alternate strategies for interview completion, such as administration in Spanish or interviews completed by proxy respondents. Further, the CAI system will enable interviewers to engage respondents in a dynamic, customized interview, in which relevant follow-up questions are triggered by pre‑loaded sample information as well as responses to items in the interview.

Staff will complete the interview on a tablet device in the field, using the same CAI software program as the telephone interviewers. They will also utilize a secure, web‑based field case management system that Mathematica created to record their contact attempts and transmit production data in real-time. Based on the 18-month survey experience, we anticipate that approximately 23 percent of cases will participate in an in-person field interview.

The study will offer a toll-free telephone number hosted by Mathematica for sample members to call in and complete interviews. SSA will host an informational web page about the PROMISE evaluation on its public website, which sample members can use to learn more about the evaluation and verify the legitimacy of the survey.

4. Efforts to Avoid Duplication

The 60-month survey will provide information that cannot be obtained through SSA’s administrative records and enable us to standardize data collection across all of the PROMISE programs. SSA does not use another collection instrument to obtain similar data. Although some of the PROMISE programs conducted their own surveys at earlier points in time, none is conducting surveys with PROMISE treatment and control group members 60 months after enrollment. Moreover, the PROMISE programs will end shortly after the PROMISE evaluation’s 60-month survey begins. Therefore, the nature of the information collected and the manner in which Mathematica will collect it preclude duplication.

5. Methods to Minimize Burden on Small Entities

There are no small entities involved in the 60-month PROMISE survey.

6. Consequences of Not Collecting Data

The 60-month survey is a one-time collection and is necessary to conduct a credible evaluation. The data we will collect is not available from other sources, and the survey will collect a richer set of information than can be gathered from administrative records. For example, administrative records might include data on earnings from jobs but do not offer details such as rates of pay, hours worked, or whether the job was competitive or supported employment. Because Mathematica will conduct the 60-month survey once, it cannot be conducted less frequently.

7. Special Circumstances

There are no special circumstances that would cause this information collection to be conducted in a manner inconsistent with 5 CFR 1320.5.

8. Federal Register Announcement and Consultation

a. Federal Register Notices

SSA published the 60-day advance Federal Register Notice on July 26, 2018 at 83 FR 35526, and SSA received no public comments. SSA published the second Notice on October 15, 2018 at 83 FR 52042. If SSA receives comments in response to the 30-day Notice, we will forward them to OMB.

b. Consultation with Outside Agencies

As a first step in the PROMISE evaluation, SSA convened a technical advisory panel. The panel provided input on the evaluation criteria and research design. It consisted of researchers and advocates who are experts in youth transition, disability, and evaluation design, including:

  • Burt Barnow, PhD, George Washington University

  • Hugh Berry, U.S. Department of Education

  • Mark Donovan, Marriott Foundation for People with Disabilities

  • David Johnson, PhD, University of Minnesota

  • Jamie Kendall, U.S. Department of Health and Human Services

  • Jeffrey Liebman PhD, Harvard University

  • Pamela Loprest, PhD, The Urban Institute

An interdisciplinary team of economists, disability policy researchers, survey researchers, and information systems professionals on the staff of the evaluation contractor (Mathematica and its subcontractor, BCT Partners) contributed to the design of the overall evaluation. These individuals include:

  • Karen CyBulski, Mathematica

  • Thomas Fraker, PhD, Mathematica

  • Jacqueline Kauff, Mathematica

  • Gina Livermore, PhD, Mathematica

  • Arif Mamun, PhD, Mathematica

  • Holly Matulewicz, Mathematica

  • Tonya Woodland, BCT Partners

c. Consultation with PROMISE Enrollees

The survey’s target audience comprises youth who enrolled in the PROMISE evaluation and their parents or guardians. They provided direct feedback on the draft instrument through their participation in the pretest in May 2018 (see more information regarding the pretest in Part B). The pretest respondents were a convenience sample of nine youth and nine parents who enrolled in the PROMISE evaluation but are not eligible for the 60-month survey. The questionnaires included in this submission (Appendices E and F) reflect the integration of their feedback.

9. Payments or Gifts

We will offer each survey respondent a base incentive of $30 for completing an interview. We will offer a bonus incentive to sample members who call in to complete an interview within twelve days of their survey cohort’s launch. The bonus incentive will be $10 for sample members with a high propensity to respond, and $20 for sample members with a low propensity to respond. This differential bonus offsets follow-up costs associated with more difficult-to-reach cases by generating completes from early responders who call in to complete an interview and by providing greater motivation for the hardest-to-reach cases to respond. By deploying a differential bonus, we can target resources to sample cases that otherwise are likely to require intensive efforts to locate, contact, or gain cooperation for interviews. Mathematica used a similar bonus incentive for the PROMISE 18-month survey, offering a base incentive of $30 and a $10 bonus to sample members who called in to complete an interview within ten days of their survey cohort’s launch. Based on the experience with the 18-month survey, Mathematica anticipates that 15 to 20 percent of 60‑month survey respondents will receive the bonus incentive.

We will distribute incentives through gift cards, and survey respondents will be offered a choice of a Visa, Target, or Walmart gift card. Gift cards will be mailed to respondents who complete interviews by telephone or on paper, and distributed in-person to respondents who complete interviews with a field interviewer.

10. Assurances of Confidentiality

The information collected during the 60-month survey is protected and held in confidence in accordance with 42 U.S.C. 1306, 20 CFR 401 and 422, 5 U.S.C. 552 (Freedom of Information Act) 5, U.S.C. 552a (Privacy Act of 1974), and OMB Circular No. A-130. The data will be treated in a confidential manner unless otherwise required by law. The data is covered by SSA Privacy Act System of Records Notice.

The study team takes seriously the ethical and legal obligations associated with the collection of confidential data. Ensuring the secure handling of confidential data is accomplished via several mechanisms, including obtaining suitability determinations for designated staff; training staff to recognize and handle sensitive data; protecting computer systems from access by staff without favorable suitability determinations; limiting the use of personally identifiable information in data; limiting access to secure data on a need-to-know basis and only for staff with favorable suitability determinations; and creating data extract files from which the identifying information is removed.

Mathematica will make clear the assurances and limits of confidentiality in the advance letter mailed to parents and youth for the 60-month survey interviews. The advance letter will also include the Paperwork Reduction and Privacy Act statements (Appendix B).

Interviewers will reiterate the assurance that the gathered information is for research purposes only during the introduction to the youth and parent interviews (Appendices E and F). In addition, Mathematica will not attribute the information survey respondents provide to specific individuals within any public documents.

Mathematica requires subcontractors, consultants, and vendors to establish confidential information safeguards that meet prime contract security requirements. The evaluation project director ensures that Mathematica properly disposes of any confidential information provided to, or generated by, a subcontractor, consultant, or vendor at the completion of the agreement between the parties.

Mathematica will destroy the 60-month survey interviews in a secure manner at the completion of the evaluation.

11. Justification for Sensitive Questions

The purpose of the study is to test the effects of the PROMISE demonstration and an innovative array of enhanced employment and educational services for youth and their families. Therefore, obtaining information about potentially sensitive topics, such as youth’s contact with the criminal justice system, is central to the intervention. Information about contact with the criminal justice system is critical for the impact analysis; a key hypothesis is that PROMISE programs may influence the risk that youth have such contact. The survey will also solicit information about household income from public benefits. Although public benefit receipt may be sensitive for some respondents, these data are needed to inform research questions about the PROMISE programs’ ability to decrease households’ public benefit receipt by increasing paid employment. The surveys will not collect data that can be obtained directly from other sources (for example, information about receipt of disability benefits is collected from SSA administrative records).

12. Estimates of Hours Burden

Table A2 shows the expected number of participants in parent and youth surveys, number of interviews, hours per response, and total response burden overall and by year.

In total, there are 11,324 parents and 11,416 youth eligible for the 60-month survey across the six programs. Assuming a response rate of 80 percent for the 60-month survey interviews, we will conduct a total of 9,059 parent and 9,133 youth interviews.

  • The response burden for each parent interview conducted on the telephone or in the field is estimated to be 32 minutes total, which includes time allocated for reviewing the advance mailing and potentially calling in to book an interview appointment, as well as the time anticipated for completing the interview with a professionally trained interviewer. The self-administered version of the parent interview is expected to take approximately 18 minutes to complete, which includes time spent reading instructions, recording answers to questions, placing the completed form in the pre-paid reply envelope, and placing the envelope in a postal box.

  • The response burden for each youth interview conducted on the telephone or in the field is estimated to be 38 minutes total, which includes time allocated for reviewing the advance mailing and potentially calling in to book an interview appointment, as well as the time anticipated for completing the interview with a professionally trained interviewer. The self-administered version of the youth interview is expected take approximately 18 minutes to complete, which includes time spent reading instructions, recording answers to questions, placing the completed form in the pre-paid reply envelope, and placing the envelope in a postal box.

The number of expected interviews and the expected response burden per interview yield a total expected burden of 10,515 hours for both parents and youth.

Because the sample will be released on a rolling basis, the total burden will vary by year, in accordance with the number of youth enrolled in PROMISE during the corresponding month in the enrollment period. These assumptions are shown in Table A2.

Table A2. Estimated Total Annual Burden by Respondent Type

Respondent

Number of Respondents

Responses Annually per Respondent

Total Annual Responses

Number of Minutes Per Response

Total Hours

2019






Parent Interview (CATI, CAPI)

1,095

1

1,095

32

584

Youth Interviews (CATI, CAPI)

1,110

1

1,110

38

703

Parent Interviews (SAQ)

22

1

22

18

7

Youth Interviews (SAQ)

23

1

23

18

7

Total all modes

2,250


2,250


1,301

2020






Parent Interviews (CATI, CAPI)

5,127

1

5,127

32

2,734

Youth Interviews (CATI, CAPI)

5,169

1

5,169

38

3,274

Parent Interviews (SAQ)

105

1

105

18

32

Youth Interviews (SAQ)

105

1

105

18

32

Total all modes

10,506


10,506


6,072

2021






Parent Interviews (CATI, CAPI)

2,656

1

2,656

32

1,417

Youth Interviews (CATI, CAPI)

2,671

1

2,671

38

1692

Parent Interviews (SAQ)

54

1

54

18

16

Youth Interviews (SAQ)

55

1

55

18

17

Total all modes

5,436


5,436


3,142

Total






Parent Interviews (CATI, CAPI)

8,878

1

8,878

32

4,735

Youth Interviews (CATI, CAPI)

8,950

1

8,950

38

5,669

Parent Interviews (SAQ)

181

1

181

18

55

Youth Interviews (SAQ)

183

1

183

18

56

Grand total all modes

18,192


18,192


10,515


13. Estimates of Cost Burden to Respondents

There are no direct costs to respondents for the 60-month survey interviews, other than their time to participate in the study, as described in A12 above. Respondents will not be asked to maintain any new records. The evaluation contractor will collect and maintain all survey data, and the contractor is responsible for all costs associated with data collection, storage, processing, and other functions related to these data. These costs are summarized below (see section A14) and are considered costs to the Federal government, paid through an SSA contract.

14. Annualized Cost to the Federal Government

The cost to the Federal government (SSA) for conducting the PROMISE 60-month follow‑up survey with parents and youth is $5,974,519. Table A3 below shows the costs by year.

Labor costs were budgeted by estimating the number of hours for required staff at the various wage levels, multiplying by the applicable wage rates, and multiplying the resulting subtotals by factors to cover fringe benefits and burden expense. The basis for estimating other direct costs varies with the type of cost estimated. The total of labor costs and other direct costs are summed and multiplied by a factor to cover general and administrative expenses, and the fee is included. In addition, we also included the cost for the contract with Mathematica.

Table A3. Annual Costs to the Federal Government

Fiscal Year

Survey Cost

2018

$273,954

2019

$720,086

2020

$2,428,151

2021

$2,552,328

Total

$5,974,519


15. Reasons for Program Changes or Adjustments

The changes in the burden stem from removing the surveys we completed, and adding in the new 60-month survey which we will complete over the next three years. Since the 60‑month survey will be the last survey we will administer as part of this project, we do not anticipate any further changes to the burden over the next three years.

16. Plans for Tabulation and Publication of Results

The 60-month survey, for which SSA is requesting clearance in this submission, will begin in May 2019 and continue through September 2021. With the PROMISE evaluation findings, SSA and ED will be able to advise federal policymakers and state administrators on the supports, services, policy, and program changes that could encourage individuals to work and decrease their dependence on disability and other public benefits. In fostering work efforts, the goal is to implement program changes that produce savings to the federal government and improve program administration.

Based on the plan outlined in the PROMISE evaluation design report (Fraker et al. 2014), the evaluator will analyze the information collected in the 60-month survey to prepare a long-term evaluation report that will present findings from the impact and benefit-cost analyses. The impact analysis will investigate the PROMISE demonstration’s effects on a wide array of education, earnings, and self-determination outcomes; the amount of payments the recipient receives from SSA; and quality of life, both overall and for meaningful subgroups. The proposed methodological approach combines a random assignment design with regression adjustment to improve the precision of the estimates. Because individuals were randomly assigned to the control group and to the treatment group, the impact analysis will focus on differences in the outcomes of enrollees between these two groups using a regression framework to control for other explanatory variables. We will use a regression‑adjusted comparison of randomly-assigned treatment group to control group for the full sample to address the impact of the intervention on enrollees’ education, labor market, and other outcomes. We will use a regression-adjusted comparison of randomly‑assigned treatment group to control group for subgroups defined by pre‑randomization values of age, race, gender, and type of disability.

The exact statistical technique used to estimate regression-adjusted impacts will depend on the nature of the dependent variable. For example, if the dependent variable is continuous, then ordinary least-squares regression produces unbiased estimates of impacts. For binary outcome variables (such as whether the youth is employed), logistic regression models generate consistent and efficient estimates, if the parametric assumptions underlying those models are correct. If the dependent variable is a count variable, then we will use an ordered logit model. If the dependent variable is ordinal, the evaluator will first reduce the measure to binary outcomes and then estimate a logit model. To account for the fact that sample members are observed for different lengths of time, the evaluator will also consider using event-history or hazard models for binary outcome measures. These models provide unbiased estimates of program effects on binary outcomes when participants’ data is truncated.

The purpose of the benefit-cost analysis is to determine whether the program impacts of the PROMISE demonstration are sufficiently large to justify the costs of providing program services. The results of this analysis will play an integral part in the decision to expand the demonstration to the larger population. The analysis will be based on an accounting framework that summarizes the intervention’s effects and resource use from the perspective of SSA and other key stakeholder groups, including society as a whole.

To ensure the benefit-cost findings are as helpful as possible to SSA, the evaluator plans to present the information in a way that has proven useful for communicating this type of information to the SSA Office of the Actuary and to OMB. First, the evaluator will summarize all of the information based directly on data collected during the demonstration period. The second set of estimates will present the size of future effects (if any) that the program would require to generate benefits that exceed costs, along with an analysis of the likelihood that future effects of that size will occur. In this way, SSA actuaries will be able to see the net value generated during the observation period and then use the more speculative analysis of possible future benefits and costs to draw conclusions about whether the PROMISE programs would ultimately pay for themselves. In addition to using this general presentation format, the evaluator will work with the SSA project officer, who will coordinate with other SSA staff, including actuaries, during the evaluation to ensure that the other assumptions used in the analysis—the discount rate, correction for inflation, and projections about potential productivity growth—are consistent with the ones they are using to assess other potential SSA initiatives. This consistency will go a long way in ensuring that comparisons of the various options are accurate and useful.

Table A4 presents the planned timeline for the data collection and the long-term evaluation report, along with the production of restricted access and public use data files.

Table A4. Data Collection and Reporting Schedule

Activity/Report

Approximate Dates

Data Collection


60-month survey

May 2019 through September 2021

Reports


Long-Term Evaluation Report

Summer 2022

Data Files


Restricted access file for 60-month survey

Summer 2022

Public use file for 60-month survey

Summer 2022


17. Approval Not to Display Expiration Date for OMB Approval

SSA is not seeking an exemption with this submission. Mathematica will display the OMB expiration date on all survey materials, as shown in the appendices.

18. Explanation of Exceptions

SSA is not requesting an exemption to certification requirements.


1 The six-state consortium goes by the name Achieving Success by Promoting Readiness for Education and Employment (ASPIRE) rather than by PROMISE.



1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSipple, Naomi
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy