Supporting Statement - Part A

Supporting Statement - Part A.docx

Promoting Readiness of Minors in SSI (PROMISE) Evaluation - Interviews with Program Staff, and Focus Group Discussions

OMB: 0960-0799

Document [docx]
Download: docx | pdf

Contents MATHEMATICA POLICY RESEARCH

Contents

Supporting Statement for The PROMISE EVALUATION 1

Part A. Justification for the Study 1

A1. Circumstances Making the Collection of Information Necessary 1

A2. Purposes and Uses of the Information 4

A3. Use of Technology to Reduce Burden 12

A4. Efforts to Avoid Duplication 13

A5. Methods to Minimize Burden on Small Entities 14

A6. Consequences of Not Collecting Data 14

A7. Special Circumstances 16

A8. Federal Register Announcement and Consultation 16

A9. Payments or Gifts 17

A10. Assurances of Confidentiality 18

A11. Justification of Sensitive Information 19

A12. Estimates of Hours Burden 19

A13. Estimates of Cost Burden to Respondents 22

A14. Annualized Cost to the Federal Government 24

A15. Reasons for Program Changes or Adjustments 24

A16. Plans for Tabulation and Publication of Results 25

A17. Approval Not to Display of Expiration Date for OMB Approval 27

A18. Explanation of Exceptions 27

Part B. Collection of Information Employing Statistical Methods 28

B1. Respondent Universe and Sampling Methods 28

B2. Procedures for the Collection of Information 31

B3. Methods to Maximize Response Rates and Deal with Nonresponse 37

B4. Tests of Procedures or Methods to Be Undertaken 39

B5. Individuals consulted on Statistical Aspects of the Design and on Collection and/or Analyzing Data 40

References 42


ATTACHMENT a: SECTION 1110 [42 U.S.C. 1310] LEGISLATION

Attachment B: Staff Interview Topics

AttAchment C: Social network Surveys

Attachment D: Focus Group Protocols

Attachment E: Focus Group INVITATION LETTERs

Attachment F: Focus Group PHONE Recruitment script

Attachment G: Focus Group Reminder Letter

Attachment H: 18-Month Youth Survey Instrument

Attachment I: 18-Month Parent Survey Instrument

Attachment J: Survey INVITATION LETTERs

ATTACHMENT K: STAFF ACTIVITY LOG

Supporting Statement for The PROMISE Evaluation

OMB No. 0960-0799

A. Justification

A1. Introduction

The Promoting Readiness of Minors in SSI (PROMISE) demonstration pursues positive outcomes for children with disabilities who receive Supplemental Security Income (SSI), and their families, by reducing dependency on SSI. The Department of Education (ED) awarded six cooperative agreements to states to improve the provision and coordination of services and support for children with disabilities who receive SSI and their families to achieve improved education and employment outcomes. ED awarded PROMISE funds to five single-state projects and one six-state consortium.1 With support from ED, the Department of Labor (DOL), and the Department of Health and Human Services (HHS), the Social Security Administration (SSA) is evaluating the six PROMISE projects. SSA contracted with Mathematica Policy Research to conduct the evaluation.


Under PROMISE, targeted outcomes for youth include: (1) an enhanced sense of self‑determination; (2) achievement of secondary and post-secondary educational credentials, and attainment of early work experiences culminating with competitive employment in an integrated setting; and (3) long-term reduction in reliance on SSI. Outcomes of interest for families include: (1) heightened expectations for, and support of, the long-term self-sufficiency of their youth; (2) parent or guardian attainment of education and training credentials; and (3) increases in earnings and total income. To achieve these outcomes, we expect the PROMISE projects to make better use of existing resources by improving service coordination among multiple state and local agencies and programs.


SSA is requesting clearance for the collection of data needed to implement and evaluate PROMISE. The evaluation provides empirical evidence on the impact of the intervention for youth and their families in several critical areas, including: (1) improved educational attainment; (2) increased employment skills, experience, and earnings; and (3) long-term reduction in use of public benefits. We base the PROMISE evaluation on a rigorous design that entails the random assignment of approximately 2,000 youth in each of the six projects to treatment or control groups (12,000 youth total). The PROMISE projects provide enhanced services for youth in the treatment groups; whereas youth in the control groups are eligible only for those services already available in their communities independent of the interventions.


The evaluation assesses the effect of PROMISE services on educational attainment, employment, earnings, and reduced receipt of disability payments. The three components of this evaluation include:

  • The process analysis, which documents program models; assesses the relationships among the partner organizations; documents whether the grantees implement the programs as planned; identifies features of the programs that may account for their impacts on youth and families; and identifies lessons for future programs with similar objectives.

  • The impact analysis, which determines whether youth and families in the treatment groups receive more services than their counterparts in the control groups. It also determines whether treatment group members have better results than control group members with respect to the targeted outcomes noted above.

  • The cost-benefit analysis assesses whether the benefits of PROMISE, including increases in employment and reductions in benefit receipt, are large enough to justify its costs. We conduct this assessment from a range of perspectives, including those of the participants, state and federal governments, SSA, and society as a whole.


SSA is planning and implementing several data collection efforts for the evaluation. These include: (1) follow-up interviews with youth and their parents or guardians 18 months and five years after enrollment; (2) phone and in-person interviews with local program administrators, program supervisors, and service delivery staff at two points in time over the course of the demonstration; (3) two rounds of focus groups with participating youth in the treatment group; (4) two rounds of focus groups with parents or guardians of participating youth; and (5) collection of administrative data.


Research Question

Process Analysis

Impact Analysis

Benefit-Cost Analysis

How were the programs designed, implemented, and operated and what factors contributed to the implementation experience?

X



Do PROMISE participants receive more and better transition and supportive services than others?

X

X


Are the PROMISE programs successful at achieving intended outcomes?


X


Are the PROMISE programs more effective for some youth and families than others?


X


Which program features are associated with achievement of the goals of the PROMISE initiative?

X

X


Are the benefits of PROMISE, including increased employment and earnings and reduced benefit receipt, large enough to justify its costs?


X

X

How might programs such as PROMISE be strengthened in the future?

X




At this time, SSA requests clearance for the collection of staff activity logs from program staff. SSA will request clearance for the five-year survey interviews in a future submission. SSA received clearance to conduct interviews with program staff and focus group discussions with youth and parents/guardians on July 14, 2014, and to conduct 18‑month surveys with youth and parents/guardians on July 30, 2015.


The staff activity logs we collect from program staff will inform the benefit-cost analysis.


A1. Authorizing Laws/Circumstances Making the Collection of Information Necessary

Since 1980, Congress requires SSA to conduct demonstration and research projects to test the effectiveness of possible program changes that could encourage individuals to work and decrease their dependence on disability benefits. In fostering work efforts, SSA intends for this research and the program changes evaluated to produce federal program savings and improve program administration. Section 1110 of the Social Security Act (Attachment A) authorizes SSA to conduct research and evaluation projects.


Youth who receive SSI face substantial barriers in making the transition to adult life. In addition to the issues facing all transition-age youth, SSI recipients and their families must consider issues related to their impairment and eligibility for continuing supports, especially cash assistance and medical insurance, as they move into young adulthood. SSI recipients who work and earn income above a certain threshold generally lose $1 of benefits for every $2 of earnings. Upon reaching age 18, child SSI recipients must undergo a redetermination of eligibility based on the adult definition of disability to continue receiving cash assistance. Uncertainty surrounding the outcome of that process may influence the decisions by youth to seek education, training, and work skills prior to age 18, as well as the support of families for their investment in human capital (Loprest and Wittenburg 2007). The poor outcomes of child SSI recipients prior to and after age 18 are indicative of the challenges they face moving into adulthood. Nearly one-third of them drop out of high school prior to age 18, and 43 percent have had problems in school that have resulted in their suspension or expulsion (Hemmeter et al. 2009). Relative to other young adults, former child SSI recipients after age 18 are more likely to be inactive in employment, school, and service programs; have higher rates of arrest; and have higher school dropout rates after age 18 (Wittenburg 2011; Hemmeter et al. 2009; Loprest and Wittenburg 2007). Approximately two-thirds continue to receive SSI as adults and only 22 percent work between the ages of 19 and 23 (Loprest and Wittenburg 2007). These poor outcomes may reflect the unique characteristics of these youth, particularly their severe impairments; however, they may also reflect factors associated with their families, such as low incomes, and other characteristics of the service environment.


A growing body of research suggests the importance of families in the employment outcomes of transition-age youth with disabilities. Previous studies demonstrated positive associations between the employment outcomes of these youth and the resources of their families, such as income, education, and family structure (Chiang et al. 2012; Emerson 2007; Loprest and Wittenburg 2007; Shattuck et al. 2012). Further evidence suggests that youth with disabilities rely primarily on family networks to find jobs (Hasazi et al. 1985) and they report family involvement as more important than other transition factors to their success (Powers et al. 2007). Family expectations about employment may be a particularly important determinant of the employment outcomes of transition-aged youth with disabilities (Blacher et al. 2010; Carter et al. 2012; Lee and Carter 2012; Lindstrom et al. 2011; Lindstrom et al. 2007; Simonsen and Neubert 2013) – potentially more important than income (Carter et al. 2012) or family structure (Lindstrom et al. 2007). Carter et al. (2012) suggest that family expectations are associated with youths’ paid employment experiences during school, and so may improve youths’ post-school employment outcomes. The importance of families in youth transitions may be amplified by the weakness of the transition service environment. High school students with disabilities may experience significant gaps in services and lack linkages to adult services. Many do not get information from their schools on how to access needed services. The U.S. Government Accountability Office (GAO) (2006) reports that youth with disabilities and their families often have difficulties identifying and learning how to ask for the accommodations they need to succeed in school and the workplace. Outside systems do not consistently provide these youth with the supports they need to achieve positive adult outcomes, especially in the critical areas of continuing education and employment. For example, only about one-quarter of secondary special education students, ages 17 or 18, have vocational rehabilitation (VR) counselors involved in their transition planning (Cameto et al. 2004). The problem of accessing supports is compounded by a lack of coordination between school- and adult-based services as youth leave secondary school (Luecking and Certo 2003; U.S. GAO 2006; Wittenburg et al. 2002).


We intend the PROMISE projects to address key limitations in the existing service system for youth with disabilities. By intervening early in the lives of these young people, at ages 14–16, the projects will engage the youth and their families well before critical decisions regarding the age 18 redetermination are upon them. We expect the required partnerships among the various state and federal agencies that serve youth with disabilities to result in improved integration of services and fewer dropped handoffs as youth move from one agency to another. And, by requiring the programs to engage and serve families and provide youth with paid work experiences, the initiative is mandating the adoption of critical best practices in promoting the independence of youth with disabilities.


The Office of Management and Budget (OMB) proposed PROMISE as an interagency project between HHS, DOL, ED, and SSA. OMB requested that SSA conduct a rigorous evaluation of the PROMISE projects, focusing on key outcomes of interest, including reductions in SSI payments. We use the information the evaluation contractor collects to assess the effectiveness of the interventions that the individual PROMISE projects implement.


A2. Purposes and Uses of the Information


1. Project Staff Interviews and Focus Groups with Youth Participants and Parents/Guardians

The evaluator conducts separate project staff interviews and focus groups with youth participating in PROMISE and their parents or guardians for each of the PROMISE projects. SSA uses information from these activities to conduct a process analysis of PROMISE implementation, addressing the following questions:


  • What were the PROMISE interventions like in practice, from the perspective of the PROMISE grantees, partner and project staff, and participants?

  • What factors contributed to the PROMISE project design and the implementation experience observed across the PROMISE projects? What did it take to implement the program?

  • Given what we learned about program impacts, what are the implications of the program implementation experience? What can we do better? What successes should we enhance, and what problems can we avoid?


The sections below provide a description of the specific nature, purposes, and uses of the staff interviews and participant focus group:


a. Staff Interviews: SSA uses information from the PROMISE project staff interviews to: (1) document the projects, the environments in which they are implemented, and the nature of existing services to youth and their families; (2) describe the interagency and other partnerships developed to implement PROMISE; and (3) assess the extent to which the projects adhered to their intended service delivery models. Specific issues we address under each of these topics include:

  • Documenting the program. What is the basic structure and logic model for each program? What is the service environment for program operations? How did grantees conduct participant outreach and enrollment? What are each program’s staffing structure and services? How were services implemented? How do grantees track participation? What do treatment families, program staff, and partners think of program services? What are the grantee’s plans and objectives for their evaluations?

  • Partnership development, maintenance, and roles. How were potential partners identified and approached to participate in PROMISE? Who are the major and secondary partners? What are their roles? What is the nature of the relationships among the partner organizations? How do the partners communicate and collaborate and how has this changed over time? What are the contractual or other form of agreements between the grantee and its partners; between the partners and service providers? To what extent do agreements and other arrangements encourage the partners to work toward demonstration goals? To what extent do they discourage them from doing so?

  • Fidelity of activities to program model. How closely do the programs adhere to their plans and logic models? In what ways do they use their logic models to guide services and track and manage inputs, outputs, and outcomes? How consistently are the models implemented at local sites? How do programs collect operations and service information and use it for management and evaluation purposes?


The evaluator will conduct two rounds of site visits for purposes of conducting in‑person interviews with PROMISE project and partner agency staff. One set of visits began in the fall of 2014 and will continue through the summer of 2015, and the other will occur during the spring, summer, and fall of 2016. We planned to conduct the first set of site visits in 2014; however, due to the PROMISE projects’ delayed enrollment, we will conduct these through the summer of 2015. The evaluator conducts the interviews with directors and administrators of the PROMISE project, and of the state agencies and other community partners participating in PROMISE; and with PROMISE project staff responsible for arranging and delivering PROMISE services to participants. For the five single-state projects, the evaluator interviews an average of 10 program directors and managers, and 20 service provider staff members at each project during the site visits. For the six-state consortium project, the evaluator interviews a total of 25 administrators and 45 service provider staff members during the 2014, 2015, and 2016 data collections. Some of these interviews occur during site visits and some by phone. Examples of program directors and managers include the PROMISE project director and principal investigators, administrators of state government agencies that participate in PROMISE, and executive directors of non-governmental or community-based organizations that provide services to PROMISE participants and other youth or adults with disabilities. Examples of PROMISE project staff include recruiters, case managers, employment specialists, benefits counselors, vocational rehabilitation counselors, and educational instructors and coordinators. We show the topics the evaluator addresses during these semi-structured interviews in Attachment B.


During the staff interviews, the evaluator asks interviewees to complete a brief social network questionnaire. The evaluator administers separate versions of the questionnaire to program managers or directors and project staff, tailored to their specific perspectives (as shown in Attachment C). We designed this brief questionnaire to assess the strength and capacity of organizational collaborations associated with PROMISE. We pre-fill the questionnaire with the names of organizations collaborating with each PROMISE project and which the evaluation team knows prior to the site visit. The respondents may add additional organizations to the form as needed. We use the results to conduct a network analysis to examine whether and to what extent stakeholders interacted with one another before the implementation of PROMISE, and whether and how their interactions change with the implementation of PROMISE. The analysis also provides a means of examining which stakeholders are relatively more active participants in the PROMISE collaborative. We may also use the data to create independent variables for use in multivariate analyses to investigate the extent to which communication and collaboration between PROMISE stakeholders is associated with program effects.


b. Participant Focus Groups: The evaluator uses the focus group data to describe the experiences of parents or guardians and youth enrolled in PROMISE and to supplement other data collected and used in the process analysis. Information collected in the focus groups supports analyses related to two key questions:


  • How are the PROMISE projects being implemented and operated?

  • What are the short-term impacts of the projects on youth and their parents or guardians?


To address these questions, evaluators convene focus group discussions that concentrate on key areas of interest for each group (youth and parents or guardians). For youth, these topics include: (1) program enrollment; (2) case management; (3) education services; (4) employment preparation and support; and (5) other program services. For the parents or guardians, key topic areas include: (1) program enrollment; (2) services for youth; (3) services to promote parent or guardian involvement; (4) staff and peer support for parents or guardians; and (5) services for parents or guardians. We use the findings to assess client satisfaction with the demonstration; identify which aspects of the demonstration may be more or less associated with participation outcomes; identify potential improvements to the demonstration approaches; and provide feedback to the PROMISE projects.


During the site visits, the evaluator conducts a pair of focus groups for each of the five single-state projects. This includes one group with youth enrolled in PROMISE and one with their parents or guardians. For the six-state consortium project, we conduct three pairs of focus groups during each round. We conduct the youth and parent or guardian focus groups separately, but concurrently. This ensures we represented the perspectives of both groups, which are the focus of PROMISE services. The evaluator conducts the groups in English only, but makes accommodations to facilitate the full inclusion of people with disabilities. We anticipate the evaluator will follow the same procedures for the site visits during 2016.


The evaluator conducted the recruitment efforts and moderated the group discussions using standard protocols (Attachment D) to structure the 90-minute discussion and encourage participation. The evaluator worked with local PROMISE project staff to identify 50 youth (and their parents or guardians) at each project who were interested in taking part in the groups. We contacted up to 50 treatment group families to recruit approximately 12 individuals to participate in each 90-minute focus group (expecting that approximately 10 would participate on the day of the group). Two to four weeks before the focus group, we sent an invitation letter (or email) to the parent or guardian for each household identified by the project staff (Attachment E). We anticipated the need for telephone follow-ups to secure participation of the target number of participants. Trained evaluator staff followed-up with these family members; explained the purpose of the session; answered any questions and responded to any concerns; and invited them to participate in the focus group discussions. Solicitation calls for each focus group continued until 12 youth and 12 parents or guardians agreed to participate. Staff used a recruitment script (Attachment F) to describe the purpose of the focus group and ask the parent or guardian, and the youth to participate. We assured all of the parents or guardians and youth contacted that participation in these focus groups is voluntary and will not affect their eligibility for SSI or any other benefits they receive, either now or in the future. One week before the focus group, evaluators sent a reminder to each individual who agreed to participate (Attachment G) along with directions to and a map of the focus group location. The day before the focus group, evaluators called participants to remind them of the focus group date, time, and location.


2. 18-Month and 5-Year Survey Interviews

The follow-up surveys will focus on outcomes that the PROMISE programs might affect and that we cannot readily obtain from administrative data files and other sources. The 18‑month survey will cover short-term outcomes, such as: (1) the receipt of services, (2) parental expectations, (3) self-determination, (4) educational progress, and (5) work‑based experiences. The five-year survey will cover long-term outcomes, such as high school graduation, employment, and economic well-being. For each survey, we will develop two instruments: one for the youth enrollees (Attachment H) and another for their parents or guardians (Attachment I).2 We will prepare English and Spanish translations of these instruments. When other languages are necessary, qualified bilingual interviewers will interpret questions.

The two follow-up surveys of PROMISE demonstration enrollees will focus on outcomes the demonstration programs might affect, and will collect information that we cannot obtain readily from administrative data files and other sources. The outcomes will include both intermediate outcomes, such as the receipt of services, as well as longer-term outcomes, such as educational attainment, employment, earnings, and benefit receipt. Rather than creating self-administered surveys, interviewers will administer these instruments to the respondents.

The data the follow-up surveys gather will be critical input to several of the evaluation’s analytic components. We will use data from the 18-month survey on treatment group members’ satisfaction with PROMISE services to supplement earlier findings from the process analysis of program implementation. We will use data from the 18-month survey for both treatment and control group members as the primary basis for the analysis of program impacts on the receipt of services and other short-term outcomes. Along with data from SSA’s administrative files, we will use data from the five-year survey as the basis for the long-term impact analysis. In addition, we will incorporate the impact estimates into the evaluation’s benefit-cost analysis.

Given their substantial investment in PROMISE and the pressing needs of transition-age SSI youth and their families, the federal sponsors of this initiative are keenly interested in whether and how the PROMISE programs achieve their goals, and whether the benefits of the programs outweigh their costs. To respond to the needs of the program sponsors, we designed the PROMISE evaluation with the following overarching research questions in mind:

  • How were the programs designed, implemented, and operated, and what factors contributed to the implementation experience?

  • Do PROMISE participants receive more and better transition and supportive services than others?

  • Are the PROMISE programs successful at:

        • Increasing educational attainment?

        • Increasing employment credentials?

        • Improving employment outcomes?

        • Reducing SSI payments?

        • Reducing the use of other public benefits?

        • Increasing total household income?

  • Are the PROMISE programs more effective for some youth and families than others?

  • Which program features are associated with achievement of the goals of PROMISE?

  • Are the benefits of PROMISE, including increased employment and earnings and reduced benefit receipt, large enough to justify its costs?

  • How might programs such as PROMISE be strengthened in the future?

The sections below describe the information we will collect in the parent and youth surveys, as well as its purposes and its uses.

SSA contracted with Mathematica Policy Research to conduct the evaluation and oversee all aspects of the survey administration. The evaluation will conduct a follow-up survey of the approximately 12,000 PROMISE demonstration enrollees (2,000 at each of the six study sites) and their parents or guardians at two points in time following their enrollment and random assignment in PROMISE. We anticipate the enrollment and random assignment, which began in April 2014, to continue through April of 2016. The first survey will take place 18 months after an individual’s random assignment date. We will survey the enrollee and parent or guardian again on the five-year anniversary of their random assignment. We will conduct these interviews primarily via computer-assisted telephone interviewing (CATI), with field locating and computer-assisted in-person interviewing (CAPI) as necessary. Based on pretest results, we anticipate the parent interview will take 35 minutes to complete, on average, and we expect the youth interview to take 25 minutes to complete, on average.

The surveys will yield information on critical outcomes that is not available at all or not available for members of the control group in administrative data. Examples include measures of job quality, parental expectations, household income sources and amounts, youth self-determination, receipt of services, and participant satisfaction with PROMISE services. Although earnings from formal jobs will be available from SSA administrative files, the surveys will collect more current and detailed information about earnings, including wage rates and hours worked in both formal and informal employment. Findings from the Youth Transition Demonstration (YTD) evaluation suggest that information on informal employment may be particularly important for an intervention targeting youth with disabilities. At one YTD site, the program showed a positive and statistically significant impact on any employment (formal or informal) based on survey data, but no significant impact on formal employment based on administrative data (Fraker et al. 2014).

The survey data also eliminates the need to collect the Social Security numbers (SSNs) of all household members for the purpose of identifying these individuals in administrative files. Individuals are often reluctant to provide their SSNs because of security concerns, or may have difficulty providing them for all members of their households. Therefore, a requirement to collect SSNs could make it more challenging for the PROMISE programs to reach their enrollment targets. The survey data also reduces the number of administrative data sources needed for the evaluation, access to which can be difficult. Identifying and arranging to collect all of the relevant administrative data from the eleven states participating in PROMISE, including data from federal and local programs, would be logistically difficult and potentially result in inconsistent measures across the states. The survey allows us to focus on the key variables of interest and collect them in a consistent manner across the eleven states.

Table A1 lists the intended uses of information from the PROMISE surveys of parents and youth conducted 18-months after random assignment.


Table A1. Youth and Parent/Guardian Instruments for 18-Month Survey: Domains and Measures of Interest

Domains

Measures

Education

Youth secondary education

School enrollment status; type of school attended; intensity of educational activity; 504/ Individualized Educational Plan (IEP) status; grade completion; high school completion; type of diploma; receipt of a General Educational Development credential

Youth postsecondary education

Postsecondary school enrollment type (degree or certificate program) and completion, by type of institution

Parent’s or guardian’s and spouse’s education

Secondary school completion (diploma, GED); any postsecondary education; any postsecondary degree, certificate, or license; type of highest degree, certificate or license achieved

Youth Employment Credentials

Youth’s work-based experience

Job shadowing, apprenticeship/internship; participation in skills training, by type (basic skills training, computer classes, problem solving training, and social skills training) and overall work-based experience

Employment

Youth’s employment experience

Employment in paid and unpaid jobs; hours of work; earnings; employment status at the time of survey

Parent’s or guardian’s and spouse’s employment and earnings

Each parent or guardian’s employment and tenure in paid jobs; hours of work; earnings; employment in jobs with benefits

Service Receipt

Youth transition services

Receipt of transition services, by type (education, employment, benefits counseling, financial literacy, other non-employment, case management) and overall receipt of transition services; extent of services used; unmet service needs; type of service providers used

Parent’s or guardian’s and spouse’s training and information

Receipt of family support services, by type (outreach, training, employment, information) and overall parent/guardian training and information; extent of services used; unmet service needs; type of service providers used

Youth Health

Health status

Self-assessment of health status; functional limitations

Health insurance

Any - private and public health insurance coverage, number of other household members not covered by insurance and their relationship to youth, source of private insurance, use of tax credits to defray costs of private insurance.

Self-Determination and Expectations

Self-determination

Index of self-determination; sub-indices of autonomy, psychological empowerment, and self-realization

Expectations

Youth’s expectations about future education and employment; parent’s expectations about youth’s performance of household chores; parent’s or guardian’s expectations about youth’s future education, employment, and independence; youth’s perceived barriers to work

Youth Risky Behavior

Substance use

Use of tobacco, alcohol, marijuana, and illegal drugs

School discipline

Suspensions and expulsions for youth since RA date

Individual and Family Well-Being

Income

Parent or guardian and spouse’s income; youth income; household income

Program participation

Participation and benefits in SSA disability programs; participation in other public- assistance programs; connection to adult services

Living arrangement

Lives alone or with friends, with family, in group home or other institution; married or cohabiting


3. Staff Activity Logs

The staff activity logs (Attachment K) provide data on aspects of service delivery that we cannot readily obtain from administrative data files and other sources. The logs will include staff’s daily time spent on various activities that are core components of the PROMISE model: case management, career and work-based learning experiences, education- and school-related services, benefits counseling and financial literacy training, youth empowerment, and parent training and information. The logs will also include categories related to program administration, as well as work leave and other program activities outside the above categories. This information will be useful for the benefit-cost analysis, enabling us to allocate program costs across the various components. Such information will be helpful for understanding the level of resources PROMISE programs allocates to one or another component, which could inform those interested in replicating a specific program and interpreting program impacts.


Data from the staff activity logs will answer the following research questions:

  • How does a program allocate resources across PROMISE components?

  • How does actual program allocation align with the program’s model of service delivery?

  • What level of effort does a program allocate to program management versus program services?

  • How do specific types of staff differ in how they spend their time on program management and service delivery?

To answer these questions, we will collect staff activity logs from selected staff for two one-week periods around the time of the second evaluation site visit (during summer and fall 2016). The one-week periods will represent typical work weeks for staff, avoiding weeks with atypical training or conferences, though the timing of the data collection efforts might occur when the program is focused on different activities (such as work experiences in the summer and school experiences in the fall). We expect to ask 25 to 35 staff from each program to complete the logs, depending on the number of staff and the different staff categories involved in delivering substantive services. Individuals selected to complete the logs will include both administrative and direct service staff and might include some subcontractors whose primary roles with their organizations involve PROMISE service delivery.


A3. Use of Technology to Reduce Burden


1. Interviews and Focus Groups

To the extent possible, we send invitations to and reminders about the focus groups via e‑mail. We record staff interviews and focus group sessions digitally. Because the social network survey consists of only six or fewer questions, and because the respondents are the same individuals who participate in the in-person interviews during site visits, it is most practical and least burdensome to collect the data on hard copy immediately during the in-person interview. We designed the questionnaire to be self-administered. We ensure members of the evaluation team are present to answer questions respondents might have about it. Transmission of the social network questionnaire to and from phone interviewees occurs electronically via fax or email.


2. 18-Month Survey Interviews

The study will use a combination of mechanical and electronic technology to collect data. The technology we select will provide reliable information while minimizing respondent burden. Examples include the following:

We will use technology to streamline outreach and locating efforts. Using a highly sophisticated sample management system, we will combine updated contact information from multiple sources, including the programs’ management information systems, SSA’s administrative records, and results of locating efforts. This streamlined effort ensures we target our resources for contacting sample members using the most up-to-date, legally permissible contact information.


We will conduct interviews in a computer-assisted (CAI) format, using technology to minimize the burden of navigating complex skip patterns and survey logic. This system also enables streamlined conversion to alternate strategies for interview completion, such as administration in Spanish or interviews completed by proxy respondents. Further, the CAI system will enable interviewers to engage respondents in a dynamic, customized interview, in which follow-up questions are driven by pre-loaded sample information as well as responses to items in the interview.


We anticipate approximately 20 percent of cases will participate in an in-person field interview. Staff will complete the interview on a tablet device, using the same CAI software program as the telephone interviewers. They will also utilize a secure, web‑based field case management system (SMARTFIELD) which Mathematica created to record their contact attempts and transmit production data in real-time.


To support inclusion of youth and parents with disabilities who would not be able to participate by telephone, we will use secure instant messenger software to transmit interview data transferred from those respondents to the interviewer. The interviewer will copy and paste the applicable items into the messenger system, recording the replies in the CAI system.


The study will offer an informational website, a toll-free telephone number, and an email address, all of which Mathematica will host.


3. Staff Activity Logs

Mathematica will send the staff activity logs to program staff via email. We designed the activity logs so staff can complete them in Microsoft Excel, although program staff can print them to complete on paper if they prefer. Program staff will return completed activity logs to Mathematica via email or fax.



A4. Efforts to Avoid Duplication


1. Interviews and Focus Groups

The staff interviews, social network survey, and focus group discussions provide information we cannot obtain through SSA’s administrative records. The discussions are about the PROMISE-related experiences of staff and participants, the nature of partnerships and coordination with other agencies and programs, and the ways in which participation in PROMISE affects participant educational and employment goals and experiences.




2. 18-Month Survey Interviews

The parent and youth surveys will also provide information we cannot obtain through SSA’s administrative records, and will provide an opportunity to standardize data collection across all of the PROMISE programs. Some of the programs plan to conduct their own surveys, but Mathematica will work closely with them to avoid duplication of efforts and minimize potential burden to enrollees (related to potential periods of overlap between the Federal evaluation and the programs’ survey efforts). Although some programs plan to track some services provided to control group members, this tracking will not be comprehensive. As such, there is no complete data source available on the provision of services for all control group members. Finally, not all PROMISE programs will track services received by the control group members, as some programs’ efforts to track service provision will focus exclusively on the PROMISE treatment group.


Therefore, the nature of the information we collect and the manner in which we collect it preclude duplication. SSA does not use another collection instrument to obtain similar data.


3. Staff Activity Logs

The staff activity logs will provide information that is not available through SSA’s administrative records, the programs’ management information systems, or the programs’ administrative cost data. The amount of time staff spend on services such as employment and education and on program administration will help us understand how the programs operate and the services that they emphasize.


A5. Methods to Minimize Burden on Small Entities

Some of the service providers we interview may be staff of small entities. Our protocol imposes minimal burden on all organizations involved and we keep discussions to one hour or less. We hold the information we request to the absolute minimum required for the intended use. We schedule interviews at times that are convenient to the respondents. In this way, we minimize the effect on small businesses and other small entities.


There are no small entities involved in the 18-month PROMISE surveys.


Some of the program staff who will complete the staff activity logs may be staff of small entities. We designed our collection of the staff activity logs to impose a modest burden on all organizations involved, and we will keep the completion of the logs to five minutes per day for two periods of seven days each. We are holding the information we are requesting to the absolute minimum required for the intended use.


A6. Consequences of Not Collecting Data


1. Interviews and Focus Groups

To support the process analysis, we scheduled two interviews and group discussion sessions with local program administrators, program supervisors, and service delivery staff. The first visit occurred after demonstration startup (beginning fall of 2014, and continuing through summer 2015, depending on the project enrollment start date), and the second visit will occur after the programs have matured and gained experience providing services to participants (2016). We determined these two visits are necessary to develop an understanding of the intervention and steps taken to implement project services. The first visit focuses on start-up activities, the projects’ outstanding features, and key challenges. The second visit will assess how the projects evolved over time in response to their early experiences and the lessons learned about service delivery to SSI youth and their families. Fewer visits would not allow SSA to assess how the projects evolve over time to address significant challenges and leverage successes.


Similarly, we determined we require two sets of focus group sessions with parents or guardians and youth. The first set of focus group sessions took place during the first round of site visits conducted in fall 2014, and we will continue conducting the first set of focus group sessions as we continue conducting the first round of site visits through summer 2015. We will conduct the second set of focus group sessions during our site visits scheduled for 2016. We recruit independent groups of participants at each round to minimize burden and provide an opportunity to gather information from more PROMISE enrollees. Two rounds of focus groups are necessary to develop an understanding of the intervention and steps taken to implement project services as the projects evolve over time, and to capture changes in the experiences and outcomes of participants as they are served by the projects for a longer period.


If we did not collect the information from the group discussions with project staff and the focus groups with program participants and their parents or guardians it would result in a loss of qualitative data that could provide greater insights into the impact findings generated with the quantitative data collected via the surveys and administrative data. Conducting the groups at two points in time allows more time to elapse between the groups; more time for staff to provide services; and more time for PROMISE to affect the lives of participants. Further, by revisiting the same sites at two points in time, evaluation staff may be able to follow up on challenges observed early in the implementation period that implementation staff may resolve or improve between the visits. Finally, by speaking with youth and parents or guardians, as well as program implementation staff at each site, the evaluation obtains a more balanced approach to understanding the implementation efforts than we could gain from interviewing implementation staff alone. Therefore, we cannot collect the information less frequently, or with fewer respondents.


2. 18 Month Survey Interviews

The 18-month survey is a one-time collection and is necessary to conducting a credible evaluation. The data we will collect is not available from other sources, and the survey will collect a richer set of information than we can gather from administrative records. For example, administrative records might include data on earnings from jobs but do not offer details such as rates of pay, hours worked, or whether the job was competitive or supported employment. Since we will conduct the 18-month survey once, we cannot conduct it less frequently.


3. Staff Activity Logs

Mathematica will collect the staff activity logs in two one-week periods around the time of the second round of site visits to each program. Two periods are necessary to provide a representative sample of staff’s time use and potential seasonal differences in program activities. The data collected are necessary to conduct a credible evaluation and are not available from other sources. Failure to collect the data would result in reduced precision of the benefit-cost analysis.


A7. Special Circumstances


There are no special circumstances that would cause this information collection to be conducted in a manner inconsistent with 5 CFR 1320.5.


A8. Solicitation of Public Comment and other Consultation with the Public


1. Federal Register Notices

The 60-day advance Federal Register Notice published on October 15, 2015, at 80 FR 62148, and we received no public comments. SSA published the second Notice on December 29, 2015 at 80 FR 81409. If we receive comments in response to the 30-day Notice, we will forward them to OMB.


2. Consultation with Outside Agencies

As a first step in the PROMISE evaluation, SSA convened a technical advisory panel. The panel provided input on the evaluation criteria and research design. It consisted of researchers and advocates who reflected expertise in youth transition, disability, and evaluation design. The external experts were:


  • Burt Barnow, PhD, George Washington University

  • Hugh Berry, US Department of Education

  • Mark Donovan, Marriott Foundation for People with Disabilities

  • David Johnson, PhD, University of Minnesota

  • Jamie Kendall, US Dept. of Health and Human Services

  • Jeffrey Liebman PhD, Harvard University

  • Pamela Loprest, PhD , The Urban Institute


An interdisciplinary team of economists, disability policy researchers, survey researchers, and information systems professionals on the staff of the evaluation contractor (Mathematica Policy Research and its subcontractor, BCT Partners) contributed to the design of the overall evaluation. These individuals include:


  • Karen CyBulski, Mathematica

  • Thomas Fraker, PhD, Mathematica

  • Jacqueline Kauff, Mathematica

  • Gina Livermore, PhD, Mathematica

  • Holly Matulewicz, Mathematica

  • Tonya Woodland, BCT Partners


3. Consultation with SSI Recipients and Program Staff


a. Focus Groups: Youth receiving SSI, and their parents or guardians, are the target audience for the participant focus groups. Through their involvement in these sessions, they provide first‑hand feedback on their experiences with PROMISE. We use findings from early groups, where applicable, to the refine procedures and discussion topics for subsequent groups held at other sites, for the later round of focus groups, and for the 18- and 60-month questionnaires.

b. 18 Month Survey Interviews: The survey’s target audience comprises youth receiving SSI and their parents or guardians. They provided direct feedback on the draft instrument through their participation in the pretest in November of 2014. This pretest included a convenience sample of nine youth who receive SSI payments but were not enrolled in PROMISE and had recently aged-out of eligibility. Refinements to the questionnaires included in this submission (Attachments H and I) reflect the integration of their feedback. (See more information regarding the pretest in Part B.)


c. Staff Activity Logs: Program staff are the target audience for the staff activity logs. Mathematica provided each program with the opportunity to review an early draft of the activity log and incorporated their feedback into the final version of the log (Attachment K). Program staff will also be asked to provide information on their specific activities to include as examples for the log and to include any additional instructions they have for the log.


A9. Payments or Gifts

We do not offer program administrators or directors and PROMISE service provider staff remuneration for completing interviews or staff activity logs.


Each PROMISE focus group youth and parent or guardian participant receives a $30 incentive in the form of a gift card, to express the study’s appreciation for their time. In addition to the gift card, evaluators provide light refreshments and snacks during the focus group sessions. Such additional incentives are likely to increase the appeal of participation because they offset the burden for those who may attend the focus groups soon after their work or school day ends.


For the 18-month and 5-year surveys, we will offer each survey respondent $30 for taking part in the interview, which we provide in a gift card mailed to the address provided during the interview. We will provide survey respondents with a choice of two retailers for their gift card (Target, Walmart). If neither of these stores is located in close proximity to their residence, the cards can be used for online purchases, as well. Those who call in to complete the survey prior to outbound calls or intensive locating and nonresponse follow-up will receive an additional $10. OMB approved this differential incentive when used with success in other studies conducted by Mathematica. For example, DOL’s Youth Build evaluation conducted an experiment to test the results of offering an additional $15 to the $25 base incentive to those who responded within the first four weeks of the field period. The “Impact of the ARRA Subsidy on COBRA Take-Up” study sponsored by DOL offered a $10 differential to the $30 base incentive to promote self-initiated response within the first four weeks of the field period. In addition, OMB recently approved an experiment, embedded in the National Beneficiary Survey (NBS), sponsored by SSA, which compares offering a $10 bonus to the $20 base incentive for respondents who call in to complete the interview within the first two to four weeks of the field period with a control group that receives the standard post-paid $20 incentive. The NBS experiment began in February of 2015, as such, results are not yet available.


A10. Assurances of Confidentiality

We protect and hold the information provided during the staff interviews and focus groups in confidential accordance with 42 U.S.C. 1306, 20 CFR 401 and 422, 5 U.S.C. 552 (Freedom of Information Act), 5 U.S.C. 552a (Privacy Act of 1974), and OMB Circular No. A-130. We treat the data in a confidential manner unless otherwise compelled by law. In addition, we will collect the 18-month survey interview data under SSA System of Records Number 60-0203.


The study team takes seriously the ethical and legal obligations associated with the collection of confidential data. Ensuring the secure handling of confidential data is accomplished via several mechanisms, including obtaining suitability determinations for designated staff; training staff to recognize and handle sensitive data; protecting computer systems from access by staff without suitability determinations; limiting the use of personally identifiable information in data; limiting access to secure data on a “need to know” basis and only for staff with suitability determinations; and creating data extract files from which we remove the identifying information. We make the assurances and limits of confidentiality clear in all advance materials sent to recruit potential participants and restate them at the beginning of each focus group session. The Paperwork Reduction and Privacy Act statements appear on the advance letter for focus group participants. For the 18-month survey interviews, we will make clear the assurances and limits of confidentiality in the advance letter mailed to parents and youth. The Paperwork Reduction and Privacy Act statements will appear on the advance letter for the survey (Attachment J).


The PROMISE enrollment database contains contact information the evaluator uses to invite participants to the focus groups. The evaluator will also use this database to invite parents and youth to take part in the surveys. Advance letters for the focus groups and surveys provide assurance that we gather the information for research purposes only. We reiterate the same message at the start of each focus group session and we ask participants to keep the focus group conversations confidential. We will not disclose the identity of the group participants to anyone outside of the evaluation team, and the information the participants provide, which we will present in public documents, will not be attributable to specific individuals. The focus group facilitator digitally records each focus group discussion beginning after all introductions have been made. We inform participants about the recording and instruct that they may request that we suspend the recording at any time. If there are any objections to the recording, the facilitator does not record the session. We do not ask any identifying information during the focus group and moderators only refer to group participants by their first names; thus we include no identifying information in the digital recording.


For the 18-month survey interviews, interviewers will reiterate the assurance that we gather the information for research purposes only during the introduction to the youth and parent interviews (Appendices H and I). In addition, we will not attribute the information survey respondents provide to specific individuals within any public documents.


We require subcontractors, consultants, and vendors to establish confidential information safeguards that meet prime contract security requirements. The evaluation project director ensures we properly dispose of any confidential information provided to, or generated by, a subcontractor, consultant, or vendor at the completion of the agreement between the parties.


We will destroy all data collected from the interviews and focus groups, the 18-month survey interviews, and the staff activity logs in a secure manner at the completion of the evaluation.


A11. Justification of Sensitive Information

The purpose of the study is to test the effects of the PROMISE demonstration and an innovative array of enhanced employment and educational services for youth and their families. Therefore, obtaining information about potentially sensitive topics, such risky behaviors including use of tobacco, alcohol, marijuana, or illegal drugs is central to the intervention. Information about engagement in risky behaviors is critical for the impact analysis, as we hypothesize that PROMISE programs may influence these behaviors (if they are addressed by the programs) and, as such, youth outcomes may reflect this influence. Race and ethnicity is required for certain subgroup analyses. The surveys will not collect data that we can obtain directly from other sources (for example, we obtain information about receipt of disability benefits directly from SSA administrative records).

A12. Estimates of Hours Burden

Table A.2 shows the expected number of participants in the qualitative data collections, number of interviews, hours per response, and the total response burden overall and by year. It also shows the expected number of participants in parent and youth surveys, number of interviews, hours per response, and total response burden overall and by year.


Over the course of the evaluation, we will conduct a total of 440 staff interviews. Per staff member burden estimates for these interviews include time for setting up the interview appointment by phone or email (6 minutes) and participating in the interview (60 minutes).


For the focus groups, the estimated time per response varies from 5 minutes to review the advance letter and complete the telephone screening for focus group nonparticipants to 100 minutes for those who participate in the focus group (5 minutes for screening, 5 minutes for reminders by phone and mail, and 90 minutes for the group discussion). The respondents spend the bulk of the annual burden time participating in the focus groups, which last approximately 90 minutes. The estimated total annual burden for the focus groups is 320 hours per round. This includes contacting and screening up to 50 enrollees (to obtain 10 focus group participants and 40 nonparticipants) per round at each of the five single-state projects, and 150 enrollees (to obtain 30 focus group participants and 120 nonparticipants) per round at the six-state consortium project.


As noted earlier, the sample will include 12,000 youth and parents enrolled in PROMISE across the six programs. Assuming a response rate of 85 percent for the 18 month survey interviews, we will conduct a total of 10,200 parents and 10,200 youth interviews for the 18-month follow-up survey. We anticipate response burden for each parent interview to be 41 minutes total (0.7 hours), which includes time allocated for reviewing the advance mailing and potentially calling to make an interview appointment (0.1 hours), as well as the time anticipated for completing the interview with a professionally trained interviewer (0.6 hours). We expect the youth interview to take .4 hours to complete, with .1 hour assumed for reviewing the invitational mailing or responding to a voicemail message (0.5 hours total). These estimates reflect a total expected burden of 12,240 hours for both parents and youth combined.


Because we will release the sample on a rolling basis, the total burden will vary by year, in accordance with the number of youth enrolled in PROMISE during the corresponding months in the enrollment period (19 months prior). These assumptions are shown in Table A.2, below. Table A.2 also contains the burden for the previously cleared data collection instruments.


Table A.2. Estimated Total Annual Burden by Respondent Type


2014 Interviews and Focus Group Discussions

Modality of Completion/

Respondent

Number of Responses

Frequency of Response

Average Burden Per Response (minutes)

Estimated Total Annual Burden (hours)

Staff Interviews with Administrators or Directors

24

1

66

26

Staff Interviews with PROMISE Project Staff

48

1

66

53

Youth Focus Groups – Non-participants

100

1

5

8

Youth Focus Groups – Participants

20

1

100

33

Parents or Guardian Focus Groups – Non-participants

100

1

5

8

Parents or Guardian Focus Groups – Participants

20

1

100

33

Totals

312



161


2015 Interviews and Focus Group Discussions and 18-Month Survey Interviews

Modality of Completion/

Respondent

Number of Responses

Frequency of Response

Average Burden Per Response (minutes)

Estimated Total Annual Burden (hours)

Staff Interviews with Administrators or Directors

51

1

66

56

Staff Interviews with PROMISE Project Staff

97

1

66

107

Youth Focus Groups – Non-participants

220

1

5

18

Youth Focus Groups – Participants

60

1

100

100

Parents or Guardian Focus Groups – Non-participants

220

1

5

18

Parents or Guardian Focus Groups – Participants

60

1

100

100

18 Month Survey Interviews – Parent

850

1

41

581

18 Month Survey Interviews—Youth

850

1

30

425

Totals

2,408



1,405


2016 Interviews and Focus Group Discussions, Staff Activity Logs, and 18-Month Survey Interviews

Modality of Completion/

Respondent

Number of Responses

Frequency of Response

Average Burden Per Response (minutes)

Estimated Total Annual Burden (hours)

Staff Interviews with Administrators or Directors

75

1

66

83

Staff Interviews with PROMISE Project Staff

145

1

66

160

Activity Logs for Administrators or Directors

45

14

5

52.5

Activity Logs for PROMISE Project Staff

160

14

5

187

Youth Focus Groups – Non-participants

320

1

5

27

Youth Focus Groups – Participants

80

1

100

133

Parents or Guardian Focus Groups – Non-participants

320

1

5

27

Parents or Guardian Focus Groups – Participants

80

1

100

133

18 Month Survey Interviews – Parent

5,100

1

41

3,485

18 Month Survey Interviews—Youth

5,100

1

30

2,550

Totals

11,425



6,838



2017 18-Month Survey Interviews

Modality of Completion/

Respondent

Number of Responses

Frequency of Response

Average Burden Per Response (minutes)

Estimated Total Annual Burden (hours)

18 Month Survey Interviews -- Parent

4,250

1

41

2,904

18 Month Survey Interviews -- Youth

4,250

1

30

2,125

Totals

8,500



5,029


Grand Total

Totals

Number of Responses



Estimated Total Annual Burden (hours)

Focus Groups and Staff Interviews

2,040



1,123

Staff Activity Logs

205



240

18 Month Survey – Parent Interviews

10,200



6,970

18 Month Survey – Youth Interviews

10,200



5,100

Grand Total

22,645



13,433


The total burden for this ICR is 13,433. This figure represents burden hours. We also calculated a separate cost burden for respondents. See #13 below for details.



A13. Estimates of Cost Burden to Respondents

1. Focus Groups and Staff Interviews

There is no cost to PROMISE administrators or to service providers because they participate in the interviews as part of their paid work. There is no cost to youth; they are still engaged in the pursuit of secondary education and we assume they are not wage earners. For parents or guardians, we estimated the cost burden using the average 2013 minimum wage rate across the states included in the evaluation (obtained from the U.S. Department of Labor website on state-by-state minimum wage data).3 Table A.3 shows the total cost to parents or guardians for their time in this collection. The evaluation contractor solely bears the costs for data collection, storage, processing, and other functions related to these data.


Table A.3 Annual Cost to Respondents


2014 Annual Cost to Respondents

Respondent Type

Number of Respondents

Frequency of Response

Average Burden Per Response (minutes)

Median Hourly Wage Rate (dollars)

Total Respondent Cost (dollars)

Parent or Guardian Focus Group – Non-Participants

100

1

5

$7.38

$61.00

Parent or Guardian Focus Group – Participants

20

1

100

$7.38

$246.00

Total

120




$307.00


2015 Annual Cost to Respondents

Respondent Type

Number of Respondents

Frequency of Response

Average Burden Per Response (minutes)

Median Hourly Wage Rate (dollars)

Total Respondent Cost (dollars)

Parent or Guardian Focus Group – Non-Participants

220

1

5

$7.38

$135.00

Parent or Guardian Focus Group – Participants

60

1

100

$7.38

$738.00

Total

280




$873.00




2016 Annual Cost to Respondents

Respondent Type

Number of Respondents

Frequency of Response

Average Burden Per Response (minutes)

Median Hourly Wage Rate (dollars)

Total Respondent Cost (dollars)

Parent or Guardian Focus Group – Non-Participants

320

1

5

$7.38

$197.00

Parent or Guardian Focus Group – Participants

80

1

100

$7.38

$984.00

Total

400




$1,181.00


Grand Total


Number of Respondents




Total Respondent Cost (dollars)

Grand Total

800




$2,361.00


2. 18 Month Survey Interviews

There are no direct costs to respondents for the 18-month survey interviews, other than their time to participate in the study, as described in #12 above. We will not ask respondents to maintain any new records. The evaluation contractor will collect and maintain all survey data, and the contractor is responsible for all costs associated with data collection, storage, processing, and other functions related to these data. These costs are summarized below (see section A.14) and we consider these costs to the federal government, paid through an SSA contract.


3. Staff Activity Logs

There is no cost to PROMISE staff because they will complete the staff activity logs as part of their paid work. The evaluation contractor solely bears the costs for data collection, storage, processing, and other functions related to these data.


A14. Annualized Cost to the Federal Government

The cost to SSA for conducting the PROMISE staff interviews and participant focus groups is $2,957,116. The cost to SSA for conducting the PROMISE 18-month follow-up surveys with parents and youth is $6,223,770. The cost to SSA for the staff activity logs is $41,766. Table A.4 below shows the costs by year.

We budgeted labor costs by estimating the number of hours for required staff at the various wage levels, multiplying by the applicable wage rates, and multiplying the resulting subtotals by factors to cover fringe benefits and burden expense. The basis for estimating other direct costs varies with the type of cost estimated. We summed and multiplied the total of labor costs and other direct costs by a factor to cover general and administrative expenses, and included the fee.

Table A.4 Annual Costs to the Federal Government

Fiscal Year

Focus Groups/Staff Interview Cost

18 Month Survey Interview Cost

Staff Activity Log Cost

Total

2014

$33,953

$175,536


$209,489

2015

$1,279,610

$510,090


$1,789,700

2016

$895,518

$2,151,100

$26,694

$3,073,312

2017

$748,035

$2,391,255

$15,072

$3,154,362

2018


$835,395


$835,395

2019


$77,664


$77,664

2020


$82,730


$82,730

Total

$2,957,116

$6,223,770

$41,766

$9,222,652


A15. Reasons for Program Changes or Adjustments

The increase in burden stems from the addition of the staff activity log data collection activity. We developed the staff activity log instrument for the PROMISE evaluation and request clearance for this instrument in addition to the previously cleared instruments for the evaluation’s other data collection activities (see Addendum for details).


A16. Plans for Tabulation and Publication of Results

With the PROMISE evaluation findings, SSA and ED will be able to advise federal policymakers and state administrators on the supports, services, policy, and program changes that could encourage individuals to work and decrease their dependence on disability and other public benefits. In fostering work efforts, the goal is to implement program changes, which produce savings to the federal government and improve program administration.


The evaluator will analyze the information collected in the staff interviews and focus groups to prepare reports that contain the findings and their program and policy implications. We will not use complex quantitative analytical techniques with data from these collections. Four major reports will present the findings from the staff interviews and participant focus groups, as well as other information collected for the evaluation. The reports will include a stand-alone summary of the purpose, methodology, key findings, and policy implications, as well as a short executive summary. Products resulting from information obtained in this data collection will provide SSA with information about the experiences of PROMISE staff, staff of partner agencies, and participants. The evaluator will integrate the information obtained from the staff interviews and participant focus groups with information collected from the other components of the evaluation to draw comparisons across and within sites and describe factors that might explain any observed variation.


The evaluator will analyze the information collected in the parent and youth 18-month follow-up surveys to prepare reports that contain the findings and their program and policy implications.


The impact reports will investigate the demonstration’s effects on a wide array of education, earnings, and self-determination outcomes; the amount of payments the recipient receives from SSA; and quality of life, both overall and for meaningful subgroups. Our proposed methodological approach combines a random assignment design with regression adjustment to improve the precision of our estimates. Because we randomly assign individuals to the control group and to the treatment group, the impact analysis will focus on differences in the outcomes of enrollees between these two groups using a regression framework to control for other explanatory variables. We will use a regression-adjusted comparison of randomly assigned treatment group to control group for the full sample to address the impact of the intervention on enrollees’ education, labor market, and other outcomes. We will also use a regression-adjusted comparison of randomly assigned treatment group to control group for subgroups defined by pre‑randomization values of age, race, gender, and type of disability.


The exact statistical technique we use to estimate regression-adjusted impacts will depend on the nature of the dependent variable and the type of issues we address. For example, if the dependent variable is continuous, then ordinary least squares regression produces unbiased estimates of impacts. For binary outcome variables (such as whether the beneficiary is employed), logistic regression models generate consistent and efficient estimates, if the parametric assumptions underlying those models are correct. If the dependent variable is a count variable, then we will use an ordered logit model. If the dependent variable is ordinal, we will first reduce the measure to binary outcomes and then estimate a logit model. To account for the fact that we will observe sample members for different lengths of time, we will also consider using event-history or hazard models for binary outcome measures. These models provide unbiased estimates of program effects on binary outcomes when we truncate participants’ data.


The purpose of the benefit-cost analysis is to determine whether the program impacts of the PROMISE demonstration are sufficiently large to justify the costs of providing program services. The results of this analysis will play an integral part in the decision to expand the demonstration to the larger population. We will base the analysis on an accounting framework that summarizes the intervention’s effects and resource use from the perspective of SSA and other key stakeholder groups, including society as a whole. To ensure the benefit-cost findings are as helpful as possible to SSA, Mathematica plans to present the information in a way useful for communicating this type of information to the SSA Office of the Actuary and to OMB. First, they will summarize all of the information based directly on data collected during the demonstration period. The second set of estimates will present the size of future effects (if any) that the program would require to generate benefits that exceed costs, along with an analysis of the likelihood that future effects of that size will occur. In this way, SSA actuaries will be able to see the net value generated during the observation period, and then use the more speculative analysis of possible future benefits and costs to draw conclusions about whether the PROMISE programs would ultimately pay for themselves. In addition to using this general presentation format, Mathematica will work with the SSA project officer, who will coordinate with other SSA staff, including actuaries, during the evaluation to ensure that the other assumptions used in the analysis—the discount rate, correction for inflation, and projections about potential productivity growth—are consistent with the ones they are using to assess other potential SSA initiatives. This consistency will go a long way in ensuring that comparisons of the various options are accurate and useful.


Table A.5 presents the planned timeline for the data collection, and the completion dates for the public reports.


Table A.5 Data Collection and Reporting Schedule


Activity/Report

Approximate Dates

Data Collection

Staff Interviews

Youth and Parent/Guardian Focus Groups

Beginning Fall 2014 and Spring 2016

Beginning Fall 2014 and Spring 2016

Survey of Parents

November 2015 through October 2017

Survey of Youth

November 2015 through October 2017

Staff Activity Logs

May 2016 through December 2016

Reports

Early Assessment Reports

Process Analysis Reports

Spring 2015 – Fall 2015

Fall 2016 – Spring 2017

Interim Services and Impact Report

Fall 2018

Long-Term Evaluation Report

Spring 2022

Data Files


Restricted access file for 18-month survey

Winter 2018

Public use file for 18-month survey

Fall 2018

Restricted access file for five-year survey

Summer 2021

Public use file for five-year survey

Spring 2022


A17. Approval Not to Display of Expiration Date for OMB Approval

SSA is not seeking an exemption with this submission. We will display the OMB expiration date on all focus group materials and surveys.


A18. Explanation of Exceptions

SSA is not requesting an exemption to certification requirements.

1 The six-state consortium project goes by the name Achieving Success by Promoting Readiness for Education and Employment (ASPIRE) rather than by PROMISE.

2 A youth may be living independently of his or her parents at the time of the five-year survey. In these cases, we will still attempt to interview both the youth and the parent or guardian.

3 Data accessed from website on November 6, 2013. [http://www.dol.gov/whd/minwage/america.htm]



iii

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Author889123
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy