Supporting Statement - RETAIN (Part A)

Supporting Statement - RETAIN (Part A).docx

Retaining Employment and Talent After Injury/Illness Network (RETAIN) demonstration

OMB: 0960-0821

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT FOR THE RETAIN EVALUATION

The Social Security Administration and the Department of Labor are undertaking the Retaining Employment and Talent After Injury/Illness Network (RETAIN) demonstration, which will test promising early intervention approaches to improve the labor force participation and retention of individuals with recently-acquired injuries and disabilities and to reduce their future need for Social Security disability benefits. The Department of Labor (DOL) is funding the intervention approaches and American Institutes for Research to provide programmatic technical assistance for the demonstration. The Social Security Administration (SSA) is funding Mathematica to provide evaluation support for the demonstration, including evaluation-related technical assistance and conducting a comprehensive evaluation.

The RETAIN demonstration consists of two Phases. The first involves cooperative awards to eight states to conduct planning and start-up activities, including the launch of a small pilot. During Phase 1, SSA will provide evaluation-related technical assistance and planning, and conduct evaluability assessments to assess which states’ projects would allow for a rigorous evaluation if continued beyond the pilot phase. DOL selected five states to continue to Phase 2, full implementation. During Phase 2, DOL will fund the operations and programmatic technical assistance activities for the selected states, and SSA will fund a full set of evaluation activities.


SSA is requesting clearance for the collection of data needed to evaluate RETAIN. The specific data collection efforts for which SSA is seeking OMB approval in this package include: (1) semi-structured interviews with program administrators and service providers conducted during two rounds of site visits. The interviews will focus on program implementation, perceptions of what worked well in each state’s program, and implementation challenges; (2) interviews with RETAIN service users to understand their experiences as they engage in program services; (3) staff activity logs to understand the costs of RETAIN services and inform the benefit-cost analysis; (4) two rounds of RETAIN enrollee surveys; and (5) a survey or RETAIN medical providers. The RETAIN enrollee surveys, planned for 2 and 12 months after enrollment, will focus on individual-level outcomes and perceptions of enrollees. The surveys will include up to 3,000 enrollees from each of the five states DOL selected for Phase 2. The provider survey, which will include a sample of up to 100 RETAIN providers from each state, will be conducted 15 months after the launch of Phase 2. It will collect information to help explain how providers delivered services, and will highlight any systems changes that might have occurred as a result of RETAIN.



PART A. JUSTIFICATION FOR THE STUDY

1. Authoring Laws/Circumstances Making the Collection of Information Necessary

The RETAIN Demonstration Projects are a collaborative effort led by the Department of Labor (DOL), in partnership with DOL’s Employment and Training Administration (ETA) and the Social Security Administration (SSA). RETAIN projects will test the impact of early intervention strategies that improve stay-at-work/return-to-work (SAW/RTW) outcomes of individuals who experience work disability while employed. “Work disability” is defined as an injury, illness, or medical condition that has the potential to inhibit or prevent continued employment or labor force participation.

SAW/RTW programs succeed by returning injured or ill workers to productive work as soon as medically possible during their recovery process and by providing interim part-time or light duty work and accommodations, as necessary. The RETAIN Demonstration Projects are loosely modeled after promising programs operating in Washington State, including the Centers of Occupational Health and Education (COHE), the Early Return to Work (ERTW), and the Stay at Work programs. While these programs operate within the state’s workers’ compensation system and are available only to people experiencing work-related injuries or illnesses, the RETAIN Demonstration Projects provide opportunities to improve SAW/RTW outcomes for both occupational and non-occupational injuries and illnesses of people who are employed, or at a minimum in the labor force, when their injury or illness occurs.

Central to these projects is the early coordination of health care and employment-related supports and services to help injured or ill workers remain in the workforce. These supports and services include:

  • Training in occupational health best practices for participating health care providers;

  • Active involvement of a Return-to-Work Coordinator throughout the medical recovery period to facilitate continued employment;

  • Enhanced communication among workers, employers, and health care providers;

  • Accommodations and job modifications; and

  • Retraining and rehabilitation services.

To accomplish this, projects will provide services through an integrated network of partners that includes close collaboration between state and/or local workforce development entities, health care systems and/or health care provider networks, and other partners as appropriate.

The primary goals of the RETAIN Demonstration Projects are:

  1. To increase employment retention and labor force participation of individuals who acquire, and/or are at risk of developing, work disabilities; and

  2. To reduce long-term work disability among RETAIN service users, including the need for Social Security Disability Insurance and Supplemental Security Income.

The ultimate purpose of the demonstration is to validate and expand implementation of evidence-based strategies to accomplish these goals. DOL is funding the intervention approaches and programmatic technical assistance for the demonstration. SSA is funding evaluation support, including technical assistance and the full evaluation for the demonstration.

To provide information to inform the development of evidence-based strategies, the evaluation from Phase 2 implementation will include the following four components:

  • Participation analysis: Using RETAIN service user interviews and surveys, this analysis will provide insights into which eligible workers choose to participate in the program, in what ways they participate, and how services received vary with participant characteristics. Similarly, it will assess the characteristics of, and if possible, reasons for non-enrollment of non-participants.

  • Process analysis: Using staff interviews and logs, this analysis will produce information about operational features that affect service provision; perceptions of the intervention design by service users, providers, administrators, and other stakeholders; the relationships among the partner organizations; each program’s fidelity to the research design; and lessons for future programs with similar objectives.

  • Impact analysis: This analysis will produce estimates of the effects of the interventions on primary outcomes, including employment and Social Security disability applications, and secondary outcomes, such as health and service usage.

  • Cost-benefit analysis: This analysis will assess whether the benefits of RETAIN justify its costs. We conduct this assessment from a range of perspectives, including those of the participants, state and Federal governments, SSA, and society as a whole.

2. Purposes and uses of the information

The purpose and proposed use of this information collection is to gather qualitative and quantitative data needed to conduct the analysis described in Section 1. These activities, described in the text that follows, include (1) site visits conducted in Phase 2; (2) interviews with RETAIN service users; (3) staff activity logs; (4) surveys of RETAIN enrollees and (5) surveys of RETAIN medical providers.

Mathematica will conduct site visits, including in-person interviews with state administrators and program staff, and telephone interviews with RETAIN service users, in all of the Phase 2 RETAIN programs.

Information from these site visits will be used to examine research questions for the process evaluation to address three key objectives:

  1. Document recruitment and enrollment activities. The process evaluation will document strategies RETAIN programs used to identify workers at risk of SSI/SSDI entry and recruit them into program services. It will also highlight the challenges RETAIN programs faced in recruiting and enrolling service providers and workers, and how they addressed the challenges.

  1. Document the program’s model for service delivery. The process evaluation will document each RETAIN program’s logic model and sequence of services and assess how well states implemented services with fidelity to their program model. It will describe how intervention services differ from the usual services already available to ill or injured workers, and the relationships and partners that were necessary to deliver services effectively.

  2. Document barriers and facilitators to program implementation. Most of the staff interviews will take place during the site visits, though some will be done by telephone if necessary. Examples of state program administrators we will interview include the RETAIN program director of state-level agencies participating in RETAIN, along with executive directors of health care systems and nongovernmental or community-based organizations that provide services to RETAIN service users. Examples of RETAIN program staff include return-to-work coordinators, health care providers, and other staff working on the front lines of organizations providing demonstration services.

Phase 2 site visits

a. First site visit

The first site visit will occur five months after the beginning of Phase 2 enrollment, in April 2022. The evaluation team will use information collected during the first round of site visits to describe states’ early experiences with the demonstration. This information will support analyses related to six key research questions for the process analysis:

  • What organizational partnerships have formed under the RETAIN programs to support service delivery?

  • To what extent are states doing the following: recruiting and enrolling enrollees with fidelity to the planned model? Delivering services in accordance with the service model? Maintaining fidelity to the evaluation design?

  • What strategies are RETAIN programs using to identify workers who have recently experienced the onset of a work-threatening injury, illness, or disability? How did the states design the programs for workers? How are RETAIN programs recruiting eligible individuals into the demonstration? What challenges are RETAIN programs facing in doing so, and how are they overcoming these challenges?

  • What are states doing to deliver RETAIN services? How are these services different from the services states were providing to injured or ill workers before RETAIN?

  • What have been important programmatic and environmental facilitators in implementing RETAIN services to date? What are the challenges states have faced during the implementation of RETAIN services to date? How have these challenges influenced implementation? How have states overcome those challenges?

  • How did states use programmatic and evaluation technical assistance to implement programs? Did they need to make any modifications to meet implementation goals (e.g., recruitment targets)?

  • How do states’ data collection procedures work? How are states using management information systems to support data collection?

To address these questions, the evaluation team will conduct semi-structured interviews with administrators and project staff. These interviews will last up to 60 minutes each and we will conduct them one on one or, if requested, in small groups of two to three staff per session. We will use the findings from this data collection to assess program implementation and provide feedback to states about where they may need potential improvements to programs, and areas where they may need additional technical assistance or support for RETAIN. Attachment B lists the topics we will address during the semi‑structured interviews.

If it is feasible and appropriate given the RETAIN program model, Mathematica will visit two service delivery providers that differ on some key factor (for example, different provider or health care systems, or different areas of the state) to get the perspectives of a range of stakeholders. The evaluation team will reach out to the RETAIN program director in each state to begin planning the site visits in the month before the visit. The researcher visiting the state will schedule an initial telephone call to discuss the purpose of the site visits, identify the two areas of interest, and get names and contact information for the staff interviews.

b. Second site visit

The second site visit will occur approximately 18 months after the launch of Phase 2 enrollment, in May of 2023. RETAIN evaluation staff will use the information collected during the second round of site visits to describe states’ experiences with the fully implemented programs. We will structure and organize the second round of visits in the same manner as the first site visits described above. Staff interviews will last up to 60 minutes each, and we will conduct them one on one or, if requested, in small groups of two to three staff per session. Mathematica will visit the same service delivery providers to understand changes in key aspects of the demonstration. Information collected during the second round of site visits will support analyses related to seven key research questions:

  • How have organizational partnerships under the RETAIN program changed to support service delivery?

  • How have states changed the delivery of RETAIN services in the prior year (since the last visit)?

  • What challenges and facilitators have influenced the implementation of RETAIN services, and how have states overcome those challenges?

  • To what extent are states delivering services with fidelity to the planned program model and evaluation design?

  • What are states’ plans for sustaining RETAIN services after the demonstration? What changes do states anticipate making to sustain RETAIN services after the demonstration?

  • What changes have been made to the counterfactual services (i.e., the environment without RETAIN services) available to control group members? What are the implications for the ability of the evaluation to detect and interpret impacts?

  • What are the key program cost components?

To answer these questions, Mathematica will conduct in-depth interviews with administrators and RETAIN program staff. We will use the information gathered to assess process findings related to program implementation. The findings will also provide contextual information to other aspects of the participation, impact, and benefit cost analysis. For example, the findings could inform the evaluation of any changes to the counterfactual service environment that may have implications for the evaluation’s ability to detect impacts and how to interpret those impacts. Finally, the evaluation team will use information collected during the second site visit to develop a template for collecting cost information from each state. Attachment B lists the topics to be addressed during these interviews.

Interviews with RETAIN service users

Mathematica will conduct interviews with 15 RETAIN service users in each state in September of 2022, 10 months after the launch of Phase 2 enrollment. These interviews will occur outside of the site visits described above. We will draw enrollees selected for these interviews based on their use of services. Mathematica will use the information collected during these interviews to describe the experiences of service users involved in the demonstration, and to supplement other data collected and used in conducting the process evaluation.

Information collected during the interviews will support analyses related to six key research questions for the process analysis, including:

  • What motivated enrollees to enroll in RETAIN?

  • What are enrollees’ employment goals and their attitudes about staying at work or returning to work?

  • What do enrollees like and dislike about the RETAIN services?

  • What other services are enrollees either aware of or receiving already?

  • How satisfied were enrollees with services and did satisfaction vary by state or service use level?

  • What factors were related to RETAIN service use?

Each interview will last up to 30 minutes. We included the topics to be addressed during these telephone interviews in Attachment C. We will use the findings to assess enrollees’ engagement in, and satisfaction with, the RETAIN services; identify which aspects of the services may be more or less associated with participation outcomes; and give RETAIN programs feedback about potential improvements.

Recruitment for the RETAIN service user interviews will occur in August of 2022. We will use data from the states’ enrollment systems and service use data to recruit a purposefully selected sample of service users with different levels of service usage including enrollees who did not receive any RETAIN services beyond enrollment We will ensure that the selected sample includes a variety of medical conditions and some representation of women and racial and ethnic minorities. The evaluation team will send recruitment letters to potential respondents and give them a toll-free number to call and schedule an interview.

Staff activity logs

The staff activity logs (Attachment I) provide data on aspects of service delivery that we cannot readily obtain from administrative data files and other sources. The logs will include staff’s daily time spent on various activities that are core components of the RETAIN model: recruitment and enrollment, case management, return-to-work services, care coordination, and communication with and training for health care providers and employers. The logs will also include categories related to program administration (evaluation, training, and other management), as well as travel, work leave, and other program activities outside the above categories. This information will be useful for the benefit-cost analysis, enabling us to allocate program costs across the various components. Such information will be helpful for understanding the level of resources RETAIN programs allocates to one or another component, which could inform those interested in replicating a specific state’s program and interpreting program impacts.

Data from the staff activity logs will answer the following research questions:

  • How does a program allocate resources across RETAIN components?

  • How does actual program allocation align with the program’s model of service delivery?

  • What level of effort does a program allocate to program management versus program services?

  • How do specific types of staff differ in how they spend their time on program management and service delivery?

To answer these questions, we will collect staff activity logs from selected staff for two one‑week periods around the time of the second evaluation site visit (during spring 2023). The one-week periods will represent typical work weeks for staff, avoiding weeks with atypical training or conferences. We expect to ask approximately 13 staff members from each program to complete the logs, depending on the number of staff and the different staff categories involved in delivering substantive services. Individuals selected to complete the logs will include both administrative and direct service staff.

Surveys of RETAIN enrollees

The survey of RETAIN enrollees will collect information on a variety of issues that is not easily accessible or available through administrative data. SSA plans to administer two rounds of the enrollee survey to measure changes in enrollees’ health, service use, and employment over time. Mathematica will conduct surveys with about 15,000 enrollees across the five RETAIN programs at 2 and 12 months after enrollment. The first survey will collect information about the disability onset event, service receipt, and immediate-term outcomes related to the return-to-work process. The second survey will obtain information on interim outcomes that could inform the evaluation, such as changes in employment status, earnings, benefit receipt, and enrollee health and well-being. The first round of the enrollee survey will begin in December of 2021 and end in May of 2024. The second round will begin in October of 2022 and end in April of 2025.

The surveys will use a sequential, mixed-mode design. Each round will be web-based, with mail and telephone follow-up, and will be administered in English and Spanish. We estimate the duration of the interview is about 12 minutes for the Round 1 (R1) survey and 18 minutes for Round 2 (R2). We will release sample cases on a rolling basis that mirrors the months of study enrollment. The surveys will have a 12-week field period, with the full data collection period spanning 25 months in total in each round.

Exhibit A1-1 lists enrollee survey domains and measures, roughly in the order that the items will be collected during the interviews. The enrollee survey instruments are provided in Attachment E.

Table A1-1. Enrollee surveys, by domain or topic, by round

Domain/topica

Round 1

Round 2

Current Employment



Illness or injury that limits work

X

X

Employment status and duration of employment with main employer

X

X

Wage, hours, and benefits

X

X

Employer accommodations

X

X

Reasons for medical leave

X

X

Reasons for not working now

X

X

Job search

X

X

Return-to-work expectations

X

X

Participation in the gig economy

X

X

Benefits



Receipt of workers’ compensation and disability insurance


X

Income



Household income


X

Receipt of public assistance (e.g., SNAP, TANF, other)


X

Training and receipt of employment services



Use of employment services

X

X

Participation in training

X

X

Use RTW coordinator and satisfaction with services

X

X

Health and functioning



Physical and mental health status

X

X

Health insurance

X

X

Work limitations and pain

X

X

Prescribed opioid pain relievers

X

X

Contextual factor



Marital status

X

X

a number of measures are not included in Table A1-1; they are collected during enrollment and captured on part one of the DOL enrollment form.

RTW = return to work; SNAP = Supplemental Nutrition Assistance Program; TANF = Temporary Assistance for Needy Families.


We will document findings from each round of the enrollee survey in the final impact report. SSA might also include findings from the first survey round in a special topic report on interim impacts (October 2024).

Surveys of RETAIN medical providers

The purpose of the survey with RETAIN medical providers is to collect information on program operations, service delivery, and RETAIN-induced practice changes. The survey will capture information not available from other sources about provider practices and the experiences of RETAIN medical providers. Mathematica will conduct surveys with up to 100 providers from each of the five RETAIN programs 15 months after the launch of Phase 2. The survey will begin in February 2023 and have a 14-week field period.

Like the enrollee survey, the provider survey will use a sequential mixed-mode design, with respondents having the option to participate by web, paper, or over the telephone. We will administer it in English, with a Spanish translation provided upon request. The survey will take about 15 minutes to complete. Exhibit A1-2 lists provider survey domains and measures, roughly in the order that we will collect the items during the interviews. The survey instruments are provided in Attachment G.

Table A1-2. Provider Survey Topics by round

Domain/topic

Round 1

Round 2

Provision of health care services



Primary role

X


Years in practice

X


Percentage of patients using workers’ compensation benefits

X

X

Use of return-to-work best practices

X

X

Experience working with a service coordinator

X

X

Barriers to providing optimal patient care

X

X

Provider experience in RETAIN



Awareness of practice participation in RETAIN

X

X

Share of patients enrolled in RETAIN

X

X

Burden of RETAIN administrative requirements

X

X

Receipt of formal training for RETAIN

X

X

RETAIN training topics

X

X

Satisfaction with training and impact on interaction with all patients

X

X

Barriers for RETAIN success

X

X

Factors discouraging practice participation

X

X

Recommendation for RETAIN adoption by other practices

X

X

Placeholder for state-specific items

X

X

Provision of patient care at practice before RETAIN

X


We will document the findings from the provider survey data in the final impact report. SSA might also include findings from the provider surveys in a process analysis report (January 2024) and in a special topic report (January 2025) that focuses on early RETAIN impacts.

3. Use of technology to reduce burden

We will not employ extensive use of technology for the qualitative components of this data collection, such as site visits and interviews with service users. Trained and experienced professional researchers will conduct the interviews using semi-structured protocols. We will digitally record and transcribe interviews (with service user permission), to allow the interviewer to focus on the conversation. To the extent possible, we will send interview invitations and reminders, based on interviewees’ preferences.

Mathematica will send the staff activity logs to program staff via email. We designed the logs to be completed in Microsoft Excel, although program staff can print a PDF version to complete the logs on paper if they prefer. Program staff will return the completed logs to Mathematica via email or fax.

We will use technology in the surveys of enrollees and providers to reduce respondent burden; standardize data collection; and store the evaluation data in a secure, consistent manner. The surveys will use the following:

  • Web-based questionnaires. Mathematica will field the enrollee and provider surveys by web, offering a low-burden way for respondents to self-report whenever it is most convenient for them. Mathematica will deploy the web survey using Confirmit® software.1 This multimode platform allows respondents to complete the interviews using a tablet, computer, or mobile device connecting to the web-based instrument, and to complete the interview by telephone with staff at Mathematica’s Survey Operations Centers. The software offers all the advantages of computer-based administration, including range and logic checks, preprogrammed skips based on item responses or preloaded variables, and dynamic text fills.

To launch the web survey, we will send sample members the survey link and a unique password in the advance letter (see Attachment F). We will send this information to providers in the advance letter, as well as in their invitation and reminder emails (see Attachment H).2 Provider email invitations will feature personalized hyperlinks that allow respondents to begin answering questions without having to input their login information, further minimizing burden.

  • Computer-assisted telephone interviewing (CATI). Mathematica will field the CATI versions of the enrollee and provider instruments using Confirmit® software. Mathematica’s professionally trained interviewers will use the software to manage nonresponse follow-up by telephone, ensuring that nonresponding sample members receive contact attempts across different days of the week and times of day, as well as ensuring that interviewers contact them during appropriate calling hours for their time zone. The system will enable interviewers to record notes after each contact attempt, minimizing sample member burden associated with repeating information to several different interviewers.

Both the web and CATI instruments will allow for breakoffs, should respondents need to pause the interview and resume at a later time, without having to re-populate responses they have already provided.

  • Computer-based sample management system. The sample management system will minimize respondent burden by ensuring that nonresponse follow-up efforts are directed only to applicable cases in each survey. Furthermore, it will ensure that we deliver survey mailings and telephone follow-up efforts in respondents’ preferred language. We will update the system in real time, as respondents complete interviews in any mode. This database will allow Mathematica to update respondent contact information over time, using information provided by the program states and other sources, and direct subsequent mailings to the most current location. Finally, Mathematica will use the sample management system to document why cases may have become ineligible for the surveys (for example, documenting deceased enrollees or providers’ departure from their practice organization).

  • Software for coding of open-ended responses. Some of the questions in the enrollee survey instrument contain an open-ended response format. Mathematica’s trained data coding team will review these responses and group them according to themes, applying codes that facilitate statistical analysis. Mathematica’s coding software, ASCRIBE®, will facilitate high-quality coding by offering proposed codes based on prior decisions, coding all instances of a given statement uniformly, and providing quality assurance checks for supervisors to test for intercoder reliability.

  • Toll-free telephone number, survey website, and email address. All survey mailings will include a toll-free number that sample members can use to contact the study team with questions or concerns. Professionally trained interviewers will respond to these calls throughout the field periods. In addition, SSA will host an information website, which sample members can visit to obtain information and relay concerns about the legitimacy of the surveys. Provider survey sample members will also have access to a survey email address, which they can use to contact Mathematica staff about the survey.

4. Efforts to avoid duplication

The evaluation of RETAIN will not require collection of information that is available through alternate sources.

The site visits and service user interviews will provide information that cannot be obtained through SSA’s administrative records, other readily available sources, or other planned survey efforts for the demonstration. We will use these data to describe how the RETAIN programs designed and delivered RETAIN services. For example, the first round of interviews with state administrators and program staff will include discussion of organizational partnerships, recruitment and enrollment, provider and service users’ participation, service take-up, fidelity to the service model, and data collection procedures. The second round of interviews with state staff will focus on the fully implemented programs’ service delivery experiences, changes to the model or stakeholder partnerships since implementation, and the feasibility of and plans for sustaining the model after the demonstration. These later interviews will also yield information about changes in the counterfactual service environment that have implications for the treatment contrast or the evaluation’s ability to detect impacts.

The staff activity logs will provide information that is not available through SSA’s administrative records, the programs’ management information systems, or the programs’ administrative cost data. The amount of time staff spend on services such as coordination with medical providers and on program administration will help us understand how the programs operate and the services that they emphasize.

The enrollee and provider surveys will provide additional information that is unavailable in SSA’s program records. For example, the enrollee surveys will collect information on the experiences and well-being of RETAIN enrollees, including their employment status, job skills development, health, health insurance coverage, employer accommodations, satisfaction with RETAIN services, expectations for the future, and household income and benefit receipt. These data are not available from any other source. The survey will not collect information that is available in SSA administrative data, including SSA disability applications and payments and calendar year earnings.

Similarly, the RETAIN service provider surveys will collect information that is not available through any other source, including data on provider awareness of participation in the demonstration, engagement in RETAIN training, and approach to delivering services.

5. Methods to minimize burden on small entities

Some of the service providers that Mathematica will interview for the process analysis may be staff of small entities. Understanding this, we will minimize burden on those and all organizations as best we can while still obtaining the necessary information. In particular, Mathematica will keep discussions to one hour or less, and whenever possible, we will obtain information from other sources (such as administrative data) to limit how much we ask of staff. Mathematica has kept the number of interviews to a minimum and will schedule them at times that are convenient to the respondents.

The survey of RETAIN providers will pose minimal burden to small entities that may be participating in the demonstration. These health care organizations have agreed to participate in the demonstration and recognize that provider participation in the surveys is a part of that effort. Mathematica will minimize the burden by directing nonresponse follow‑up efforts to the providers themselves, placing minimal burden on staff at the front desk of the organization, and by making the survey available for completion outside business hours, at a day, time, and format that is most convenient and least burdensome for the provider.

6. Consequences of not collecting data

Site visits and in-person interviews with program staff

These site visits and interviews are valuable for observing program operations firsthand and understanding which aspects of the programs work well and why. Moreover, if we make fewer visits, SSA and DOL will not be able to assess how the programs evolve over time to address challenges and leverage successes. Conducting interviews in person will allow the evaluation team to capture as complete a picture as possible of what program implementation looked like in practice and enhance Mathematica’s ability to develop a narrative about service delivery that will give DOL and SSA a rich source of information on ways to improve programs.

Interviews with service users

These telephone interviews are necessary to help DOL and SSA assess whether service users have a favorable impression of the services; how their impressions translate into service use; and how participation in RETAIN affected enrollees’ employment decisions and quality of life. Not collecting this information would lead to missed opportunities for improving programs and assessing how well the quantitative analysis findings apply in different settings. Finally, speaking with both RETAIN enrollees and project staff will support a more balanced approach to understanding program implementation than we could gain from interviewing project staff alone.

Staff activity logs

Mathematica will collect the staff activity logs in two one-week periods around the time of the second round of site visits to each program. Two periods are necessary to provide a representative sample of staff’s time use and to account for potential seasonal differences in program activities. The data collected are necessary to conduct a credible evaluation and are not available from other sources. Failure to collect the data would result in a less-precise benefit-cost analysis.

Surveys with RETAIN enrollees

The enrollee surveys collect critical data to help measure program outcomes for which data are not available from other sources. Without the survey data enrollees provide, the evaluation will not be able to assess the impact of RETAIN on outcomes for which data are not available from other sources, including employment status, income and benefit receipt, workplace accommodations, health status, and satisfaction with services RETAIN provides. Furthermore, we cannot collect these data less frequently, as the two surveys provide a critical measure of short- and longer-term impact of receipt of RETAIN services.

Surveys with RETAIN service providers

The survey of RETAIN service providers will collect critical data to help measure program outcomes for which data are not available from other sources. Without these data, the evaluation will not be able to assess the impact of RETAIN on outcomes such as greater utilization of return-to-work best practices in care delivery or the providers’ perceptions about barriers they face in providing optimal patient care with this population. These data, which are not available from other sources, play a critical role in evaluating whether the field as a whole should seek to emulate the practices developed by RETAIN.

7. Special circumstances

There are no special circumstances that would cause this information collection to be conducted in a manner inconsistent with 5 CFR 1320.5.

8. Federal Register announcement and consultation

a. Federal Register Notice

SSA published the 60-day advance Federal Register Notice on January 6, 2021 at 86 FR 667, and we received no public comments. The 30-day FRN published on March 12, 2021 at 86 FR 14170. If we receive any comments in response to this Notice, we will forward them to OMB.

b. Consultation with outside agencies

As a first step in the RETAIN evaluation, SSA undertook collaboration with their partner agency, DOL, on key issues relating to Phase 1 implementation and recruitment efforts across the state programs. SSA has also organized a technical working group (TWG) to provide input on key research questions, evaluability considerations, feasible experimental and nonexperimental methods, survey designs, analysis strategies, and interpretation and presentation of results. The TWG consists of researchers and clinicians with expertise in the areas of disability, early intervention, and evaluation design. The TWG is scheduled to meet on a regular basis, with three meetings planned during Phase 1 of the demonstration (in February, May, and August 2019) and three meetings planned during Phase 2, timed around key evaluation reports (early assessment, process and early impacts, final impacts). These external experts are:

  • Thomas Wickizer, Ph.D., Ohio State University College of Public Health

  • Glenn Pransky, former director at Center for Disability Research at the Liberty Mutual Research Institute

  • Carolyn Heinrich, Ph.D., Vanderbilt University

  • Jack Smalligan, M.A., Urban Institute

  • Frank Neuhauser, Ph.D., University of California at Berkeley’s Institute for the Study of Societal Issues

  • Douglas Martin M.D., Medical Director, UnityPoint Health – St. Luke’s Occupational Medicine

  • Marianne Cloeren, M.D., M.P.H., University of Maryland School of Medicine

  • Benjamin Doornink, M.B.A., Kootenai Health

An interdisciplinary team of economists, disability policy researchers, and survey researchers on staff at the evaluation contractor (Mathematica and its subcontractor, Tree House Economics, LLC) are contributing to the design of the overall evaluation. These individuals include:

  • Jillian Berk, Ph.D., Mathematica

  • Rosalind Keith, Ph.D., Mathematica

  • Gina Livermore, Ph.D., Mathematica

  • Holly Matulewicz, M.A., Mathematica

  • David Wittenburg, Ph.D., Mathematica

  • David Stapleton, Ph.D., Tree House Economics, LLC

  • Kenneth Fortson, Ph.D., Mathematica

c. Consultation with RETAIN enrollees

Interviews with RETAIN service users will provide firsthand feedback on experiences with RETAIN. Where applicable, we will use findings from the interviews we hold early on to refine procedures and discussion topics for interviews we conduct later. Because of the timing of the RETAIN service user interviews (September 2022), we will not use these findings to inform the design of the survey instruments for the 2- and 12-month questionnaires used in the RETAIN enrollee surveys.

9. Payments or gifts

Mathematica will not offer remuneration to program administrators or directors or to RETAIN program staff members for participating in the qualitative interviews or completing staff activity logs. Mathematica will give respondents to the RETAIN service user interviewees a $30 gift card to express the study team’s appreciation for their time.

The enrollee and provider surveys will offer incentives for participation. SSA plans the following respondent payments:

  • Each round of the enrollee survey will feature a total incentive of $30. Mathematica will include a $5 prepaid cash incentive in the survey advance letter. The $5 prepayment is designed to encourage participation and offset costs associated with nonresponse follow-up. Respondents who complete the survey, by any mode, will receive a $25 gift card. Although the demographic characteristics of RETAIN enrollees are not yet known, Mathematica anticipates that gift cards will maximize the use and value of the incentive amount among survey respondents, especially for those who lack access to banks and might incur check-cashing fees.

Research shows that incentives increase response rates without compromising data quality (Singer and Kulka 2000), and they help increase response rates among people with relatively low educational levels (Berlin et al. 1992), among low-income and non‑white populations (James and Bolstein 1990), and among unemployed workers (Jäckle and Lynn 2007). There is also evidence that incentives bolster participation among those with lower interest in the survey topic (Jäckle and Lynn 2007; Kay 2001; Schwartz, Goble, and English 2006), resulting in data that are more complete.

  • The provider survey will feature a total incentive of $50. Mathematica will include a $5 prepaid cash incentive in the survey advance letter. As with the enrollee survey, this prepayment is designed to encourage participation and offset costs associated with nonresponse follow-up. Respondents will receive a $45 check. The provider incentive design is drawn from industry-wide practices for motivating responses from health care professionals (Cho, Johnson, and VanGeest 2013; McLeod et. Al 2013).

10. Assurances of confidentiality

The information provided during the staff and RETAIN enrollee interviews is protected and held in confidential accordance with 42 U.S.C. 1306, 20 CFR 401 and 422, 5 U.S.C. 552 (Freedom of Information Act) 5 U.S.C. 552a (Privacy Act of 1974) and OMB Circular No. A-130. The data will be treated in a confidential manner unless otherwise compelled by law.

The study team takes seriously the ethical and legal obligations associated with the collection of confidential data. Secure handling of confidential data is accomplished via several mechanisms, including obtaining suitability determinations for designated staff, training staff to recognize and handle sensitive data, protecting computer systems from access by staff without favorable suitability determinations, limiting the use of personally identifiable information in data, limiting access to secure data on a “need to know” basis, and only for staff with favorable suitability determinations, and creating data extract files from which identifying information has been removed.

We will make clear the assurances and limits of confidentiality in all advance materials sent to recruit RETAIN service users and we will restate these assurances at the beginning of each interview. Although Mathematica staff members may work with the state liaisons to schedule and coordinate the interviews with RETAIN staff, they will not give those staff direct feedback on findings from the interviews. Mathematicall will aggregate all relevant findings from the staff interviews will in the evaluation reports. For the RETAIN service user interviews, Mathematica staff will have access to the states’ enrollment data, which will contain contact information for each potential participant for the interviews. However, Mathematica will not release this information to anyone outside the evaluation team. Moreover, Mathematica will not reveal to the states, RETAIN programs, or any other entity the names of the service users who participated in these interviews.

The Paperwork Reduction and Privacy Act statements appear on the enrollee and provider survey advance letters (Attachments F and H). Mathematica will include text reiterating assurances about the purposes of the survey and how we will use the data provided in the advance notification letter and in the survey introductions across all modes (Attachments E, F, G, and H). After we collect and analyze the survey data, Mathematica will not attribute the information that survey respondents provide to specific individuals in any public documents. Finally, Mathematica will destroy all data collected during the interviews and surveys in a secure manner at the completion of the evaluation.

11. Justification of sensitive information

We will not ask RETAIN staff any questions that are sensitive in nature. We do not expect the interviews with RETAIN service users to touch on any sensitive topics related to their involvement with RETAIN and services they have received. However, the general process of discussing their return-to-work experiences might be sensitive for some individuals, depending on their lived experiences and perspectives on their medical condition(s). We anticipate that these individuals will decline the interview solicitation.

Some enrollee survey respondents might have similar sensitivities with respect to discussion of their health. Additional items might also be sensitive topics for respondents, including household income, participation in public benefit programs, and whether the respondent was prescribed opioid medications. Some enrollees might consider questions about household income and benefit receipt to be sensitive because they believe financial matters are private. Public concern about opioid addiction and abuse might cause some respondents to feel embarrassed or ashamed to report opioid use even when that use is appropriate. All modes of survey administration will permit respondents to refuse to answer questions they do not wish to answer or that make them feel uncomfortable.

The provider survey does not collect any information that could be considered sensitive. Nonetheless, these respondents will have the same opportunity to decline responding to any questions they do not wish to answer.

12. Estimates of the hours of burden

Staff interviews. Over the course of the evaluation, we will conduct interviews with a total of 95 staff, including interviews with RETAIN administrators. Burden estimates per staff member for these interviews are 1.25 hours in total for each round, which includes time for setting up interview appointments by telephone or email (0.25 hours) and participating in the interview (1.0 hours). We have allocated an additional 0.5 hours for the RETAIN state administrators or directors (1.75 hours in total for each round) to participate in provider selection and overall planning for the first site visit and in gathering information that will inform the benefit-cost analysis in the second visit. The estimated total burden time for all respondents and nonrespondents is 243 hours.

Interviews with RETAIN service users. The estimated time per response for these interviews varies from 0.1 hours (to review the invitation letter) for nonrespondents and 0.6 hours for interviewees (to review the invitation letter, call in to schedule an appointment, and complete the telephone interview). The bulk of annual burden time is spent in the interview itself, which will last up to 30 minutes. The estimated total burden time for all respondents and nonrespondents is 113 hours. This includes time spent fielding inquiries and scheduling interviews with up to 75 enrollees across the five RETAIN programs (15 per state).

Staff activity logs. The estimated time to complete the staff activity log is five minutes per day, and we are asking staff to complete the log each day for two one-week periods. This estimate includes time spent reviewing task instructions, recording their information, and returning the completed form. We anticipate that the data collection will include 1 RETAIN administrator (5 total) and 12 agency line staff per state (60 total). The total burden for this effort is 76 hours.

Enrollee surveys. The sample will include 14,040 individuals enrolled in RETAIN across the five programs (3,000 each for Kansas, Kentucky, Minnesota, and Ohio; and 2,040 for Vermont). Assuming a response rate of 80 percent at each round, we will conduct 22,464 interviews (11,232 each round). We anticipate the response burden for the R1 enrollee survey to be 15 minutes (0.25 hours). This includes time allocated for reviewing the advance mailing and potentially calling in to book an interview appointment (0.05 hours), as well as the time anticipated for completing the interview (0.2 hours). Across the 11,232 enrollee interviews in R1, the total burden is 2,808 hours for survey respondents. We expect the second round of the survey to have a slightly larger burden, as the R2 survey interview duration is longer. The R2 survey interview includes questions that we will not include in R1. We estimate the R2 response burden to be 21 minutes (0.35 hours), which includes time allocated for reviewing the advance mailing and potentially calling in to book an interview appointment (0.05 hours), as well as the time anticipated for completing the interview via any mode (0.30 hours). Across the 11,232 enrollee interviews in R2, the total burden is 3,932 hours. These estimates reflect a total expected burden of 7,020 hours for respondents and nonrespondents in the enrollee survey.

Provider surveys. The sample will include 500 providers delivering RETAIN services across the five programs (100 per program). Assuming a response rate of 80 percent, we will conduct a total of 400 interviews. We estimate the response burden to be 17 minutes (0.28 hours), which includes time allocated for reviewing the advance mailing and potentially calling in to book an interview appointment (0.05 hours), as well as the time anticipated for completing the interview (0.23 hours). Across the 400 provider surveys, the total burden is 113 hours. These estimates reflect a total expected burden estimate of 118 hours for provider survey respondents and nonrespondents.

Please see the burden charts below:

RETAIN 2021 Burden Figures:

Modality of Completion

Number of Respondents

Frequency of Response

Average Burden per Response (minutes)

Estimated Total Annual Burden (hours)

Average Theoretical Hourly Cost Amount (dollars)*

Average Wait Time in state RETAIN facilities (minutes)**

Total Annual Opportunity Cost (dollars)***

Enrollee Survey Round 1

(Respondents)

374

1

15

94

$25.72*

24**

$6,276***

Enrollee Survey Round 1

(Nonrespondents)

94

1

3

5

$25.72*

24**

$1,106***

Totals

468



99



$7,382***


RETAIN 2022 Burden Figures:

Modality of Completion

Number of Respondents

Frequency of Response

Average Burden per Response (minutes)

Estimated Total Annual Burden (hours)

Average Theoretical Hourly Cost Amount (dollars)*

Average Wait Time in state RETAIN facilities (minutes)**

Total Annual Opportunity Cost (dollars)***

Staff Interviews

(state administrators / directors)

5

1

105

9

$45.23*

24**

$498***

Staff Interviews

(program line staff)

90

1

75

113

$32.58*

24**

$4,854***

Service User Interviews

(Respondents)

75

1

36

45

$25.72*

24**

$1,929***

Service User Interviews

(Nonrespondents)

675

1

6

68

$25.72*

24**

$8,693***

Staff Activity Logs

(state administrators / directors)

5

1

70

6

$45.23*

24**

$362***

Staff Activity Logs

(program line staff)

60

1

70

70

$32.58*

24**

$3,063***

Enrollee Survey Round 1

(Respondents)

4,493

1

15

1,123

$25.72*

24**

$75,102***

Enrollee Survey Round 1

(Nonrespondents)

1,123

1

3

56

$25.72*

24**

$12,989***

Enrollee Survey Round 2

(Respondents)

1,123

1

21

393

$25.72*

24**

$21,656***

Enrollee Survey Round 2

(Nonrespondents)

281

1

3

14

$25.72*

24**

$3,241***

Totals

7,930



1,897



$132,387***


RETAIN 2023 Burden Figures:

Modality of Completion

Number of Respondents

Frequency of Response

Average Burden per Response (minutes)

Estimated Total Annual Burden (hours)

Average Theoretical Hourly Cost Amount (dollars)*

Average Wait Time in state RETAIN facilities (minutes)**

Total Annual Opportunity Cost (dollars)***

Staff Interviews

(state administrators / directors)

5

1

105

9

$45.23*

24**

$498***

Staff Interviews

(program line staff)

90

1

75

113

$32.58*

24**

$4,854***

Enrollee Survey Round 1

(Respondents)

4,493

1

15

1,123

$25.72*

24**

$75,102***

Enrollee Survey Round 1

(Nonrespondents)

1,123

1

3

56

$25.72*

24**

$12,989***

Enrollee Survey Round 2

(Respondents)

4,493

1

21

1,573

$25.72*

24**

$86,676***

Enrollee Survey Round 2

(Nonrespondents)

1,123

1

3

56

$25.72*

24**

$12,989***

Provider Survey

(Respondents)

400

1

17

113

$32.58*

24**

$8,894***

Provider Survey

(Nonrespondents)

100

1

3

5

$32.58*

24**

$1,466***

Totals

11,827



3,048



$203,468***


RETAIN 2024 Burden Figures:

Modality of Completion

Number of Respondents

Frequency of Response

Average Burden per Response (minutes)

Estimated Total Annual Burden (hours)

Average Theoretical Hourly Cost Amount (dollars)*

Average Wait Time in state RETAIN facilities (minutes)**

Total Annual Opportunity Cost (dollars)***

Enrollee Survey Round 1

(Respondents)

1,872

1

15

468

$25.72*

24**

$31,301***

Enrollee Survey Round 1

(Nonrespondents)

468

1

3

23

$25.72*

24**

$5,401***

Enrollee Survey Round 2

(Respondents)

4,493

1

21

1,573

$25.72*

24**

$86,676***

Enrollee Survey Round 2

(Nonrespondents)

1,123

1

3

56

$25.72*

24**

$12,989***

Totals

7,956



2,120



$136,367***


RETAIN 2025 Burden Figures:

Modality of Completion

Number of Respondents

Frequency of Response

Average Burden per Response (minutes)

Estimated Total Annual Burden (hours)

Average Theoretical Hourly Cost Amount (dollars)*

Average Wait Time in state RETAIN facilities (minutes)**

Total Annual Opportunity Cost (dollars)***

Enrollee Survey Round 2

(Respondents)

1,123

1

21

393

$25.72*

24**

$21,656***

Enrollee Survey Round 2

(Nonrespondents)

281

1

3

14

$25.72*

24**

$3,241***

Totals

1,404



407



$24,897***


RETAIN Grand Total Burden Figures:

Modality of Completion

Number of Respondents

Frequency of Response

Average Burden per Response (minutes)

Estimated Total Annual Burden (hours)

Average Theoretical Hourly Cost Amount (dollars)*

Average Wait Time in state RETAIN facilities (minutes)**

Total Annual Opportunity Cost (dollars)***

Totals

29,585



7,571



$504,501***

* We based these figures on average U.S. citizen’s hourly salary, as reported by Bureau of Labor Statistics data (https://www.bls.gov/oes/current/oes_nat.htm), and average local Government Management and staff hourly wages, as reported by Bureau of Labor Statistics data (https://www.bls.gov/oes/current/oes110000.htm) & (https://www.bls.gov/oes/current/oes131071.htm).

** We based this figure on the average FY 2020 wait times for field offices, based on SSA’s current management information data.

*** This figure does not represent actual costs that SSA is imposing on recipients of Social Security payments to complete this application; rather, these are theoretical opportunity costs for the additional time respondents will spend to complete the application. There is no actual charge to respondents to complete the application.

13. Estimates of cost burden to respondents

There is no cost burden to respondents other than the value of their time to participate in the study. Costs for data collection, storage, processing, and other functions related to these data are born solely by the evaluation contractor. The total cost to study participants for their time in this collection is shown in Exhibit A3.

Table A3. Annual Cost to Respondents

Respondent

Average burden per response (hours)

Number of respondents

Frequency of response

Median hourly wage rate

Respondent cost

2021

 

 

 

 

 

Enrollee survey R1

 

 

 

 

 

Respondents

0.25

374

1

$25.72

$2,418

Nonrespondents

0.05

94

1

$25.72

$129

Total

0.3

468

**

**

$2,546

2022

 

 

 

 

 

Staff interviews

 

 

 

 

 

RETAIN administrators/directors

1.75

5

1

$45.23

$407

RETAIN program line staff

1.25

90

1

$32.58

$3,682

Services user interviews

 

 

 

 

 

Respondents

0.6

75

1

$25.72

$1,157

Nonrespondents

0.1

675

1

$25.72

$1,749

RETAIN staff activity logs

 

 

 

 

 

RETAIN state administrators/directors

1.16

5

1

$45.23

$271

RETAIN program line staff

1.16

60

1

$32.58

$2,281

Enrollee survey R1

 

 

 

 

 

Respondents

0.25

4,493

1

$25.72

$28,884

Nonrespondents

0.05

1,123

1

$25.72

$1,440

Enrollee survey R2

 

 

 

 

 

Respondents

0.35

1,123

1

$25.72

$10,108

Nonrespondents

0.05

281

1

$25.72

$360

Total

7.05

7,930

**

**

$50,339

2023

 

 

 

 

 

Staff interviews

 

 

 

 

 

RETAIN administrators/directors

1.75

5

1

$45.23

$407

RETAIN program line staff

1.25

90

1

$32.58

$3,682

Enrollee survey R1

 

 

 

 

 

Respondents

0.25

4,493

1

$25.72

$28,884

Nonrespondents

0.05

1,123

1

$25.72

$1,440

Enrollee survey R2

 

 

 

 

 

Respondents

0.35

4,493

1

$25.72

$40,458

Nonrespondents

0.05

1,123

1

$25.72

$1,440

Provider survey

 

 

 

 

 

Respondents

0.28

400

1

$32.58

$3,682

Nonrespondents

0.05

100

1

$32.58

$163

Total

1.03

11,827

**

**

$80,155

2024

 

 

 

 

 

Enrollee survey R1

 

 

 

 

 

Respondents

0.25

1,872

1

$25.72

$12,037

Nonrespondents

0.05

468

1

$25.72

$592

Enrollee survey R2

 

 

 

 

 

Respondents

0.35

4,493

1

$25.72

$40,458

Nonrespondents

0.05

1,123

1

$25.72

$1,440

Total

0.7

7,956

**

**

$54,526

2025

 

 

 

 

 

Enrollee survey R2

 

 

 

 

 

Respondents

0.35

1,123

1

$25.72

$10,108

Nonrespondents

0.05

281

1

$25.72

$360

Total

0.4

1404

**

**

$10,468

Grand total 

 

 

 

 

 

Staff interviews

 

 

 

 

 

Administrators/directors

3.5

5

1

$45.23

$814


RETAIN program line staff

2.5

90

1

$32.58

$7,363

RETAIN staff activity logs

 

 

 

 

 

RETAIN state administrators/directors

1.16

5

1

$45.23

$271

RETAIN program line staff

1.16

60

1

$32.58

$2,281

Service user interviews

 

 

 

 

 

Respondents

0.6

75

1

$25.72

$1,157

Nonrespondents

0.1

675

1

$25.72

$1,749

Enrollee surveys R1, R2

 

 

 

 

 

Respondents

0.6

11,232

1

$25.72

$173,353

Nonrespondents

0.1

2,808

1

$25.72

$7,202

Provider survey

 

 

 

 

 

Respondents

0.28

400

1

$32.58

$3,682


Nonrespondents

0.05

100

1

$32.58

$163

Total

**

15,450

**

**

$198,034



14. Annualized cost to the federal government

The total cost to SSA of conducting the RETAIN evaluation is $20,806,467.00. The cost by year is shown in Exhibit A4. We budgeted labor costs by estimating the number of hours of required staff at the various wage levels, multiplying by the applicable wage rates, and multiplying the resulting subtotals by factors to cover fringe benefits and burden expense. The basis for estimating other direct costs varies with the type of cost being estimated. We summed the total of labor costs and other direct costs and multiplied them by a factor to cover general and administrative expenses, and the fee is added.

Table A4. Annual Costs to the Federal Government

Fiscal year

Cost

2021

$ 3,200,000

2022

$ 4,200,000

2023

$ 4,200,000

2024

$ 2,944,519

2025

$ 1,786,501

Total

$ 16,331,020

15. Reasons for program changes or adjustments

This is a new information collection that will increase the public reporting burden.

16. Plans for tabulation and publication of results

With the findings of the RETAIN evaluation, SSA and DOL will be able to advise federal policymakers and state administrators on supports, services, and policy and program changes that could improve labor force participation and retention of individuals experiencing the onset of an injury, illness, or condition that could threaten their ability to remain employed.

Mathematica will analyze the information collected in the interviews to prepare reports that present the findings and their program and policy implications. We will not use complex quantitative analytical techniques with these data.

Two major reports will present the findings from the site visits and interviews. The reports will include a stand-alone summary of the purpose, methods, key findings, and policy implications, as well as a short executive summary. Products resulting from information obtained in this data collection will provide DOL and SSA with information about the experiences of RETAIN program administrators, project staff, and enrollees. Mathematica will integrate the information obtained from the qualitative information collected for the process evaluation with information collected from the other components of the evaluation, which we will use to draw comparisons between states.

We will include the enrollee and provider survey findings in the impact report. The impact report will include a stand-alone summary of the purpose of the demonstration and the evaluation, methods, key findings, and policy implications, as well as a short executive summary. Products resulting from information obtained in this data collection will provide DOL and SSA with information on the short- and intermediate-term impacts of participation in RETAIN for enrollees and providers. Mathematica will integrate the information obtained from the surveys with information collected from the other components of the evaluation. We will use these data to draw summary conclusions about RETAIN overall, and to identify and provide potential explanations for any differences in outcomes observed across the participating states. Exhibit A5 shows the planned timeline for the data collection along with the completion dates for the public reports that will include the interview findings.

Table A5. Data collection and Reporting Schedule

Activity/report

Approximate dates

Data collection


RETAIN enrollee survey, round 1

January 2022-June 2024

RETAIN enrollee survey, round 2

December 2022-May 2025

RETAIN provider survey

March 2023

RETAIN program administrator, staff interviews

April 2022 and May 2023

RETAIN service user interviews

September 2022

Staff activity logs

April–June 2022

Reports


Early assessment report

October 2022

Process analysis report

January 2024

Early impacts report

January 2025

Final impacts report

February 2026

17. Approval not to display of expiration date for OMB approval

SSA is not seeking an exemption with this submission. We will display the OMB expiration date on all interview materials.

18. Explanation of exceptions

SSA is not requesting an exemption to certification requirements.

1 Confirmit® is the computer-assisted interviewing system and survey-processing tool Mathematica uses for survey data collection. The software was developed by Confirmit® for the Windows® operating system and web browsers.

2 SSA security requirements do not permit correspondence with enrollee survey sample members through email. However, provider survey sample members will receive electronic communications at the email address provided by their practice organizations.

35


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleRETAIN OMB Data Collection Package
AuthorMATHEMATICA
File Modified0000-00-00
File Created2022-08-22

© 2024 OMB.report | Privacy Policy