NextGen Project Full ICR SSA_v16

NextGen Project Full ICR SSA_v16.docx

OPRE Evaluation: Next Generation of Enhanced Employment Strategies Project [Impact, Descriptive, and Cost Studies]

OMB: 0970-0545

Document [docx]
Download: docx | pdf


Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes

Next Generation of Enhanced Employment Strategies Project



OMB Information Collection Request

0970-0545





Supporting Statement

Part A

March 2020

Revised May 2022


Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Hilary Bruck

Marie Lawrence

Gabrielle Newell







Part A




Executive Summary


  • Type of Request: This Information Collection Request is for changes to the new collection request approved in April 2020 under OMB #0970-0545.



  • Description of Request: The Office of Planning, Research, and Evaluation (OPRE) within the Administration for Children and Families (ACF) is conducting data collection activities for the Next Generation of Enhanced Employment Strategies Project (NextGen Project). The project includes experimental impact, descriptive, and cost studies of about 10 programs. As described in the initial request, we are using a two-phased approach for our information collection requests. The first phase included instruments that will be uniform across programs selected for evaluation. The second phase includes materials that could be tailored to programs and therefore finalized after recruitment of specific programs. This request is for approval for minor changes to the first phase baseline survey, to use a subset of second phase instruments with programs selected for inclusion in the NextGen Project, with changes to those instruments, as well as changes to the tokens of appreciation for the follow-up surveys. We do not intend for this information to be used as the principal basis for public policy decisions.



  • Time Sensitivity: We are planning to begin these data collections in some selected programs in May 2022.






The Office of Planning, Research, and Evaluation (OPRE) within the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) seeks approval for data collection activities conducted for the Next Generation of Enhanced Employment Strategies Project (NextGen Project). OPRE contracted with Mathematica to conduct the NextGen Project.


A1. Necessity for Collection

OPRE has spent decades studying strategies to help low-income people find and keep jobs. Findings from these studies have been mixed, revealing variation in what works for whom and the duration and magnitude of impacts. Some studies have also demonstrated that certain programs are less accessible to individuals with complex challenges, such as low educational attainment or involvement with the criminal justice system, due to the program’s eligibility requirements.


The NextGen Project is intended to build on the findings and lessons learned from these past and ongoing evaluations by identifying and rigorously evaluating the “next generation” of employment strategies for highly vulnerable populations with complex barriers to obtaining and retaining employment. These strategies may be enhancements or adaptations of previously evaluated strategies, or innovative approaches showing promise in the field and ready to be tested. Additionally, the project has a particular interest in the role of market-oriented, employment-focused programs, such as social enterprises and public/private partnerships, in assisting highly vulnerable populations obtain and retain employment. The current data collection request is necessary to continue these rigorous evaluations.


A2. Purpose

Purpose and Use

The information collected through the instruments included in this Information Collection Request (ICR) will be used to evaluate innovative programs serving low-income individuals facing complex challenges to employment and economic independence to expand the evidence base in this area.


The NextGen Project is actively coordinating with another current project sponsored by OPRE, the Building Evidence on Employment Strategies for Low-Income Families (BEES) study (OMB #0970-0537). BEES may include impact and/or implementation studies of up to 21 employment-focused programs; these will not overlap with programs selected for the NextGen Project. The NextGen Project and BEES have a common goal to foster stronger understanding of the types of programs that can improve labor market outcomes for low-income individuals; however, the projects also maintain separate domains of focus. In addition, both projects are involved in a joint effort with the Social Security Administration (SSA). SSA has provided demonstration program funds to ACF to support the addition of a disability focus in both projects; specifically, to identify and evaluate employment-related programs for potential SSI applicants. This is intended to assist SSA in better understanding the types of early interventions that effectively connect or reconnect potential SSI applicants to work before they apply for SSI. See Section A4 for information about coordination and efforts to not duplicate activities.


Data collection instruments for the NextGen Project impact studies will provide baseline and outcome data about study participants, which the project team will use to estimate the effectiveness of each program. The project team will use data collection instruments for the descriptive studies to describe each program’s design, staffing, service provision, partnerships, and other details necessary to understand the nature of and context for the programs, and for other organizations to replicate them. The instruments will also help inform the interpretation of impact findings. Finally, the project team will use data collection for the cost studies to estimate the costs of implementing each evaluated program and to estimate the cost-effectiveness of the programs. The results will provide policymakers and practitioners with high-quality information on the effects, design and implementation, and the cost of the programs. Having this information will help strengthen policy and practice to better serve individuals facing complex challenges to employment and economic independence. Study findings may also inform future studies in this area.


The information collected is meant to contribute to the body of knowledge on ACF programs. It is not intended to be used as the principal basis for a decision by a federal decision-maker and is not expected to meet the threshold of influential or highly influential scientific information.


Research Questions or Tests

The questions this evaluation will answer are in Table A.1.


Table A.1. Research questions for the NextGen Project

Impact studies

Did the program affect the amounts and types of services participants receive?

Did the program improve participants’ employment outcomes (employment, earnings, job retention and advancement, and quality of job) and economic independence (income, public assistance receipt)?

Did the program improve outcomes relevant to the challenges faced by the target population, for example reduce substance abuse; reduce criminal justice involvement; or increase education, credentialing, and training?

Did the program improve participants’ physical health, mental health, and well-being?

Was the program more effective for some groups of participants than others? If so, which groups?

Did the impacts of the program change over time? If so, how?

How did the program’s costs compare to the benefit of the impacts it generated? What were the net benefits for participants and society as a whole?

Descriptive studies

How was the program designed and implemented?

What contextual, organizational, and other factors impeded or facilitated implementation?

What were the challenges faced, solutions, and lessons learned?

What were the characteristics of study participants?

What services were participants offered, and what were the participation and outcome patterns?

What role did employers play in the program? How do local labor market conditions affect the program design, implementation, and employers’ and participants’ involvement?

Which program services or implementation features appear to be related to program impacts? Which components or services do participants and staff perceive to be helpful?

What were the backgrounds and experience of program staff and program leaders?

How did staff spend their time, and how many participants did they work with?

How did program leaders spend their time?

How did participants perceive the program? What were the most helpful elements? How did the program affect their lives?

Cost studies

How was the program funded? What were its costs? Was the program sustainable?


Study Design

The NextGen Project will include experimental impact, descriptive, and cost studies of about 10 programs. It will study programs that include a wide range of supports designed to serve individuals with multiple challenges to employment and that might be delivered by public–private partnerships, interagency collaborations, government initiatives, nonprofit agencies, or social enterprises. In addition to these studies, the project may include case studies of employers and social enterprises using novel strategies to serve the target population of interest. If pursued, these case studies will not include programs or employers that participate in the impact, descriptive, or cost studies for the broader evaluation.


The impact studies are intended to produce internally valid estimates of the program’s causal impact, not to promote statistical generalization to other sites or service populations. The descriptive and cost studies are intended to present internally valid descriptions of the service population, implementation, and cost of the programs in the chosen sites, not to promote statistical generalization to other sites or service populations. See Section B.1 of this ICR for further information about the appropriateness of the design and its limitations.


As of May 2022, five programs have been identified for inclusion in the NextGen Project. The activities to identify and assess these programs were approved under the generic clearance for Formative Data Collections for ACF Research (OMB #0970-0356). The programs were assessed to determine if they meet three general criteria: (1) the program addresses the research priorities of this project; (2) the program is well implemented, or could be after some technical assistance; and (3) a rigorous evaluation of the program is feasible, using an experimental design, or could be after the program receives some technical assistance. Additionally, included programs have some evidence that they might be effective, and so an evaluation of the program builds on existing evidence and is valuable to the field. Additionally, programs were selected to address SSA’s research interests. The programs studied are not national programs, and the study is not designed to be nationally representative, nor will the project team attempt to generalize the evaluation results beyond the programs and target populations under study. The NextGen Project is not actively recruiting additional programs. However, the project could add programs later if circumstances warrant.


As of May 2022, the programs identified for inclusion in the NextGen Project include:


  • Bridges from School to Work (Bridges) is a nationwide, employer-driven program that provides job-readiness instruction, placement, and post-placement support for young adults with disabilities. Bridges partners with schools and school districts to recruit participants, and Bridges actively partners with employers to find jobs for program participants.

  • Individual Placement and Support for Adults with Justice-Involvement (IPS JI) offers the individual, placement, and support (IPS) employment model to people who are reentering the community after incarceration or have been recently sentenced by a mental health or drug court. IPS includes rapid job placement, long-term support, and integrated employment services and mental health treatment. The NextGen Project is testing IPS JI offered by five mental health centers in four states.

  • Families Achieving Success Today (FAST) serves a subset of participants in Minnesota’s Temporary Assistance for Needy Families (TANF) program in Ramsey County; participants either have a disability or care for someone with a disability. Participants receive employment services, coordinated care with mental health providers for themselves or their dependents, and can also receive IPS services if they choose.

  • Western Mass MOMS PartnershipSM (Western Mass MOMS) is a program in Springfield and Holyoke, Massachusetts, designed for parents and other primary caregivers who identify as women and who have low income and depressive symptoms. Participants build community and social support, and receive cognitive behavioral therapy through a cohort-based Stress Management Course. Participants can also meet one-on-one with an employment specialist to develop a career plan, connect to education or training programs, find a job, and address work stresses.

  • Philadelphia Workforce Inclusion Network services (Philly WINs) is a program designed for low-income adults at risk for SSI who are seeking to attain economic independence through employment in Philadelphia. Services include assessing client’s interests and capabilities, matching them with employment opportunities, providing accommodations and other services to support participants’ job search effort, supporting their integration into the workforce, and providing ongoing follow-up services as needed at the job site. In addition, the program helps a network of employers adopt inclusive workplaces that allow workers with mental or physical disabilities to be productive and feel welcomed.


Phased Approach to Data Collection Approval

As noted in the Executive Summary, the NextGen Project is using a two-phased approach for OMB approval of this ICR.


Phase 1

In Phase 1, the project team is formally recruiting the programs being identified and assessed through the approved generic IC (discussed above). In April 2020, OMB granted approval for the project team to administer the baseline survey (Instrument 1) and to collect identifying and contact information for study participants (Instrument 2). These baseline data collections are uniform across programs selected for evaluation except for the program-based skip logic in the instruments.

Phase 2

In the first ICR submission we indicated that, under Phase 2, we would request approval of the remaining instruments. We anticipated that some of the Phase 2 instruments would require some revisions to tailor to each program selected for the evaluation. The initial ICR submission included drafts of these instruments and burden estimates for initial review and informational purposes (Appendices F and H – O), but did not seek approval at that time. Phase 2 instruments were also included in the Federal Register Notices, allowing for public comment on the initial versions. We indicated that once programs were selected for the evaluation, we would submit updated materials and burden estimates as either a non-substantive change request or a revision with abbreviated public comment time, dependent on the level of changes and guidance provided by the OMB Office of Information and Regulatory Affairs.


In a non-substantive change request approved in December 2020, we requested official approval to use a subset of the Phase 2 instruments across all selected NextGen programs, with non-substantive changes to all but one of the instruments as well as changes to capture how programs responded to COVID-19 and the resulting recession. As explained in that request, rather than tailoring instruments to each selected program, as initially proposed in the first ICR, we intend to use those same Phase 2 instruments across all programs, with skip patterns and/or instructions to interviewers indicating whether certain items only apply to certain types of respondents or programs. The following Phase 2 instruments were part of that request:


  • Instrument 6. Staff characteristics survey revised

  • Instrument 7. Program leadership survey – revised

  • Instrument 8. Semi-structured program discussion guide – revised

  • Instrument 10. In-depth participant interview guide – revised

  • Instrument 11. Cost workbook


In a change request approved in March 2021, we requested approval for changes to the previously approved Phase 1 instruments; updates to the previously approved consent form and clearance for a parent/guardian consent form and a youth assent form for use in the evaluation of one selected program that serves youth (Bridges); and approval to use a subset of Phase 2 instruments with programs selected for inclusion in the project with some changes made to those instruments. Specifically, the following instruments were part of that request:


Phase 1:

  • Appendix A. Informed consent form – revised

  • Appendix A.1. Bridges informed consent form

  • Instrument 1. Baseline survey – revised

  • Instrument 2. Identifying and contact information – revised


Phase 2:

  • Instrument 5. Service receipt tracking – revised

  • Instrument 9. Semi-structured employer discussion guide – revised


In this change request, we are seeking clearance for: (1) minor changes to the Phase 1 baseline survey, (2) the remaining Phase 2 instruments (two follow-up surveys) with some revisions and (3) changes to the tokens of appreciation for the follow-up surveys. Specifically, the following instruments are part of this request:


Phase 1:

  • Instrument 1. Baseline survey – revised


Phase 2:

  • Instrument 3. First follow-up survey – revised

  • Instrument 4. Second follow-up survey – revised



The original ICR submission included burden estimates for each instrument. The burden for completing the data collection for the instruments included in this request falls within those original estimates; the proposed changes do not change the burden estimates.


In addition, as part of this ICR submission, we are submitting revisions to the follow-up survey reminders and notifications (Appendix G. Follow-up survey reminders and notifications – revised). The revisions include changes to the original notifications submitted to OMB in March 2020, as well as the addition of new notifications. We are also submitting to OMB copies of new recruitment and marketing materials that will be used for administering the NextGen Project. These include copies of recruitment materials; an 18th birthday mailer to be used at one selected program that serves youth (Bridges); and an end of study notification. Copies of these materials are included in Appendix G.1. NextGen Project recruitment materials.


Impact studies. The experimental impact studies will provide rigorous evidence on whether each program is effective, for whom, and under what circumstances. Participants eligible for the programs will be asked to consent to participate in the study (Appendix A) and, if they provide consent,0 will be randomly assigned to one of two groups: a treatment group offered the program or a control group not offered the program. Members of all study groups will continue to have access to other services offered in the community. Individuals who do not consent to participate in the study will not be randomly assigned, will not participate in the data collection efforts, and will not be eligible to receive the intervention (until after the second follow-up survey has been fielded).


The project team will collect information from study participants for the impact studies at three points: (1) at program entry before random assignment occurs (baseline); (2) at about 6 to 12 months after random assignment via the first follow-up survey; and (3) at about 18 to 24 months after random assignment via a second follow-up survey. (Note that the timing of the follow-up surveys might vary depending on when each program’s theory of change suggests impacts might be expected.) Table A.2 presents the data collection activities for the impact studies.

As noted above, this change request seeks approval for revisions to Instrument 3. First follow-up survey and Instrument 4. Second follow-up survey. Changes to these instruments are detailed in Appendix Q.1 Summary of requested changes. They also require program-specific skip patterns. Program-specific skips are noted in the instruments themselves.

Table A.2. Data collection activities for the impact studies

Data Collection Activity and Associated Instrument

Respondent, Content, Purpose of Collection

Mode and Duration

Phase 1 Instruments

Baseline data collection


Instrument 1: Baseline survey – revised


Instrument 2: Identifying and contact information – revised

Respondents: All consenting study participants.


Content: Baseline survey includes information on demographics, receipt of Social Security Administration benefits, employment history, social trust, COVID-19-related challenges, and challenges to maintaining employment. Identifying information includes name, Social Security number, and date of birth. Contact information includes physical and electronic addresses and social media information for participants and up to three friends or relatives.

Instrument 2 also includes the Center for Epidemiologic Studies Depression Scale Revised (CESD-R), which is used by one program being considered for inclusion in the evaluation as a program eligibility screening tool.


Purpose: Baseline survey data will be used to describe the study sample and check that the characteristics of the study participants are similar on average across groups. The data will also be used to define subgroups, as covariates in regression models, and for weighting for nonresponse. A question-by-question justification for the items included in the baseline survey is presented in Appendix B – revised.


Identifying information are used before random assignment to make sure participants have not already been enrolled in the study. The project team will use this information later to match study participants to their administrative data records to assess outcomes. In addition, the team will collect detailed contact information to locate participants to complete follow-up surveys.


One program in the evaluation used the CESD-R screening tool during program intake prior to the evaluation to assess eligibility for the program. Including it with the collection of identifying and contact information will streamline study intake procedures for this program. The CESD-R only displays for that program; other programs skip these items. The CESD-R items does not add to the evaluation-related information collection burden; the items are administered before study consent and used only to determine program eligibility in keeping with the program’s prior intake requirements. The study team will maintain CESD-R scores for those who are eligible for the program (that is, those with scores of more than 16 on the scale) and consent to participate in the study. The revised consent form indicates that the program may share the eligibility screener score with the study team (Appendix A. Informed consent form – revised).


A question-by-question justification for the items included in the identifying and contact information is presented in Appendix C – revised.

Mode: Baseline survey allows for multiple administration options: by program staff, self-administered by study participants via the web, or by NextGen Project staff via telephone.


RAPTER® identifying and contact information and responses to CESD-R questions are provided verbally by study participants and entered into RAPTER® by program staff.


Duration: 25 minutes (total to complete the baseline survey and provide identifying and contact information)

Proposed Phase 2 Instruments

Follow-up data collection


Instrument 3: First follow-up survey – revised



Instrument 4: Second follow-up survey – revised

Respondents: The project team will attempt to survey all study participants.


Content: The follow-up surveys collect data on outcomes of interest, including service receipt, employment, earnings, economic independence, well-being, health status, substance use, involvement in the criminal justice system; perceptions of the usefulness of the program being evaluated (for treatment group only); and updated contact information (on first follow-up survey only). The exact questions asked could vary by site depending on the site’s target population.


Purpose: The project team will use survey data to estimate program impacts on outcomes of interest; estimate the program impacts on the services the study participants receive; describe treatment group members’ perceptions of the usefulness of the program being evaluated; and describe the study sample. The updated contact information from the first follow-up survey will be used to assist in locating study participants for the second follow-up survey. A question-by-question justification for the items included in the follow-up surveys is in Appendix D.


The proposed revisions to these surveys under this change request are presented in Appendix Q.1. Summary of requested changes (submitted May 2022).

Mode: Participants will self-administer via the web. Alternatively, administered by NextGen Project staff via telephone.


Duration: 50 minutes per follow-up survey

Descriptive studies. The descriptive study for each program will describe the following: (1) the community, economic, and program context in which the program operates; (2) the characteristics of the program model, including the target population, services offered, role of partners and employers, theory of change, and plans for sustainability and replication; and (3) the implementation and cost drivers of the program, such as leadership, organizational culture and structure, staffing and staff development, and service delivery. The data collection period for the descriptive study will vary by participating program, typically around 4 to 8 months after the study begins enrolling participants. Table A.3 summarizes the data collection activities for the descriptive studies. If respondents consent to being recorded, the interviewer will audio record discussions with program administrators, supervisors, staff; key partner staff, including employers; and participants.


As noted above, the study gained approval for a non-substantive change request to use a subset of the descriptive study instruments with all NextGen Project sites, with changes to those instruments, in December 2020. These included the staff characteristics survey (Instrument 6. Staff characteristics survey – revised), program leadership survey (Instrument 7. Program leadership survey – revised), semi-structured program discussion guide (Instrument 8. Semi-structured program discussion guide – revised), and in-depth participant interview guide (Instrument 10. In-depth participant interview guide – revised). An additional subset of descriptive instruments were approved in March 2021 for all NextGen Project programs, with changes to the instruments, as noted below in Table A.3. These include the service receipt tracking instrument (Instrument 5. Service receipt tracking – revised) and guide for employer discussions (Instrument 9. Semi-structured employer discussion guide – revised).


Table A.3. Data collection activities for the descriptive studies

Data Collection Activity and Associated Instrument

Respondent, Content, Purpose of Collection

Mode and Duration

Phase 2 Instruments

Treatment group service receipt


Instrument 5: Service receipt tracking – revised

Respondents: Program staff


Content: Information about the treatment group members’ participation in the program. In programs that also provide services to control group members, program staff might also record information on receipt of services of control group members.


Purpose: To describe the service receipt of treatment group members, including type of service, duration, location, and mode.

Mode: Program staff enter information about services received by study participants through the program in RAPTER®. If a program already collects data on service receipt through its own database, the study uses the information the program already collects.


Duration: 5 minutes per entry

Characteristics of program staff and leaders




Instrument 6. Staff characteristics survey – revised




Instrument 7. Program leadership survey – revised

Respondents: Program staff and leaders.


Content: Staff members’ and leaders’ professional backgrounds, skills, experience, credentials, and perceptions of the program. Leaders’ resource investments and decision-making processes. Changes due to COVID-19.


Purpose: To provide insight into how program structure, staffing, and leadership might affect implementation of the program. Compared with the semi-structured interviews, described below, the surveys will enable the collection of information (1) in a more structured format, (2) on topics that staff and leaders might be uncomfortable talking about in a group setting, and (3) from a broader set of staff and leaders than would have the time to participate in a semi-structured interview.

Mode: Program staff and leaders will self-administer the surveys via the web.


Duration: 25 minutes for staff survey; 15 minutes for leadership survey

Discussions with program staff, partners, and employers




Instrument 8. Semi-structured program discussion guide – revised


Instrument 9: Semi-structured employer discussion guide – revised

Respondents: Program administrators, supervisors, staff; key partner staff, including employers


Content: Semi-structured discussions with program administrators, supervisors, direct service staff, community partners, and specialized treatment providers will provide information about the program’s design and implementation and any COVID-19 related challenges. Semi-structured discussions with employers will collect information about their involvement in developing and executing the programs of interest including any changes to the employers’ relationships with the programs as a result of the COVID-19 pandemic.


Purpose: To describe each program’s design, staffing, service provision, partnerships, and other details necessary to understand the nature of and context for the programs, and for other programs to replicate them. Also to help inform the interpretation of impact findings.


Mode: The interviews will be conducted in person during site visits, either individually or in small groups. Interviews may also be conducted via telephone or video dependent upon any COVID-related restrictions.


Duration: 90 minutes per administrator; 60 minutes per program supervisor, key partner staff, or employer; 45 minutes for direct service staff

In-depth participant interviews




Instrument 10. In-depth participant interview guide – revised

Respondents: Select study participants


Content: Participants’ background and goals, experiences and challenges finding and retaining employment, experiences with the program, including reasons for disengaging from the program, if applicable. Challenges related to COVID-19.


Purpose: To provide the “stories” that will make the findings from the implementation and impact studies more meaningful. They might also inform the understanding of whether the program was implemented as planned and suggest possible refinements.

Mode: The interviews will be conducted in person during site visits. Interviews may also be conducted via telephone or video dependent upon any COVID-related restrictions.


Duration: 120 minutes

Cost studies. The cost study for each program will (1) provide descriptive information about the amount, sources, and types of its funding, and (2) produce an estimate of the average cost of the program per participant. The average cost of the program per participant will be used in the benefit-cost analysis. In that analysis, the benefits that accrue to program participants such as increased earnings and reduced receipt of public benefits will be compared with the cost of providing program services. Data collection for the cost studies will ideally take place around the same time as the data collection for the descriptive studies. They are summarized in Table A.4.


In December 2020, the study received approval to use the Phase 2 Excel-based cost workbook to collect cost study data across all NextGen Project sites with no changes proposed to the instrument.


Table A.4. Data collection activities for the cost studies

Data Collection Activity and Associated Instrument

Respondent, Content, Purpose of Collection

Mode and Duration

Phase 2 Instruments

Cost data collection




Instrument 11. Cost workbook


Respondents: Program leader (or a designee)


Content: Excel-based cost workbook to record information on the expenditures associated with the program for a recent 12-month period.


Purpose: To estimate the costs of implementing each evaluated program and to estimate the cost-effectiveness of the programs.



Mode: The project team will ask program leaders for their accounting records or financial reports and obtain as much information as possible from these records. If additional information is needed after review of financial records, the project team will ask the programs to complete the workbook in part or in full, depending on the information required.


Duration: 32 hours

Other Data Sources and Uses of Information

The NextGen Project collects administrative records data for outcomes of interest; this information is already being collected and represents no additional burden for participants or program staff. The project team will collect administrative data on quarterly earnings, receipt of unemployment insurance, and new hires on all study participants from the National Directory of New Hires (NDNH), which is maintained by the Office of Child Support Enforcement at ACF. If applicable, the project team will also collect records for study participants on the receipt of TANF from program data and contact information from state or local TANF agencies. For some programs, administrative data will be collected from SSA on annual taxable earnings and receipt of SSI and Social Security Disability Insurance. In addition, as applicable and informative to the programs’ theories of change, data might also be collected on receipt of Supplemental Nutrition Assistance Program (SNAP) benefits and contact information; receipt of benefits and contact information from the Special Supplemental Nutrition Program for Women, Infants, and Children; state records on child support owed or paid; health care outcomes (Medicare enrollment and claims) from the Centers for Medicare & Medicaid Services; involvement with the criminal justice system from court records; educational attainment and completion from school districts; and receipt of housing benefits (such as participation in a housing choice voucher program) from housing authorities.


The project is also using information collected or expected to be collected under the generic clearance for Formative Data Collections for ACF Research (OMB #0970-0356), including information collected to gather feedback from stakeholders, identify sites, and assess activities and characteristics.


A3. Use of Information Technology to Reduce Burden

This project is using multiple applications of information technology to reduce burden. As described below, information technology is being used to collect baseline data, participant identifying and contact information, and information on service receipt. It will also be used to conduct surveys with program staff and leaders, conduct the two follow-up surveys, and collect cost information from the programs. The semi-structured staff discussions and in-depth participant interviews will be audio recorded, if respondents consent to being recorded. Additionally, interviews may be conducted via telephone or video dependent upon any COVID-related restrictions.

RAPTER®. RAPTER® is a secure, web-based system that program staff use to administer consent to participants, collect their identifying and contact information, conduct random assignment, and enter information on the services received or activities participated in by study participants. The use of check boxes and drop-down menus and response categories minimize data entry burden.

Baseline, follow-up, staff, and leadership surveys. All surveys have the capability to be hosted on the Internet via a live secure web-link. To reduce burden, the surveys employ (1) secure log-ins and passwords so respondents can save and complete the survey in multiple sessions, (2) drop-down response categories so respondents can quickly select from a list, (3) dynamic questions and automated skip patterns so respondents only see those questions that apply to them (including those based on answers provided previously in the survey), and (4) logical rules for responses so respondents’ answers are restricted to those intended by the question.

Respondents also have the option to complete the baseline survey and first and second follow-up surveys using computer-assisted telephone interviewing (CATI). CATI reduces respondent burden, relative to interviewing via telephone without a computer, by automating skip logic and question adaptations and by eliminating delays caused when interviewers must determine the next question to ask.

Excel-based workbook for collecting cost data. A Microsoft Excel-based data collection tool will be used to collect cost data. To reduce respondent burden, the project team will ask program leaders for their accounting records or financial reports and obtain as much information as possible from these records to complete the workbook. If additional information is needed after review of financial records, the project team will ask the programs to complete the remaining sections of the workbook. Formatting, data checks, and layout built into the template will assist staff in completing it.


A4. Use of Existing Data: Efforts to reduce duplication, minimize burden, and increase utility and government efficiency

Information that is already available from alternative data sources will not be collected again for this project. For example, if a program in the study has an existing management information system that collects information needed for this project that is exportable and of sufficient quality, we will accept data from its existing system. In these cases, the project team requests that the program only enter into RAPTER® data that the program is not already collecting.


Although information on employment will be collected from administrative records and via the survey, this information is not duplicative because the two sources differ in accuracy and coverage of jobs. NDNH administrative records will provide information on quarterly earnings from jobs covered by unemployment insurance as well as new hires. The baseline survey and follow-up surveys will ask for information about all jobs held, including those not covered by unemployment insurance. The follow-up surveys will also collect information about the characteristics of the jobs (such as the wage rate, hours worked, and benefits offered) that are not included in the NDNH data.


The follow-up surveys will collect information on whether participants received assistance from public assistance programs such as TANF, SNAP, unemployment insurance, and other assistance programs. However, these surveys will not ask for details about the receipt of these benefits, which we will collect via administrative records. It is important to ask about receipt of benefits on the survey because administrative records will not be available for those respondents who do not provide their Social Security number.


As noted in Section A2, the NextGen Project is actively coordinating with OPRE’s BEES study. OPRE is intentionally and strategically coordinating these projects to prevent duplication of effort; fully capitalize on the opportunity the projects afford for large-scale, rigorous evaluation; advance the knowledge base regarding effective employment strategies for low-income, vulnerable populations; and meet SSA’s priorities across both projects. The projects intentionally included some common questions within instruments. Areas of measurement coordination with the existing BEES data collection instruments are described in the question-by-question justifications for the baseline data collection and follow-up surveys (Appendices B, C, and D, revised). The projects differ in that BEES is especially interested in evaluating programs for individuals struggling with opioid dependency, abuse of other substances, and/or mental health issues, while the NextGen Project is especially focused on evaluating interventions that are market-oriented and/or employer-driven.


A5. Impact on Small Businesses

Although we have not yet recruited all specific programs to be evaluated, small organizations, such as businesses or nonprofit organizations, might be involved in implementing a program to be evaluated. If small organizations are involved, we will minimize the burden for respondents by collecting data at times convenient for the respondents, and requiring minimal record keeping or written responses on the part of respondents.


A6. Consequences of Less Frequent Collection

The project team will collect information only once for the baseline survey and identifying participant information, staff characteristics survey, program leadership survey, semi-structured staff discussions, semi-structured employer discussions, in-depth participant interviews, and the Excel-based workbook for collecting cost data.


The project team will administer two similar follow-up surveys. Collecting data at two future points of time will allow an examination of whether the impacts of the program changed over time and whether changes in intermediary outcomes (such as health or skills) were associated with changes in longer-term outcomes (namely employment and economic independence outcomes). This also reduces the chance of recall error from respondents when collecting information on their receipt of services and jobs held over a period of time, relative to collecting it only once at the end of the follow-up period. Similarly, updated contact information will be collected from respondents upon administration of the first follow-up survey to assist in locating them for the second follow-up survey.


Program staff use the RAPTER® system or their existing management information system to record service receipt for each participant each time he or she receives a service. Staff are asked to enter the information into RAPTER® immediately after the service is provided. Doing so less frequently would contribute to recall error and affect the quality of data collected.


A7. Special Circumstances: now subsumed under A2 (above) and A10 (below)


A8. Consultation

Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on January 8, 2020, Volume 85, Number 5, page 906-907, and provided a 60-day period for public comment. A copy of this notice is attached as Appendix P. During the notice and comment period, no substantive comments were received. ACF published an additional notice in the Federal Register announcing the agency’s intention to request an OMB review of proposed changes to the information collection activity in January 2021. This notice was published on January 6, 2021, Volume 86, Number 3, page 541-543, and provided a 30-day period for public comment. A copy of this notice is included in Appendix P.1. During the notice and comment period, no substantive comments were received.

ACF published an additional notice in the Federal Register announcing the agency’s intention to request an OMB review of these proposed changes to the information collection activity. This notice was published on March 28, 2022, Volume 87, Number 59, page 17298-17299, and provided a 30-day period for public comment. A copy of this notice is included in Appendix P.2. During the notice and comment period, no substantive comments were received.

Consultation with Experts Outside of the Study

Experts in their respective fields from OPRE and Mathematica were consulted in developing the design, data collection plan, and instruments for which clearance is requested. Select agency staff within SSA and HHS were also consulted. We also consulted with the BEES project staff to coordinate measurement of key outcomes across projects.


A9. Tokens of Appreciation

The proposed structure of tokens of appreciation for this study is designed to support the retention of respondents over the course of the longitudinal data collection and enhance the quality of information derived from in-depth interviews. OMB approved the initial proposed structure of tokens of appreciation for this study in April 2020. Through this change request, we propose updates to the tokens of appreciation related to the longitudinal surveys. See the Longitudinal Surveys subsection below for additional information.

Study Enrollment

After finishing the study enrollment process, participants will receive a study packet designed to establish their engagement with the study. This packet will include a copy of the consent form, a one-page study flyer that describes upcoming data collection activities (see Appendix G), and a small study-specific item (valued between $1-$3) such as a magnet, keychain, or screen cleaner, that contains the study logo and contact information for our call center. The purpose of these materials is to establish positive association with the study and support familiarity when respondents are contacted to participate in an interview. This revised information collection request does not seek changes to the study enrollment packet.

Longitudinal Surveys

Obtaining high response rates for the follow-up surveys is important. The risk of biased impact estimates increases with lower overall survey response rates or larger differences in survey response rates between key research groups (What Works Clearinghouse 2017).


To increase survey participation following successful contact, the initial information collection clearance request proposed tokens of appreciation for survey completion—a $40 gift card for respondents to the first follow-up survey and a $50 gift card for respondents to the second follow-up survey. The dollar amounts originally proposed for the NextGen Project follow-up surveys were based on observational information from recent randomized controlled trials with similar service populations. Table A.5 presents information about the type of data collection, incentive offered, survey duration, timeframe, and response rates obtained in recent studies. Three of these studies used tokens of appreciation of between $40 and $50, as was originally proposed for the NextGen Project.


We are currently seeking two changes to the tokens of appreciation offered for the first follow-up survey: (1) we are requesting approval to increase the token of appreciation for survey completion from $40 to $50; and (2) we are requesting approval to test a $5 prepaid token of appreciation that would be offered before the sample member has responded to the survey. We would evaluate the effectiveness of the prepaid token using an experimental design. One research group would be offered a $5 prepaid token and a $50 postpaid token and the other research group would be offered no prepaid token of appreciation but would be offered a $55 postpaid token of appreciation. In addition, if the experiment shows the prepaid token is effective for the first follow-up survey, we would offer all sample members the $5 prepaid token and a $50 postpaid token of appreciation. If the experiment shows that it is not effective, we would offer only a $50 postpaid token of appreciation and no prepaid token. Justification for each of these requests is provided below.


Justification for increasing the first follow-up survey token to $50. Despite three of the four studies cited in Table A.5 achieving adequate response rates, recent experience from the fourth—the Evaluation of Employment Coaching for TANF and Related Populations (OMB #0970-0506)—has directly informed the NextGen Project’s proposed approach. This evaluation includes populations very similar to those participating in NextGen Project evaluations. In March 2018, OMB initially approved a two-tiered structure for the Employment Coaching study with an “early bird” amount that provides survey respondents $35 if they completed the survey within four weeks of the initial notification, and $25 if they completed it after four weeks. However, overall response rates were below the target, and concerns arose about differential response—that is, differences in response rates between the treatment and control groups—which could bias impact estimates. Therefore, in March 2020, the study team submitted and OMB approved a non-substantive change request. This request preserved the two-tiered structure among study participants within two study sites; however, participants from the other four sites would be offered $50 for completing each survey, irrespective of whether the participants completed the survey within the four-week “early bird” period. After the change, the web and phone response rates increased and the treatment-control differential decreased in the four sites in which the $50 amount was offered but not in the two sites in which it was not. Based on this experience, OMB approved an increase in survey tokens to $50 for all sites in October 2020. If the NextGen Project change request of $50 is approved by OMB, the study team will also update the amount included on the consent form (Appendix A). The revised study flyer and survey notifications (Appendix G) included in this request reflect the $50 amount. Some respondents will receive $55 postpaid tokens during the incentive experiment phase if they are not part of the prepaid token group, as discussed below.


Justification for introducing a $5 prepaid token of appreciation. Prepaid tokens of appreciation may be more effective because they lend credibility to the postpaid token and to the study more generally. In addition, it likely creates a sense of reciprocity. Behavioral science literature suggests that the norm of reciprocity requires that we repay in kind what another has done for us; in this context, giving a potential survey respondent five dollars should increase their desire to reciprocate by completing the survey (Falk 2007; Gneezy and List 2006; Cialdini 2007).


Studies show prepaid tokens of appreciation to be more effective than postpaid ones, and the combination of prepaid and postpaid is more effective than just prepaid (Singer and Ye 2013; Mercer et al. 2015). Singer and Ye (2013) conducted a systematic review of studies of tokens of appreciation published since 2002, including some unpublished studies. Their review found support that prepaid tokens of appreciation are effective in assisting with locating and contact success, especially under a longitudinal design (Beydoun et al. 2006; Mann et al. 2008). In a meta-analysis of 39 experimental studies, Singer et al. 1999 showed that a prepaid token can yield a higher response rate compared to a postpaid scheme for interviewer-administered surveys. Cantor et al. (2008) found that prepaid tokens of appreciation between $1 and $5 increase response rates from 2 to 12 percent compared to no prepaid or postpaid tokens of appreciation. Mercer et al. (2015) conducted a meta-analysis of published and unpublished experiments since 1987 and found that prepaid tokens of appreciation are more effective than postpaid ones. Jäckle and Lynn (2008) found that prepaid tokens of appreciation significantly reduced attrition in follow-up waves under a longitudinal design. Hock et al. (2015) found that a $5 prepaid token of appreciation for a hard-to-reach population yielded a higher response rate for online mode in a phone and web administered survey.


In addition to the survey literature, recent results from ACF studies show support for prepaid tokens of appreciation. The Assessing the Cost and Implementation of High Quality Early Care and Education (OMB# 0970-0499) project conducted an experiment to test a prepaid token of appreciation in a survey of child care providers. One group of respondents was given $10 in advance of the survey and $10 upon completion; a second group had no prepaid token of appreciation but was given $20 upon completing the survey. Even though both groups were paid the same total amount for completing the survey, the preliminary results showed a significantly higher response rate among the group of respondents who received the prepaid $10 over those who did not (81 percent compared to 61 percent, respectively). As a result, the study is implementing prepaid tokens of appreciation for future data collections.


Proposed Experiment

Although the survey and behavioral science literature shows strong support for the effectiveness of prepaid tokens of appreciation at increasing overall response rates and reducing differential response rates, less is known about their usefulness specifically in surveys of low-income populations participating in randomized control trials of employment programs. To address this knowledge gap, we propose conducting an experiment to test the effectiveness of prepaid tokens of appreciation in the NextGen Project. As the enrollment periods for the NextGen Project evaluations are more than 12 months long, there will be time to implement an experiment on a small scale, analyze the results, and implement the most effective structure for future sample cohorts for the first follow-up survey.


The experiment would be set up so that the study’s treatment and control groups will each be randomized into two groups and each will receive the same total dollar amount. The first group will all receive a $5 bill attached to the advance letter and will receive a $50 gift card upon completion of the first follow-up survey. The second group will only receive a $55 gift card upon survey completion. We propose that the second group receive $55 so that both groups could receive the same amount and the only difference is the timing of when the amount is paid. The main questions we intend to answer with this design are:


  • Does a prepaid token of appreciation, combined with a postpaid one at survey completion, increase response rates compared to only the postpaid one?

  • Does a prepaid token of appreciation, combined with a postpaid one at survey completion, decrease response rate differentials across study groups, relative to only the postpaid one?


Upon completion of the experiment and analysis of the findings, we will provide a memorandum to OMB with the results. It will include a request to implement either prepaid plus postpaid or postpaid only for the remaining sample for the first follow-up survey.


Justification for revising the tokens for the second follow-up survey. We propose to introduce a $5 prepaid token of appreciation for the second follow-up survey if the experiment shows it is effective for the first follow-up survey. We will provide a memorandum to OMB with the experiment results from the first follow-up survey, as described above, and our plan for any changes to the second follow-up survey tokens of appreciation as a result.

Table A.5. Tokens of appreciation and response rates obtained in similar follow-up surveys

Study

Instrument

Duration

(minutes)

Data collection timeframe

Amount of token of appreciation

Response rate


Evaluation of Employment Coaching for TANF and Related Populations, OMB #

0970-0506

6- to 12-month follow-up

60

2018-present

Originally: $35 first four weeks

$25 after four weeks

Revised: $50

41-81 percent depending on site, for cases that have been in the field for six months or longer

48 to 82 percent treatment

35 to 81 percent control

Enhanced Transitional Jobs Demonstration, OMB #0970-0413


12-month follow-up

45

2012-14

$40

67 to 82 percent depending on site

69 to 82 percent treatment

65 to 81 percent control

Self-Employment Training (SET) Demonstration, full sample,

OMB #1205-0505

18-month follow-up

20

2015-17

$50 first four weeks

$25 after four weeks

80 percent overall

83 percent treatment

78 percent control

YouthBuild,

full sample,

OMB #1205-0503

12-month follow-up

60

2012-14

$40 first four weeks

$25 after four weeks

81 percent overall

82 percent treatment

79 percent control

Note: Treatment and control groups in this table refer to the overall evaluation (that is, the original conditions to which sample members were assigned upon enrollment) and not any incentive experiment. The SET sample includes the full survey sample, including the time before and after the conclusion of the incentive experiments described in the text. The TANF Coaching response rates include only those cases that have been in the field for six or more months.

In-depth Interviews
Respondents to the in-depth participant interviews, which are estimated to take 120 minutes on average, will receive a $60 gift card (as approved by OMB in April 2020), intended to offset costs of participation in the study. Interview data will not be representative in a statistical sense, in that they will not be used to make statements about the prevalence of experiences for the entire service populations. However, it is important to secure participants with a range of background characteristics to capture a variety of possible experiences with these programs. Without offsetting the direct costs incurred by respondents for participating in the interviews, such as arranging child-care, transportation, or time off from paid work, the research team increases the risk that only those individuals able to overcome the financial barriers to participate will agree to an interview, which would reduce the overall quality of the qualitative data collection. This revised information collection request does not seek changes to the gift card amount for the in-depth interviews.


A10. Privacy: Procedures to protect privacy of information, while maximizing data sharing


Personally Identifiable Information

The information provided by or about participants during the baseline data collection, follow-up surveys, service receipt tracking, and in-depth participant interviews will contain participant-level personally identifiable information (PII). This includes names, addresses, email addresses, social media accounts, phone numbers, birth dates, and Social Security numbers. This information is needed to ensure that: the prospective study participant has not already enrolled in the study; the project team can locate study participants to complete the follow-up surveys; and the project team can link participants to their corresponding administrative data. See Section A11 for further details. In addition, the project team will collect the names and email addresses of program staff in order to administer the staff characteristics and program leadership surveys.


Mathematica will share study participants’ information with SSA, which will do additional research on how programs affect earnings and receipt of disability benefits. They will do this research through 2028. Mathematica will share information such as name, sex, date of birth, and Social Security number so researchers at SSA can locate participants’ records. They will only use this information to do research. The information will not be used to make decisions about benefits participants receive from the SSA, now or in the future. The sharing of information with SSA for these purposes and for the specified timeframe are described to participants in the informed consent form (Appendix A).


Information will not be maintained in a paper or electronic system from which data are actually or directly retrieved by an individuals’ personal identifier.


Assurances of Privacy

Mathematica will protect respondents’ privacy to the extent permitted by law and will comply with all Federal and departmental regulations for private information. Mathematica has developed a data safety and monitoring plan that assesses all protections of respondents’ PII. Mathematica will ensure that its employees, subcontractors (at all tiers), and employees of each subcontractor who perform work under this contract are trained on data privacy issues and comply with the above requirements. All study staff with access to PII—including program staff who are entering information about study participants and their service receipt into RAPTER®—receive study-specific training on (1) limitations on disclosure; (2) safeguarding the physical work environment; and (3) storing, transmitting, and destroying data securely. These procedures are documented in training manuals for study staff, and refresher trainings will occur annually.


Respondents are informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law. As specified in the contract with ACF for the NextGen Project, Mathematica (the Contractor) will comply with all Federal and departmental regulations for private information.


The project team is in the process of seeking Institutional Review Board (IRB) approval from the Health Media Lab IRB; to date, IRB approval has been granted for four of the selected five programs. An IRB application will soon be submitted for the fifth identified program. The project team also received a Certificate of Confidentiality (CoC) from the National Institutes of Health. The CoC helps assure participants that their information will be kept private to the fullest extent permitted by law.


Data Security and Monitoring

The project team will use Federal Information Processing Standard compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of sensitive information during storage and transmission. They will securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Information Processing Standard. They will ensure that it incorporates this standard into its property management/control system and establishes a procedure to account for all laptop computers, desktop computers, and other mobile devices and portable media that store or process sensitive information. Any data stored electronically, including audio recordings of discussions with program administrators, supervisors, staff, key partner staff, and participants, will be secured in accordance with the most current National Institute of Standards and Technology requirements and other applicable Federal and departmental regulations. In addition, the project team will submit a plan for minimizing, to the extent possible, the inclusion of PII and other sensitive information on paper records, and for the protection of any paper records, field notes, or other documents that contain PII or other sensitive information that ensures secure storage and limits on access.


Information shared with researchers at SSA (see discussion above) and exchanged between programs and Mathematica will be sent via a secure file transfer protocol.


At the end of the study, de-identified project data will be archived to make them available to other researchers. Mathematica will work with ACF to develop a comprehensive data archive plan and to produce an archive data file or files. Any restricted- or public-use files will be reviewed for appropriateness of public or restricted release, including appropriate masking techniques for each level of release. A non-disclosure review will also be conducted to ensure that the data cannot be used to re-identify study participants.


A11. Sensitive Information

To evaluate the effectiveness of employment programs for vulnerable populations, it is necessary to ask some sensitive questions. Before starting the baseline and follow-up surveys and the in-depth interviews, all respondents will be informed that their identities will be kept private to the extent permitted by law, that results will only be reported in the aggregate, that their responses will not affect any services or benefits they or their family members receive, and that they do not have to answer any questions that make them uncomfortable.

The sensitive questions in the approved data collection instruments and proposed data collection instruments relevant for this ICR follow. These topics were all described in previously submitted and approved justification packages.

  • Respondents’ Social Security numbers. Respondents’ Social Security numbers are necessary to collect administrative data used to estimate impacts on earnings, employment, and public benefit receipt. The consent form informs study participants that the project team might collect administrative data about them. Social Security numbers will also be used to collect information through online databases containing information on the location of study participants for the follow-up surveys. Along with names, birthdates, and other data from baseline surveys, Social Security numbers will be used to verify respondents’ identities for follow-up surveys. The project team does not want to rely on name and address matching (or similar techniques) for collecting administrative data because it leads to the inability to match administrative data for a high proportion of participants, an unacceptably high uncertainty in match success, or both. This would affect the study’s ability to estimate impacts and draw conclusions for findings that rely on administrative data.

  • Wage rates and earnings. It is necessary to ask about earnings because increasing participants’ earnings is a key goal of these programs. The follow-up surveys ask about each job worked since random assignment, the wage rate, and the number of hours worked per week. This information will be collected on the first and second follow-up surveys.

  • Challenges to employment. We will ask about some challenges to employment caused by COVID-19. This will provide some information about the labor market context of the participants at the time they enroll in the study.

  • Economic hardships. The follow-up surveys ask about economic hardships, such as food insecurity. These outcomes are used to assess respondents’ degree of economic independence and might be affected by the program. Economic hardships might also be discussed as part of the in-depth participant interviews.

  • Disabilities, mental and physical health, and substance misuse. The baseline and follow-up surveys will collect information about disabilities, mental or other health problems, and substance misuse; the severity of those issues; and how much they impact the ability to work. These issues might also be discussed in the in-depth participant interviews. All of these are important potential challenges to finding or maintaining employment and could play a role in the effectiveness of the program. The Center for Epidemiologic Studies Depression Scale Revised (CESD-R) is also be collected for one program during eligibility screening and saved for those determined to be eligible for the program and who consent to participate in the study. This program uses CESD-R for eligibility screening. It was added to the RAPTER program to facilitate the intake process for this program and will be included in the first and second follow-up surveys as a way to measure program impact on mental health.

  • Involvement in the criminal justice system. The baseline survey asks about prior involvement in the criminal justice system, including the number of convictions and felony convictions, details about parole or probation, type of crime committed, and time spent in last incarceration because such involvement often makes it harder to find employment. The two follow-up surveys will also ask about arrests, convictions, and incarcerations that occurred after random assignment because these outcomes might be affected by the program. Criminal history might also be discussed during the in-depth participant interviews.

  • COVID-19-related challenges. The baseline survey asks if respondents are fully vaccinated against the Coronavirus because vaccination is expected to be associated with employment outcomes. It also asks whether COVID-19 posed specific challenges to employment for study participants or if the pandemic impacted previous employment. The follow-up surveys ask some questions about the effects of the COVID-19 pandemic on getting or keeping employment and whether they are vaccinated against the Coronavirus.



A12. Burden

Explanation of Burden Estimates

Table A.6 reflects the burden and cost for information collection proposed in Phase 1 of this ICR. There are no changes proposed for Phase 1 burden as part of this change request. However, the cost information in Table A.6 of this request was updated to use more recent wage estimates. Table A.7 reflects the estimated reporting burden and cost for the Phase 2 data collection instruments that this change request seeks approval to administer (previously included in Appendix E in the ICR approved by OMB in April 2020 under OMB #0970-0545; revised in December 2020 with the non-substantive change request; and revised again in March 2021 with the substantive change request). No changes are proposed for Phase 2 burden as part of this change request. The cost information in Table A.7 of this request does reflect updated wage estimates. The burden for completing the data collection for the subset of Phase 2 instruments included in this request falls within the original burden estimates; the proposed changes do not change the estimates.

Details of the estimates for data collections in Phase 1 and 2 of this request are as follows:

  • Baseline data collection. Baseline data collection involves both study participants and program staff. The burden estimates assume that program staff will assist study participants in baseline data collection, which includes collecting the baseline survey (Instrument 1) and using RAPTER® to collect participant identifying and contact information (Instrument 2).

  • We expect about 10,000 study participants (1,000 in each of 10 programs) will complete baseline data collection.0 We expect each baseline data collection (inclusive of the baseline survey and RAPTER® identifying and contact information) to last 0.42 hours, for a total of 4,200 burden hours. Annualizing over three years is 1,400 hours per year for study participants.

  • We assume that 200 program staff across all 10 programs (approximately 20 per program) will perform the baseline data collection. Each staff member will administer the baseline data collection (inclusive of the baseline survey and RAPTER® identifying and contact information) 50 times and each session is expected to last 0.42 hours for a total of 4,200 burden hours. Annualizing over three years is 1,400 hours.

  • Service receipt tracking. We anticipate 200 program staff (20 in each of the programs) will enter data on program service receipt into RAPTER®. We expect 250 entries per staff member and expect that each entry will take 5 minutes (0.08 hours), or a total of 1,340 annual burden hours.

  • Staff characteristics survey. We expect to survey 200 program staff who directly interact with participants (20 per program). The survey is expected to take 25 minutes (0.42 hours) to complete, or a total of 28 annual burden hours.

  • Program leadership survey. We expect to survey 50 program leaders (five per program). The survey is expected to last 15 minutes (0.25 hours) to complete, or a total of four annual burden hours.

  • Semi-structured program discussion guide—program leaders. We expect to interview 40 program leaders across all ten programs (approximately four per program). We expect each staff interview to last 1.5 hours on average, or a total of 20 annual burden hours.

  • Semi-structured program discussion guide—program supervisors and partners. We expect to interview 80 program supervisors or partners across all ten programs (approximately eight per program). We expect each interview to last one hour on average, or a total of 27 annual burden hours.

  • Semi-structured program discussion guide—program staff and providers. We expect to interview 80 direct service staff across all ten programs (approximately eight per program). We expect each staff interview to last 0.75 hours on average, or a total of 20 annual burden hours.

  • Semi-structured employer discussion guide. We expect to interview 50 employers’ staff across all ten programs (approximately 5 per program). We expect each interview to last one hour on average, or a total of 17 annual burden hours.

  • In-depth participant interview guide. We expect to interview 200 study participants (20 in each of the ten programs). These interviews are expected to last two hours on average, or a total of 134 annual burden hours.

  • Cost workbook. We expect that 40 program staff (four in each of the ten programs) will enter data on expenditures and costs into Excel. We expect one entry per staff member and expect that each entry will take 32 hours, or a total of 416 annual burden hours.

  • First and second follow-up surveys. We expect to survey 10,000 study participants (1,000 participants per intervention) at two follow-up points. We anticipate an 80 percent response rate or 8,000 respondents to each survey, or 2,667 annualized over three years. We expect each survey to last 50 minutes (0.83 hours), or a total of 2,214 annual burden hours.


Estimated Annualized Cost to Respondents

Phase 1:

The total annual cost for data collection instruments in Phase 1 of this request is $35,882. The total estimated cost figures are computed from the total annual burden hours and an average hourly wage for staff and participants. The wage rate for program staff administering the survey is based on the May 2020 employment and wages from Occupational Employment Statistics survey from the Bureau of Labor Statistics (http://www.bls.gov/oes/current/oes_stru.htm). The rate used for direct service staff, $18.38, is the mean wage for social and human services assistants under SOC code 21-1093. The average hourly wage of study participants is estimated to be $7.25, the federal minimum wage.


Table A.6. Burden and cost for information collection proposed in Phase 1


Instrument

No. of respondents (total over request period)

No. of responses per respondent (total over request period)

Avg. burden per response (in hours)

Total burden (in hours)

Annual burden (in hours)

Average hourly wage rate

Total annual respondent cost

Baseline survey & Identifying and contact information – participants

10,000

1

0.42

4,200

1,400

$7.25

$10,150

Baseline survey & Identifying and contact information – staff

200

50

0.42

4,200

1,400

$18.38

$25,732

Estimated annual burden total

2,800


$35,882


Phase 2:

The total annual cost for data collection instruments in Phase 2 for which we are currently requesting approval is estimated to be $69,628. The total estimated cost figures are computed from the total annual burden hours and an average hourly wage for staff, participants, and employers. Hourly wage estimates were derived from the U.S. Bureau of Labor Statistics 2020 National Compensation Survey (http://www.bls.gov/oes/current/oes_stru.htm).

  • We estimate the average hourly wage for program leaders to be $52.59, the average hourly wage of Local Government Managers under SOC code 11-1021 (General and Operations Managers).

  • The rate used for program and partner supervisors, $36.13, is the mean wage for social and community services managers (SOC code 11-9151).

  • The rate used for direct service staff, $18.38, is the mean wage for social and human services assistants (SOC code 21-1093).

  • The average hourly wage of study participants is estimated to be $7.25, the federal minimum wage.

  • The average hourly wage for employers is estimated as $60.45, the average wage of General and Operations Managers across industries (SOC 11-1021).


Table A.7. Burden and cost for information collection proposed in Phase 2


Instrument

No. of Respondents (total over request period)

Annual Number of Respondents

No. of Responses per Respondent (total over request period)

Avg. Burden per Response (in hours)

Annual Burden (in hours)

Average Hourly Wage Rate

Total Annual Respondent Cost

Service receipt tracking – staff

200

67

250

0.08

1,340

$18.38

$24,629

Staff characteristics survey – staff

200

67

1

0.42

28

$18.38

$515

Program leadership survey – program leaders

50

17

1

0.25

4

$52.59

$210

Semi-structured program discussion guide –program leaders

40

13

1

1.50

20

$52.59

$1,052

Semi-structured program discussion guide —program supervisors and partners

80

27

1

1.00

27

$36.13

$976

Semi-structured program discussion guide —program staff and providers

80

27

1

1.00

27

$18.38

$496

Semi-structured employer discussion guide – employers

50

17

1

1.00

17

$60.45

$1,028

In-depth participant interviews – participants

200

67

1

2.00

134

$7.25

$972

Cost workbook – staff

40

13

1

32.00

416

$18.38

$7,646

First follow-up survey – participants

8,000

2,667

1

0.83

2,214

$7.25

$16,052

Second follow-up survey – participants

8,000

2,667

1

0.83

2,214

$7.25

$16,052

Estimated annual burden total

6,441


$69,628


The total estimated burden for previously approved Phase 1 instruments is 2,800 hours. The total estimated burden for previously approved Phase 2 instruments (2,013), and the Phase 2 instruments for which we currently request approval (4,428 hours) is 6,441 hours. The total estimated burden for all the instruments in the study is 9,241 hours.


A13. Costs

There are no additional costs to respondents.

A14. Estimated Annualized Costs to the Federal Government

Phase 1:

The total cost to the Federal government for the data collection activities under the first phase of this ICR will be about $3,109,600. Annualized costs to the Federal government will be about $1,036,533 for the proposed data collection. These estimates of costs are derived from Mathematica’s budgeted estimates and include labor rates, direct costs, and tokens of appreciation for respondents.

Cost category

Estimated costs

PHASE 1


Field work

$1,889,800

Analysis

$527,400

Publications/dissemination

$692,400

Total costs over the request period

$3,109,600

Annual costs

$1,036,533


Phase 2:

The total cost to the Federal government for all Phase 2 data collection activities will be about $12,438,400. Annualized costs to the Federal government will be about $4,146,133 for the proposed data collection. These estimates of costs are derived from Mathematica’s budgeted estimates and include labor rates, direct costs, and tokens of appreciation for respondents.


Cost category

Estimated costs

PHASE 2


Field work

$7,559,200

Analysis

$2,109,600

Publications/dissemination

$2,769,600

Total costs over the request period

$12,438,400

Annual costs

$4,146,133



A15. Reasons for Changes in Burden

The requested changes submitted as part of this request do not change the burden estimates for either Phase 1 or Phase 2.



A16. Timeline

The beginning of participant intake and baseline data collection is staggered by program. Due to current and expected delays in the study schedule due to COVID-19, the first programs began baseline data collection in summer and winter of 2021. Other programs will begin intake in early 2022. For each program, we expect intake and baseline data collection to continue for about 18 to 30 months. Data collection for the descriptive and cost studies began in 2021 or will begin in 2022 depending on the program. The first follow-up survey will begin in 2022, and the second follow-up survey will begin in 2023.


Findings from the project will be published throughout the study in technical reports and briefs. We anticipate that reporting on the descriptive and cost studies will begin in 2023 and continue through 2024. Reporting on the intermediate impact findings will likely begin in 2024 and continue through 2025. Reporting on final impact findings will likely begin in 2025 and continue through 2027.


We anticipate that data archives (restricted or public use) would become available starting in 2026 and hosted on a data archive platform such as the Inter-university Consortium for Political and Social Research (ICPSR).



A17. Exceptions

No exceptions are necessary for this information collection.


Attachments:


Instruments

Instrument 1. Baseline survey – revised (submitted May 2022)

Instrument 2. Identifying and contact information – revised

Instrument 3. First follow-up survey – revised

Instrument 4. Second follow-up survey – revised

Instrument 5. Service receipt tracking – revised

Instrument 6. Staff characteristics survey – revised

Instrument 7. Program leadership survey – revised

Instrument 8. Semi-structured program discussion guide – revised

Instrument 9. Semi-structured employer discussion guide – revised

Instrument 10. In-depth participant interview guide – revised

Instrument 11. Cost workbook


Appendices

Appendix A. Informed consent form – revised

Appendix A.1. Bridges consent forms

Appendix B. Question-by-question justification for baseline survey – revised (submitted May 2022)

Appendix C. Question-by-question justification for identifying and contact information – revised

Appendix D. Question-by-question justification for follow-up surveys – revised

Appendix G. Follow-up survey reminders and notifications – revised

Appendix G.1. NextGen Project recruitment materials

Appendix P. Federal Register Notice

Appendix P.1. Federal Register Notice – 30-day request, published January 2021

Appendix P.2. Federal Register Notice – 30-day request, published March 2022

Appendix Q. Summary of requested changes (submitted February 2021)

Appendix Q.1. Summary of requested changes – revised (submitted May 2022)



Supporting Statement A: References

Beydoun, Hind, Audrey F. Saftlas, Kari Harland, and Elizabeth Triche. 2006. Combining conditional and unconditional recruitment incentives could facilitate telephone tracing in surveys of postpartum women. Journal of Clinical Epidemiology 59 (7): 732–38.

Cantor, David, Barbara O’Hare, and Kathleen O’Connor. 2008. The use of monetary incentives to reduce non-response in random digit dial telephone surveys. In Advances in telephone survey methodology, eds. James M. Lepkowski, Clyde Tucker, J. Michael Brick, Edith de Leeuw, Lilli Japec, Paul J. Lavrakas, Michael W. Link, and Roberta L. Sangster, 471–98. New York, NY: Wiley.

Cialdini, R. B. (2007). Influence: the psychology of persuasion. Rev. ed. ; 1st Collins business essentials ed. New York: Collins.

Falk, Armin. “Gift Exchange in the Field.” Econometrica 75, no. 5 (2007): 1501–11. http://www.jstor.org/stable/4502037.

Gneezy, Uri, and John A. List. “Putting Behavioral Economics to Work: Testing for Gift Exchange in Labor Markets Using Field Experiments.” Econometrica 74, 5 (September 2006): 1365-1384.

Hock, Heinrich, Priyanka Anand, Linda Mendenko, Rebecca DiGiuseppe, and Ryan McInerney. The Effectiveness of Prepaid Incentives in a Mixed-Mode Survey. Presentation to the Annual Conference of the American Association for Public Opinion Research 2015.

Jäckle, Annette, and Peter Lynn. 2008. Respondent incentives in a multi-mode panel survey: Cumulative effects on nonresponse and bias. Survey Methodology 34 (1): 105–17.

Mann, Sue L., Diana J. Lynn, and Arthur V. Peterson. 2008. The “downstream” effect of token prepaid cash incentives to parents on their young adult children’s survey participation. Public Opinion Quarterly 72 (3): 487–501.

Mercer, Andrew; A. Caporaso; D. Cantor; and R. Townsend. 2015. How Much Gets You How Much? Monetary Incentives and Response Rates in Household Surveys. Public Opinion Quarterly, Vol. 79, No. 1, Spring 2015, pp. 105-129.

Singer, Eleanor, and Cong Ye. 2013. “The Use and Effects of Incentives in Surveys.” ANNALS of the American Academy of Political and Social Science 645:112–41.

Singer, Eleanor, Nancy Gebler, Trivellore Raghunathan, John Van Hoewyk, and Katherine McGonagle. 1999. The effect of incentives in interviewer-mediated surveys. Journal of Official Statistics 15 (2): 217–30.

What Works Clearinghouse. “Standards Handbook, Version 4.” 2017. Available at https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_standards_handbook_v4.pdf.

0 One program selected for the evaluation will involve participants under the age of 18. For the evaluation of this program, informed consent will also be collected from the participant’s parent or guardian, and assent will be collected from the participant. Some interventions might also involve adults or youths with cognitive disabilities. For these interventions, the NextGen Project will rely on determinations, screenings, or assessments made by site staff to ensure the potential participants are capable of understanding the consent process and implications of participating in the study. If program staff determine that a potential participant is unable to understand, that individual will be exempt from the NextGen Project and will not be included in any data collection.

0 As noted in Section A2, the NextGen Project has currently identified five programs and is not actively recruiting new programs. However, we remain open to the possibility of adding additional programs later if need arises or an eligible program provides a good fit for the study design. If the expected number of study participants is lower than 10,000 at the end of the intake period, we will provide an update to OMB.

21


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorLawrence, Marie (ACF)
File Modified0000-00-00
File Created2022-05-03

© 2024 OMB.report | Privacy Policy