EC Long Term Supporting Statement A_FINAL

EC Long Term Supporting Statement A_FINAL.docx

OPRE Evaluation: Evaluation of Employment Coaching for TANF and Other Related Populations [Experimental impact study and an Implementation study]

OMB: 0970-0506

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes



Evaluation of Employment Coaching for TANF and Related Populations




OMB Information Collection Request

0970-0506





Supporting Statement

Part A






April 2022








Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officer:

Lauren Deutsch



Part A




Executive Summary


  • Type of Request: This Information Collection Request (ICR) is for an extension to continue ongoing approved data collection activities and add additional activities for the sample enrolled in the Evaluation of Employment Coaching for Temporary Assistance for Needy Families (TANF) and Related Populations (OMB#: 0970-0506). We are requesting 1) An extension for the previously approved second follow-up survey data collection; 2) New data collection through a third follow-up survey; and 3) New data collection through follow-up semi-structured interviews with management, staff, supervisors, and participants. We are requesting three years of approval.

  • Progress to Date: Study enrollment is complete; the programs have recruited 5,026 people eligible for their programs to participate in the study. Approved data collection activities related to the implementation study are complete and analysis is ongoing. The approved first follow-up survey data collection is complete and resulted in 3,255 respondents for a response rate of 68 percent. The approved second follow-up survey, which is ongoing, has 3,579 respondents to date.

  • Timeline: The study enrollment period was extended through February 2020. Study participants are released for second follow-up survey data collection 21 to 24 months after study enrollment. All study participants were released for second follow-up survey data collection in February 2022. After release, it may take six months for survey data collection to be completed with these second follow-up survey participants. Therefore, we will not be able to complete the second follow-up survey data collection by the current end of the approval period in April 2022. The proposed third follow-up survey data collection will take place from September 2022 to August 2024. The proposed semi-structured interview data collection will take place approximately from August 2022 to October 2022.

  • Previous Terms of Clearance: The initial ICR approval required a memo detailing response rate patterns under the initially approved two-tiered follow-up survey incentive. We anticipate submission of that memo in spring 2022.

  • Summary of Changes Requested: We are requesting an extension with no changes for the previously approved second follow-up survey data collection to allow sample members who enrolled at the end of the study intake period to complete the second follow-up survey.

We are proposing new instruments to collect descriptive information about how coaches form trusting relationships with their participants and other key topics that have emerged as important in analysis of previously collected study data. These new instruments include a third follow-up survey for participants and additional follow-up through semi-structured interviews with program management, staff, supervisors, and participants. The survey will provide information on participants at least four years after random assignment and the interviews will enable additional input from employment coaching program staff and participants on processes and perceptions of employment coaching.

We do not intend for this information to be used as the principal basis for public policy decisions.

  • Description of Request: The request includes an extension to complete data collection with the currently approved second follow-up survey (Attachment N) and a request for approval of new instruments, including a third follow-up survey (Attachment Q) and semi-structured interviews with management (Attachment R), staff and supervisors (Attachment S), and participants (Attachment T).

  • Time Sensitivity: The current approval for the second follow-up survey will expire on April 30, 2022. We will not be able to complete the second follow-up survey data collection by that date and are therefore seeking approval before then, in order to avoid continue fielding the survey without disruption.



A1. Necessity for Collection

This data collection will provide information about the effectiveness of employment coaching programs in helping TANF and other low-income populations achieve economic independence. The proposed extension for the second follow-up survey data collection under OMB #0970-0506 will allow for continued follow-up in the evaluation sites. The extension is necessary to complete the second follow-up survey. There are no changes to the previously approved information collection.

The approved data collections for this study provide information about program impacts for the first 21 to 24 months after study enrollment. However, it is possible that the programs continue to generate impacts beyond this period, particularly given that three employment coaching interventions included in the study continue to provide services beyond the period covered by the second follow-up survey. The proposed new information collection through a third follow-up survey under OMB #0970-0506 will provide information about participants at least four years after random assignment. This activity will provide rigorous evidence on whether the coaching interventions are effective, for whom, and under what circumstances over the longer term. The information collected at a later follow-up point will be used to assess how employment coaching might have a continued effect on participants long after they have left coaching programs.

Qualitative analysis of completed approved data collections related to the implementation study suggest the importance of the relationship between the coach and program participant and certain program features, such as participation incentives. The proposed new information collection through follow-up semi-structured interviews with management, staff, supervisors, and participants under OMB #0970-0506 will enable additional input from employment coaching program staff and participants on the processes and perceptions of employment coaching. The proposed new data collection instruments will provide descriptive information about how coaches form trusting relationships with their participants and other key topics that have emerged as important in analysis of previously collected study data.

There are no legal or administrative requirements that necessitate this collection. ACF is undertaking the collection at the discretion of the agency.


A2. Purpose

Purpose and Use

The information collected through the instruments included in this Information Collection Request (ICR) will be used to learn about coaching interventions in employment programs serving Temporary Assistance for Needy Families (TANF) and other low-income populations. The data collection efforts will provide information on implementation of coaching interventions, the experiences of the program participants who are paired with a coach, and the interventions’ effectiveness at improving outcomes for program participants. They will also provide information on the reasons interventions may or may not be effective, the successes and challenges in implementing them, and potential solutions for addressing those challenges.

This information will be synthesized and disseminated through a series of reports and research briefs intended for diverse audiences, such as program practitioners, policymakers, and other stakeholders interested in employment coaching or programs intended for low-income populations. If the information collection does not take place, policymakers and providers of coaching programs will lack high-quality and long-term information on the effects of the interventions.

The information collected is meant to contribute to the body of knowledge on ACF programs. It is not intended to be used as the principal basis for a decision by a federal decision-maker, and is not expected to meet the threshold of influential or highly influential scientific information.

Research Questions

The instruments included in this ICR will provide data for the study to answer the following research questions:

  1. Do the coaching interventions improve participants’ employment outcomes (such as employment, earnings, job quality, job retention, job satisfaction, and career advancement); self-sufficiency (income, public assistance receipt); and other measures of well-being?

  2. Do the coaching interventions improve measures of self-regulation? To what extent do impacts on self-regulation explain impacts on employment outcomes?

  3. Are the coaching interventions more effective for some groups of participants than others?

  4. How do the impacts of the coaching interventions change over time?

  5. What factors are associated with coaches and participants developing strong, trusting relationships? Do these factors differ with different circumstances? Do coaches and participants have different perspective on these factors?

  6. What role does the knowledge and discussions of executive skills have in coaching? Do coaches and participants have different perspectives of the role of these discussions?

  7. What role does paying participants incentives for program participation or reaching milestones have in coaching? Do coaches and participants have different perspectives of the role of incentives?


Study Design

This ongoing study includes an impact evaluation to provide rigorous evidence on whether the coaching interventions are effective over the longer term, for whom, and under what circumstances. It also includes an implementation assessment that describes the coaching interventions and how they operated, provides information on the contrast between the treatment and control groups, and details challenges to implementing the interventions and solutions to addressing those challenges. The study includes the following coaching interventions: MyGoals for Employment Success in Baltimore (MyGoals Baltimore); MyGoals for Employment Success in Houston (MyGoals Houston); Family Development and Self-Sufficiency (FaDSS) program in Iowa; LIFT in New York City, Chicago, and Los Angeles; Work Success in Utah; and Goal4 It! in Jefferson County, Colorado. Descriptions of these interventions appear in SSB, section B.2.

The impact evaluation of the study is experimental. Participants eligible for the coaching services were asked to consent to participate in the study and, if consent was given, were randomly assigned to one of two groups: a treatment group offered coaching or a control group not offered coaching. Individuals who did not consent to participate in the study were not eligible to receive coaching, were not randomly assigned, and will not participate in the data collection efforts. The control group may receive other services within the program. Both groups will remain eligible for other services offered in the community. For example, the control group may receive regular case management from staff who have not been trained in coaching. With this design, the research groups are likely to have similar characteristics, so differences in outcomes too large to be attributable to chance can be attributed to the coaching intervention. As previously approved, we collected information at baseline (before or during random assignment) from study participants and staff and again at about 6 to 12 months after random assignment. The approved second follow-up survey and the new third follow-up survey included in this request will provide information about key outcomes 21 to 24 months after study enrollment and at least 48 months after study enrollment, respectively, and enable estimation of impacts at those two time points. We do not plan to include Work Success in the third follow-up survey data collection because the program’s service provision period is shorter than the other four programs and not expected to lead to different impacts at 48 months than at 21 to 24 months. The new semi-structured interviews are part of the implementation assessment and will provide context for how coaches form trusting relationships with their participants and other factors that are associated with more effective coaching.

Table A.1 provides an overview of the data collection instruments.

Table A.1. Information Collections

Data Collection Activity

Instruments

Respondent, Content, Purpose of Collection

Mode and Duration

Second Follow-Up Survey

Adult Survey

Respondents: Study participants


Content: Receipt of employment coaching services, receipt of support services, barriers to employment, additional

education credentials, completion of training program, diploma or training certificate receipt, self-regulation, current employment status and formal employment history, job quality, economic well-being, receipt of TANF, SSI, SSDI, SNAP, WIC, unemployment insurance, or housing assistance, housing status, experience with the criminal justice system


Purpose: To continue to learn about the effectiveness of coaching interventions and improving outcomes for participants in employment programs serving TANF and related populations.

Mode: Web, with phone and field non-response follow-up conducted about 21 months after study enrollment


Duration: 45 minutes

Third Follow-Up Survey

Adult Survey

Respondents: Study participants for the FaDSS, Goal4 It!, LIFT, MyGoals Baltimore, and MyGoals Houston interventions


Content: Barriers to employment, additional

education or training, self-regulation, current employment status and formal employment history, job quality, economic well-being, receipt of TANF, SSI, SSDI, SNAP, WIC, unemployment insurance, or housing assistance, housing status, experience with the criminal justice system


Purpose: To examine long-term impacts across outcome domains of interest


Mode: Web, with phone and field non-response follow-up conducted at least 48 months after study enrollment


Duration: 45 minutes


Semi-structured management interviews

Management topic guide


Respondents: Program management


Content: Factors that affect the development of trusting relationships between coaches and participants, role of discussions of executive skills in coaching, structure and role of paying incentives in coaching programs


Purpose: To provide context to findings from earlier implementation analysis and provide additional information relevant to practitioners looking to implement coaching effectively


Mode: Semi-structured interview


Duration: 60 minutes


Semi-structured interviews for staff and supervisors

Supervisor and staff topic guide

Respondents: Program supervisors and staff


Content: Factors that affect the development of trusting relationships between coaches and participants, role of discussions of executive skills in coaching, structure and role of paying incentives in coaching programs


Purpose: To provide context to findings from earlier implementation analysis and provide additional information relevant to practitioners looking to implement coaching effectively


Mode: Semi-structured interview


Duration: 60 minutes


Semi-structured participant interviews

FaDSS and Goal4 It! topic guide

LIFT topic guide

MyGoals topic guide


Respondents: Study participants for the FaDSS, Goal4 It!, LIFT, MyGoals Baltimore, and MyGoals Houston interventions

Content: Factors that affect the development of trusting relationships between coaches and participants, role of discussions of executive skills in coaching, structure and role of paying incentives in coaching programs; topic guide content varies by intervention based on the program model


Purpose: To provide context to findings from earlier implementation analysis and provide additional information relevant to practitioners looking to implement coaching effectively

Mode: Semi-structured interview


Duration: 90 minutes for FaDSS and Goal4 It!

120 minutes for LIFT

150 minutes for MyGoals



Other Data Sources and Uses of Information

The Employment Coaching follow-up survey data collections will continue to assess administrative records data for outcomes of interest; this information is already being collected and represents no additional burden for participants or program staff. The project team will collect administrative data on quarterly earnings, receipt of unemployment insurance, and new hires on all study participants from the National Directory of New Hires (NDNH), which is maintained by the Office of Child Support Enforcement at ACF. The project team will also collect records for study participants on TANF benefit receipt from state or local TANF agencies and, where available from the same agency, Supplemental Nutrition Assistance Program (SNAP) benefit receipt.


A3. Use of Information Technology to Reduce Burden

This evaluation is using multiple applications of information technology to reduce burden. For example, the follow-up surveys are hosted on the internet via a live secure web-link. To reduce burden, the surveys employ the following: (1) secure log-ins and passwords so that respondents can save and complete the survey in multiple sessions, (2) drop-down response categories so that respondents can quickly select from a list, (3) dynamic questions and automated skip patterns so that respondents only see those questions that apply to them (including those based on answers provided previously in the survey), and (4) logical rules for responses so that respondents’ answers are restricted to those intended by the question.

Respondents also have the option to complete the follow-up surveys using computer-assisted telephone interviewing (CATI). CATI reduces respondent burden, relative to interviewing via telephone without a computer, by automating skip logic and question adaptations and by eliminating delays caused when interviewers must determine the next question to ask. CATI is programmed to accept only valid responses based on preprogrammed checks for logical consistency across answers.

The proposed semi-structured interviews will be conducted either in-person, by video, or by phone, according to each respondent’s preference and any health concerns. The use of technology in this case is intended to both reduce burden on respondents and address potential participant concerns related to COVID-19.


A4. Use of Existing Data: Efforts to reduce duplication, minimize burden, and increase utility and government efficiency

Information that is already available from alternative data sources will not be collected again for this evaluation. We will be collecting information related to employment and earnings both through administrative records and directly from study participants. This information is not duplicative because the two sources cover different types of employment. Information on quarterly earnings from jobs covered by unemployment insurance will be obtained from NDNH administrative records. The follow-up surveys ask for earnings across all jobs, including those not covered by unemployment insurance. A number of experimental employment evaluations have found large differences in survey- and administrative-based earnings impacts (Barnow and Greenberg 2015). Therefore, collecting information from both sources is necessary for a full understanding of impacts on earnings. To further identify and avoid duplication, we do not request baseline characteristic information in the second and third follow-up surveys for participants who already provided this information in the first follow-up survey. In the third follow-up survey, we limit the length of the reference period for a number of items in an effort to additionally minimize burden on respondents.

The additional semi-structured interviews we propose conducting will collect information that is not available from any other data source.


A5. Impact on Small Businesses

No small businesses will be involved with this information collection.


A6. Consequences of Less Frequent Collection

A first follow-up survey is available to participants approximately 6 to 12 months after random assignment. The second follow-up survey is available to participants about 21 to 24 months after random assignment. The third follow-up survey would be available to participants about 48 months after random assignment. The second and third follow-up surveys are intended to collect a similar set of outcome data as the first. This will allow an examination of whether the impacts of the program changed over time and whether changes in self-regulation skills were associated with changes in employment and self-sufficiency outcomes. The goal of the third follow-up survey is to learn about the long-term effects of employment coaching. The information collected at a later follow-up point will be used to assess how employment coaching might have a continued effect on participants long after they have left coaching programs. With a follow-up period of 48 months, program participation will be completed and impacts of the program are likely to be at a steady state.

The additional implementation study data collection for which this request seeks approval will be a one-time data collection.


A7. Now subsumed under 2(b) above and 10 (below)

A8. Consultation

Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on February 11, 2022, Volume #87, No. 29, page 8019, and provided a sixty-day period for public comment. During the notice and comment period, no substantive comments were received.

Consultation with Experts Outside of the Study

We did not consult with experts outside of the study in developing the data collection instruments.


A9. Tokens of Appreciation

Estimates of program impacts may be biased if respondents differ substantially from non-respondents and those differences are correlated with assignment to the evaluation treatment or control groups. The risk of biased impact estimates increases with lower overall survey response rates or larger differences in survey response rates between the research groups (What Works Clearinghouse 2013). Thus, if low overall response rates or large differential response rates between the research groups are observed, differences between groups on key outcomes might be the result of differences in baseline characteristics among survey respondents and cannot be attributed solely to the effect of the coaching intervention (What Works Clearinghouse 2013).

Concern about the potential for low overall response rates is particularly relevant to this study. The longitudinal nature of the study adds to the complexity of the second and third follow-up survey data collections. Additionally, the coaching interventions are designed for unemployed low-income people. A number of factors could complicate tracking such participants over time. These factors include:

  • Unstable housing.

  • Less use of mortgages, leases, public utility accounts, cell phone contracts, credit reports, memberships in professional associations, licenses for specialized jobs, activity on social media, and appearances in publications such as newspapers or blogs.

  • Use of an alias to get utility accounts because of poor credit and prior payment issues.

  • Use of pay-as-you-go cell phones. These phone number are generally not tracked in online databases. Pay-as-you-go cell phone users also switch numbers frequently, which makes contacting them across a follow-up period more difficult.

Differential response rates between the treatment and control groups could bias this study’s impact estimates. Participants assigned to the control group may be less motivated to participate than those assigned to the treatment group because they are not receiving the intervention. They may also feel that the surveys are not relevant to them.

To address these concerns, the study team has implemented a number of procedures, as detailed in Supporting Statement B, section B5. As part of this effort, OPRE proposed and OMB approved tokens of appreciation for this study. In extending the approved data collection related to the second follow-up survey, we will retain the current data collection procedures. This includes the approved $50 tokens of appreciation to study participants for the completion of the second follow-up survey, which is administered about 21 months after study enrollment.

For the new data collection related to the third follow-up survey, to be administered at least 48 months after study enrollment, we propose providing a $5 cash pre-pay enclosed with advance letters and a $65 gift card upon survey response in appreciation of continued participation in the study. Previous research demonstrates prepaid tokens of appreciation enclosed with advance letters reliably increase response rates (Singer and Ye 2013; Mercer et al. 2015). The proposed increase in the amount of the incentive reflects the fact that survey response is often more difficult to achieve for longer-term follow-up surveys and that inflation has been high in recent months. The increased incentive amount and pre-pay incentive will facilitate survey response and reduce the risk of survey attrition bias in the study’s impact estimates, as discussed in more detail below. Recently, the Supporting Youth to be Successful in Life study received OMB approval (#0970-0574) to provide a $65 gift card to youth at risk of homelessness who complete their 24-month follow-up survey. Supporting Youth to be Successful in Life and the third follow-up survey under this study have similar concerns around both retaining respondents through the study period and locating respondents to complete the follow-up surveys.

As previously approved, we also offered a $60 gift card to respondents who participated in the in-depth interviews for this effort’s implementation study, which are estimated to take 2.5 hours on average. For the new semi-structured participant interviews proposed as part of this request, we propose offering participants gift cards of different amounts based on their study site, which determines the expected length of the participant interview. Specifically, we propose a $60 gift card for MyGoals participants who complete the interviews (average length 2.5 hours), a $50 gift card for LIFT participants (average length 2 hours), and a $40 gift card for FaDSS and Goal4 It! participants (average length 1.5 hours).


A10. Privacy: Procedures to protect privacy of information, while maximizing data sharing

Personally Identifiable Information (PII)

As part of the study, we collected the respondent's first and last name, date of birth, Social Security Number, mailing address, telephone number, and email address; this information may be updated during the data collection period. The purpose of collecting this PII is to:

  • Confirm the identity of the respondent during follow-up efforts,

  • Contact respondents to participate in follow-up efforts,

  • Facilitate locating for nonresponsive respondents, and

  • Request administrative data.


Information will not be maintained in a paper or electronic system from which they are actually or directly retrieved by an individuals’ personal identifier.


Assurances of Privacy

PII will be kept private to the extent permitted by law. As part of the consent process, respondents were informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law.

Due to the sensitive nature of this research (see A.11 for more information), the evaluation obtained a Certificate of Confidentiality. The Certificate of Confidentiality helps assure participants that their information will be kept private to the fullest extent permitted by law.

Data Security and Monitoring

As specified in the evaluator’s contract, the Contractor shall protect respondent privacy to the extent permitted by law and will comply with all Federal and Departmental regulations for private information. Mathematica has developed a Data Safety and Monitoring Plan that assesses all protections of respondents’ PII. Mathematica and its subcontractor Abt Associates will ensure that all of its employees, subcontractors (at all tiers), and employees of each subcontractor who perform work under this contract/subcontract are trained on data privacy issues and comply with the above requirements. All study staff with access to PII will receive study-specific training on (1) limitations on disclosure; (2) safeguarding the physical work environment; and (3) storing, transmitting, and destroying data securely. These procedures will be documented in training manuals. Refresher training will occur annually.

As specified in the evaluator’s contract, the Contractor shall use Federal Information Processing Standard compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of sensitive information during storage and transmission. Mathematica will securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Information Processing Standard. Mathematica will ensure that they incorporate this standard into their property management/control system, and establish a procedure to account for all laptop computers, desktop computers, and other mobile devices and portable media that store or process sensitive information. Any data stored electronically will be secured in accordance with the most current National Institute of Standards and Technology requirements and other applicable Federal and departmental regulations.


A11. Sensitive Information

Some sensitive questions are necessary in an evaluation of programs designed to affect employment. Before starting the baseline and follow-up surveys, all respondents are and will continue to be informed that their identities will be kept private and that they do not have to answer any question that makes them uncomfortable. Although such questions may be sensitive for many respondents, they have been successfully asked of similar respondents in other data collection efforts, such as in the follow-up surveys and first round of management, staff, and supervisor interviews and participant in-depth interviews already conducted for the Evaluation of Employment Coaching for TANF and Related Populations (OMB #0970-0506), as well as in the Parents and Children Together (OMB #0970-0403) study and the Workforce Investment Act Gold Standard Evaluation (OMB #1205-0504).

The sensitive questions relevant for this request for the second and third follow-up surveys include:

  • Wage rates and earnings. It is necessary to ask about earnings because increasing participants’ earnings is a key goal of coaching interventions. The second follow-up survey asks about each job worked since random assignment, the wage rate, and the number of hours worked per week. The third follow-up survey asks about each job worked during the year before survey response.

  • Challenges to employment. It is important to ask about challenges to employment to understand how programs might be supporting participants and their potentially unmet needs. The second and third follow-up surveys include questions that address challenges to employment.

  • Convictions. Prior involvement in the criminal justice system makes it harder to find employment. The second follow-up survey asks about convictions that occurred before random assignment as baseline information (if participants did not already provide this information in the first follow-up survey) and convictions that occurred after random assignment or since the first follow-up survey as an outcome that may be affected by coaching. The third follow-up survey asks about criminal justice involvement in the year before survey response.

  • Economic hardships. The second and third follow-up surveys ask about economic hardships, such as missing meals or needing to borrow money from friends. These outcomes reflect a lack of self-sufficiency and may be affected by coaching.

  • COVID-19 vaccination status. The third follow-up survey asks about COVID-19 vaccination status. This information will provide context to observed employment patterns given that vaccinated people might be eligible for more jobs than those who are not vaccinated and are less subject to risk of infection or serious disease.


A12. Burden

Explanation of Burden and Cost Estimates

The following provides information about how we calculated respondent burden for each activity:

  • Second follow-up survey (Attachment N). For this ongoing data collection, we anticipate an average burden of about 45 minutes per completion based on responses to date. This is consistent with our previous estimate. There are 824 remaining potential respondents.

  • Third follow-up survey (Attachment Q). We anticipate an average burden of about 45 minutes per completion given experience with the first and second follow-up surveys, which are similar in structure and content. We plan to attempt to administer the third follow-up survey for all study participants for the FaDSS, Goal4 It!, LIFT, MyGoals Baltimore, and MyGoals Houston interventions who did not adamantly refuse response to the first or second follow-up surveys and who are not deceased. There are 4,239 potential respondents to the third follow-up survey from the original 4,273 study participants for these interventions.

  • Semi-structured management interviews (Attachment R). We anticipate that these interviews will take an hour to complete on average based on experience with earlier semi-structured staff interviews conducted as part of the implementation study. We propose conducting 20 interviews with management staff.

  • Semi-structured staff and supervisor interviews (Attachment S). We anticipate that these interviews will take an hour to complete on average based on experience with earlier semi-structured staff interviews conducted as part of the implementation study. We propose conducting 40 interviews with staff and supervisors.

  • Semi-structured participant interviews (Attachment T). We anticipate that the duration of the interviews will vary by site because of differences in the number of questions asked by site. We expect the average interview to last 90 minutes for FaDSS and Goal4 It! participants, 120 minutes for LIFT participants, and 150 minutes for MyGoals participants. We propose conducting approximately 7 interviews with participants from each of the FaDSS, Goal4 It!, LIFT, MyGoals Baltimore, and MyGoals Houston sites.

Estimated Annualized Cost to Respondents

To estimate annualized costs to respondents to the second follow-up survey and semi-structured interviews with program participants, we used the federal minimum wage as the basis for program participant respondent hourly wage. For other semi-structured interviews, we used the U.S. Bureau of Labor Statistics Occupational Employment and Wages 11-9151 Social and Community Service Managers median rate as the basis for management, staff, and supervisor respondent hourly wage.

Table A.2 presents the 618 annual burden hours remaining at the time of this request from the previously approved information collection for which we are requesting an extension.

Table A.2. Annual Burden Estimates -- Burden Remaining From Previously Approved Information Collections

Instrument

No. of Respondents (total over request period)

No. of Responses per Respondent (total over request period)

Avg. Burden per Response (in hours)

Total Burden (in hours)



Average Hourly Wage



Total Annual Cost

Second follow-up survey

824

1

0.75

618

$7.25

$4,481

Estimated Total Annual Burden Hours: 618



Table A.3 provides the estimated annual burden and cost calculations for the third follow-up survey and semi-structured interview data collections included in this ICR. The total annual estimated burden is 1,104 hours.

Table A.3. Annual Burden Estimates -- New Burden Requested Under This Information Collection

Instrument

No. of Respondents (total over request period)

No. of Responses per Respondent (total over request period)

Avg. Burden per Response (in hours)

Total Burden (in hours)

Annual Burden (in hours)

Average Hourly Wage

Total Annual Cost

Third follow-up survey

4,239

1

0.75

3,179

1,060

$7.25

$7,685

Semi-structured management interviews

20

1

1

20

7

$33.46

$234

Semi-structured staff and supervisor interviews

40

1

1

40

13

$33.46

$435

Semi-structured participant interviews, FaDSS and Goal4 It!

14

1

1.5

21

7

$33.46

$234

Semi-structured participant interviews, LIFT

7

1

2

14

5

$33.46

$167

Semi-structured participant interviews, MyGoals

14

1

2.5

35

12

$33.46

$402

Estimated Total Annual Burden Hours: 1,104




A13. Costs

There are no additional costs beyond what is outlined in A12 and A14.


A14. Estimated Annualized Costs to the Federal Government

The total cost to the federal government for the data collection activities described in this request is estimated at $5,472,374. Annualized costs to the federal government are estimated at $1,824,125. These estimates of costs are derived from Mathematica’s budgeted estimates and include labor rates and, direct and indirect costs and are displayed below in Table A.6.

Table A.4. Estimated Annualized Costs

Cost Category

Estimated Costs

Field Work

$4,332,938

Analysis

$608,261

Publications/Dissemination

$688,345

Total costs over the request period

$5,629,544

Annual costs

$1,876,515




A15. Reasons for changes in burden

This is in part a new information collection request. For the continuing collection for which an extension is requested, there is no change in burden.




A16. Timeline

Study enrollment and baseline data collection began in summer 2018, as previously approved by OMB (OMB #0970-0506). Over the duration of the evaluation, a series of reports will be generated, the timing for which is highlighted in Table A.5. We will produce reports on the impact findings for the first, second, and third follow-up periods. Reports on the implementation study include a detailed report describing each program and a report examining the implementation findings across all programs (a cross-site implementation study report). In addition to these reports, this evaluation may provide opportunities for analyzing and disseminating additional information through special topics reports and research or issue briefs. We will also provide a public or restricted-use data file for others to replicate and extend our analyses. Findings related to the proposed semi-structured interviews will be incorporated into the evaluation’s implementation reports and impact analyses and reporting.

Table A.5. Study schedule

Activity

Timing*

Data collection


Sample enrollment and baseline data collection

Spring 2018 through Fall 2019 for FaDSS; Summer 2018 through Fall 2019 for LIFT and Goal4 It!; Spring 2019 through Spring 2020 for Work Success; Not applicable for the two MyGoals sites

Implementation study data collection

Summer 2018 through Fall 2020

First follow-up survey

Spring 2018 through Spring 2021

Second follow-up survey


Spring 2019 through Summer 2022

Third follow-up survey

Fall 2022-Spring 2024

Additional management, supervisor, staff, and participant semi-structured interviews

Summer 2022-Spring 2023

Reporting


Implementation study reports

2020-2022

First follow-up findings report

2022

Second follow-up findings report

2023

Third follow-up findings report

2025

Special topics reports on additional interviews

To be determined

*Future dates dependent on date of OMB approval of this information collection request.

A17. Exceptions

No exceptions are necessary for this information collection.



Attachments

Attachment N: Second Follow-Up Survey

Attachment Q: Third Follow-Up Survey

Attachment R: Semi-structured Interviews for Management

Attachment S: Semi-structured Interviews for Staff and Supervisors

Attachment T: Semi-structured Interviews for Participants

Attachment U: Third Follow-up Survey Question-by-question Justification

Attachment V: Notifications



References

Baumgartner, R., and P. Rathbun. “Prepaid Monetary Incentives and Mail Survey Response Rates.” Paper presented at the Annual Conference of the American Association of Public Opinion Research, Norfolk, VA, 1997.

Groves, R.M., E. Singer, and A.D. Corning. “A Leverage-Saliency Theory of Survey Participation: Description and Illustration.” Public Opinion Quarterly, vol. 64, 2000, pp. 299–308.

Martinez-Ebers, V. “Using Monetary Incentives with Hard-to-Reach Populations in Panel Surveys.” International Journal of Public Opinion Research, vol. 9, 1997, pp. 77–86.

Mercer, Andrew; A. Caporaso; D. Cantor; and R. Townsend. 2015. How Much Gets You How Much? Monetary Incentives and Response Rates in Household Surveys. Public Opinion Quarterly, Vol. 79, No. 1, Spring 2015, pp. 105-129.

Shettle, C, and G. Mooney. “Monetary Incentives in Government Surveys.” Journal of Official Statistics, vol. 15, 1999, pp, 231–250.

Singer, E. and R. Kulka. 2002. "Paying Respondents for Survey Participation," In Studies of

Welfare Populations: Data Collection and Research Issues, eds. Michele Ver Ploeg, Robert A.

Moffitt, and Constance F. Citro, pp. 105-128. Washington: National Academy Press.

Singer, Eleanor, and C. Ye. 2013. "The Use and Effects of Incentives in Surveys." Annals of the American Academy of Political and Social Science, 645(1): 112-141.

U.S. Bureau of Labor Statistics (2021, November 10). “Consumer Price Index Summary” [Press release]. Retrieved from Consumer Price Index Summary - 2021 M10 Results (bls.gov)

What Works Clearinghouse. “Assessing Attrition Bias.” Available at: https://ies.ed.gov/ncee/wwc/Docs/ReferenceResources/wwc_attrition_v2.1.pdf. 2013.

14


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMathematica
File Modified0000-00-00
File Created2022-04-22

© 2024 OMB.report | Privacy Policy