JSA OMB Supporting Statement A - Revised 1.4.18 clean

JSA OMB Supporting Statement A - Revised 1.4.18 clean.docx

Job Search Assistance Strategies (JSA) Evaluation - Extension for the Six-Month Follow-up Survey

OMB: 0970-0440

Document [docx]
Download: docx | pdf



Supporting Statement A

For the Paperwork Reduction Act of 1995: Request for an Extension for the Six-Month Follow-up Survey for the Job Search Assistance Strategies Evaluation




OMB No. 0970-0440





Revised January 2018



Submitted by:

Office of Planning, Research & Evaluation

Administration for Children & Families

U.S. Department of Health and Human Services


Federal Project Officer

Carli Wulff

Table of Contents

Attachments

  1. Six-month Follow-up Survey

  2. OMB 60-Day Notice




Part A: Justification

This document provides supporting statements for the collection of information for the Job Search Assistance (JSA) Strategies Evaluation (hereafter, JSA Evaluation), funded by the Office of Planning, Research, and Evaluation (OPRE) in the Administration for Children and Families (ACF) at the U.S Department of Health and Human Services. The goal of the JSA Evaluation is to determine what type of job search assistance strategies are most effective in helping Temporary Assistance for Needy Families (TANF) and other cash assistance applicants and recipients find employment and increase their earnings. The JSA Evaluation contains both an impact study and an implementation study. To date, the impact study has randomly assigned individuals to contrasting JSA approaches. The study will then compare participant employment and earnings to determine the relative effectiveness of these strategies in moving TANF recipients to work and increasing their earnings. The evaluation will also report on the implementation of these strategies, including measures of services received under each JSA approach, and operational lessons. Abt Associates and its partner Mathematica Policy Research are conducting the evaluation.

In October 2013, the JSA Evaluation received approval from the Office of Management and Budget (OMB) for a first set of data collection instruments for the JSA Evaluation, specifically for the field assessment and site selection process (OMB No. 0970-0440). Instruments approved in that submission included the Discussion Guide for Researchers and Policy Experts, the Discussion Guide for State and Local TANF Administrators, and the Discussion Guide for Program Staff. In November 2014, OMB provided approval for the next set of data collection forms—the Baseline Information Form, Staff Surveys, and Implementation Study Site Visit Guides—under the same approval number. In February 2016, OMB provided approval for three additional data collection efforts related to a follow-up survey data administered to sample members – the Contact Update Form, Interim Tracking Surveys, and the JSA six-month follow-up survey instrument – under the same approval number. Approval for these activities expires on February 28, 2018.

This submission seeks OMB approval for the continuation of the JSA six-month follow-up survey instrument. There are no changes requested to this survey. All other information collection under 0970-0440 will be complete by the original OMB expiration date of February 28, 2018. The extension to the follow-up survey is needed because the study enrollment period was lengthier than planned; the enrollment period was originally estimated to span 12 months, but it took 18 months to complete enrollment, leaving insufficient time to complete the six-month follow-up survey. This submission is requesting a four-month extension in order to allow individuals randomly assigned between June and August 2017 to complete the follow-up survey in the same timeframe as earlier enrollees.

The JSA six-month follow-up survey, administered to sample members by telephone, will be a key source for outcomes of interest in the JSA Evaluation. While the principal source of employment and earnings data for the impact study is quarterly Unemployment Insurance records from the National Directory of New Hires (NDNH) (maintained by the Office of Child Support Enforcement (OCSE) at HHS), this follow-up survey will provide critical information on additional measures of interest. This includes the content, mode, and duration of JSA services received; job characteristics related to job quality (e.g. wage, benefits, and schedule); factors affecting the ability to work; public benefit receipt beyond TANF; and household income. Evaluators will use the survey results in both the impact study, to collect data on key outcomes, and the implementation study, to document the JSA services received.

A.1 Necessity for the Data Collection

A.1.1 Study Overview

JSA programs are typically short-term, low-cost programs designed to help job seekers find jobs. Some JSA programs focus on helping job seekers find jobs more quickly than they would on their own, others focus on helping job seekers find better jobs, and some focus on both. TANF programs typically require participation in JSA activities as a condition of receiving cash assistance.

At a general level, there is evidence that JSA strategies are effective in increasing employment, but the impacts are modest (Klerman, et al., 2012). However, there are different approaches to providing job search assistance, but there is very little evidence regarding which strategies are more effective. For example, is providing JSA in a group setting with instruction on job search techniques more or less effective than when individuals search for work one-on-one with a staff person? Are longer time commitments to search for a job more effective than shorter? Given that JSA is an important TANF work activity in all states, ACF sponsored this evaluation to determine the relative effectiveness of various JSA approaches in a large-scale randomized trial across multiple sites.

The impact study component of the JSA Evaluation will use an experimental design to determine the relative effectiveness of contrasting JSA approaches. The evaluation requires TANF applicants to be randomly assigned to one of two JSA approaches, each of which provides a different set of JSA services. This will allow us to measure the incremental benefits of one approach compared to the other. The evaluation will not include a true “no services” control group. In addition to the impact study, there will be an implementation study to document the operation of the JSA approaches, including the context in which they operate; the service delivery structure; and the content, mode, intensity, and duration of services. The evaluation will address the following principal research question:

  • What are the differential impacts of alternative TANF JSA approaches on short-term employment and earnings outcomes?

In addition, the evaluation will address the following secondary research questions:

  • What are the differential impacts of alternative TANF JSA models on: (a) job quality (including wages, work-related benefits, consistency and predictability of hours); (b) public benefits received; (c) family economic well-being; and (d) non-economic outcomes (including motivation to search for a job and psycho-social skills such as perseverance and self-efficacy)?

  • What components of JSA services are linked to better employment outcomes?

  • What are the job search strategies used by successful job seekers?

The evaluation began with a field assessment to identify the contrasts of JSA program features that are of most interest to TANF policymakers and practitioners to test as part of the evaluation, and identify sites for the evaluation.1 The research team identified three sites to be included in both the implementation and impact study (Genesee and Wayne County, MI, New York, NY, and Sacramento County, CA,), and two with an implementation study only (Ramsey County, MN and Westchester County, NY).

The evaluation consists of two key components: and impact study and an implementation study. The follow-up survey is an important data source for both.

  • Impact Study: The impact study will estimate the effects of contrasting JSA services on TANF recipients’ employment, earnings, and benefit receipt. The impact study will derive primary outcomes from existing data, including the date of new hire and quarterly earnings from NDNH and administrative data on receipt of TANF and Supplemental Nutrition Assistance Program (SNAP) benefits from the state and local TANF agencies that participate in the evaluation. The role of the follow-up survey in the impact study will be to measure outcomes that cannot be measured with these data. This includes: (1) job characteristics including wages and benefits (for those with current/recent employment); (2) public benefit receipt beyond TANF and SNAP; (3) job search-related skills such as perseverance and self-efficacy, (4) attitudes towards employment (including reservation wage and attitudes about job search); (5) factors that affect the ability to work; and (6) household income.

  • Implementation Study: The implementation study will document features and dimensions of the approaches under evaluation (e.g., type and mode of services provided, frequency of contacts between TANF recipients and program staff, sequencing of JSA among other TANF work activities) and program context. For the implementation study, the survey will supplement qualitative data from site visits with data on receipt and use of JSA services as recalled by study sample members. In addition to allowing researchers to measure the fidelity of the JSA services to the JSA approaches, this will allow us to document the strategies that lead to a successful job search and potentially identify services and activities that are linked to better outcomes. Survey-based measures for this analysis include: (1) types and frequency of services received in group instruction on job search provided by the TANF program; (2) types and frequency of services received in one-on-one instruction on job search provided by the TANF program; (3) use of on-line job search tools; (4) receipt of support services to support job search such as child care; and (5) job search activities undertaken in last successful job search.

A.1.2 Legal or Administrative Requirements that Necessitate the Collection

There are no legal or administrative requirements that necessitate the collection.

A.1.3 Overview of Data Collection

Six months after enrollment, all sample members receive an advance letter advising them that interviewers would attempt to call to ask them to participate in the JSA follow-up survey. The follow-up survey measures data on (1) employment status; (2) participation in job search assistance activities; (3) how current or most recent job was found;( 4) job search-related skills; (5) motivation to work/attitudes toward job search; (6) barriers to employment; (7) job characteristics and conditions; (8) job search parameters; and (9) household composition and income. Interviewers administer the six-month follow-up survey by telephone. The average interview length is about 20 minutes and sample members receive a $25 token of appreciation (in the form of VISA gift card) for completing the survey.


Evaluators contact study participants to complete the follow-up survey six months after random assignment. On average, it takes about 12-16 weeks to complete the interviews with each cohort. The study team initially expected random assignment to end in 2016. However, random assignment occurred on a rolling basis across the three study sites and ended in the last site in August 2017:


JSA Impact Study Site

Random Assignment Start Date

Random Assignment End Date

New York, NY

October 2015

October 2016

Sacramento County, CA

April 2016

June 2017

Genesee and Wayne County, MI

October 2016

August 2017

The four-month extension to the follow-up survey would allow individuals randomly assigned in the Michigan site between June and August 2017 to complete the follow-up survey in the same timeframe as earlier enrollees.

All other data collection efforts previously approved under 0970-0440 will be complete by the original OMB expiration date of February 28, 2018. These previously approved information collection activities include:

  1. Baseline data collection. The baseline data covered by this clearance collects basic identification, demographic, education, employment and income, and contact information for study participants. The form includes standard items used in prior ACF evaluations and enables the research team to: (1) describe the characteristics of study participants at the time they are randomly assigned to one of the experimental groups; (2) ensure that random assignment was conducted properly; (3) create subgroups for the analysis; (4) provide contact information to locate individuals for potential follow-up surveys; and (5) improve the precision of the impact estimates.

  2. Implementation study site visits. The primary purpose of these visits is to document differences in services and practice within and across contrasting JSA approaches. This activity involves conducting site visits for the purpose of documenting the program context, program organization and staffing, the JSA service components (e.g., assessment, use of self-directed job search, group job search, one-on-one assistance, job development), sequencing and flow of activities, and other relevant aspects of the TANF program (e.g., sanction policy, economic and community context). A second purpose for the site visits was site monitoring to assess whether sites maintained distinctions between contrasting approaches.

  3. JSA staff survey. The on-line survey of TANF supervisors and line staff involved in the provision of JSA and other relevant employment-related services is to collect data on JSA services and other aspects of the TANF program systematically across the study sites. Researchers will use this survey data for two major purposes: (1) to advance the documentation of each JSA approach under study and (2) to derive many of the measures that will used in the impact analysis to link program characteristics and implementation factors to program effects.

  4. Contact update form. The contact update form enables the research team to: (1) confirm or update sample member contact information; (2) confirm existing contact information for alternative contact persons; and (3) capture contact information for a new alternative contact person if needed. It is comparable to contact update forms used in prior ACF evaluations, notably the Pathways for Advancing Careers and Education (PACE) Evaluation (OMB # 0970-0397), the Health Professions Opportunity Grants (HPOG) Impact Evaluation (OMB # 0970-0394), and the Career Pathways Intermediate Outcomes (CPIO) study (OMB # 0970-0394 and 0970-0397). The contact update form was part of the welcome packet study participants received after enrollment.

  5. Interim surveys. Sample members receive an invitation to complete an interim survey on a monthly basis. The interim surveys capture important data about current employment and JSA service receipt between the point of enrollment and administration of the six-month follow-up survey. Interim survey administration is multi-modal. For those who give consent to receive text messages from the study, administration occurs on a monthly basis via SMS text messaging.2 Others are invited by email each month to complete an on-line version of the survey. Participants receive $2 as a token of appreciation for their time spent participating in each wave of the interim survey.

The data provided by the JSA follow-up survey are not available through any other source, as described further in section A.4.

A.2 Purpose of the Data Collection

A.2.1 Overview of Data Collection Instruments

This information collection request is for an extension of the previously approved JSA Evaluation six-month follow-up survey. The survey instrument was previously approved under OMB control number 0970-0440 in February 2016. A description of the follow-up survey is below.

        1. JSA Six-Month Follow-up Survey

As part of the impact study, the follow-up survey will provide critical information on additional measures of interest, in terms of service receipt, employment, and other outcomes. The survey is an important source for documenting the content, duration and intensity of job search services received; the type of job search skills that sample members possess; understanding how individuals found employment; obtaining information on employment that is not available from other sources; and obtaining information on income and public benefit receipt. Exhibit A1 summarizes covered measures for each of the domains included in the six-month follow-up survey. The follow-up survey is in Attachment A.

Exhibit A1: Key Domains for Job Search Assistance Strategies Evaluation Six-Month Follow-up Survey


Domain

Topics Covered in Survey

A

Introduction and confirmation of identity


B

Employment status

This section captures information about employment and labor force participation.

C

Job search assistance activities under TANF

This section captures information about the receipt of different job search assistance services to measure fidelity to the two intervention conditions tested. The questions provide information on the content, intensity, and duration of service receipt (such as topics covered during group or one-on-one staff assisted job search.).

D

Tools of last successful job search

This section identifies the job search activities that helped individuals find a job (and applies only to those who found employment).

E

Job search skills

This section provides information on job search-related skills such as perseverance, self-efficacy, and career planning skills.

F

Attitudes towards work and job search

This section gathers data on motivation, measured for example by respondents’ reservation wage, and attitudes toward the job search process.

G

Factors that affect ability to work

This section collects information on barriers to employment to identify factors that may limit the effectiveness of job search services.

H

Job characteristics

This section asks about the characteristics of the current or most recent job for those who do find work, particularly wages, benefits, and schedule.

I

Household structure and income

This section collects information required to determine household poverty status and income



Evaluators attempt follow-up survey interviews with all sample members by telephone. As described above, the survey window begins on the first day of the calendar month following the six-month anniversary of random assignment and closes 12-16 weeks later. The average interview length is 20 minutes. The research team expects to conduct the survey as a census of the 5,273 sample members enrolled across the three impact study sites. The research team has a target response rate of 80 percent (4,712 completed interviews). We have also developed strong plans for nonresponse bias analysis and weighting adjustments. (See section B.3.4 of this supporting statement.)

A.2.2 Who Will Use the Information

The primary beneficiaries of this planned data collection effort will be ACF, state and local TANF program administrators, other state and local policymakers, and other federal agencies and policy makers. ACF will use the information to understand what strategies for providing job search assistance are most effective in moving TANF applicants and recipients to work and increasing their earnings. This will be important information in guiding the operation of state and local TANF programs. Secondary beneficiaries of this data collection will be those in public policy and program administration who are interested in understanding about effective job search strategies more broadly. Many agencies and programs also provide job search assistance to their clients (for example, the Department of Labor provides job search assistance to Unemployment Insurance (UI) claimants) so there is a broad interest in this topic.

A.3 Improved Information Technology to Reduce Burden

The follow-up survey administration uses CATI (computer-assisted telephone interviewing) technology. CATI technology reduces respondent burden, as interviewers can proceed more quickly and accurately through the survey instruments, minimizing the interview length. Computerized questionnaires ensure that the skip patterns work properly, minimizing respondent burden by not asking inappropriate or non-applicable questions. For example, unemployed respondents will skip out of the questions about current job characteristics. Computer-assisted interviewing can build in checkpoints, which allow the interviewer or respondent to confirm responses thereby minimizing data entry errors. Finally, automated survey administration can incorporate hard edits to check for allowable ranges for quantity and range value questions, minimizing out of range or unallowable values.

A.4 Efforts to Identify Duplication

There is minimal duplication of data collection in the evaluation. The six-month follow-up survey asks about employment status despite the availability of some information available through NDNH. This is necessary as a screener to asking about job conditions, information not available through NDNH. As part of questions to document all sources of income, the survey also asks about receipt of TANF and SNAP benefits despite the availability of this information from the local TANF office. Reminding respondents of all income sources is part of our strategy of trying to get good measurements of total household income. Another reason for doing this is that it could create difficulties in the interviewing process if the list of income support sources does not include these two sources. More concretely, some respondents might volunteer them and slow the interview process if we did not ask about them explicitly.

A.5 Involvement of Small Organizations

The data collection does not involve small businesses or other small entities.

A.6 Consequences of Less Frequent Data Collection

Researchers designed the data collection effort described in this document to provide information on key outcomes of interest over the six-month follow-up period. This activity ultimately enhances researchers’ ability to obtain a complete picture of employment and job search activities during the six months following study enrollment.

No follow up after the six-month survey is planned.

A.7 Special Circumstances

There are no special circumstances for the proposed data collection.

A.8 Federal Register Notice and Efforts to Consult Outside the Agency

A.8.1 Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13 and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995)), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on November 14, 2017, Volume 82, Number 218 page 52735-52736, and provided a 60-day period for public comment. A copy of this notice is included as Attachment B. During the notice and comment period, the government received one request for materials and no comments.

        1. A.8.2 Consultation with Experts Outside of the Agency

Most of the items in the survey are from previously approved data collection instruments. Researchers consulted experts in their respective disciplines (statistics, policy analysis, economics, and survey operations) from ACF and Abt Associates in developing the six-month follow-up survey instrument. A list of key contributors is listed below.

        1. ACF

Ms. Erica Zielewski Contract Officer’s Representative (previous)

Ms. Carli Wuff Contract Officer’s Representative

Mr. Mark Fucello Division Director

Ms. Naomi Goldstein Deputy Assistant Secretary for Planning, Research and Evaluation

        1. Abt Associates

Ms. Karin Martinson Project Director (301) 347-5726

Dr. Stephen Bell Principal Investigator (301) 634-1721

Mr. David Judkins Statistician (301) 347-5952

Dr. Alison Comfort Analyst (617) 520-2937

Ms. Debi McInnis Survey Operations (617) 349-2627

A.9 Incentives for Respondents

There are no changes requested to the previously approved incentive structure described in this section.

Tokens of appreciation are a powerful tool for maintaining low attrition rates in longitudinal studies. The use of tokens of appreciation for the JSA Evaluation can help maximize response rates, which is necessary to ensure unbiased impact estimates.

Three factors helped to determine the amounts for each survey:

  1. Respondent burden, both at the time of the interview and over the life of the study;

  2. Costs associated with participating in the interview at that time; and

  3. Other studies of comparable populations and burden.


Our experience with this study population shows that it is more likely to respond positively to tokens of appreciation payments. Previous research has shown that sample members with certain socio-economic characteristics are significantly more likely to become survey respondents when tokens are offered. In particular, sample members with low incomes and/or low educational attainment have proven responsive to incentives, as have minority group members. These characteristics are expected to be heavily represented in this study panel (Duffer et al. 1994); Educational Testing Service (1991).

The token of appreciation paid for each of the interim tracking surveys ($2) is comparable to that used in the participant contact mailings for the Career Pathways Intermediate Outcomes (CPIO) study (OMB # 0970-0394 and 0970-0397). The follow-up survey incentive among is comparable to other studies of similar populations, including the survey conducted for the Health Professionals Opportunity Grant (HPOG) Evaluation (0970-0394).

Study participants receive small tokens of appreciation during the six- month follow-up period between enrollment and the follow-up survey data collection.

  • Participants receive $2 initially, as part of their welcome packet. Each month, those that complete the interim tracking survey earn an additional $2 for each completed interim survey interview. This money is accrued and paid out prior to the start of the six-month follow-up survey. At the end of each completed interim tracking survey, a thank you message displays indicating how much money the respondent has ‘banked’ cumulatively to that point and explains when they will receive payment. The accrual system is based on consumer “reward” models that follow a similar accrual. The initial payment demonstrates the study commitment and appreciation for the respondents’ time. Text-based surveys are a rapidly developing data collection methodology. They have been commonly used in consumer research but emerging rapidly in public health research (CDC, 2012). Preliminary studies show promise for reaching and engaging low-come populations (Chang, 2014; Vervloet et al, 2012).



  • Prior to the start of the six-month follow-up survey, the team sends an advance letter explaining the purpose of the survey, the expectations of sample members who agree to complete that survey, and the promise of an additional $25 as a token of appreciation for their participation in the survey. The advance letter also includes payment for the cumulative amount the participant has ‘accrued’ in completing interim surveys—for example, a participant who responds to three of the five interim surveys will receive $6 and someone who responds to all five interim surveys will receive $10.



  • Sample members that complete the six-month follow-up interview receive a check for $25 as a token of appreciation for their participation. In total, enrolled participants can receive between $2 and $37 dollars depending on how many rounds of data collection they complete.

A.10 Privacy of Respondents

Abt Associates and Mathematica are very cognizant of and committed to maintaining federal, state, and ACF data security requirements. All Abt Associates and Mathematica study staff will comply with relevant policies related to secure data collection, data storage and access, and data dissemination and analysis.

The JSA research team developed strong protocols to help maintain the privacy of respondents to the extent permitted by law. All research staff working with personally identifiable information (PII) will follow strict procedures to protect private information and they will sign a pledge stating that they will keep all information gathered private to the extent permissible by law. All papers that contain participant names or other identifying information will reside in locked areas and passwords will protect any computer documents containing identifying information.

The JSA interim surveys and six-month follow-up survey are purely voluntary. Prior to the start of each survey, researchers will inform sample members that all of their responses will be kept private, their names will not appear in any written reports, and that responses to the questions are voluntary. Specifically, the research team will take the following specific measures to protect respondents’ privacy:

  • Using rigorous security measures for survey data. Abt Associates and Abt SRBI have established safeguards that provide for the confidentiality of data and the protection of the privacy of the sampled individuals on all of its studies. All data users are aware of and trained on their responsibilities to protect participants’ personal information, including the limitations on uses and disclosures of data. Signed data confidentiality agreements are also required. All personal data (identifiable and de-identified data analyses files) will reside on a secure workstation or server that is protected by a firewall and complex passwords, in a directory that can only be accessed by the network administrators and the analysts actively working on the data. Survey data collected are stored in secure CATI servers. Data transfer to and from Abt and Abt SRBI will occur through Abt’s secure online file transfer platform that utilizes FIPS 140-2 validated cryptographic modules. Researcher assign generic study identifiers – not based on PII – for each study participant to link participant data. PII is removed from all electronic files prior to analysis.

  • Notification of data security breaches. All study partners, including the sites that are participating in the evaluation, are aware that they must notify Abt Associates within one hour of a breach of PII confidentiality per OMB rules.  Study partners are also aware that they must notify Abt within 24 hours from the time any study partner knows of a breach/deviation from the data security plan. Researchers will notify ACF of any data security breaches, including breaches of protocol no later than 24 hours after Abt staff is made aware of the breach.

  • Restricting access to the study network folder. Secure servers will store all data collected that contains PII for the JSA Evaluation. Access to the study network will be restricted by assigning a password to each relevant staff member.

In addition to these study-specific procedures, the evaluator has extensive corporate administrative and security systems to prevent the unauthorized release of personal records. These systems include state-of-the-art hardware and software for encryption that meets federal standards and other methods of data protection (e.g., requirements for regular password updating), as well as physical security that includes limited key card access and locked data storage areas.

A.11 Sensitive Questions

None of the survey questions for the JSA Evaluation are sensitive in nature. The most sensitive questions relate to income, public benefit receipt, factors that affect their ability to work, and perceptions of one’s own skills (e.g. perseverance, self-efficacy). Since JSA services should result in increased income and reduced public benefit receipt, these are important domains to measure. The other items are necessary to evaluate the mediating role of these factors in the impact of the contrasting JSA services on economic outcomes. Interviewers will remind study participants during the interviewing process that they may refuse to answer individual items. Interviewers will also provide assurances to participants that their responses will be kept private to encourage candid responses.

A.12 Estimation of Information Collection Burden

The estimated time burden for completing the six-month follow-up survey is 20 minutes.

This information collection requests covers the remaining 766 participants that may be completing the six-month follow up survey during the four-month extension period.

Exhibit A-2 presents the estimated remaining reporting burden on study respondents completing the follow-up survey. The burden estimates for the extended follow-up survey assume 766 interviews in total. To place a value on respondents’ time, we calculated average hourly wage rates3 for sample members based on information from the Bureau of Labor Statistics4 and the federal minimum wage:

  • Sample members: The average hourly wage for respondents is based on the average minimum wage rates in the JSA Evaluation site states (at the time of submission) and was calculated by multiplying the average minimum hourly wage ($8.41) by 1.4 to account for the value of fringe benefits when working (equal to 40 percent of the hourly wage).

When members of a respondent group come from multiple job categories, we took an average across the relevant categories, as noted.

Exhibit A-2 Annual Information Collection Activities and Cost

Instrument


Total/Annual Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Annual Burden Hours

Average Hourly Wage

Total Annual Cost

Participant 6-month Follow-up Survey

766

1

0.333

255

$11.77

$3,001

Total Burden

255


$3,001

A.13 Cost Burden to Respondents or Record Keepers

This data collection effort involves no recordkeeping or reporting costs for respondents other than those described in Exhibit A-2 above.

A.14 Estimate of Cost to the Federal Government

The total cost for the previously approved data collection activities and the current information collection request will be $1,517,138. This includes the cost of initial information collection from the field, developing and pretesting data collection instruments and tools, administering the surveys and interviews, and analyzing the follow-up survey data. Annual costs to the Federal government will be $126,428 for the proposed data collection.

A.15 Change in Burden

This request is to extend the data collection period to complete the six-month follow-up data collection.

A.16 Publication Plans and Project Schedule

Exhibit A-3 presents an overview of the project schedule for information collection. It also identifies publications associated with each major data collection activity.

Exhibit A-3 Overview of Project Data Collection Schedule

Data Collection Activity

Timing

Associated Publications

Baseline information form

October 2015 – August 2017

Individual Site Reports

Site visits and semi-structured interviews with TANF program staff

June 2016 – July 2017

Individual Site Reports

JSA staff survey

September 2016 – August 2017

Individual Site Reports

Follow-up Survey

May 2016-May 2018

Individual Site Reports

Participant interim surveys

May 2016--January 2018

Individual Site Reports

The schedule for the individual site reports is as follows:

  • The Ramsey County implementation report will be produced in January 2018

  • The New York City final report will be produced in June 2018

  • The Westchester implementation report will be produced in August 2018

  • The Sacramento final report will be produced in October 2018

  • The Michigan final report will be produced in February 2019

A.17 Reasons not to Display OMB Expiration Date

All instruments created for the JSA Evaluation will display the OMB approval number and the expiration date for OMB approval.

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.


References

Duffer, A., J. Lessler, M. Weeks, & M. Mosher (1994). Effects of Incentive Payments on Response Rates and Field Costs in a Pretest of a National CAPI Survey. Proceedings of the Survey Research Methods Section of the American Statistical Association, Vol. 2, pp. 1386-1391.

Gurol-Urganci, de Jongh T, Vodopivec-Jamsek V, Car J, Atun R: Mobile phone messaging for communicating results of medical investigations.


Klerman, J., R. Koralek, A. Miller, & K. Wen (2012). Job Search Assistance Programs: A Review of the Literature (OPRE Report # 2012-39). Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.


"National Adult Literacy Survey Addendum to Clearance Package, Volume II: Analyses of the NALS Field Test" (Educational Testing Service, September 1991), pp. 2-3


Vervloet M, Linn AJ, van Weert JC, de Bakker DH, Bouvy ML, van Dijk L: The effectiveness of interventions using electronic reminders to improve adherence to chronic medication: a systematic review of the literature.

J Am Med Inform Assoc 2012, 19(5):696-704.













1 This field assessment included semi-structured interviews with state and local TANF administrators, program staff who provide JSA services, and researchers and policy experts. OMB approved discussion guides for these information-gathering interviews under OMB clearance number 0970-0440.

2 SMS stands for “short message service.” This type of texting does not require a smart phone. It allows for an exchange of short messages to be threaded together, a critical feature for the administration of a short survey.

3 Assuming 2080 FTE hours worked per year.

4 http://www.bls.gov/oes/current/oes_nat.htm


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDebi McInnis
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy