JSA OMB Supporting Statement A - 11.17.15

JSA OMB Supporting Statement A - 11.17.15.docx

Job Search Assistance Strategies (JSA) Evaluation - Contact updates, Interim Surveys and Six-Month Follow-up Survey

OMB: 0970-0440

Document [docx]
Download: docx | pdf



Supporting Statement A

For the Paperwork Reduction Act of 1995: Approval for the Participant Tracking, Interim Surveys and Six-Month Follow-up Survey for the Job Search Assistance Strategies Evaluation




OMB No. 0970-0440





November 2015



Submitted by:

Office of Planning, Research & Evaluation

Administration for Children & Families

U.S. Department of Health and Human Services


Federal Project Officer

Erica Zielewski

Table of Contents

Attachments

  1. Welcome Packet and Contact Update Form

  2. Interim Surveys

  3. Six-month Follow-up Survey

  4. Six-month Follow-up Survey Pre-notification (Advance) Letter

  5. OMB 60-Day Notice




Part A: Justification

This document provides supporting statements for the collection of information for the Job Search Assistance (JSA) Strategies Evaluation (hereafter, JSA Evaluation), funded by the Office of Planning, Research and Evaluation (OPRE) in the Administration for Children and Families (ACF) at the U.S Department of Health and Human Services. The goal of the JSA evaluation is to determine what type of job search assistance strategies are most effective in helping Temporary Assistance for Needy Families (TANF) and other cash assistance applicants and recipients find employment and increase their earnings. The JSA Evaluation contains both an impact study and an implementation study. The impact study will randomly assign individuals to different approaches for providing job search assistance and then will compare their employment and earnings to determine their relative effectiveness in moving TANF recipients to work and increasing their earnings. The implementation study will document services received under each approach for providing job search assistance and provide operational lessons. Abt Associates and its partner Mathematica Policy Research are conducting the evaluation.

In October 2013, the JSA Evaluation received approval from the Office of Management and Budget (OMB) for a first set of data collection instruments for the JSA Evaluation, specifically for the field assessment and site selection process (OMB No. 0970-0440). Instruments approved in that submission included the Discussion Guide for Researchers and Policy Experts, the Discussion Guide for State and Local TANF Administrators, and the Discussion Guide for Program Staff. OMB provided approval for the next set of data collection forms—the Baseline Information Form, Staff Surveys, and Implementation Study Site Visit Guides—under the same approval number in November 2014.

This submission seeks OMB approval for three additional data collection efforts related to a follow-up survey data administered to sample members:

  • Contact update form. A paper version of this form will be included in a “welcome packet” that is mailed to sample members shortly after study induction to encourage them to complete the survey.1 The purpose of the form is to update sample member’s contact information, including contact information on alternate contacts, for the study’s follow-up survey.

  • Interim tracking surveys. This activity involves conducting a brief monthly survey with sample members during the first five months after random assignment. The primary purpose of these surveys is to keep contact information current and maintain the sample member’s engagement with the study. The contact information captured will be comparable to that collected with the contact update form. The interim surveys will also provide information on employment and participation in job search support services. These surveys will be conducted via text messaging.2

  • JSA six-month follow-up survey instrument. This survey, administered to sample members by telephone, will be a key source for outcomes of interest in the JSA Strategies Evaluation. While the principal source of employment and earnings data for the impact study is quarterly Unemployment Insurance records from the National Directory of New Hires (NDNH) (maintained by the Office of Child Support Enforcement (OCSE) at HHS), this follow-up survey will provide critical information on additional measures of interest. This includes the content, mode, and duration of job search assistance services received; job characteristics related to job quality (e.g. wage, benefits, and schedule); factors affecting the ability to work; public benefit receipt beyond TANF; and household income. Survey results will be used in both the impact study, to collect data on key outcomes, and the implementation study, to document the JSA services received.

The six-month follow-up survey is the primary focus of this request. The other two data collections are preparatory steps to facilitate a high response rate for the survey.

A.1 Necessity for the Data Collection

A.1.1 Study Overview

JSA programs are typically short-term, low-cost programs designed to help job seekers find jobs. Some JSA programs focus on helping job seekers find jobs more quickly than they would on their own, others focus on helping job seekers find better jobs, and some focus on both. TANF programs typically require participation in job search assistance activities as a condition of receiving cash assistance.

At a general level, there is evidence that JSA strategies are effective in increasing employment, but the impacts are modest (Klerman, et al., 2012). However, there are different approaches to providing job search assistance, but there is very little evidence regarding which strategies are more effective. For example, is providing JSA in a group setting with instruction on job search techniques more or less effective than when individuals search for work one-on-one with a staff person? Are longer time commitments to search for a job more effective than shorter? Given that JSA is an important TANF work activity in all states, ACF sponsored this evaluation to determine the relative effectiveness of various JSA approaches in a large-scale randomized trial across multiple sites.

The impact study component of the JSA evaluation will use an experimental design to determine the relative effectiveness of different JSA approaches. The evaluation requires TANF applicants to be randomly assigned to one of two JSA approaches, each of which provides a different set of JSA services. This will allow us to measure the incremental benefits one approach compared to the other. The evaluation will not include a “no services” control group. In addition to the impact study, there will be an implementation study to document the operation of the JSA approaches, including the context in which they operate; the service delivery structure; and the content, mode, intensity, and duration of services. The evaluation will address the following principal research question:

  • What are the differential impacts of alternative TANF JSA approaches on short-term employment and earnings outcomes?

In addition, the evaluation will address the following secondary research questions:

  • What are the differential impacts of alternative TANF JSA models on: (a) job quality (including wages, work-related benefits, consistency and predictability of hours); (b) public benefits received; (c) family economic well-being; and (d) non-economic outcomes (including motivation to search for a job and psycho-social skills such as perseverance and self-efficacy)?

  • What components of JSA services are linked to better employment outcomes?

  • What are the job search strategies used by successful job seekers?

As of the time of this submission in October 2015, site selection is still underway. Thus far, one site has started random assignment to the two job search assistance approaches: the TANF program operated by the Human Resources Administration (HRA) in New York, NY. The study is being conducted in four TANF offices across Brooklyn and Queens. In this site, cash assistance applicants are randomly assigned to an approach that requires participation in job search classes and activities for 35 hours per week or to one that requires applicants to meet weekly with a staff person who supervises their job search.

The research team is currently working with four other localities to develop the evaluation design including Sacramento County, CA; Ramsey County, MN; Genesee and Wayne Counties, MI; and Westchester County, NY. We expect that most of these sites will start random assignment in early 2016, and that random assignment will last for a period of 12 months in each site.

The follow-up survey is an important data source for both the impact and the implementation study. For the impact study, the evaluation will also collect administrative data on employment and earnings from the NDNH and on TANF and the Supplemental Nutrition Assistance Program (SNAP) benefit receipt from the state and local TANF agencies that participate in the evaluation. The role of the follow-up survey in the impact study will be to measure outcomes that cannot be measured with these data. This includes: (1) job characteristics including wages and benefits (for those with current/recent employment); (2) public benefit receipt beyond TANF and SNAP; (3) job search-related skills such as perseverance and self-efficacy, (4) attitudes towards employment (including reservation wage and attitudes about job search); (5) factors that affect the ability to work; and (6) household income.

For the implementation study, the survey will supplement qualitative data from site visits with data on receipt and use of JSA services as recalled by study sample members. In addition to allowing researchers to measure the fidelity of the JSA services to the JSA approaches, this will allow us to document the strategies that lead to a successful job search and potentially identify services and activities that are linked to better outcomes. Survey-based measures for this analysis include: (1) types and frequency of services received in group instruction on job search provided by the TANF program; (2) types and frequency of services received in one-on-one instruction on job search provided by the TANF program; (3) use of on-line job search tools; (4) receipt of support services to support job search such as child care; and (5) job search activities undertaken in last successful job search.

A.1.2 Legal or Administrative Requirements that Necessitate the Collection

There are no legal or administrative requirements that necessitate the collection.

A.1.3 Overview of Data Collection

The success of the six-month follow-up survey is contingent upon strong respondent connection to the study and accurate contact information. These activities are particularly important given that cash assistance recipients have historically been a hard-to-reach population in other similar survey efforts. To facilitate these outcomes, the study includes a series of monthly contacts with study participants that begin shortly after random assignment and continue until shortly before the six-month follow-up survey. Accordingly, we request approval for three data collection activities:

  1. Contact update form (Attachment A). The contact update form will enable the research team to: (1) confirm or update sample member contact information; (2) confirm existing contact information for alternative contact persons; and (3) capture contact information for a new alternative contact person if needed. It is comparable to contact update forms used in prior ACF evaluations, notably the Pathways for Advancing Careers and Education (PACE) Evaluation (OMB # 0970-0397), the Health Professions Opportunity Grants (HPOG) Impact Evaluation (OMB # 0970-0394), and the Career Pathways Intermediate Outcomes (CPIO) study (OMB # 0970-0394 and 0970-0397). The contact update form will be part of the welcome packet study participants receive after enrollment.

  2. Interim surveys (Attachment B). As described below, sample members will receive an invitation to complete an interim survey on a monthly basis. The interim surveys will capture important data about current employment and JSA service receipt between the point of enrollment and administration of the six-month follow-up survey. Interim survey administration will be multi-modal. For those who give consent to receive text messages from the study, the administration will occur on a monthly basis via SMS text messaging.3 Others will be invited by email each month to complete an on-line version of the survey. Participants will receive $2 as a token of appreciation for their time spent participating in each wave of the interim survey.

  3. Six-month follow-up survey (Attachment C). Six months after enrollment, all sample members will receive an advance letter advising them that interviewers will attempt to call to ask them to participate in the JSA follow-up survey. The follow-up survey will measure data on (1) employment status; (2) participation in job search assistance activities; (3) how current or most recent job was found;( 4) job search-related skills; (5) motivation to work/attitudes toward job search; (6) barriers to employment; (7) job characteristics and conditions; (8) job search parameters; and (9) household composition and income. Interviewers will administer the six-month follow-up survey by telephone. The expected interview length is about 20 minutes and sample members will receive a $25 token of appreciation (in the form of a check) for completing the survey.


The data provided by the JSA follow-up survey are not available through any other source, as described further in section A.4.

A.2 Purpose of the Data Collection

A.2.1 Overview of Data Collection Instruments

The JSA Strategies Evaluation includes three data collection instruments in this OMB package – a contact update form, the questionnaire for a series of monthly interim tracking surveys, and a six-month follow-up survey questionnaire. A description of each form follows.

        1. Contact Update Form

At the time of enrollment, all study participants will provide contact information on the previously-approved baseline information form for themselves, as well as for up to three relatives or friends who are likely to know how to reach them. Program staff or the study team4 will enter data from the baseline information form into a secured web-based system, known as the Participant Tracking System (PTS).

All sample members will receive a welcome to the study packet by mail within one month of enrollment. This packet will include a welcome letter, an overview of the study, and a contact update form (with a business reply envelope).5 The contact update form will include the sample member’s contact information from the participant tracking system (PTS). This form will allow the study participants to review, confirm, and update or correct their contact information (name, address, telephone number(s), and email address(es)) if needed. This form will also confirm or correct the contact information for up to three relatives or friends that will always know how to reach the study participant. (See Attachment A for the contact update form.) Maintenance of up-to-date contact information is crucial to the tracking and locating of sample members leading up to the follow-up survey.

        1. Interim Tracking Surveys

The interim tracking surveys provide an opportunity to remind participants that they are part of the evaluation study and obtain up-to-date contact information should enhance the ability of interviewers to both locate the participants and gain their cooperation during the six-month, follow-up data collection. The interim surveys are an important part of the participant tracking efforts because each contact helps to strengthen the participant’s connection to the study and likelihood of continued participation in data collection activities. The interim surveys will capture data similar to that captured on the contact update form (see Attachment B for the interim surveys). It will allow the respondents to either confirm or update their own contact information. It will also allow them to update the phone and email information for any alternate contact person already in the system, or add a new alternate contact person if needed.


In addition to the participant tracking functions, the interim survey provides the opportunity to build a time-series with two key measures at up to five points in time. The outcomes captured include: (1) employment status, and (2) participation in job search assistance activities. The employment data will allow us to perform methods research on the accuracy of first employment when recalled at six months in the follow-up survey described below. The job search participation data will be used to document the services individuals receive and the service contrast between the two approaches.


        1. JSA Six-Month Follow-up Survey

As part of the impact study, the follow-up survey will provide critical information on additional measures of interest, in terms of service receipt, employment, and other outcomes. The survey is an important source for documenting the content, duration and intensity of job search services received; the type of job search skills that sample members possess; understanding how individuals found employment; obtaining information on employment that is not available from other sources; and obtaining information on income and public benefit receipt. Exhibit A1 summarizes covered measures for each of the domains included in the six-month follow-up survey. The follow-up survey is in Attachment C and the advance letter is in Attachment D.

Exhibit A1: Key Domains for Job Search Assistance Strategies Evaluation Six-Month Follow-up Survey


Domain

Topics Covered in Survey

A

Introduction and confirmation of identity


B

Employment status

This section captures information about employment and labor force participation.

C

Job search assistance activities under TANF

This section captures information about the receipt of different job search assistance services to measure fidelity to the two intervention conditions tested. The questions provide information on the content, intensity, and duration of service receipt (such as topics covered during group or one-on-one staff assisted job search.).

D

Tools of last successful job search

This section identifies the job search activities that helped individuals find a job (and applies only to those who found employment).

E

Job search skills

This section provides information on job search-related skills such as perseverance, self-efficacy, and career planning skills.

F

Attitudes towards work and job search

This section gathers data on motivation, measured for example by respondents’ reservation wage, and attitudes toward the job search process.

G

Factors that affect ability to work

This section collects information on barriers to employment to identify factors that may limit the effectiveness of job search services.

H

Job characteristics

This section asks about the characteristics of the current or most recent job for those who do find work, particularly wages, benefits, and schedule.

I

Household structure and income

This section collects information required to determine household poverty status and income



The follow-up survey will attempt interviews with all treatment and control group members by telephone. The survey window will begin on the first day of the calendar month following the six-month anniversary of random assignment and close eight weeks later. The estimated interview length is 20 minutes on average. If pretesting shows that the interviews last longer than 20 minutes, then some items will be cut from the instrument. The research team expects to conduct the survey as a census of the 8,000 sample members enrolled across the five sites. The research team has a target response rate of 80 percent (6,400 completed interviews). We have also developed strong plans for nonresponse bias analysis and weighting adjustments. (See section B.3.4 of part B of this supporting statement.)

A.2.2 Who Will Use the Information

The primary beneficiaries of this planned data collection effort will be ACF, state and local TANF program administrators, other state and local policymakers, and other federal agencies and policy makers. ACF will use the information to understand what strategies for providing job search assistance are most effective in moving TANF applicants and recipients to work and increasing their earnings. This will be important information in guiding the operation of state and local TANF programs. Secondary beneficiaries of this data collection will be those in public policy and program administration who are interested in understanding about effective job search strategies more broadly. Many agencies and programs also provide job search assistance to their clients (for example, the Department of Labor provides job search assistance to Unemployment Insurance (UI) claimants) so there is a broad interest in this topic.

A.3 Improved Information Technology to Reduce Burden

Respondent tracking utilizes both text-based surveys and email, allowing participants to answer questions or provide updated contact information at their convenience. These vehicles reduce burden by decreasing the number of times they are contacted by the research team and give participants flexibility in responding by mode and at a time most convenient to them.

The follow-up survey administration will use CATI (computer-assisted telephone interviewing) technology. CATI technology reduces respondent burden, as interviewers can proceed more quickly and accurately through the survey instruments, minimizing the interview length. Computerized questionnaires ensure that the skip patterns work properly, minimizing respondent burden by not asking inappropriate or non-applicable questions. For example, unemployed respondents will skip out of the questions about current job characteristics. Computer-assisted interviewing can build in checkpoints, which allow the interviewer or respondent to confirm responses thereby minimizing data entry errors. Finally, automated survey administration can incorporate hard edits to check for allowable ranges for quantity and range value questions, minimizing out of range or unallowable values.

A.4 Efforts to Identify Duplication

There is minimal duplication of data collection in the evaluation. The six-month follow-up survey will ask about employment status despite the availability of some information in this area the NDNH. This is necessary as a screener to asking about job conditions, information not available through NDNH. As part of questions to document all sources of income, the survey will also ask about receipt of TANF and SNAP benefits despite the availability of this information from the local TANF office. Reminding respondents of all income sources is part of our strategy of trying to get good measurements of total household income. Another reason for doing this is that it could create difficulties in the interviewing process if the list of income support sources does not include these two sources. More concretely, some respondents might volunteer them and slow the interview process if we did not ask about them explicitly.

A.5 Involvement of Small Organizations

The data collection does not involve small businesses or other small entities.

A.6 Consequences of Less Frequent Data Collection

Researchers designed the data collection effort described in this document to help maintain updated contact information for study participants during the follow-up period, strengthen the participant’s connection to the study and provide information on key outcomes of interest over the six-month follow-up period. All of these activities ultimately enhance the quality of the contact data available to interviewers to help maximize response rates to the follow-up survey and obtain a complete picture of employment and job search activities of that time.

Without collecting monthly contact information on study participants, the quality of the contact information available to interviewers for the six-month follow-up survey will deteriorate. A lack of, or outdated, contact information would make it difficult to achieve the target response rates. It could also diminish the participant’s connection to the study, which could make it difficult for interviewers to gain cooperation at the six-month follow-up even if they are successful in locating respondents.

No follow up after the six-month survey is planned.

A.7 Special Circumstances

There are no special circumstances for the proposed data collection.

A.8 Federal Register Notice and Efforts to Consult Outside the Agency

A.8.1 Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13 and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995)), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on July 15, 2015, Volume 80, Number 135 page 41505-41506, and provided a 60-day period for public comment. A copy of this notice is included as Attachment E. During the notice and comment period, the government received one request for materials and no comments.

        1. A.8.2 Consultation with Experts Outside of the Agency

Most of the items in the survey are from previously approved data collection instruments. Experts in their respective disciplines (statistics, policy analysis, economics, and survey operations) from ACF, Abt, and Mathematica Policy Research were consulted in developing the survey instruments (6-month follow-up survey, interim survey) and other materials for which clearance is requested. A list of key contributors is listed below.

        1. ACF

Ms. Erica Zielewski Contract Officer’s Representative

Ms. Carli Wuff Contract Officer’s Representative

Mr. Mark Fucello Division Director

Ms. Naomi Goldstein Deputy Assistant Secretary for Planning, Research and Evaluation

        1. Abt Associates

Ms. Karin Martinson Project Director (301) 347-5726

Dr. Stephen Bell Principal Investigator (301) 634-1721

Mr. David Judkins Statistician (301) 347-5952

Dr. Alison Comfort Analyst (617) 520-2937

Ms. Debi McInnis Survey Operations (617) 349-2627

A.9 Tokens of Appreciation for Respondents

Tokens of appreciation are a powerful tool for maintaining low attrition rates in longitudinal studies. The use of tokens of appreciation for the JSA Strategies Evaluation, can help maximize response rates, which is necessary to ensure unbiased impact estimates.

Three factors helped to determine the amounts for each survey:

  1. Respondent burden, both at the time of the interview and over the life of the study;

  2. Costs associated with participating in the interview at that time; and

  3. Other studies of comparable populations and burden.


Our experience with this study population shows that it is more likely to respond positively to tokens of appreciation payments. Previous research has shown that sample members with certain socio-economic characteristics are significantly more likely to become survey respondents when tokens are offered. In particular, sample members with low incomes and/or low educational attainment have proven responsive to incentives, as have minority group members. These characteristics are expected to be heavily represented in this study panel (Duffer et al. 1994); Educational Testing Service (1991).

The token of appreciation paid for each of the interim tracking surveys ($2) is comparable to that used in the participant contact mailings for the Career Pathways Intermediate Outcomes (CPIO) study (OMB # 0970-0394 and 0970-0397). The follow-up survey incentive among is comparable to other studies of similar populations, including the survey conducted for the Health Professionals Opportunity Grant (HPOG) Evaluation (0970-0394).

Study participants will receive small tokens of appreciation during the six- month follow-up period between enrollment and the follow-up survey data collection.

  • Participants will receive $2 initially, as part of their welcome packet. Each month, those that complete the interim tracking survey will earn an additional $2 for each completed interim survey interview. This money will be accrued and paid out prior to the start of the six-month follow-up survey. At the end of each completed interim tracking survey, a thank you message will display indicating how much money the respondent has ‘banked’ cumulatively to that point and explaining when they will receive payment. The accrual system is based on consumer “reward” models that follow a similar accrual. The initial payment demonstrates the study commitment and appreciation for the respondents’ time. Text-based surveys are a rapidly developing data collection methodology. They have been commonly used in consumer research but emerging rapidly in public health research (CDC, 2012). Preliminary studies show promise for reaching and engaging low-come populations (Chang, 2014; Vervloet et al, 2012).



  • Prior to the start of the six-month follow-up survey, the team will send an advance letter explaining the purpose of the survey, the expectations of sample members who agree to complete that survey, and the promise of an additional $25 as a token of appreciation for their participation in the survey. The advance letter will also include a check issuing the cumulative amount the participant has ‘accrued’ in completing interim surveys—for example, a participant who responds to three of the five interim surveys will receive $6 and someone who responds to all five interim surveys will receive $10.



  • Sample members that complete the six-month follow-up interview will receive a check for $25 as a token of appreciation for their participation. In total, enrolled participants can receive between $2 and $37 dollars depending on how many rounds of data collection they complete.

A.10 Privacy of Respondents

Abt Associates, Abt SRBI, and Mathematica are very cognizant of and committed to maintaining federal, state, and ACF data security requirements. All Abt Associates and Mathematica study staff will comply with relevant policies related to secure data collection, data storage and access, and data dissemination and analysis.

The JSA research team developed strong protocols to help maintain the privacy of respondents to the extent permitted by law. All research staff working with personally identifiable information (PII) will follow strict procedures to protect private information and they will sign a pledge stating that they will keep all information gathered private to the extent permissible by law. All papers that contain participant names or other identifying information will reside in locked areas and passwords will protect any computer documents containing identifying information.

The JSA interim surveys and six-month follow-up survey are purely voluntary. Prior to the start of each survey, researchers will inform sample members that all of their responses will be kept private, their names will not appear in any written reports, and that responses to the questions are voluntary. Specifically, the research team will take the following specific measures to protect respondents’ privacy:

  • Using rigorous security measures for survey data. Abt Associates and Abt SRBI have established safeguards that provide for the confidentiality of data and the protection of the privacy of the sampled individuals on all of its studies. All data users are aware of and trained on their responsibilities to protect participants’ personal information, including the limitations on uses and disclosures of data. Signed data confidentiality agreements are also required. All personal data (identifiable and de-identified data analyses files) will reside on a secure workstation or server that is protected by a firewall and complex passwords, in a directory that can only be accessed by the network administrators and the analysts actively working on the data. Survey data collected are stored in secure CATI servers. Data transfer to and from Abt and Abt SRBI will occur through Abt’s secure online file transfer platform that utilizes FIPS 140-2 validated cryptographic modules. Researcher assign generic study identifiers – not based on PII – for each study participant to link participant data. PII is removed from all electronic files prior to analysis.

  • Notification of data security breaches. All study partners, including the sites that are participating in the evaluation, are aware that they must notify Abt Associates within one hour of a breach of PII confidentiality per OMB rules.  Study partners are also aware that they must notify Abt within 24 hours from the time any study partner knows of a breach/deviation from the data security plan. Researchers will notify ACF of any data security breaches, including breaches of protocol no later than 24 hours after Abt staff is made aware of the breach.

  • Restricting access to the study network folder. Secure servers will store all data collected that contains PII for the JSA evaluation. Access to the study network will be restricted by assigning a password to each relevant staff member.

In addition to these study-specific procedures, the evaluator has extensive corporate administrative and security systems to prevent the unauthorized release of personal records. These systems include state-of-the-art hardware and software for encryption that meets federal standards and other methods of data protection (e.g., requirements for regular password updating), as well as physical security that includes limited key card access and locked data storage areas.

A.11 Sensitive Questions

None of the survey questions for the JSA Evaluation are sensitive in nature. The most sensitive questions relate to income, public benefit receipt, factors that affect their ability to work, and perceptions of one’s own skills (e.g. perseverance, self-efficacy). Since job search assistance services should result in increased income and reduced public benefit receipt, these are important domains to measure. The other items are necessary to evaluate the mediating role of these factors in the impact of the contrasting JSA services on economic outcomes. Interviewers will remind study participants during the interviewing process that they may refuse to answer individual items. Interviewers will also provide assurances to participants that their responses will be kept private to encourage candid responses.

A.12 Estimation of Information Collection Burden

The estimated time burden for completing the data collection instruments discussed in this package are as follows:

  • Five minutes for the contact form.

  • Ten minutes for each interim tracking survey (including five minutes for the substantive questions and five minutes for the linked contact updating).

  • Twenty minutes for the six-month follow-up survey (as noted previously, if a formal pretesting indicates that the six-month survey will take longer than 20 minutes, then we will cut some items). To estimate the length of the survey, we conducted initial mock interviews with research team members and drew on previous experience on numerous other studies conducted by Abt Associates using similar items and forms. Other studies using similar items and forms upon which these time estimates are based include the Pathways for Advancing Careers and Education (PACE) Evaluation (0970-0397) and the Health Professions Opportunity Grants (HPOG) Impact Evaluation (OMB # 0970-0394) both conducted for ACF and the Green Jobs and Health Care Impact Evaluation (OMB # 1205-0481) conducted for the U.S. Department of Labor.

Exhibit A-2 presents the estimated reporting burden on study respondents completing all instruments—those previously approved by OMB as well as those included in this data collection request and their total cost. We note that the burden estimates for the previously approved instruments (the Baseline Information Form, the implementation study site visit and interviews, and the JSA staff survey) have been revised due to reductions in the number of sites in the evaluation, and thus a reduction in sample size projections. The burden estimates for the baseline forms assumed a total sample of 25,000 across 10 sites. At the time of this submission, we project a total sample of 8,000 participants across five sites.

To place a value on respondents’ time, we calculated average hourly wage rates6 for various categories of respondents based on information from the Bureau of Labor Statistics7 and the federal minimum wage:

  • Sample members: The average hourly wage for respondents is based on the average minimum wage rates in the JSA evaluation site states (at the time of submission) and was calculated by multiplying the average minimum hourly wage ($8.41) by 1.4 to account for the value of fringe benefits when working (equal to 40 percent of the hourly wage).

  • Implementation Study Site Visit Respondents: Based on a blended rate of using the Community and Social Service Occupations (SOC 21-0000), using a wage rate of $21.50 plus a 40 percent adjustment for benefits, or $30.01 and Social and Community Service Manager Occupations (SOC 11-9151), using a wage rate of $31.61, plus a 40 percent adjustment for benefits, or $44.25. The average rate is thus $37.31.

  • JSA Staff Survey Respondents: Based on a blended rate of using the Community and Social Service Occupations (SOC 21-0000), using a wage rate of $21.50 plus a 40 percent adjustment for benefits, or $30.01 and Social and Community Service Manager Occupations (SOC 11-9151), using a wage rate of $31.61, plus a 40 percent adjustment for benefits, or $44.25. The average rate is thus $37.31.

When members of a respondent group come from multiple job categories, we took an average across the relevant categories, as noted.

Exhibit A-2 Annual Information Collection Activities and Cost

Instrument

Total Number of Respondents


Annual Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Annual Burden Hours

Average Hourly Wage

Total Annual Cost

Previously Approved Instruments Still in Use

Baseline Information Form

8,000

4,000

1

0.2

800

$11.77

$9,416.00

Implementation study site visits

150

75

1

1

75

$37.136

$2,784.75

JSA staff survey

440

220

1

0.5

110

$37.138

$4,084.3

Current Request for Approval

Contact update form

1,200

600

1

0.083

50

$11.77

$588.50

Interim tracking surveys

2,800

1,400

5

0.167

1,169

$11.77

$13,759.13

Participant 6-month Follow-up Survey

6,400

3,200

1

0.333

1,066

$11.77

$12,546.82

Total Burden

3,270


$41,805.69

A.13 Cost Burden to Respondents or Record Keepers

This data collection effort involves no recordkeeping or reporting costs for respondents other than those described in Exhibit A-2 above.

A.14 Estimate of Cost to the Federal Government

The total cost for the previously approved data collection activities and the current information collection request will be $1,517,138. This includes the cost of initial information collection from the field, developing and pretesting data collection instruments and tools, administering the surveys and interviews, and analyzing the follow-up survey data. Annual costs to the Federal government will be $379,285 for the proposed data collection.

A.15 Change in Burden

This evaluation involves new data collection that increases the public reporting burden under this OMB number. Section A.12 provides documentation of the increase in burden figures.

A.16 Publication Plans and Project Schedule

Exhibit A-3 presents an overview of the project schedule for information collection. It also identifies publications associated with each major data collection activity.

Exhibit A-3 Overview of Project Data Collection Schedule

Data Collection Activity

Timing

Associated Publications

Baseline information form

October 2015 – January 2017

Job Search Assistance Study Final Report (September 2017)

Site visits and semi-structured interviews with TANF program staff

March 2016 – September 2016

Job Search Assistance Study Final Report (September 2017)

JSA staff survey

March 2016 – September 2016

Job Search Assistance Study Final Report (September 2017)

Follow-up Survey

March 2016-January 2017

Job Search Assistance Study Final Report (September 2017)

Participant interim surveys

March 2016-December 20169

Job Search Assistance Study Final Report (September 2017)

A.17 Reasons not to Display OMB Expiration Date

All instruments created for the JSA Evaluation will display the OMB approval number and the expiration date for OMB approval.

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.



References

Duffer, A., J. Lessler, M. Weeks, & M. Mosher (1994). Effects of Incentive Payments on Response Rates and Field Costs in a Pretest of a National CAPI Survey. Proceedings of the Survey Research Methods Section of the American Statistical Association, Vol. 2, pp. 1386-1391.

Gurol-Urganci, de Jongh T, Vodopivec-Jamsek V, Car J, Atun R: Mobile phone messaging for communicating results of medical investigations.


Klerman, J., R. Koralek, A. Miller, & K. Wen (2012). Job Search Assistance Programs: A Review of the Literature (OPRE Report # 2012-39). Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.


"National Adult Literacy Survey Addendum to Clearance Package, Volume II: Analyses of the NALS Field Test" (Educational Testing Service, September 1991), pp. 2-3


Vervloet M, Linn AJ, van Weert JC, de Bakker DH, Bouvy ML, van Dijk L: The effectiveness of interventions using electronic reminders to improve adherence to chronic medication: a systematic review of the literature.

J Am Med Inform Assoc 2012, 19(5):696-704.













1 The welcome packet (Attachment A) will include a welcome letter, an overview of the study, and a contact update form (with a business reply envelope).

2 Participants can choose not to give their consent to contact via text message. Those that do not consent to text messaging or do not have cell phones will receive an email invitation to participate in the interim tracking surveys online (see discussion below).

3 SMS stands for “short message service.” This type of texting does not require a smart phone. It allows for an exchange of short messages to be threaded together, a critical feature for the administration of a short survey.

4 The study team will enter baseline data from participants randomly assigned in the New York City site, as this site adapted their existing MIS to incorporate random assignment and are not using the evaluation’s participant tracking system (PTS). The baseline information forms are sent to the evaluation team who then enters the information in the PTS.

5 As discussed, the JSA Strategies Evaluation participant enrollment period began in October 2015 in New York, NY, prior to OMB approval of the contact update form and use of tokens of appreciation. Thus, the welcome packets for the participants enrolled prior to OMB approval will not include these items. Tracking for these early enrollees will draw on available contact tracking sources until OMB approval is received and this group can be transitioned into the active tracking system as well.

6 Assuming 2080 FTE hours worked per year.

7 http://www.bls.gov/oes/current/oes_nat.htm

8 Source: Bureau of Labor Statistics, National Compensation Survey, 2011: Combined average hourly wage of Community and Social Service Occupations and Social and Community Service Manager Occupations

9 The interim tracking surveys are scheduled to begin in March 2016, or as soon as OMB approval is obtained.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDebi McInnis
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy