1205-0504 Supporting Statement A_2-12-16

1205-0504 Supporting Statement A_2-12-16.docx

Workforce Investment Act Adult and Dislocated Worker Programs Gold Standard Evaluation

OMB: 1205-0504

Document [docx]
Download: docx | pdf

Workforce Investment Act (WIA) Adult and Dislocated Worker Programs Gold Standard Evaluation (WIA Evaluation)

OMB #:1205-0504


CONTENTS

PART A: JUSTIFICATION 1

1. Circumstances Necessitating the Data Collection 2

2. How, by Whom, and for What Purposes Will the Information Be Used 5

3. Use of Improved Technology to Reduce Burden 11

4. Avoiding Duplication of Effort 12

5. Methods to Minimize Burden on Small Businesses or Entities 13

6. Consequences of Not Collecting the Data 13

7. Special Data Collection Circumstances 14

8. Federal Register Notice 14

9. Respondent Payments 14

10. Confidentiality 15

11. Questions of a Sensitive Nature 19

12. Estimates of the Annualized Burden Hours 20

13. Estimates of the Total Annual Cost Burden to Respondents or Record Keepers 21

14. Estimates of the Annualized Cost to the Federal Government 21

15. Changes in Burden 22

16. Publication Plans and Project Schedule 22

17. Reasons for Not Displaying Expiration Date of OMB Approval 22

18. Exception to the Certification Statement 22

APPENDIX A: Authorization for Evaluation, Section 172 of WIA and Section 169 of wioa

APPENDIX B: Study Registration, Consent, and Contact Information Forms

APPENDIX C: 30-Month Follow-Up Survey Instrument, Results of Survey Pretests, and List of Frequently-Asked Questions

APPENDIX D: Letters and Reminders to Survey Sample Members

APPENDIX E: 60-Day Federal Register Notice


WIA GOLD STANDARD Evaluation 30-MONTH FOLLOW-UP SURVEY EXTENSION REQUEST

PART A: JUSTIFICATION

Overview

The U.S. Department of Labor’s (DOL) Employment and Training Administration (ETA) is currently undertaking the Workforce Investment Act (WIA) Adult and Dislocated Worker Programs Gold Standard Evaluation (The Evaluation). Although WIA was replaced by the Workforce Innovation and Opportunity Act of 2014 (WIOA), in July 2014, the Adult and Dislocated Worker programs continue to exist and offer job seekers a similar set of services. Lessons learned from this evaluation will inform policymakers and program administrators as WIOA is implemented. The evaluation was authorized under Section 172 of WIA with continuing authorization under Section 169 of WIOA (Appendix A).

The overall aim of this evaluation is to determine whether certain adult and dislocated worker services and training funded by Title I of WIA –and now Title I of WIOA—currently the largest source of Federal funding of employment services and training—are effective and whether their benefits exceed their costs. ETA has contracted with Mathematica Policy Research and its subcontractors—Social Policy Research Associates, MDRC, and the Corporation for a Skilled Workforce—to conduct this evaluation. The evaluation was launched in 28 randomly selected Local Workforce Investment Areas (LWIAs) starting in November 2011, and all sites began intake of customers into the study by August 2012.

This will be the third clearance package submitted to the Office of Management and Budget (OMB) for this evaluation. An initial data collection package, approved by OMB in September 2011 (OMB Control Number 1205-0482, Information Collection Reference (ICR) Number 201101-1205-001), requested clearance for a form to check the study eligibility of the customer, a customer study consent form, and the collection of data at baseline through a study registration form and contact information form, as well as site visit guides for the collection of qualitative information on WIA program processes, services, and training. A second data collection package was approved on January 18, 2013 (OMB Control Number 1205-0504, ICR Number 201208-1205-012) to allow for the collection of additional qualitative data in order to analyze veterans’ experiences in the 28 randomly selected LWIA sites, two follow-up surveys conducted at 15 and 30 months after random assignment, and site-level cost data collected on three forms. In March 2015, a non-substantive change request was approved by OMB to modify the incentives used for both follow-up surveys (OMB Control Number 1205-0504, ICR Number 201502-1205-001).

This new request is to extend OMB clearance of the final 30-month follow-up survey administration (cleared under OMB Control No. 1205-0504, the second collection described above), which currently expires on January 31, 2016, for an additional six months, to July 31, 2016. This extension will allow additional time to locate sample members for administration of the 30-month survey and hence achieve a higher response rate. There are no proposed changes to the survey instrument or the way it is administered.

This package includes:

  1. Appendix A: Authorization for Evaluation, Section 172 of WIA and Section 169 of WIOA

  2. Appendix B: Study Registration, Consent, and Contact Information Forms

  3. Appendix C: 30-Month Follow-Up Survey Instrument, Results of Survey Pretests, and List of Frequently-Asked Questions

  4. Appendix D: Letters and Reminders to Survey Sample Members

  5. Appendix E: 60-Day Federal Register Notice

1. Circumstances Necessitating the Data Collection

The evaluation examines the impacts of WIA intensive and training services on customers’ outcomes relative to a situation in which customers have access to core services only. It addresses the following research questions:

  1. Does access to WIA intensive services, alone or in conjunction with WIA-funded training, lead adults and dislocated workers to achieve better educational, employment, earnings, and self-sufficiency outcomes than they would achieve in the absence of access to intensive and training services?

  1. Does the effectiveness vary by population subgroup? Is there variation by sex, age, race/ethnicity, unemployment insurance (UI) receipt, prior education level, previous employment history, adult and dislocated worker status, and disability status?

  2. How did the implementation of WIA vary by LWIA? Did the effectiveness vary by how it was implemented? To what extent do implementation differences explain variations in effectiveness?

  3. Do the benefits from intensive and training services exceed program costs? Do the benefits of intensive services exceed their costs? Do the benefits of training exceed its costs? Do the benefits exceed the costs for adults? Do the benefits exceed the costs for dislocated workers?

Random selection of sites. To obtain a nationally representative study sample, the design calls for first randomly selecting study sites. Since LWIAs typically administer local WIA funding and hence determine the services and training provided, an LWIA is considered a “site” in the evaluation. Thirty sites were randomly selected from the set of all LWIAs on the U.S. mainland that serve 100 or more intensive services customers annually. This number of sites allows for precise estimates and a low rate of assignment to the research groups that are not eligible to receive full WIA services (as described below). The random selection was conducted using explicit and implicit stratification to take into account the enrollment levels at each site, the LWIA’s geographic location, and, as a proxy for the focus the site places on training, the proportion of LWIA intensive service customers who receive WIA-funded training. Each of the 30 randomly selected sites was asked to participate in the evaluation, and 26 of these initially selected sites agreed to participate. Four sites declined to participate, and replacement sites were identified for two of these sites (and agreed to participate); therefore, the total number of sites participating in the evaluation is 28. The other two initially selected sites declined to participate too late in the process for replacement sites to be recruited.

Random assignment of customers within selected sites. The cornerstone of the impact analysis was random assignment of customers within these 28 randomly-selected sites to experimental groups. Experimental evaluations are generally viewed as the “gold standard” for evaluating social programs because, more than any other approach, they minimize the chance that any observed differences in outcomes between comparison groups are due to unmeasured, preexisting differences between members of the research groups. The three research groups to which customers (who consent to participate in the study) were assigned were: (1) full-WIA group—adults and dislocated workers in this group could receive any WIA services and training for which they were eligible; (2) core-and-intensive group—adults and dislocated workers in this group could receive any WIA services for which they were eligible but not training; and (3) core group—adults and dislocated workers in this group could receive only core services and no WIA intensive or training services. Customers remained in their study groups for 15 months after the date they were randomly assigned. Customers who did not consent to participate in the study were allowed to receive core services only until intake for the study had ended.

In most cases, the sample intake period lasted between 12 and 18 months in each site. The length of the intake period was determined in consultation with the Workforce Investment Board and/or LWIA administrators, as some preferred to minimize the intake period. Sample intake began on a rolling basis in November 2011 with most sites starting random assignment by April 2012. Random assignment ended at all sites by April 2013. Before a customer was randomly assigned, he or she completed a study registration form, a contact information form, and a consent form (Appendix B).

Across all sites, about 36,000 customers were randomly assigned. After attrition of about 2,000 customers from the sample (some customers were found ineligible and others withdrew consent), there were about 34,000 customers remaining in the study. Of these, about 2,000 are members of the core group, 2,000 are members of the core-and-intensive group, and about 30,000 are members of the full-WIA group. All members of the core and core-and-intensive groups and a random sample of about 2,000 members of the full-WIA group (a total of 6,000 customers) were asked to complete the 15- and 30-month follow-up surveys.

The data collection for the WIA Evaluation is complete except for the 30-month follow-up survey (Appendix C). We do not expect to have completed the 30-month follow-up survey when the current OMB clearance expires in January 2016. The extension period is necessary to locate and interview the remaining sample. For reference, Table A.1 summarizes all the data collection activities of the evaluation.



Table A.1. Summary of Data Collection Activities for the WIA Evaluation

Type of Data Needed

Reason Data Needed

Sources

For Whom Collected

Whether Completed

Baseline information

Describe study participants

Study registration forms

All 34,000 study participants

Complete

Check that random assignment created groups with similar baseline characteristics

Define groups for subgroup analysis

Enhance precision of the impact analysis

State UI agencies

All 34,000 study participants


Services received

Monitor random assignment

Determine impact of WIA on the receipt of any employment services and training

Assign a cost of WIA services and training per participant

State and/or LWIA management information systems

All 34,000 study participants

Complete

15- and 30-month follow-up surveys

6,000 study participants in the survey sample (2,000 in the Core group, 2,000 in the Core-and-Intensive group, and 2,000 in the Full-WIA group)


Outcomes

Estimate the impacts of intensive services and training

Estimate the benefits of intensive services and training

National Directory of New Hires

All 34,000 study participants

In progress

15- and 30-month follow-up surveys

6,000 study participants in the survey sample

15-month survey complete

30-month survey in progress

Implementation data

Document and describe the implementation of WIA services and training

Monitor the implementation of the evaluation

Site visits: interviews with LWIA staff, group interviews with customers, review of program documents, site observations

All 28 participating LWIAs

Complete

State and/or LWIA management information systems

All 34,000 study participants


Cost data

Estimate costs of services for the benefit-cost analysis

Cost data collection packages completed by local WIA staff: (1) program costs questionnaire, (2) front-line staff activity log, and (3) resource room sign in sheet

All 28 participating LWIAs

Complete

Accounting data on ITA obligations and expenditures

All 34,000 study participants


2. How, by Whom, and for What Purposes Will the Information Be Used

An extension is being requested for the 30-month survey. The extension will allow additional time to locate sample members for the administration of this survey and will lead to a higher response rate.

The two follow-up surveys are the main way the evaluation team will collect data on the receipt of services and training (funded by WIA or not), as well as employment and self-sufficiency outcomes. As discussed in item 4 (below), administrative data on these topics falls short of providing the level of detail, coverage, and uniformity across sites in the data elements needed to conduct a comprehensive and fine-tuned analysis of the effectiveness of WIA’s intensive and training services. The data on service receipt will be used to determine the extent to which the offer of services, including training, actually led to an increase in services. Some customers will not take up the offer of services and some customers in the core and core-and-intensive groups will be able to access services similar to those restricted through the evaluation from sources other than WIA. Data from the surveys will also be used to compute the average cost of services received for each survey sample member. The data on outcomes will be used to estimate the impact of the services and their benefits.

a. Sampling for the Surveys

All customers randomly assigned to the core or core-and-intensive groups who were not later found to be ineligible or withdrew consent, were included in the survey sample (about 2,000 in the core and core-and-intensive groups, respectively). A random subset of about 2,000 of the approximately 30,000 full-WIA group members was included. Sampling 2,000 of the full-WIA group members minimized the cost and the burden on respondents while providing sufficient statistical precision. The random selection of full-WIA members for the survey sample was stratified by site. Within each site, the survey sample size of full-WIA members was about the same as the sample sizes for the core-and-intensive and core groups. Stratification on other characteristics was performed to ensure that the sample is balanced in terms of adult/dislocated worker status, sex, and race/ethnicity and is well matched to the core and core-and-intensive services groups on these dimensions.

b. Survey Content

The follow-up survey includes basic screening and tracking questions and detailed modules that obtain information on service receipt, participation in education and training programs, employment and earnings patterns, self-sufficiency, and some customer characteristics. An overview of the key items included in the survey and how they will be used is provided in Table A.2. This extension request proposes no changes to the data collection instrument.

Table A.2. Data Items in the WIA Evaluation Follow-up Surveys


Survey
Items

Tracking Information

Descriptive Measure

Outcome Measure

Benefit-Cost Measure

Personal Identifying and Tracking Information

Verify name, date of birth, and last four digits of Social Security number

Section A




Address and telephone numbers of respondent and friend or relative

Section G




Service Receipt

Section B

Resource room






Number of times resource room visited in American Job Center

Items B3, B5


Number of times resource room visited at another provider

Items B8-B9, B11



Workshops






Attendance in specified staff-intensive workshops in LWIA

Item B15


Number of other workshops attended in American Job Center; average amount of time spent in workshop

Items B16, B18, B20


Number of workshops attended elsewhere; average amount of time spent in workshop; type of provider

Items B21-B22, B24, B26



Topics covered in workshops

Item B27




Assessments






Type of assessments taken

Item B28


Number of assessments taken at American Job Center

Items B29-B31


Number of assessments taken elsewhere; type of provider

Items B32-B33, B35



Peer support groups






Number of peer support groups attended at American Job Center

Items B36, B38


Number of peer support groups attended elsewhere; type of provider

Items B41-B42, B44



Individualized counseling services






Topic of counseling

Item B47b




Number of times met with a counselor at American Job Center; average length of meeting

Items B48-B50, B52b


Number of times met with a counselor at another provider; average length of meeting; type of provider

Items B53-B54, B56, B58



Support Services






Type of assistance received

Item B59b




Dollar value of assistance received from an American Job Center

Item B61


Dollar value of assistance received from another provider; type of provider

Items B63-B64



Education and Training

Section C

Complete history of participation in education and training programs in the past 15 months, including start and stop dates

Items
C1-C9



Number of hours per week in program

Items C10-C11


Type of program (educational, occupational skills, English as a Second Language, on-the-job training)

Items C12-C14


Type of provider

Item C16



Total out-of-pocket costs and other sources of funding for programs

Items C17-C23


Whether program was completed, and if not, reasons for not completing

Items C25-C26



Whether a degree, diploma, license, or certification was received

Items C27-C31



Associated assessments or tests required, whether they were taken, and if so, their total cost and sources of payment

Items C32-C36



Type of occupation the program trained for, and whether the customer perceived that the training helped them get a job in that field

Items C15, C37



Employment Patterns, Job Characteristics, and Earnings

Section D

Complete history of employment in the past 15 months

Items D1, D5, D8-D12, D21-D29


Industry and main duties

Items D2-D3; D6-D7; D34-D35



Number of hours worked per week

Items D13-D15, D30-D32



Earnings in job

D4, D37


Employment status: regular, seasonal, contractor, temporary, casual, day laborer, on call

D17, D36



Fringe benefits

Items D18, D38


Whether job was unionized

Items D19, D39



Reason for job separations

Item D20




Income Sources and Household Characteristics

Section E

Number of months of receipt and average amount received per month of SNAP, TANF, SSI, or other cash assistance

Items E1-E3


Number of months received assistance from the Women, Infants, and Children Program (WIC)

Items E1 and E2


Total household income

Items E4-E7



Number of people in household, number of children in household

Items E8-E9




Demographic and Household Characteristics

Section F

Health problems limiting work

Item F1




Receipt of health insurance at baseline and during previous 15 months, type of insurance

Items F2-F5



Race, ethnicity, marital status

Items F7-F9





Educational attainment

Items F10-F11



Whether the respondent has been arrested or convicted of a felony

Items F12a-13b




Below, the types of information collected are discussed in approximately the order of their appearance in the survey instruments.

Personal identifying and tracking information. Tracking information to correctly identify the survey sample members and follow up with them at a later date bookends the survey instrument. The survey starts with screening questions to ensure that the sample locating process has identified the correct individual. Respondents are asked to confirm their name, date of birth, and last four digits of their social security number. At the end of the survey, respondents are asked to confirm or update the basic contact information gathered from the sample locating process so that incentive payments (discussed in Part A, Section 9) can be delivered.

Service receipt. Key to the interpretation of the impacts of WIA intensive services on customer employment and self-sufficiency outcomes is the impact of offering these services on actual service receipt across the study groups. While all members of the full-WIA and core-and-intensive groups are offered intensive services, and members of the full-WIA group will be offered training as well, some customers will not access all offered services. In addition, as WIA is not the only funder of employment services and training, sample members may access services funded by sources other than WIA. Hence, it is important to collect data on the amount and type of services and training received by members of all three study groups from all sources.

The surveys collect data on the quantity of employment services, education, and training received since random assignment and whether these services were received at an American Job Center or elsewhere. How the quantity of an activity is measured depends on the type of service, education, or training received; measures may include the number of times accessed, length of time spent in service, and the dollar cost.

Services asked about include:

  • Use of resource rooms. American Job Centers usually have resource rooms that provide local labor market information such as specific job openings and employers and industries that are in need of workers. These resource rooms also provide technological assistance to support a job search such as computers, access to the internet, fax machines, and telephones. Other organizations also provide similar services. The surveys ask about the number of times the customer visited a resource room, and the time spent there, in both American Job Centers and elsewhere.

  • Attendance in workshops. American Job Centers offer workshops on a variety of topics aimed at helping the customer become employed. Most of these workshops require only one staff member and there is little individualized attention. However, we have identified some more intensive workshops in which staff provide one-on-one assistance to customers. As the costs of these workshops are much higher than average, we treat them separately. The surveys ask about attendance in the identified intensive workshops specific to each study site. (Information about the length and intensity of these workshops was collected during on-site interviews with American Job Center staff and, therefore, is not collected in the survey.) The survey questions then collect information on the number and average length of other workshops attended within the American Job Center (typically the core workshop series offered), and then the extent of attendance in workshops provided by other non-WIA funded agencies or organizations.

  • Attendance at peer support groups. Sometimes referred to as job clubs or networking groups, peer support groups are offered by some American Job Centers and other organizations as a means through which participants can share experiences, resources, and leads throughout the job search process. The survey asks about attendance in peer group meetings provided at the American Job Center as well as participation in such group meetings offered by another agency or entity.

  • Completion of assessments. Assessments can be used to determine the level of an individual’s basic skills (such as math or reading), and/or to determine how the interests and abilities of an individual align with particular jobs. The surveys collect information on the type and total number of assessments completed as well as on the agency or entity that provided the assessment(s).

  • Receipt of individualized counseling. The receipt of individualized counseling to support an individual’s job search, career exploration, and training options is an important element of WIA services. The surveys collect information about the receipt and content of counseling sessions. In addition, items on the surveys collect details about the frequency and duration of counseling services by the type of counseling provider (whether within the American Job Center or other organization).

  • Receipt of support services. Sample members may also be eligible for and receive an array of supportive services in the form of cash, voucher, gift card or reimbursement to help him/her with expenses to look for work or attend training or school. Questions on the surveys collect information on the purpose of assistance (such as to purchase books or uniforms, or to support travel expenses) and the total value of such assistance received from the American Job Center or from other agencies or organizations.

Education and training. Sample members are asked for information about each education and training program they attended from random assignment to the 15-month follow-up (in the 15-month follow-up survey), and between the 15- and 30-month follow-up (in the 30-month survey). Responders reached for the first time for the 30-month survey are asked about programs over the full 30-month period. The surveys collect information to detail the duration of education or training pursued, type of education or training, the education or training provider, total costs and out-of-pocket costs, whether the course was completed, and resulting credentials as further detailed in Table A.2. This information is collected on each program, regardless of whether it was funded by WIA, or whether the customer completed the program.

Employment and earnings. Because the goal of the WIA intensive and training services is to improve customers’ labor market outcomes, key outcomes for the evaluation are related to employment and earnings. Given the importance of these outcomes, we collect a complete and detailed history of all jobs held by sample members for 30 months after they were randomly assigned. Items in each of the 15- and 30-month surveys collect basic information about jobs for pay including: earnings (from each job), employment (current status, number of jobs, periods of unemployment); the characteristics of each job held (industry and occupation, hours worked, wage rates, and type of employment agreement); job retention (how long held each job, reasons for job separations); and measures of job quality (the availability of fringe benefits, presence of unions). Detailed information on earnings from each job is critical as earnings represent both a key outcome for the impact analysis as well as the main benefit that will contribute to the benefit-cost analysis.

Self-sufficiency. A goal of employment policy is self-sufficiency for the participant and his/her household. Thus, for the impact analysis, the two surveys collect information on household composition and receipt of public assistance—whether Federal or state—by any member of the household in which the sample member lives. Specifically, the 15- and 30-month surveys collect information on the receipt and amount of public assistance received, such as benefits through SNAP, TANF, or other cash assistance program, and the WIC program.1 In addition, the surveys ask about total household income in aggregate.

Demographic and individual characteristics. Respondents are asked to confirm any items they did not complete on their study registration form such as their gender and race and ethnicity. In addition, we ask about the extent and type of health insurance coverage for the sample member over the 30-month study period. Finally, the surveys ask about limitations to work due to disabilities or health problems and any felony convictions. Limitations to work are important baseline measures because they can affect the impact of the intensive and training services. These items are collected in both the 15- and 30-month surveys.

c. Use of the Survey Data and by Whom

Mathematica will use the information collected from the surveys to carry out the analysis needed to fully assess the effectiveness of WIA intensive and training services. This information will be used by Congress to determine future funding, by Congress and DOL to determine national workforce policy, and by state and local areas to decide on local policy. Specifically, the survey data will be used to conduct two analyses:

Impact analysis of intensive services, training services, and outcomes. Data collected through the survey will be used in the impact analysis to provide estimates of the offer of WIA intensive and training services on the receipt of services and on employment and self-sufficiency outcomes. The net impacts will be derived by comparing the average outcomes of individuals in each of the three research groups. Three comparisons of outcomes will be made: (1) those of the core group to those of the core-plus-intensive-services group, (2) those of the core-plus-intensive-services to those of the full-WIA group, and (3) those of the core group to those of the full-WIA group. Each respective comparison will provide information about the relative impact of intensive services over core, training over intensive services, and training (which by definition include intensive services) over core services. In addition to estimating overall impacts, impacts for different subgroups—by age, sex, race/ethnicity, adult/dislocated worker and educational and employment background—will be estimated in order to determine who is or is not served well by the program as presently constituted. In addition to estimating the offer of intensive and training services the impact of the receipt of intensive and training services will be estimated.

Benefit-cost analysis. The impact estimates on employment and self-sufficiency outcomes derived from the survey data will be used to measure the benefits from increased employment, greater earnings, and reduced use of other public assistance. In addition, the survey data on service receipt—use of the resource room, workshops, peer-support groups, assessments, individual counseling and supportive services—will be used with cost data to assign a total cost of providing services for each customer. The survey will ask about the cost of education and training programs. The benefit-cost analysis will place a dollar value on each benefit and cost of the program and then summarize in a single statistic all of the diverse impacts and costs associated with WIA service receipt.

3. Use of Improved Technology to Reduce Burden

The follow-up surveys are administered by computer-assisted telephone interviewing (CATI). CATI provides many benefits for both the data collectors and the respondents. Using CATI allows greater flexibility in scheduling for survey respondents, making the survey less burdensome to them. Also, CATI programming ensures skip logic, restricts entries to valid responses and checks for logical consistency across questions. Interviewers are thus able to correct errors during the interview, eliminating the need for callbacks to respondents, further reducing the burden on respondents as well as keeping costs in check. In cases when field locators are needed (when sample members cannot be reached through multiple attempts by phone), locators are equipped with cell phones and encourage sample members to call into a centralized call center where a project-trained interviewer will administer the CATI interview. This is less costly and burdensome than paper-and-pencil interviewing which typically requires longer administration time; turning pages and following skip instructions using a hard copy questionnaire takes longer for the interviewer to administer, thus increasing respondent burden.

To further minimize burden for respondents, both surveys are preloaded with key information to facilitate data collection. Data such as date of birth and the last four digits of the Social Security number will be used to confirm sample members’ identities. Employer names from the study registration form will frame questions about employment at time of (or just prior to) study intake. Similarly, data collected at the 15-month follow-up is preloaded for surveys conducted at the 30-month follow-up (when applicable).2 Using previously collected data can aid respondent recall and ensure that only new information is collected, thereby reducing burden.

Finally, using CATI and a call scheduler translates into less time burdening the sample member’s household with calls at inappropriate times and/or in incorrect languages. An automated call scheduler will simplify scheduling and rescheduling of calls to respondents and can assign cases to specific interviewers, such as those who are trained in refusal conversion techniques or those who are fluent in Spanish. In addition, CATI almost completely eliminates dialing errors because calls are made through a preview dialer. The preview dialer allows interviewers to review case history notes and the history of dispositions. The interviewer then presses one button to dial the number after reviewing the case (this is akin to one-touch or speed dialing).

4. Avoiding Duplication of Effort

There is no similar prior or ongoing data collection being conducted that duplicates the efforts of the proposed data collection for the evaluation of the Adult and Dislocated Worker programs. Specific efforts have been made to reduce the overall burden on the respondents by making efficient use of baseline data from the study registration form in the follow-up surveys, and supplementing administrative data with the rich and detailed data available only from direct customer surveys.

Some data items included in the follow-up surveys are available from administrative data sources, but not with the same level of detail and coverage as can be obtained from the direct customer surveys. For example, while UI quarterly earnings data was collected for the entire evaluation sample, these administrative data tend to be less accurate than the survey data, for several reasons. The UI earnings data do not cover all workers (the data cover 90 percent of all workers); they exclude Federal workers, military staff, self-employed people, railroad employees, workers in service for relatives, most agricultural labor, some domestic service workers, part-time employees of non-profit organizations, insurance and real estate agents on commission, and workers performing what is referred to as casual labor (U.S. Department of Labor, 2004). They also exclude workers whose employers (illegally) fail to report their earnings to the UI agency.

Similarly, administrative data on service receipt were obtained from files maintained by states with participating sites. The state-maintained files provide information on services and training funded by WIA however these data do not include details on the types of services and training received. For example, they record that the customer received an intensive service, but not the type of service. Since the costs of such services differ depending on what specifically is received (for example, a one-on-one counseling session versus attending a workshop), it is important to distinguish service receipt at a finer level than is available in the administrative data. Most importantly, these data do not cover intensive and training services that are not funded by WIA; the surveys are the only means of collecting information about non-WIA funded services received by members of the three research groups.

5. Methods to Minimize Burden on Small Businesses or Entities

Follow-up surveys will be conducted with individuals. The evaluation team will not contact small businesses or entities.

6. Consequences of Not Collecting the Data

The data collection efforts in this extension request are designed to provide information to answer questions of interest to policymakers and program operators. The follow-up surveys serve as a critical source of reliable and consistent data about sample members’ service use, employment, and self-sufficiency for all three study groups. This information is critical in order to be able to assess the types of services the sample members in the core group received in the absence of the WIA Adult and Dislocated Worker programs. It is also critical to collect this information for the sample members of the other two study groups (core-plus-intensive and full-WIA) who may also access intensive and training services from sources other than WIA. It is necessary to have information on the full set of services received by sample members in each of the three study groups to assess the impact of WIA intensive and training services on patterns of service receipt, as well as to contribute critical information to develop cost estimates for the benefit-cost analysis.

The surveys are important for providing information on study participants’ earnings and other employment outcomes that will, again, contribute critical data elements for the impact analysis as well as the benefit-cost analysis. Although earnings and employment data are collected from UI quarterly earnings records, these data are incomplete in ways that will affect the study’s ability to evaluate the impact of WIA intensive and training services (as discussed in section 4 above). For instance, UI earnings data do not contain the dollar value of any fringe benefits the employee might receive. The evaluation team will use the survey data on earnings—rather than the administrative data—to develop estimates of the benefits of each increasing level of WIA service receipt because of the greater completeness and accuracy of these data. Development of these estimates will not be possible without this data.

Without the proposed extension, we will not be able to conduct interviews with all planned respondents for the 30-month survey. The additional time is needed to locate sample members and administer the survey. Without the proposed extension, we estimate that we will be able to complete 3,690 interviews, for a response rate of only 62 percent. With the additional six months, we predict that we will be able to complete 4,920 interviews for an 82 percent response rate. The higher response rate will reduce the likelihood of nonresponse bias.

7. Special Data Collection Circumstances

No special circumstances apply to this data collection. In all respects, the data will be collected in a manner consistent with Federal guidelines. There are no plans to require respondents to report information more often than quarterly, to submit more than one original and two copies of any document, to retain records, or to submit proprietary trade secrets.

8. Federal Register Notice

a. Federal Register Notice and Comments

A Federal Register notice announcing plans to submit this data collection extension package to OMB was published on August 14, 2015 (80 FR 48916) consistent with the requirements of 5 CFR 1320.8 (d). The Federal Register notice described the evaluation and provided the public an opportunity to review and comment on the data collection extension plans within 60 days of the publication, in accordance with the Paperwork Reduction Act of 1995. A copy of this 60-day notice is included as Appendix E.

DOL did not receive any comments in response to the Federal Register notice published on August 14, 2015.

b. Consultations Outside of the Agency

DOL and the study team did not engage in any outside consultations for the follow-up survey.

9. Respondent Payments

We will continue to use the respondent payments agreed upon by OMB on March 12, 2015 (ICR reference number 201502-1205-001). The approach is to: (1) offer sample members who were paid a $40 or $75 payment for completing the 15-month a $75 payment for completing the 30-month follow-up survey; (2) offer sample members who either did not respond to the 15-month survey or were paid $25 for completing the 15-month survey a $25 incentive to complete the 30-month survey and increase this to $75 only for sample members who are unresponsive to outreach attempts.

At intake, participants were advised that they could be contacted to complete a survey and that they would receive an incentive payment for survey completion. The letter sent in advance of contacting the sample member for the telephone interview states the incentive amount (see advance letter in Appendix D).

A sample member is deemed to be unresponsive to outreach attempts and hence eligible for the $75 incentive payment only if the sample member has not completed an interview after three months have passed since the first attempt to contact the sample member. Sample members will be offered the $75 in a postcard sent to their home in the fourth month after the first attempt to contact the sample member (non-respondent reminder postcard in Appendix D). The postcard will provide a telephone number for the sample member to call and complete the interview. The sample member will also be called with the $75 offer after the postcard is mailed. Five months after the first attempt to contact the sample member, if the sample member has not completed the interview, field locators will be sent to the last known address of the sample member and offer a $75 incentive for completing the interview.

Incentives can help support high data quality by ensuring high overall response rates and by increasing the response rates from subgroups that are less likely to cooperate with the survey request. Incentives can help achieve high response rates by increasing the sample members’ propensity to respond and can reduce the likelihood that we need to send a field locator to complete the interview (Singer et al. 2000). And, studies have shown that incentives may reduce differential response rates and hence the potential for nonresponse bias (Singer and Kulka 2002). For example, there is evidence that incentives are effective at increasing response rates for people with lower educational levels (Berlin et al. 1992) and low-income and nonwhite populations (James and Bolstein 1990). In addition, a recent study found that incentives increased the participation of sample members who were more likely to be unemployed (Jäckle and Lynn 2007). Further, studies have found that paying incentives does not distort responses and, thereby, impair the quality of the data obtained (as reflected in item nonresponse or the distribution of responses) from groups that would otherwise be underrepresented in the survey (Singer et al. 2000).

Our estimated cost of providing incentives for completion of the 30-month follow-up survey is $184,500, assuming that 75 percent of completers (or 3,690 respondents) will receive an incentive of $25 and 25 percent (or 1,230 respondents) will receive an incentive of $75. This estimate has not changed given the extension of the survey. We estimate that $46,125 of the $184,500 would be distributed during the proposed extension period of February 1, 2016 to July 31, 2016.

10. Confidentiality

Evaluation researchers have a strong set of methods to ensure that the privacy of data is protected. Mathematica institutes, and researchers must follow, policies related to (1) privacy, (2) physical and technical safeguards, (3) approaches to the treatment of personally identifiable information (PII), and (4) survey related procedures.

a. Policy

All Mathematica and subcontractor evaluation staff will comply with relevant policies related to secure data collection, data storage and access, and data dissemination and analysis. Mathematica’s security policy meets the legal requirements of The Privacy Act of 1974 (System of Records Notices DOL/ETA-15); the “Buckley Amendment,” Family Educational Rights and Privacy Act of 1974; the Freedom of Information Act; and related regulations to ensure and maintain the privacy of program participants.

It is the policy of Mathematica to efficiently protect this information and data in whatever medium it exists, in accordance with applicable Federal and state laws and contractual requirements. In conjunction with this policy, all Mathematica staff shall:

  1. Comply with the Mathematica Confidentiality Pledge, which is signed by all Mathematica full-time, part-time, and hourly Mathematica staff, and with the Mathematica Security Manual procedures to prevent the improper disclosure, use, or alteration of PII. Staff may be subjected to disciplinary or civil or criminal actions or both for knowingly and willfully allowing the improper disclosure or unauthorized use of PII.

  1. Only access PII and proprietary information in performance of assigned duties.

  2. Notify their supervisor, the project director, and the Mathematica security officer if PII information has been disclosed to an unauthorized individual, used in an improper manner, or altered in an improper manner. All attempts to contact Mathematica staff about any study or evaluation by individuals who are not authorized to access the PII will be reported immediately to both the Mathematica project director and the Mathematica security officer.

  3. As part of their contract with DOL, all regular status and on-call staff who have access to PII will adhere to all DOL security requirements, including fingerprinting and background checks.

b. Safeguards

Mathematica has established safeguards that provide for the security of PII and the protection of the data provided by individuals on all of its studies. Safeguards to ensure the privacy of data include:

  1. Facility. The doors to office space and the survey operations center (SOC) are always locked, and all SOC staff are required to display a current photo identification while on the premises. Visitors are required to sign in and out of company offices and are required to wear temporary identification badges while on the premises. Any network server containing PII is in a controlled-access area. All authorized external access is through a protected internet network that is under strict password control.

  1. Network. Data stored on network drives are protected using the security mechanisms available through the network operating system used on Mathematica’s primary network servers: Novell Netware 5–6.5. These versions of Novell Netware are compliant with the C2/E2 Red Book security specifications. Netware is certified at the National Computer Security Center’s Trusted Network Interpretation Class C2 level of security at the network level. The network is protected from unauthorized external access through the PIX Firewall from CISCO. This firewall resides between the network and the communications line over which the corporate internet traffic flows. Access to all network features such as software, files, printers, internet, email, and other peripherals is controlled by user ID and password. Network passwords must be a minimum of eight characters in length and must be a combination of numbers and letters. All user ID, passwords, and network privileges are revoked within one working day for departing staff and immediately for terminated staff. All staff members are required to log off the network before leaving for the day.

  2. Printers. Printer access is granted to all staff with a valid user identification (ID) and password. The physical hard disks on which the printer queues reside are subject to the same security/crash procedures that apply to the file servers. Print stations are monitored appropriately depending on the sensitivity of the printed output produced. No PII or proprietary data or information may be directed to a printer outside of Mathematica’s offices.

  3. Electronic communication. Ethernet is used for internal email communications over the network. As Ethernet communications use Novell Netware with built-in user ID and password protections and Windows NT Challenge Handshake Authentication Protocols, sensitive information in both email text and attachments may be safely transmitted. Email transfer is also encrypted when sent to or from the Mathematica gateway facility, which allows staff to check and send emails from home. A dedicated private line supports cross-office communications between Mathematica offices.

Research team members who play a role in data collection and analysis will be trained in procedures for safeguarding PII and will be prepared to describe these procedures in full detail and to answer any related questions raised by participants.

c. Treatment of Data with Personal Identifying Information

All data containing PII, including social security number, name, home address, and home telephone number, are considered to be sensitive or private, project-specific data. Specific details regarding the handling and processing of PII for the evaluation are provided next.

  1. Access. Electronic files containing PII are stored in restricted access network directories. Access to restricted directories is limited on a need-to-know basis to staff who have been assigned to and are currently working on the project. When temporarily away from their work area, project staff members close files and applications. Access to workstations will automatically lock within a set period of minutes, and staff must use a password to regain access through the protected screen saver.

  1. Electronic communications. Although the protections offered by internal email are extensive, staff members are instructed not to transmit sensitive information as a regular file attachment to an internal email. Instead, staff members are instructed to use the insert shortcut feature in Outlook to include a shortcut to the file. This allows the receiver to go to the file directly but will not allow access to unauthorized individuals. Additionally, staff members are instructed not to include sample members’ names or other personal identifying information in internal emails so that there is no potential for these to be viewed by others. When information about a sample member is transmitted via email, a Mathematica identification number is used as a reference. To ensure the security of sensitive information sent outside of Mathematica through an email, the sender is obligated to ensure that the recipient is approved to receive such data. When files must be sent as attachments internally or outside of Mathematica, staff are instructed to use WinZip 9.0 (256-bit AES encryption) to password protect the file. When sending sample member name and contact information outside of the company, this information will be included in a secure attachment rather than in the text of the email.

  2. Databases. The databases developed for this study containing PII are password protected and accessible only to staff who are currently working on the project. To access the database, users will first log on to their workstations and then to the database using a separate log-in prompt. The database will be removed and securely archived at the end of the data-processing period.

  3. Public use data files. To allow external verification and replication of the study findings, as well as additional research, public use data files containing key analysis variables created for the evaluation will be produced at the end of the study and formatted to data.gov specifications. These public use files will follow the current OMB checklist on privacy to ensure that they can be distributed to the general public for analysis without restrictions. Steps will be taken to ensure that sample members cannot be identified in indirect ways. For example, categories of a variable will be combined to remove the possibility of identification due to a respondent being one of a small group of people with a specific attribute. Variables that will be carefully scrutinized include age, race and ethnicity, household composition and location, dates pertaining to employment, household income, household assets, and others as appropriate. Variables will also be combined in order to provide summary measures to mask what otherwise would be identifiable information. Although it cannot be predicted which variables will have too few respondents in a category, the study researchers plan not to report categories or responses that are based on cell sizes of less than five. If necessary, statistical methods will be used to add random variation within variables that would be otherwise impossible to mask. Finally, variables that could be linked to identifiers by secondary users will be removed or masked.

d. Follow-up Surveys: Privacy and Security

All respondent materials—letters and reminder postcards—include assurances of privacy protection. In addition, as part of the interviewer’s introductory comments to the telephone interview, sample members are told that their responses are anonymous and will have the opportunity to have any questions answered. Interviewers are trained in these procedures and will be prepared to describe them in full detail, if needed, or to answer any related questions raised by participants. For example, the interviewer will explain that the individual’s answers will be combined with those of others and presented in summary form only.

All data items that identify sample members will be kept only by Mathematica, for use in assembling records data and in conducting the interviews. No data received by DOL will contain personal identifiers, thus precluding individual identification.

  • Telephone interviewers for the evaluation survey will be seated in a common, supervised area. As part of the process to verify that the correct sample members have been reached, interviewers will have access to respondents’ names and birthdates, as well as the last four digits of their Social Security Number (SSN). Birth date and the last four SSN digits will be displayed on the computer screen only temporarily, at the beginning of the survey, so that the interviewer can verify the sample member’s identity. Interviewing staff for this project receive training that includes general security and privacy procedures, as well as project-specific training that includes explanation of the highly private nature of this information, instructions to not share it or any PII with anyone not on the project team, and warnings about the consequences of any violations. Telephone interviews are recorded for educational and training purposes only, to aid interview staff in improving their skills, and are then destroyed.

  • Locating. Staff members who work on updating sample member contact information when the original contact information is no longer valid must have access to key identifying information for short periods. These staff members will receive training that includes general security and privacy procedures, as well as project-specific training that includes clear instructions on what data and databases can be accessed and what data are required and can be recorded in a database. In addition, locators may talk to a sample member’s family, relatives, or other references to obtain updated contact information. To protect the sample member, locators are given scripts on what they can and cannot say when using these sources to obtain information. For example, locators will indicate that Mathematica is trying to reach the sample member for an important study sponsored by the DOL, but will not reveal the nature of the study. Postcards will similarly describe Mathematica’s need to reach the sample member.

  • Locating and calling contact sheets. Project team members keep only the minimum amount of printed PII needed to perform assigned duties. Hard-copy materials (such as locating or calling contact sheets) containing data with any individual identifiers (for example, name, street address) are stored in a locked cabinet or desk when not being used. When in use, such materials are carefully monitored by a project supervisor and are never left unattended. At the conclusion of the project, a final disposition of all remaining sample members will be made, and contact sheets and other associated materials will be destroyed.

  • Data files. Electronic files for everyday use are created without personal identifiers. Data and sample files that must contain sensitive data are stored and analyzed on one of Mathematica’s “Secure Data” drives. Specifically, staff working on this project will be instructed to maintain all files with PII in project-specific, encrypted folders on the Mathematica network. Access control lists restrict access on a need-to-know basis and only to project staff members who are specifically authorized to view the sample data (as designated by the project or survey director) to select and process the sample or to process the data files. Sensitive data that are no longer needed in the performance of the project will be magnetically erased or overwritten using Hard Disk Scrubber or equivalent software, or otherwise destroyed.

  • Hard-copy printouts. Sensitive temporary work files, used to create hard-copy printouts and stored in temporary work files on local hard drives, are deleted on a periodic basis. PII hard-copy output is shredded or stored securely once no longer needed. Test printouts of data records carrying personal identifiers that are generated during file construction are shredded.

11. Questions of a Sensitive Nature

The follow-up surveys contain some questions that may be considered sensitive by some sample members. Obtaining information about these potentially sensitive topics is integral to addressing the research questions posed by the study. The survey questions around these topics have been worded to show the highest level of objectivity and sensitivity. Interviewers are also trained to show sensitivity to respondents while remaining impartial. All questions in the current survey, including those deemed potentially sensitive, have been thoroughly pretested and many have been used extensively in prior surveys with no evidence of harm.

Further, as described in item 10, all participants will be assured of privacy at the outset of the interview and reminded throughout the interview as needed. All survey responses will be held in strict confidence and reported in aggregate in any reports or publically available documents, eliminating the possibility of individual identification.

The potentially sensitive questions and justifications for their inclusion in the survey instruments are presented in Table A.3.

Table A.3. Justification for Sensitive Questions in the Follow-up Surveys

Question Topic

Justification

Receipt of financial assistance in support of work, training, or school (Section B, Items B59a-B64)

Information about the receipt of financial assistance received to support work or training is important in assessing the impact of the offer of WIA intensive and/or training services on service receipt patterns across the three study groups. In addition, it is necessary to collect information on the total amount of assistance received to estimate the costs of services for the benefit-cost analysis.

Type, location, costs, and completion of training or education program (Section C)

Specific information about each training or education program in which the sample member participates is essential in: (1) estimating the impacts of the offer of WIA intensive services and training on the participation in training across the three study groups, (2) estimating the impacts on the completion of training and receipt of associated degrees or credentials across the study groups, and (3) computing the costs of training (for the individual and for the government) for use in the benefit-cost analysis. These questions have been used frequently in other DOL surveys including the evaluations of the Individual Training Account Demonstration and the Trade Adjustment Assistance Program with few issues with nonresponse.

Employment history over study period; characteristics of jobs held; and earnings (Section D)

Employment and earnings patterns are key outcomes for this evaluation and are necessary for answering the research questions about the effectiveness of access to WIA intensive services and training in achieving better outcomes for individuals than would be experienced in the absence of the program. The impact estimates on earnings contribute an important element to the analysis addressing the question of whether the benefits of WIA services (in the form of increased earnings) exceed program costs. These questions have been used frequently in other DOL surveys including the Individual Training Account Demonstration and the Trade Adjustment Assistance program demonstration with few issues with nonresponse.

Household income and receipt of public assistance (Section E)

Total household income and the receipt of public assistance are used to measure self-sufficiency, another key outcome of this evaluation. Similar to employment and earnings, data on these topics is critical in estimating the impacts of the offer of WIA intensive services and training across the study groups, and using these estimates to contribute to both the benefit (household income) and cost (receipt of public assistance) side of the equation in the benefit-cost analysis. Household income and sources and amounts of public assistance have been collected on many national surveys, including the Survey of Income and Program Participation, and have been used frequently in other DOL surveys. The survey questions for this evaluation are quite brief on this topic and they are aggregated for the household as a whole to obtain total income and sources of public assistance. In this way, the sample member does not have to disclose which member of the household receives specific benefits.

Receipt of health insurance (Section F, Items F2-F5x)

Receipt of health insurance is an important indicator of self-sufficiency and hence an outcome measure.

Individual characteristics including age, race and ethnicity, marital status, and level of education (Section F, Items F7-F11)

Data on these topics are important to collect in order to conduct an analysis of the impacts of WIA intensive services and training by subgroups of survey respondents. Such an analysis addresses a key research question about whether the effectiveness of WIA varies by population subgroup. Nonresponse to these items is rare.

Limitations to work including health problems, arrests, and felony convictions (Section F, Items F1 and F12a-F13b)

Limitations to work are important baseline measures because they can affect the impact of the intensive and training services. Health problems that affect work and felony convictions are two important barriers to employment. Recognizing the sensitivity of collecting information about arrests and felony convictions, these questions are asked at the end of the survey.

12. Estimates of the Annualized Burden Hours

The extension of the OMB survey will not increase the total burden of the 30-month survey. The total burden was estimated at 2,460 hours at an indirect total cost burden of $17,836. (This was annualized over two years for an annual burden of 1,230 hours and a cost of $8,918). We originally expected to complete interviews with 4,920 people over two years. We now expect to complete 3,690 interviews before January 31, 2016 and the remaining 1,230 interviews between February 1, 2016 and July 31, 2016. Table A.4 presents the number of respondents, the number of responses per respondent, the average burden hours per response, and the total annual burden hours for the 30-month follow-up survey data collection that will occur during the extension period. We expect to complete 30 minute interviews with 1,230 people for a total of 615 burden hours. Table A.5 presents annualized estimates of indirect costs to all respondents for the 30-month follow-up survey data collection instrument during the proposed extension. At an average wage of $7.25 per hour—the Federal minimum wage—the cost estimate for this customer burden is $4,459 (615 hours at $7.25 per hour). The minimum wage is used as the opportunity cost to the customers.

Table A.4. Annual Burden Estimates for WIA Evaluation 30-Month Follow-up Survey, February 1 to July 31, 2016

Activity

Annualized Number of Respondents

Number of Responses per Respondent

Average Burden Hours per Response

Total Annual
Burden Hours

30-month survey, extension, February 1 to July 31, 2016

1,230

1

30 minutes

615

Table A.5. Monetized Burden Hours for WIA Evaluation 30-Month Follow-up Survey, February 1 to July 31, 2016

Activity/Respondent

Annualized Number of Burden Hours

Type of Respondent

Average Hourly Cost

Annualized Indirect Cost Burden

30-month survey, extension, February 1, 2016 to July 31, 2016

615

WIA customer

$7.25

$4,459

13. Estimates of the Total Annual Cost Burden to Respondents or Record Keepers

There will be no direct costs incurred by WIA customers (survey sample members) or WIA staff associated with the follow-up survey. The only indirect cost to respondents is the cost of their time (see Table A.5). Evaluation participants who are selected as survey respondents will not incur any out-of-pocket costs. Telephone calls will be placed at the expense of the evaluation contractor (Mathematica), and respondents who wish to call the interviewers will be provided with a toll-free number billed to Mathematica.

14. Estimates of the Annualized Cost to the Federal Government

The total cost of the WIA Evaluation to the Federal government is $24,416,039. Of this $24,026,039 will be paid to the contractor and $390,000 will be spent by DOL staff managing the study and overseeing the contractor. Since the WIA Evaluation will last nine years, the annualized cost to the Federal government is $2,712,893 ($24,416,039÷ 9 years).

Of the $24,026,039 paid to the contractor, about $1.552 million is for design and planning, $2.498 million is for site recruitment, $4.433 million is for payments to sites and states as compensation for staff time spent on the study, $2.176 million is for training site staff and providing technical assistance throughout the study, $10.309 million is for data collection, and $3.058 million is for analysis and reporting.

DOL will spend approximately $390,000 on staff salaries to manage the study and oversee the contractor throughout the course of the entire evaluation. (OPM Salary Table 2015: https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2015/DCB_h.pdf.)

Cost of Federal Staff Working on the evaluation based on OPM Salary Table 2015

Grade

Step

Hourly Basic Rate

Total Hours

Total Pay

14

8

$63.43

4,928

$312,583

14

10

$66.85

1,000

$66,850

15

5

$68.56

100

$6,856

15

10

$76.04

49

$3,726

Total




$390,015

15. Changes in Burden

The proposed extension will not change the total amount of burden associated with the administration of the 30-month follow-up survey. Total burden is expected to remain at the approved amount of 2,460 (4,920 interviews × 1/2 hour per interview). Given the current response schedule, we expect that 1,230 of the interviews will be conducted during the proposed extension period of February 1, 2016 to July 31, 2016. This equates to 615 burden hours (1,230 interviews × 1/2 hour per interview).

The remaining information collections that were approved in 2013 (Resource Room Sign-in Sheet, Veterans’ Supplemental Study Staff Interviews, Protocols for Focus Groups, Veterans Focus Groups, and Cost Data Collection, Program Costs Questionnaire and Front Line Staff Activity Log) have fulfilled their purpose, and the agency seeks to discontinue their use. This results in a discretionary burden reduction of 11,929 responses and 1,966 hours.

16. Publication Plans and Project Schedule

Table A.6 shows the schedule for the evaluation.

Table A.6. Schedule for the evaluation

Activity

Date

Participant Intake period

November 2011 through April 2013

Administration of 15-month follow-up survey

April 2013 to May 2015

Administration of 30-month follow-up survey

June 2014 to July 2016

First impact report submitted

Winter 2015/2016

Final report submitted

Fall 2016

17. Reasons for Not Displaying Expiration Date of OMB Approval

The expiration date for approval issued by OMB for the survey data collections will be printed on all materials sent to sample members such as letters and reminder postcards.

18. Exception to the Certification Statement

Exception to the certification statement is not requested.

















































1 Only receipt of WIC will be collected, not amount, since the WIC package varies by family and the sample member is unlikely to know the benefit amount in dollar terms.

2 When information is missing from either the study registration form or the 15-month survey we will be unable to preload information. However, we will then attempt to collect any missing information from prior data collection efforts


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorC Fitts
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy