1205-0494 EVAL YPDP Reinstate w Change - Supporting Statement Part A - 12172015

1205-0494 EVAL YPDP Reinstate w Change - Supporting Statement Part A - 12172015.docx

Young Parents Demonstration Project Evaluation

OMB: 1205-0494

Document [docx]
Download: docx | pdf

Evaluation of the Young Parents Demonstration Program, Reinstatement with Change

OMB Control Number: 1205-0494

December 2015


SUPPORTING STATEMENT

EVALUATION of the YOUNG PARENTS DEMONSTRATION PROGRAM, REINSTATEMENT WITH CHANGE

1205-0494

A. Justification


Background


The U.S. Department of Labor’s Employment and Training Administration (ETA) is requesting Office of Management and Budget (OMB) review and approval to reinstate with change (upon expiration on May 31, 2015) the currently approved Information Collection Request (ICR) 1205-0494 to: (1) collect participant data from organizations that received grants under the Young Parents Demonstration Program (YPDP); (2) conduct semi-structured interviews with key administrators and staff in the demonstration projects to document the structure and implementation of the demonstration intervention; and (3) conduct a follow-up survey of YPDP participants at 18 months after random assignment. In this request, we eliminate the 36-month follow-up survey and make non-material changes to improve the current data collection.


The YPDP is designed to provide educational and occupational skills training that fosters family economic self-sufficiency to young parents (both mothers and fathers) and expectant parents ages 16-24, including, as applicable, those in high-risk categories such as victims of child abuse, children of incarcerated parents, court-involved youth, youth at risk of court involvement, homeless and runaway youth, Native American youth, migrant youth, youth in or aging out of foster care, and youth with disabilities.1 The YPDP grantees are required to develop a mentoring model, which includes an intensive professional staff mentoring specifically for education, employment, and training and specifically for pregnant and parenting teens and young parents. They are to implement this intervention as an additional level of services above and beyond the existing services they currently provide that are specifically intended to increase an individual’s education, job training and employment.


Individuals enrolling in the program have a 50/50 chance of receiving this additional level of services. Those individuals assigned to the treatment group will receive the additional services, while individuals assigned to the control group will receive the existing services offered by the grantee. To evaluate the YPDP mentoring interventions, education, employment, and other outcomes of the two groups will be compared over time. Random assignment of new participants was completed in June 2014.


The Urban Institute (a nonprofit, non-partisan research organization based in Washington, DC), Capital Research Corporation, Inc. (a for-profit, small business evaluation research firm located in Arlington, VA), Abt Associates (a for-profit, research firm located in Boston, MA and Bethesda, MD), and Abt-SRBI (a for-profit survey firm headquartered in New York, NY), referred to from here on as the “evaluation team,” are collaborating on designing and conducting process/implementation and net impact evaluations of YPDP. Among the evaluation activities being conducted by these three firms are the following: (1) provision of assistance to each YPDP site to implement rigorous random assignment procedures to ensure fidelity in the assignment of YPDP participants to treatment and control groups; (2) development and implementation of a web-based Participant Tracking System to collect valid, reliable, and comparable participant-level data across all the grantee sites; (3) ongoing monitoring of enrollment in YPDP and of random assignment procedures; (4) documentation of the structure and implementation of the interventions based on site visits to each YPDP grantee; and (5) administration of follow-up surveys at 18 months after enrollment to obtain employment, earnings, and educational outcomes, as well as participant views on services received and their effectiveness.


Thus, the evaluation involves three types of data collection activities that are the principal focus of this OMB supporting document, which are highlighted below.


Participant Tracking System (PTS). For their pre-existing programs, all grantees maintained data on individual participants they served in a variety of formats such as computerized management information systems, Microsoft Excel spreadsheets, and paper files. To ensure rigorous process and impact evaluations, the evaluation team developed the web-based PTS to establish a uniform system of data collection by standardizing the participant data on demographic characteristics, services received, and outcomes across the sites. The PTS also executes the random assignment procedures. A detailed description of the PTS is provided in Attachment A.


Field-based Implementation Site Visits. The second research task involves conducting visits to the grantee sites aimed at documenting the program environment, the start-up, existing and bump-up services, participant flow through random assignment and program services, program costs, and grantee perspectives on implementation challenges and intervention effects. A two-person team will spend an average of two days on each grantee site approximately 12 months after the start of the random assignment process and near the end of the grant period. During each visit, eight key administrators and staff will be interviewed. While it was originally intended to collect data during one visit to each site, the evaluation team decided to split up the visit into two to ensure they had the opportunity to understand the program implementation at earlier and later stages of the grant. The interview guide does not change as the same guide will be used to collect data on program implementation. However, the use of two visits increases the burden estimate for the visits. The evaluation team will use the semi-structured interview (previously approved) guide presented in Attachment B.


Follow-up Survey. Using Computer Assisted Telephone Interviewing (CATI), a survey team is collecting data from both treatment and control group members on the type and intensity of services received, including the type and strength of the mentoring relationship; perspectives on their participation in the program; and various outcomes relating to employment, education, and parenting. The survey data will be used as part of the impact study to estimate short-term net impacts of YPDP participation – for example, estimating differences between the control and treatment groups in employment and earnings outcomes at 18 months after random assignment. We are fielding telephone surveys to all study participants (treatment and control) at the sites. The enrollment goal for YPDP across the four sites is 1,633 participants (i.e., a total of slightly over 800 treatment and 800 control group participants). With an 80 percent response rate, this survey effort will yield a total of 1,306 YPDP participants completing the survey. A 36-month follow-up survey was previously approved. However, due to the anticipated challenges with locating these youth after three years, we have eliminated this part of the data collection. Details on the 18-month follow-up survey can be found in section A.2 of this statement. In addition, Attachment C provides a copy of the survey instrument (previously approved) to be used in conducting the 18-month survey.



  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


The FY 2010 Omnibus Appropriations Bill required that the Secretary establish a demonstration project to address the employment and training needs of young parents under the Pilots, Demonstration and Research provisions (section 171) of the Workforce Investment Act of 1998 (WIA),2 grants were awarded in June 2011 to organizations in Los Angeles, CA; Asheville, NC; Birmingham, AL; and Worcester, MA. These organizations will implement intensive professional staff mentoring specifically for education, employment, and training and specifically for pregnant and parenting teens and young parents. Additionally, the Department commissioned Capital Research Corporation and the Urban Institute to perform an evaluation of the demonstration under section 172 of WIA, which is now governed under section 169 under the Workforce Innovation and Opportunity Act of 2014 (WIOA).3 The evaluation will use a rigorous experimental design featuring random assignment of YPDP participants into treatment and control groups.


A web-based PTS has been developed to accomplish random assignment in an efficient and private manner and allow ongoing quality control activities necessary to maintain the integrity of the random assignment process. The participant data items collected through PTS at the time of enrollment are necessary to execute the random assignment process. In this process, there is a 50-percent chance of each YPDP participant being assigned to either the treatment (bump-up) or the control group. The PTS has been designed to allow the grantees to collect data on participant characteristics, services received, and a select group of short-term participant outcomes, such as whether the participant is employed and hourly wages received at 6, 12, and 18 months after random assignment. The PTS also contains participant contact information (e.g., address, telephone, and e-mail information, as well as alternative contact information for three individuals who know the participant) to facilitate service provision and the administration of the follow-up surveys.


The site visits are an integral part of the process/implementation study for documenting the environment in which each of the four programs are operating, the flow of YPDP participants through random assignment and throughout the program services, the types of interventions available for both the control and treatment groups such as existing and bump-up services, and other programmatic characteristics. An in-depth understanding of the structure of each program and services being offered for treatment and control group members is necessary to establish the extent and nature of differences between the services received by treatment and control group members. Understanding the specifics of how each of the demonstration sites operates and the services delivered is essential to determining if any differences in outcomes between the treatment and control groups that exist are potentially associated with differences in interventions received. It is also imperative to understand the details of program operations and services provided should the Department of Labor (or other state or local agencies) seek to replicate part or all of the YPDP interventions in the future.


A follow-up telephone survey at 18 months after random assignment is needed to collect detailed information about employment, earnings, educational, parenting, and other outcomes across YPDP participants. Such outcome data will be used to statistically analyze net impacts of YPDP participation. In addition, this survey effort will provide the ability to gather more qualitative views of participants (in a comparable way) concerning their experiences with program participation.




2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


The follow-up survey of YPDP treatment and control group members at 18 months after random assignment, the Participant Tracking System, and the field-based implementation site visits to YPDP projects are data collection efforts that will continue under this approval. The information to be collected will be used to understand and analyze the impacts of the program overall and for different bump-up services. The key research questions being addressed are shown in the table below (see Table 1), which displays study questions and the principal data sources to be used to address each of the study. The results of the study will be used by policy makers and youth program providers in the Department of Labor to improve services to young parents.

To date, the information has been used by the Department and the grantees to track enrollment and provide a profile of the study participants via data books to each site. Continued and new collection of data under this approval will be used to develop a final report across the YPDP sites.

PTS. The four YPDP sites will continue to use the PTS to collect participant data on service receipt and outcomes for YPDP participants into treatment and control groups. The PTS will no longer be used for randomly assigning participants as enrollment was completed in June 2014. The system ensures that each YPDP site collects valid and comparable data on participant characteristics, service receipt, and outcomes. YPDP sites also use the PTS to: (1) record contact information to facilitate locating participants throughout their involvement in YPDP (e.g., to facilitate case management, mentoring, and service delivery) and to facilitate locating participants for the planned follow-up survey; (2) systematically collect data on services received by YPDP participants during their involvement in the demonstration and specifically on mentoring services; and (3) to collect and record comparable employment and educational outcome data on participants at 6, 12, and 18 months after random assignment. YPDP site staff enters data into the PTS on a secure website; participant information that might identify individuals will be immediately encrypted and complies with all provisions of the Urban Institute’s Institutional Review Board (IRB). (See Attachment A for a description of the PTS, the informed consent, mentoring log and, instructions to grantee staff about using the PTS approved by the Urban Institute’s IRB.)


TABLE 1: KEY EVALUATION QUESTIONS AND DATA SOURCES

KEY EVALUATION QUESTIONS

PTS

Site Visits/

Process Study

Follow-up Survey

Quarterly Earnings

Question #1: To what extent are there statistically significant differences in employment, education, parenting, and other outcomes for the bump-up (treatment) and control groups?


Question #2: How do net impacts on key outcomes vary across YPDP sites for the treatment and control groups?


Question #3: If it is possible to pool samples across YPDP sites and sample size is adequate for subgroup analyses, how do net impacts on key outcomes (for the treatment and control groups) vary for specific subpopulations of the youth served (e.g., race/ethnicity, gender, educational attainment, and other demographic factors)?


Question #4: Based on the results of the follow-up surveys and other evaluation activities (e.g., the process evaluation), what are the potential reasons for variation in net impacts for treatment and control groups?

Question #5: What types of services/assistance do participants receive, how satisfied are they with the services received (e.g., the strength and intensity of their relationship with their mentor), and to what extent is achievement of employment, education, and other outcomes associated with the types of services received?

Question #6: Based on the results of the follow-up surveys (and cost data collected through the process evaluation), what are the most cost-effective strategies for delivery of services to improve employment, education, and other outcomes for YPDP participants? Are there specific strategies that should be adopted to meet the needs of specific subpopulations of youth? Are there some strategies or subgroups for which the intervention appears ineffective? Are there ways that the intervention can be improved based on these findings?


Implementation Site Visits. A rigorous evaluation requires clear and specific documentations of the existing and bump-up program interventions received by control group and treatment group members in each of the grantee sites. To develop such documentation, a team of two experienced evaluators will visit each site twice and follow a specific, detailed field visit protocol and interview guide. The site visit data collection follows a standard protocol for conducting semi-structured interviews with selected staff and administrators. Each individual staff or administrator interview is conducted in a private office or room on-site following established procedures for maintaining strict individual privacy. Notes from the interview are either handwritten or entered onto a laptop computer. After each visit, the field notes will be stripped of any identifying information to guard against any violations of privacy provisions. Notes will be stored in a secure drive or file cabinet at the Urban Institute or its partners that can only be accessed by the evaluation team. (See Attachment B for the interview instrument.)

Follow-up Survey. The follow-up surveys of YPDP treatment and control group members at 18 months after random assignment will provide standardized and statistically valid data for analyses of impacts of YPDP services on such outcomes as employment and earnings, and educational attainment. The survey will provide data from randomly assigned treatment and control group members on the following topics (note: see Attachment C for a copy of the 18-month follow-up survey instrument and specific questions to be asked under each of these topics):


  1. Service Receipt

  2. Mentoring Services

  3. Educational Attainment Since Random Assignment

  4. Employment and Earnings

  5. Receipt of Cash Assistance

  6. Receipt of Food Stamps and Other Assistance

  7. Family Composition/Change

  8. Relationship/Engagement with Children

  9. Food Security

  10. Housing and Housing Security

  11. Family Income/Contact Information


The surveys will be one of several key data sources to support the analysis of the net impact on participants of the YPDP intervention. Survey data will be carefully integrated with data from the PTS, qualitative findings from site visits conducted to each of the YPDP sites, and other administrative data (e.g., Unemployment Insurance wage record data) to report on outcomes, net impacts, and cost effectiveness of YPDP). All results and materials developed from the analyses of this data collection effort are intended to reach multiple audiences, including:


  • ETA and DOL staff;

  • Policymakers at the state and federal levels of government looking to design programs and services to be responsive to the needs of parenting youth;

  • Employment and training/workforce development groups and associations; and

  • Community colleges, technical colleges, workforce investment agencies; and organizations, other similar training providers, and human service agencies providing a range of services for parenting youth.


The final report produced under the process/implementation and impact analysis components of the YPDP evaluation (which will include the survey results) will be posted on the ETA website. The surveys, and the reports they inform, will help ETA better understand whether and how employment, training, education, and parenting services can and should be implemented at the state and local levels to assist young parents and expectant youth. The survey results will help ETA, states, and local workforce investment areas to identify potential problems/challenges to effectively serving parenting youth. These results will also inform possible changes to policy, strategies, and service delivery to enhance program performance and cost-effectiveness of services provided to young and expectant parents through a wide variety of local service providers including American Job Centers, workforce investment boards, community-based organizations, and educational/training institutions.




3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.

PTS. The PTS is a web-based information system used for gathering background information on participants and services received, as well as executing random assignment for the evaluation. The use of this technology minimizes the burden on grantee staff to enter data and execute random assignment. The YPDP grantee staff enter data into the PTS and use the system to randomly assign participants into the treatment and control groups. In addition to being trained by the evaluation contractor on how to collect and enter data into the PTS, each site has been provided with a user manual fully documenting PTS data collection and entry procedures. A web-based training module on the YPDP SharePoint site maintained by the evaluation contractor is available to each YPDP site. This training module provides an overview and basic instructions for entering data into the PTS for both new users and as a refresher for existing system users.

Follow-up Survey. The data collection of the telephone follow-up surveys will be accomplished through the use of CATI. CATI systems collect responses 100 percent electronically. The systems also perform a number of functions to avoid errors including:


  • providing correct question sequence;

  • automatically executing skip patterns based on prior question answers (which decreases overall interview time and consequently the burden on respondents);

  • recalling answers to prior questions and displaying the information in the text of later questions;

  • providing random rotation of specified questions or response categories (to avoid bias);

  • ensuring that questions cannot be skipped; and

  • rejecting invalid responses or data entries.


The CATI system lists questions and corresponding response categories automatically on the screen, eliminating the need for interviewers to track skip patterns and flip pages. Moreover, the interviewers enter responses directly from their keyboards, and the information is automatically recorded in the computer’s memory. CATI allows the computer to perform a number of critical quality assurance routines that are monitored by survey supervisors, including tracking average interview length, refusal rate, and termination rate by interviewer, and performing consistency checks for inappropriate combinations of answers.




4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


The information being recorded into the PTS for the YPDP evaluation is not otherwise available in the format required for conducting secure and accurate random assignment in a systematic manner across sites. Each grantee submits quarterly reports to ETA; however, the summary program reports present financial information only and exclude participant details needed to conduct an impact analysis and program descriptive information needed to conduct a process/implementation analysis. With the exception of earnings data available from the Unemployment Insurance wage record system maintained by states (which will be an additional source of data that will be used in the YPDP evaluation effort), there are no other sources of information that could be used in place of the PTS and the follow-up survey at 18 months to systematically assess outcomes for YPDP treatment and control group members. The administrative data available do not permit systematic and timely analyses of differences between treatment and control groups on, for example, hourly wages, number of hours worked each week, views on helpfulness of services, changes in housing status, changes in family composition, views on quality of mentoring services received, and other topical areas that are the focus of the follow-up survey. The existing data are not satisfactory to report on the impacts of YPDP services and the extent to which such services make a statistically significant difference in terms of employment, earnings, educational achievement, and parenting outcomes.




5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


The PTS, YPDP implementation field-based data collection, and the 18-month follow-up survey do not impact small businesses or other small entities (other than those that are YPDP grantees). The impact on grantee organizations is minimal, involving staff input of participant information into the web-based PTS -- about 1 hour staff time per month -- and meeting with the evaluation team during the site visits (about 45 minutes of time for each of up to 8 staff members and/or administrators per site.).




6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


PTS. The consequences of not collecting information through the PTS are: (1) the integrity of the random assignment process would be compromised if grantees were to handle this activity directly; and (2) valid and comparable data would not be available for analysis across the YPDP sites relating to numbers and types of individuals served, services received, and outcomes.


Implementation Site Visits. The consequences of not conducting the field-based implementation site visits and process analysis is that there will be a lack of in-depth information about the nature of the strategies developed and employed at YPDP sites to improve the employment, education, and training outcomes of young parents. Site visits will provide an opportunity to fully document the services being delivered to treatment and control group members, which is an essential part in an experimental design for understanding, for example, if employment outcomes at various points after random assignment are potentially associated with varying services received by treatment and control group members. If there are positive net impacts for the treatment group, it will be vital to understand the specific intervention(s) received by treatment group members so that they could potentially be replicated by other employment and training programs serving parenting youth.


Follow-up Survey. Without the follow-up survey, ETA would not have the full range of outcome information needed to gauge the net impacts of YPDP services on treatment group members. For example, the follow-up surveys conducted 18 months after random assignment will allow for systematic comparison of employment, earnings, education, and parenting outcomes for the treatment and control group members to determine net impacts of the YPDP. The surveys will also provide valid and reliable data for understanding variation in types of services received by YPDP participants and the views of YPDP participants concerning the usefulness of services received.


Overall, the federal investment of resources into the YPDP and the requirement to conduct a rigorous evaluation of the implementation strategies and their impacts requires the systematic collection of program and participant data, as intended under the relevant sections of WIA and now WIOA (see Attachment D). The integrity of the random assignment process requires a secure and technically sound tracking system. With significant expenditures involved in implementing the YPDP, it is critical to document the different models and projects that are operating under the initiative, examine and assess the implementation to date, and identify innovative features and potentially promising strategies. The site visits are critical to this evaluation project, as it represents the only opportunity to gather comprehensive information on implementation from all grantees. The survey is an essential component of the evaluation project because it enables the evaluation team to systematically assess and identify which of the strategies have been useful for the participants and to associate service participation with the outcome measures of interest. Given the significant expenditures involved in implementing employment, training, and other human service programs for parenting youth, as well as the importance that Congress, ETA, and the public places on cost-effectively serving parenting youth, the proposed data collecting effort is of critical importance.




7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* Requiring respondents to report information to the agency more often than quarterly;

* Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* Requiring respondents to submit more than an original and two copies of any document;

* Requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* In connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* Requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* Requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


There are no special circumstances that would cause this information collection to be conducted in any manner listed above.




8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden. Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.

Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years—even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


The original notification of this data collection was published in the Federal Register, Vol. 76, No. 50, (Tuesday, March 15, 2011: pp. 14099 – 14100), a copy of which is in Attachment E. Readers were given 60 days from the date of publication to submit comments. No comments were received.


The notification of information collection, reinstatement with change, was published in the Federal Register (Vol. 80, p. 39161) on Wednesday, July 8, 2015, a copy of which is in Attachment F. Readers were given 60 days from the date of publication to submit comments. ETA received one comment, from Jean Public, expressing a dislike for all public assistance programs, including job training programs, but did not make any comments specific to the information collection. A copy of the public comment is included in Attachment I.




9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


Grantee administrators and staff using the PTS or being interviewed during site visits will receive no payments or gifts. Respondents to the follow-up surveys will receive a $25 gift card upon completion of the survey. Research suggests that such incentive payments can make a critical difference in boosting response rates in telephone surveys. The likelihood of survey cooperation increases when respondents perceive a benefit from the survey and the costs of completing the survey are not burdensome. Research suggests that incentives may persuade sample members who have otherwise little reason to participate in the survey. Providing a small incentive adds to the perceived benefits of completing the survey and adds to the legitimacy of the request. A “token” monetary incentive works best because larger amounts are likely to be confused with a payment. (Singer, 2002, Dillman, 1978) Although we anticipate that most participants in the YPDP will have at least some interest in the survey topic due to their participation in the demonstration program, we also anticipate that some will be reluctant or skeptical or have time constraints. We suggest $25 as sufficient to entice reluctant respondents, to add legitimacy to the request for those that may have time constraints and to compensate for respondent’s time and effort expended in responding to the survey.4




10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


No assurances of privacy are provided for data entered into the PTS or to respondents interviewed during site visits. However, survey respondents will be informed that the highest standards will be applied to protect the privacy of all data collected.


PTS. Information on individual participants entered into the PTS are subject to the highest standards of privacy. (See Attachment A for details of the system and screen shots of the PTS, including the specific data elements being collected through the system on each YPDP participant.) The web-based application resides on a standard Windows server running a web service such as IIS or Apache that is physically located in a monitored, access-controlled, secure server room at the Urban Institute in Washington DC.  The web server has been hardened using a best practice security hardening checklist. The web server is dedicated solely to the PTS application. Administrator access to the database server is restricted to an authorized Urban Institute server administrator. Accounts on the web server are password protected. Passwords are at least 8 characters long and contain at least 1 special character and number and do not contain dictionary words.  These requirements are enforced upon account creation.  Passwords expire every 90 days and users have to create new passwords that fulfill the requirement of the password policy. A dedicated Secure Site License (SSL) is employed for this project and all non-encrypted access to the MIS application is restricted. Any logging or output files does not contain personal identifiable information (PII) and is limited to generic system and error data. All data transmitted from the grantee sites to the application server takes place over the encrypted SSL connection. Data extracts for use by the project team are available in files encrypted with the Pretty Good Privacy software and available to project team members on the Urban Institute’s Secure File Transfer Protocol site.

The system segregates user data into PII data and project specific data. PII data such as Social Security Numbers (SSNs) are stored in a separate database table containing a system generated ID number with the SSN stored in encrypted form. PII data are entered into the system but at no point are displayed or downloadable by users of the system. PII data are stored separately from project specific data and are available for updating only by the grantee that originally entered the data. Project specific data are available to the project team in specific extracts and reports.

Implementation Site Visits. The administrators and staff interviewed by evaluators when on-site will be assured that their responses will not be identified by the individual in any reports nor will interview notes be shared with ETA Individuals will be interviewed separately and in private offices. (See interview guide in Attachment B for the statement that will be used during site visits to assure respondents of privacy.) To preserve privacy, paper copies of interview notes will be secured in a locked file cabinet. If any notes are recorded on laptop computers, such notes will be stored in a Structured Query Language, also known as SQL, Server database located in an access-controlled server room at the Urban Institute.


Follow-up Survey. The YPDP participants surveyed are assured that their responses will be kept private within the limits of the law (see survey introduction in Attachment C which provides assurance of privacy for respondents). The survey data are stored on an evaluation contractor computer that is protected by a firewall that monitors and evaluates all attempted connections from the Internet. PII data on each survey respondent are maintained in a separate data file and apart from the survey data so that it is not possible to link particular responses to individual respondents. Once the evaluation is completed, all PII data on each respondent will be destroyed. The entire survey database is encrypted so that any data stored will be further protected. Finally, access to any data with identifying information is limited only to contractor staff directly working on the survey. All findings in any written materials or briefings will be presented at the aggregate level and it will not be possible to link specific responses to individual respondents in any way. The contractor will not include any identifying information such as names, addresses, telephone numbers, or SSNs in the database delivered to ETA. The basis for the assurance of privacy for the 18-month follow-up survey is the Privacy Act.




11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


PTS. The PTS, as detailed in Attachment A, records individual identifying information of a sensitive, personal, and private nature, including (1) last and first name; (2) SSN; (3) date of birth, (4) ethnicity and race; (5) marital status; (6) whether the individual is pregnant or expecting a child; (7) number of children; (8) whether the individual is a Temporary Assistance for Needy Families (TANF) or Supplemental Nutrition Assistance Program (SNAP) recipient; (9) name and contact information for three individuals who can be contacted if the program cannot locate the individual; and (10) hourly wage at 6, 12, and 18 months after random assignment.


The reasons why these questions are considered necessary, and the specific uses to be made of this information:


  • The last and first names are needed by YPDP site staff to be able to effectively use and update the data system in a reliable, efficient, and user-friendly manner (e.g., to be able to easily locate an individual’s record and update the record with service and outcome data).


  • The SSN of each YPDP participant is needed by YPDP grantee sites for performance management purposes. The SSN data are immediately encrypted upon entry into the PTS. SSN is included in the PTS to enable sites to use the PTS as a single, stand-alone automated case management system. In addition, the collection of SSN will allow the possibility (at a later date) for individual grantee sites to match individual participants with other administrative data (such as data they may maintain in other data systems managed by the grantee or with other external administrative records systems, such as unemployment insurance wage record data collected by state workforce agencies).


  • Other data which are of a sensitive nature – including date of birth, ethnicity, race, marital status, whether the individual is pregnant or expecting a child, number of children, and whether the individual is a TANF or SNAP recipient -- are needed to support detailed analyses of the types of youth in the treatment and control groups receiving YPDP services, as well as to conduct analyses of services received and educational/employment outcomes by various subgroups. In addition, YPDP sites require data such as, the date of birth, number of children, and whether the individual is pregnant or expecting a child, to determine whether individuals are eligible to participate in YPDP (e.g., are between 16 and 24 years of age and are parenting or expecting a child).


  • Name and contact information for up to three individuals who know the participant are necessary in order for the YPDP site administrators and staff to locate the participant should they lose contact to help ensure the program is able to continue to provide YPDP services, to reduce attrition from the program, and to help sites locate participants to collect outcome and exit information (e.g., employment status at 6, 12, and 18 months after random assignment).


  • Hourly wages for each YPDP participant at 6, 12, and 18 months after random assignment are needed as an important short-term employment outcome measure.


Prior to participating in YPDP, demonstration site staff provides a description of the YPDP program initiative, and the informed consent and random assignment process. All applicants must read and sign an informed consent form prior to being randomly assigned. If the individual is younger than 18 years of age, her/his parent must also sign this form. A copy of the informed consent form for YPDP participants is attached at the end of Attachment A, along with a sample of the PTS forms being used. All participants are also informed that collection of data for the PTS is voluntary and they are not required to answer any of the questions on the PTS forms.


Implementation Site Visits. There are no data of a sensitive, personal, or private nature collected in the site visit interview guide.


Follow-up Survey. There are several questions that are of a sensitive, personal, and private nature that are being collected as part of the 18-month follow-up survey. The following questions may be considered as sensitive, personal or private by participants (see Attachment C for the specific questions): (1) regular hourly pay for the participant’s current or most recent job (Question 32 and 33); (2) receipt of cash assistance from a state or county welfare program (Question 37), type of assistance received (Question 38), and months of cash assistance received over the past 18 months (Question 39); receipt of SNAP or food stamp benefits (Question 40); receipt of Medicaid or similar state health program (Question 44) and whether children were covered by Medicaid or similar state health program (Question 45); number of people living with the participants and changes in family composition over the past 18 months (Questions 48 through 52); marital status and whether it has changed over the past 18 months (Questions 53 and 54); whether the individual is currently living with her/his spouse or partner (Question 55) or her/his parents (Question 56); how many biological children the participant has living in her/his household (Question 57) and whether and how many children live elsewhere (Questions 58 and 59); whether the participant receives or pays child support, and if so, number of payments and amounts received over the past 18 months (Questions 60 through 65); extent of the participant’s relationship and engagement with her/his children (Questions 66 through 69); whether the participant was unable to pay her/his mortgage, rent or utility bills during the past 18 months (Question 73) and whether the participant or her/his children moved in with other people during the past 18 months because of inability to pay mortgage, rent, or utility bills (Question 74); and the combined income of all members of the participant’s family during the past 18 months (Question 77).


Each of the data items identified above to be collected at 18 months after random assignment are necessary to evaluate the differences in critical outcomes for parenting youth enrolled in the treatment and control groups. The data permits analyses of impacts between the control and treatment groups on a range of outcome variables that would not otherwise be available or possible without the follow-up survey data, including changes in hourly wages, earnings, and family income; receipt of public assistance; marital status and family composition; and the ability to meet food and housing needs of the family. Combined with data collected as part of the PTS, it is possible to conduct subgroup analysis of participant outcomes (e.g., for different participant characteristics at the time of random assignment). As part of the informed consent process prior to random assignment participants are asked to provide consent for collection of follow-up data and informed about the likelihood that one or more follow-up surveys will be conducted. They are also informed at the time of random assignment that such follow-up surveys are voluntary on their part and that they may or may not provide answers to some or all of the survey questions. Additionally, as reflected in the introduction to the follow-up survey, participants are again informed that the survey is voluntary on their part and that their responses to each question will remain private within the limits of the law (see introduction to each of the follow-up surveys in Attachment C for the explanation of the purposes of the survey, its voluntary nature, and assurances of privacy to be provided to each survey respondent).




12. Provide estimates of the hour burden of the collection of information. The statement should:

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under “Annual Cost to Federal Government.”


Table 2 below provides the estimate of the respondent hour burden for (1) collecting data from treatment and control group participants for entry into the PTS; (2) conducting interviews with YPDP site administrators and staff during the process/implementation site visits to the four YPDP sites, and (3) conducting the follow-up surveys with YPDP treatment and control group participants in mentoring sites at 18 months after random assignment into YPDP. Across these data collection efforts, the total hour response burden is 2,116 hours.


The estimate for the hour burden for collection of participant data for the PTS is 1,633 hours. Enrollment in YPDP is expected to be 1,633 treatment and control group participants. Data will be collected and entered into the PTS on all YPDP participants at six points in time averaging approximately 10 minutes each: (1) at intake, (2) at random assignment, (3) to record service receipt, and (4) to record employment outcomes at 6, 12, and 18 months after random assignment. Total hours burden per YPDP participant is estimated at one hour. With PTS data being collected on all YPDP participants (100 percent response rate), the total estimated response burden for the PTS data collection is 1,633 hours.


The Implementation Field Site Visit Discussion Guide will be used to interview an average of eight administrators and staff twice in each of the 4 YPDP sites – a total of 32 respondents. The estimated response rate is 100 percent, since when arranging for the site visits, evaluators will confirm scheduled times for interviewing key administrators and staff. The estimated response time for the interviews is an average of 45 minutes. Total estimated hourly response burden for the site visits is 48 hours.


The 18-month follow-up surveys will each be conducted with a total of 1,633 participants at the YPDP sites with mentoring bump-up services. The estimated response rate is 80 percent – yielding valid responses from a total of 1,306 YPDP participants to each survey. The estimated response time for the survey is 20 minutes. This will result in a total estimated response burden of 435 hours for the 18-month survey.


The total annualized cost to respondents for the survey also is presented in Table 2 below. The total estimated costs for these data collection activities are $44,210, updated using more recent hourly wage averages from the Bureau of Labor Statistics (BLS) and eliminating the 36-month follow-up survey. The average hourly wage in that table for the PTS and follow-up surveys is $20.81, based on the BLS average hourly earnings of production and nonsupervisory employees on private, service providing, nonfarm payrolls (January 2015 National Industry-Specific Occupational Employment and Wage Estimates, from the U.S. Department of Labor, Bureau of Labor Statistics and available on the department’s website). The hourly wage in that table for the interviews with YPDP administrators and staff during the YPDP site visit is the average hourly wage of $24.84, based on the BLS average hourly earnings of all employees on private, service providing, nonfarm payrolls (January 2015 National Industry-Specific Occupational Employment and Wage estimates, from the U.S. Department of Labor, Bureau of Labor Statistics and available on the department’s website.)


TABLE 2: TOTAL BURDEN OF THIS INFORMATION COLLECTION


Data Collection Activity

Number of Respondents


Frequency

Total Annual Responses

Time Per Response

Total Annual Burden (Hours)

Hourly Rate*

Monetized Value of Respondent Time

PTS – Data Collection

1,633

6

9,798

10 minutes

1,633

$20.811

$33,983

Site Visit Interviews

32

2

64

45 minutes

48

$24.482

$1,175

18-Month Survey

1,306

1

1306

20 minutes

435

$20.81

$9,052

Unduplicated Totals

2,971


11,168


2116


$20.81

$44,210

1U.S. Department of Labor, Bureau of Labor Statistics, Table B-8. Average hourly and weekly earnings of production and nonsupervisory employees on private nonfarm payrolls by industry sector, seasonally adjusted (accessed from the following website as of January 2015: http://www.bls.gov/news.release/empsit.t24.htm).

2U.S. Department of Labor, Bureau of Labor Statistics, Table B-3. Average hourly and weekly earnings of all employees on private nonfarm payrolls by industry sector, seasonally adjusted (accessed from the following website as of January 2015: http://www.bls.gov/news.release/empsit.t19.htm).




13. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected on the burden worksheet).

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collections services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


The proposed data collection for the PTS, YPDP on-site visits, and follow-up survey will not require respondents to purchase equipment or services or to establish new data retrieval mechanisms. There are no capital/start-up or ongoing operation/maintenance costs associated with this information collection. The field-based implementation/process data collection involves semi-structured interviews discussing staff and administrators’ descriptions of services and service delivery, and their experiences, opinions, and factual information. The follow-up survey content is based on the respondents’ experiences, opinions, and factual information. Therefore, the cost to respondents solely involves the time involved in being interviewed or in entering data into the PTS. These costs are captured in the burden estimates provided in A.12.


(a) We do not expect any total capital and start-up costs.

(b) We do not expect extensive time spent on generating, maintaining, and disclosing or providing the information.




14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.


The only cost to the Federal government associated with this data collection is the overall cost of the contract with Capital Research Corporation and the Urban Institute to conduct the process and impact evaluations of the Young Parents Demonstration Project. The total cost of YPDP evaluations to be conducted over a seven-year period is $5,570,000. The total cost of YPDP evaluations to be conducted over a five-year period is $5,570,000. To average the annualized costs of the evaluation effort, divide the five-year total by five for an average annual cost of $1,114,000. The costs of the three data collection efforts that are the focus of this supporting statement (i.e., development/implementation of the PTS, conduct of implementation site visits, and conduct of the 18-month follow-up survey) are included in these costs, as well as (a) design, implementation, and monitoring of random assignment in the sites, and (b) data analysis and preparation of reports and briefing documents. Table 4 provides an overview of the annual costs for the entire evaluation effort to the federal government.


TABLE 3: TOTAL ANNUALIZED COSTS TO THE FEDERAL GOVERNMENT


Year

Dates

Costs

1

2011-2012

$1,114,000

2

2012-2013

$1,114,000

3

2013-2014

$1,114,000

4

2014-2015

$1,114,000

5

2015-2016

$1,114,000

Total


$5,570,000


The complete cost of the three data collection activities, which are the subject of this submission, is estimated at $4,456,000, including contract staff salaries. The costs of the three data collection activities (i.e., development/implementation of the PTS, conduct of implementation site visits, and conduct of the 18-month follow-up surveys) are estimated at 80 percent of the total evaluation budget. Thus, if $4,456,000 divided by five years is annualized at $891,200.




15. Explain the reasons for any program changes or adjustments reported on the burden worksheet.


This is a request to continue data collection for this project. The burden and cost estimate has changed above for two reasons. First, the evaluation will no longer conduct the 36-month follow-up survey. Second, the evaluation team found it to be important to conduct a second set of visits to the sites to ensure they track any changes to implementation since 12 months after random assignment and to clarify any information as needed. It will help ensure that the evaluation can fully explain the programs’ implementation and the impacts observed for YPDP participants.




16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


The analysis that will be involved in the YPDP evaluation will focus on the qualitative field-based implementation analysis, descriptive summary tabulations of participant data drawn from the PTS, and estimation of net impacts of YPDP services based on data collected from the follow-up survey and Unemployment Insurance wage record data. Together, the quantitative and qualitative analysis will describe the structure, operations, and service delivery in the YPDP grant-funded programs; document the pre-existing services and the mentoring interventions; present tables showing the demographic characteristics of the treatment and control group members; and estimate net impacts of YPDP services on participant employment, earnings, education, and parenting outcomes. Because the YPDP evaluation employs an experimental design with YPDP participants being randomly assigned to treatment and control groups, the analysis of net impacts will be statistically straightforward (e.g., using t-tests to determine if the means for the treatment and control groups are statistically different on, for example, percent employed or total earnings at 18 months after random assignment). Data will be presented in summary formats such as tables, charts, graphs that allow ETA and other stakeholders to easily understand the range of YPDP programs, the bump-up models employed and their implementation, number and characteristics of YPDP participants, service utilization, participant outcomes, and net impacts.


The time schedule for the project, including beginning and end dates of the collection of information and completion of reports are as follows (note: reports are asterisked):


YPDP Program or Evaluation Activity

Completion Date

YPDP Grants Issued

6/30/2011

Enrollment Began

2/1/2012

Site Visits Begin

6/30/2013

Enrollment Ends

6/30/2014

18-Month Follow-up Survey Started

9/30/2013

Site Visits Completed

8/30/2015

18-Month Follow-up Survey Completed

3/31/2016

Final Report Submitted (PTS, Site Visits, 18-Month Survey)

6/30/2016


Preliminary analysis of data collected through the PTS, site visits, and survey will begin shortly after each of these data collection efforts is initiated. For example, as data are entered into the PTS it will be possible to conduct frequency distributions and cross-tabulations to (preliminarily) examine participant characteristics, service utilization, and interim outcomes. Similarly, in using the CATI system to conduct the follow-up survey, it will be possible to tally results on each survey question as the survey sample builds. Once all of the YPDP participants have been enrolled, fully served, and exited from YPDP services and follow-up data on employment has been entered into the system, the evaluation team will be able to conduct complete analyses of the PTS.


A final evaluation report is planned. The report, providing results based on the site visits, the PTS, and 18-month follow-up survey will be submitted by June 30, 2016. The final report will categorize grant strategies and bump-up models; analyze enrollment and participant characteristics; assess implementation success and challenges; analyze service utilization/receipt; provide overall, and by YPDP site, in-depth analyses of participant outcomes for all YPDP sites; estimate net impacts at mentoring sites only; and provide study conclusions and recommendations. The final report will be submitted to ETA in Microsoft Word and PDF format (for publication on the ETA website). In addition, public use data files, stripped of individual identifiers, will be submitted to ETA with the final report which will include (1) the participant data entered into the PTS and (2) the 18-month follow-up survey.




17. If seeking approval not to display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The OMB approval number and expiration date will be displayed or cited on all information collection instruments.




18. Explain each exception to the topics of the certification statement identified in “Certification for Paperwork Reduction Act Submissions,”


There are no exceptions to the certification statement.




1 U.S. Department of Labor, Employment and Training Administration. 2008. “Young Parents Demonstration Program (YPDP) SGA/DFA PY 08-08,” Federal Register, Vol. 73, No. 193, October 3, 2008 (available over the Internet at: http://edocket.access.gpo.gov/2008/pdf/E8-23319.pdf).

2 Workforce Investment Act of 1998, Title I, Subtitle D, Section 171 (b), Public Law 105-220. See Attachment D for the authorizing language and Sections 171 that authorized the development of a demonstration project for young parents.

3 Workforce Investment Act of 1998 (WIA), Title I, Subtitle D, Section 172, Public Law 105-220 and Workforce Innovation and Opportunity Act of 2014 (WIOA or The Act), Title I, Subtitle D, Section 169, Public Law 113-128. See Attachment D for the authorizing language on evaluations and Section 172 of WIA and Section 169 of WIOA.

4Dillman, D. A. (1978), Mail and Telephone Surveys: The Total Design Method, NY: John Wiley and Sons. Inc. Singer, Eleanor (2002), "The Use of Incentives to Reduce Nonresponse in Household Surveys," in Groves, et al. Survey Nonresponse (New York: Wiley and Sons, 2002). Additional studies assessing the utility of incentive payments for boosting survey response rates include: Singer, Eleanor, John Van Hoewyk, Nancy Gebler, Trivellore Raghunathan, and Katherine McGonagle (1999), “The Effect of Incentives on Response Rates in Interviewer-Mediated Surveys,” Journal of Official Statistics 15: 217-230; and Singer, Eleanor, John Van Hoewyk, and Mary P. Maher (2000), “Experiments with Incentives in Telephone Surveys,Public Opinion Quarterly 64:171-188.

14

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT 1995
AuthorAdministrator
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy