1205-0503 Supporting Statement Part A 10-1-2015

1205-0503 Supporting Statement Part A 10-1-2015.docx

YouthBuild Impact Evaluation: Youth Follow-Up Surveys

OMB: 1205-0503

Document [docx]
Download: docx | pdf

youthbuild evaluation participant follow-up survey, extension

omb control no.: 1205-0503

October 2015

Part a: Supporting statement

youthbuild evaluation participant follow-up survey, extension

omb control no.: 1205-0503


The Impact Evaluation of the YouthBuild Program is a seven-year experimental design evaluation funded by the U.S. Department of Labor (DOL), Employment and Training Administration (ETA), with initial funding support from the Corporation for National and Community Service (CNCS). YouthBuild is a youth and community development program that addresses several core issues facing low-income communities: education, employment, crime prevention, leadership development, and affordable housing. The program primarily serves high school dropouts and focuses on helping them attain a high school diploma or general educational development (GED) certificate and teaching them construction skills geared toward career placement. The evaluation will measure core program outcomes including educational attainment, postsecondary planning, employment, earnings, delinquency and involvement with the criminal justice system, and youth social and emotional development.

The evaluation contract was awarded to MDRC in June 2010. The evaluation began in the fall of 2011 and is scheduled to continue until June 2017. MDRC is the prime contractor; Mathematica Policy Research (Mathematica) and Social Policy Research Associates (SPR) are subcontractors that assisted MDRC with designing the study and implementing random assignment. They will continue to assist with analyzing data collected for the study and reporting the study’s findings. The YouthBuild evaluation design includes an impact component, an implementation component and a cost-effectiveness component. A total of 75 YouthBuild grantees are participating in the evaluation, including 58 programs that received DOL funding in 2011 and 17 CNCS-funded programs that did not also receive DOL funding during the same period.

The evaluation will assess YouthBuild’s operation, participation by youth, and impact on youth’s education, employment, criminal and personal development outcomes. It will also determine the net cost of the impacts generated.

DOL has submitted several requests to the Office of Management and Budget (OMB) as part of the YouthBuild evaluation (see Table A.1). The full package for this study has been submitted in separate parts because data collected through the evaluation’s initial stages informed the development of the subsequent data collection instruments. On June 15, 2011, OMB approved DOL’s request to administer a grantee questionnaire to YouthBuild programs (see ICR Reference #201005-1205-002), designed to provide basic information about how YouthBuild programs are structurally managed and operated relative to other youth training and education programs. The information was to be collected from all 2011 DOL- and CNCS-funded YouthBuild grantees to provide initial information about operations and to develop sufficient information to select a subset of grantees to participate in the impact component of the evaluation.

A second clearance request, to collect baseline and program service and activity data from YouthBuild participants was approved on April 18, 2011 (see ICR Reference #201008-1205-002). This information was collected via a web-based management information system (MIS) and is a key component of the departmental management of the program that supports case management and performance reporting. The MIS allows program operators to track services, outcomes and random assignment status of youth participating in the YouthBuild evaluation. On March 13, 2012, OMB approved DOL’s request to administer a YouthBuild grantee questionnaire (see ICR Reference #201108-1205-005) to all programs selected to participate in the impact component of the evaluation. A fourth clearance request, which established protocols for conducting YouthBuild site visits, was approved on November 2, 2012 (see ICR Reference #201202-1205-002).

Finally, on December 18, 2012, OMB approved DOL’s request to administer three follow-up surveys (at 12, 30 and 48 months) with youth who were randomly assigned to either the treatment group or control group in the 75 sites selected for the impact component of the evaluation (see ICR Reference #201208-1205-007).

This package requests clearance for a revision to the OMB approval to administer the follow-up surveys, which expires on December 31, 2015. While both the 12- and 30-month follow-up surveys will be completed under the existing approval, some of the 48-month data collection will extend beyond the end of 2015, hence we are requesting approval through June 2018, the end of our total contract period. In addition, minor modifications are proposed for the 48-month instrument. Those changes were tested and a summary of pretest findings can be found in Appendices G1 and G2. More specifically, this request seeks clearance for the continued collection of the 48-month follow-up survey (Appendix A and A3) which is discussed in detail below, including the addition of 5 questions focused on social and emotional development and the removal of 16 questions designed to facilitate future contact (since this is the last survey administration and further contact will not be necessary). This package includes the 48-month follow-up survey and related respondent materials, specifically:

  1. Survey Instrument – Computer Assisted Telephone Interviewing Version, with proposed revisions (Appendix A)

  2. Survey Instrument – Computer Assisted Telephone Interviewing Version with track changes noting changes from previously approved survey (Appendix A1)

  3. Informed Consent Form (Appendix A2)

  4. Survey Instrument – Web Version, with proposed revisions (Appendix A3)

  5. Survey Instrument – Web Version with track changes noting changes from previously approved survey (Appendix A4)

  6. Participant Advance Letter (Appendix B)

  7. Participant Advance Email (Appendix B1)

  8. Participant Interim Letter (Appendix C)

  9. Workforce Investment Act—Section 172 (Appendix D)

  10. Workforce Innovation and Opportunity Act – Section 169 (Appendix D1)

  11. Frequently Asked Questions (Appendix E)

  12. 60-Day Federal Register Notice (Appendix F)

  13. Summary of Pretest Findings 2012 (Appendix G)

  14. 48-Month Survey Pretest Findings 6/2014 (Appendix G1)

  15. 48-Month Survey Pretest Findings 11/2014 (Appendix G2)

  16. Certificate of Confidentiality (Appendix H)

  17. Social Media Outreach Protocol (Appendix I)

  18. 12-Month Incentive Experiment Findings (Appendix J)


Table A.1. OMB Clearance Requests

Request #

Data collection

Date approved

OMB control #

1

YouthBuild Grantee Site Selection Questionnaire

June 5, 2011

1205-0436

2

YouthBuild Reporting System

April 18, 2011 Rev. May 22, 2012

1205-0464

3

Evaluation Grantee Survey

March 13, 2012

1205-0488

4


YouthBuild Evaluation Site Visit Protocols

November 02, 2012

1205-0502

5


Evaluation Participant Follow-up Survey

December 18, 2012

1205-0503

6

(Current request)

Evaluation Participant Follow-up Survey, extension


1205-0503



A. Justification

1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.

The YouthBuild evaluation design includes an impact component, an implementation component and a cost-effectiveness component. The follow-up surveys are the key data source used to estimate the impacts of the program. Survey data on education and training participation, post-secondary enrollment, employment and earnings, criminal justice involvement, and social and emotional development—data that are not available from other sources—will inform DOL and CNCS of YouthBuild’s effects on a range of youth outcomes.

Second, the evaluation will examine whether impacts vary with certain program features, such as program fidelity, length of mental toughness orientation (which varies from several hours to several weeks across programs), or strength of post-secondary services. We will use multi-level estimation methods for this analysis, where the unit of analysis is the individual for Level One and the sites for Level Two. The site-level impact, then, is allowed to vary with site characteristics (e.g., implementation strength, program components, service contrast, etc.).

This evaluation of the YouthBuild Program is being carried out under the authority of the Workforce Investment Act, Section 172, (Appendix D) which states that “for the purpose of improving the management and effectiveness of programs and activities…the Secretary shall provide for the continuing evaluation of the programs and activities.” (WIA, Sec. 172(a) 1998). As well, the evaluation is being carried out under the authority of the new Workforce Innovation and Opportunity Act, Section 169 (Appendix D1) which also states that “for the purpose of improving the management and effectiveness of programs and activities…the Secretary…shall provide for the continuing evaluation of the programs and activities under this title” (WIOA, Sec. 169(a)(1)(A) 2014).

2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.

Clearance by OMB is currently being requested to complete administration of the 48-month follow-up survey (Appendix A and A3), and the advance and interim letters (Appendix B/B1 and Appendix C, respectively). The Frequently Asked Questions (FAQ) document (Appendix E) is available online for participants throughout the study. The data collected in the follow-up surveys will be used in the impact analyses. The FAQ document, and the follow-up survey are described in detail below, along with how, by whom, and for what purposes the information collected will be used. The justification for the additional questions and removal of the contact questions is also described.

a. Advance Letter

The participant advance letter (Appendix B) is sent to all study participants approximately two weeks prior to the email invitation (Appendix B1) that includes a link to the survey. The letter reiterates basic information about the study which sample members received before consenting to participate in the study, including the study sponsorship, the optional and protected nature of their responses, the incentive payment; and contact information in case they have questions about the study or the survey. Information about OMB is provided on the back of the letter. The advance letter includes a separate enclosure that provides a user identification and password for accessing the online survey. The link to the survey is sent separately.

b. Interim Letter

Two months before the start of data collection, we send out a reminder letter asking sample members to call our toll-free number, email us, or return an enclosed card to update their contact information (Appendix C). We enclose a pre-paid monetary incentive ($2 bill) with this mailing. A growing body of research suggests that pre-paying sample members with as little as $2 remains an effective way to increase response rates. Beebe et al. (2004) found that a $2 pre-paid incentive increased response rates by close to 10 percent on a mail survey of over 9,000 Medicaid beneficiaries. The study included an oversample of racial and ethnic minorities and found that the incentive had a comparable effect across different racial and ethnic strata.1

c. Frequently Asked Questions

All sample members are provided with a user ID and log in information for the web survey. The FAQ document (Appendix E) is available online for participants for the duration of the study via the web link. The FAQ addresses basic questions that participants might have regarding the study, including who is being invited to complete the surveys, who is sponsoring the study, the voluntary nature of survey responses and the survey team’s plans for protecting the respondents’ information. Interviewers have copies of the FAQs and will be able to answer questions as needed.

d. Follow-up Survey

Approximately two weeks after receiving the advance letter, study participants receive an email with the link to complete the follow-up survey (Appendix A3) on the web. The follow-up surveys are the primary source of outcome data to assess program impacts. These data will inform DOL and CNCS on how YouthBuild programs affect program participants’ educational attainment, postsecondary planning, employment, earnings, delinquency and involvement with the criminal justice system, and youth social and emotional development.

The sample for the follow-up surveys includes 3,436 study participants (program and control group members) who were between the ages of 16 and 24 at the time they consented to participate in the study and were randomly assigned. Random assignment began in August 2011 and was completed in January 2013.2

We expect to achieve an 80 percent response rate, meaning that we will obtain completed surveys from 2,749 respondents to each round of the survey. In order to achieve at least an 80 percent response rate, we will employ the following approach, designed to maximize efficiency:

  • A multi-mode approach that will deploy the most cost-effective modes first. For the YouthBuild evaluation, we are implementing a multi-mode survey that begins on the web, and then moves to more intensive methods – Computer Assisted Telephone Interviewing (CATI) and in-person locating – as part of our nonresponse follow-up strategy. We provide a hardcopy questionnaire to any sample member who requests one but, increasingly, our studies of youth find a preference for completing surveys online. Mathematica recently conducted the College Student Attrition Project (CSAP) for the Mellon Foundation on which over 80 percent of the completed surveys were done online. While we do not anticipate as high a web-completion rate for the YouthBuild evaluation – the YouthBuild sample is more disadvantaged than the CSAP population – our approach emphasizes communicating with youth through the media that they prefer.3

  • A clear, stream-lined survey instrument that will make responding easy and ensure accurate responses. The survey is designed to be as brief as possible, with clear, easy-to-answer questions (mostly closed questions with a few open-ended questions).

  • An official letter will gain attention and legitimize the study. An advance letter with log-in information is mailed to sample members, helping legitimize the study and further encouraging participation.

  • A staged outreach strategy for reluctant responders will result in a high conversion rate. Beginning in week 2, we send e-mail reminders to those who have not responded. We conduct follow-up telephone calls (in addition to e-mail reminders) to non-responders beginning in week 4. Mathematica telephone interviewers are trained in refusal conversion. Experienced, expert refusal converters are assigned to work with the reluctant responders to maximize conversion rates. Lastly, field locators attempt to find and gain cooperation from those sample members who cannot be reached by web or telephone. (Additional details and justification for our approach is provided in the description of the incentive experiment below).

The data collected through the surveys will allow the evaluation team to measure the full range of outcomes associated with participation in the YouthBuild program. YouthBuild programs are based on a youth development framework that emphasizes academic achievement, vocational skills, and personal development as the primary means by which youth attain self-sufficiency through employment. Program participants receive a range of educational services that lead to a high school diploma or GED. These services include basic skills testing and instruction, remedial education, as well as assistance with post-secondary planning and financial aid. Program participants may be elected to represent their peers on a Youth Policy Council, a committee that is charged with making recommendations for improving the YouthBuild program; serve on a Youth Advisory Council; participate in public speaking events on behalf of the program; and gain valuable community service experience by volunteering for a local organization. In addition, participants spend almost half of their time in construction training – rehabilitating or building new housing for low-income and homeless families in their communities – or in other vocational skills training.

Based on our knowledge of the YouthBuild program, the study’s research objectives, and our review of the literature on educational and employment interventions for youth, the survey includes the following five core topics in each of the follow-up surveys:

  • Service Receipt: This section assesses respondents’ participation in education, training, employment and youth development activities, all of which will provide important information about the services that treatment and control group youth received during the study period (from YouthBuild or elsewhere). Obtaining information on service receipt is important for obtaining rates of participation in YouthBuild. In addition, it is critical to accurately measure the “treatment difference” between the two research groups.

  • Educational attainment: This section collects information on respondents’ highest level of education completed, post-secondary educational plans, and post-secondary enrollment. Increased educational attainment is one of the primary goals of the YouthBuild program.

  • Employment: This section gathers detailed characteristics of current or most recent job, job and earnings history, and job readiness. It is hypothesized that the vocational training and hands-on experience that YouthBuild provides will help participants obtain stable, well-paying jobs, either in construction or in other fields.

  • Criminal justice involvement and delinquency: This section collects information about respondents’ delinquent activities, arrests, convictions, and incarcerations. The YouthBuild program model focuses on social and emotional development and positive engagement with the community. In addition, some programs focus primarily on youth offenders and offer YouthBuild as an alternative to sentencing. It is expected that the program, by providing positive, caring adult role models, positive peers, and educational and vocational services, will result in a reduction in criminal behavior and delinquent activities.

  • Current living arrangements, marital status, and fertility: This section asks respondents about their current living situations, partnership arrangements, and pregnancy histories. These questions will measure the effects that the YouthBuild program has on respondents’ transitions into adulthood.

Each of the surveys also includes topic modules on one or more dimensions of adolescent development such as: social and emotional development, identity development, and health and well-being. YouthBuild is expected to positively influence each of these developmental domains. These modules are included, and identified, in the survey (Appendix A and A3).

For the 48-month survey, ETA would like to delete questions that are intended to allow the evaluation team to maintain contact with the survey respondent for future surveys. Since this will be the final cycle of the survey, these questions will no longer be necessary. Thus, the added burden to pose these questions cannot be justified for this survey. ETA also proposes adding a few questions to the survey. The questions measure self-efficacy, community involvement and debt management, all of which will enhance the results of the evaluation by adding more nuanced final outcomes that the YouthBuild program attempts to influence. Further description of and rationale for adding these questions to the 48-month survey is provided below:

  • An abbreviated self-efficacy scale. This question includes a scale of six previously tested items, which seek to determine the study members’ belief in their ability to achieve results and be proactive, despite circumstances. Youth development experts note that programs like YouthBuild often have positive effects on youth self-efficacy and self-esteem. Further, self-efficacy has been shown to be a predictor of vocational outcome expectations and can influence career interests, goals, and behaviors. Adding these questions will help inform key evaluation outcomes.

  • Questions about involvement in community activities. Although the survey as administered at 12 and 30 months after enrollment into the evaluation asks study members about civic engagement and leadership roles, process study findings suggest that we were not capturing some roles study members may play in their communities. The two proposed questions capture study members’ commitment to the community, but also their capacity for upward mobility and economic growth.

  • Debt management. The last question we propose asks if study participants have paid any money they owed. Given that assuming financial responsibility is an important step for young adults moving towards achieving self-sufficiency, this question measures study members’ commitment to honoring their financial obligations.

3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.

Advanced technology is used in data collection efforts to reduce burden on study participants. We administer a multi-mode survey that begins on the web, and then moves to more intensive methods – Computer Assisted Telephone Interviewing (CATI) and in-person locating – as part of our nonresponse follow-up strategy. The web and CATI surveys are identical (Appendix A shows the questions as they will be structured in both modes), and both web and CATI surveys have built-in verifications and skip patterns to ensure data are reliable and participants only answer questions that pertain to them. Web surveys reduce the amount of interviewer labor necessary to complete a data collection, and allow respondents to complete the questionnaire on their own schedule, in multiple sittings, and without having to return any forms by mail. We anticipate that not all study participants will be able to complete the survey on the web. For those sample members who do not complete the survey on the web, we attempt to complete it over the phone using a CATI survey. For those sample members who cannot be located by telephone, Mathematica’s specialized locating staff uses searchable databases, directory assistance services, reverse directories, and contact with neighbors and community organizations to obtain current contact information. We also include searches of social network websites (e.g. Facebook and Twitter; please see the Social Media Outreach Protocol in Appendix I), community and college websites, and prison websites. If electronic locating yields no results, field locators attempt to locate sample members by using recently known address(es) of residency. Field locators are equipped with cell phones and can dial into Mathematica’s Survey Operation Center (SOC) to allow a sample member to complete a CATI survey, once they have been located.

4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in item 2 above.

The YouthBuild evaluation strives to minimize data duplication. The survey focuses on information that is critical to the evaluation but not available from administrative records. For example, MIS data are available to measure participation in YouthBuild activities among program group youth. However, these data are limited to YouthBuild activities, and are not collected for youth in the control group. Assessing the program-control difference in participation in education and employment services is critical for the evaluation. Similarly, employment and earnings information may be available from administrative records,4 but these data do not provide information on job tenure and wages and other measures of job quality. YouthBuild is also likely to affect youth’s involvement with the criminal justice system, data that will only be available through the follow-up surveys. Finally, information on social and emotional development is not available from any other sources.

5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.

Collection of follow-up information will not impact small businesses or other small entities.

6. Describe the consequences to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles in reducing burden.

The evaluation represents an important opportunity for DOL and CNCS to add to the growing body of knowledge about the impacts of second-chance programs for youth who have dropped out of high school, including outcomes related to educational attainment, postsecondary planning, employment, earnings, delinquency and involvement with the criminal justice system, and youth social and emotional development.

If the information collection is not conducted, Federal program or policy activities will not be informed by high quality information upon which to base critical decisions regarding the impacts of the YouthBuild program, nor will DOL know whether this program is one that has substantial impacts upon its participants. Since the program continues to be funded through various grantees across the country, and DOL may wish to continue funding programs targeted to high school dropouts, it is imperative that rigorous information on the impacts of this program is obtained. The follow-up surveys are the primary source of outcome data to assess program impacts, thus, not conducting the follow-up survey will limit the study team’s ability to assess the impact of YouthBuild programs.

7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

  • requiring respondents to report information to the agency more often than quarterly;

  • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • requiring respondents to submit more than an original and two copies of any document;

  • requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

  • in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

  • requiring the use of statistical data classification that has not been reviewed and approved by OMB;

  • that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

  • requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.

There are no special circumstances surrounding data collection. All data will be collected in a manner consistent with Federal guidelines.

8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.

a. Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995, the public was given an opportunity to review and comment through a 60-day Federal Register Notice, published on June 2, 2015 (80 FR.31418). A copy of this notice is attached as Appendix F. In response to the 60-day notice, a member of the public commented that the survey was unnecessary because they believe the YouthBuild program is without value, is too expensive and is not effective in bringing about change in youth. While we have not made any changes to the survey as a result of this comment, the survey will play a significant role in helping the Employment and Training Administration determine whether the program’s impacts are, in fact, worth the costs, based on any observed changes in those youth who participated in the program.

b. Consultations Outside of the Agency

There have been no consultations on the research design, sample design, data sources and needs, and study reports with anyone outside the evaluation team members from MDRC, Mathematica, and SPR. The 48-month survey, which is subject of this package, has been discussed with staff in the DOL Chief Evaluation Office.

9. Explain any decision to provide any payment or gift to respondents, other than re-numeration of contractors or grantees.

The evaluation’s data collection plan includes respondent payments for completing the follow-up surveys. It is essential to include an incentive in order to maximize the response rate, and it is particularly important with a challenging population5 and a demanding data collection strategy. In a seminal meta-analysis, Singer, et al. (1999) found that incentives in face-to-face and telephone surveys were effective at increasing response rates, with a one dollar increase in incentive resulting in approximately a one-third of a percentage point increase in response rate, on average. They, as well as others, have found some evidence that incentives are useful in boosting response rates among underrepresented demographic groups, such as low-income and minority individuals.6 This is a significant consideration for this study of YouthBuild and the YouthBuild-eligible population. Several studies have found that the use of incentives in early rounds has a positive effect on cooperation rates in subsequent rounds.7 Generally, prepaid monetary incentives were found to be more effective than promised incentives or non-monetary incentives. However, Ferber and Sudman (1974) found that non-monetary, follow-up gifts appeared to be especially effective in panel surveys. Some survey researchers have argued that the use of incentives in panel studies creates an expectation of payment in subsequent rounds. Singer, VanHoewyk and Maher (1998) conducted a series of experiments to determine if incentive payments create expectation effects. They found that respondents who receive a monetary incentive in the first round of a study are more likely than non-incentive respondents to agree with the statement, “People should be paid for doing surveys like this,” but they were also more likely to participate in subsequent non-incentive waves of the survey.

To encourage youth to complete their survey early and online, we conducted an experiment during the 12-month follow-up survey that informed our data collection strategy for subsequent rounds. The experiment assessed the cost effectiveness of offering two different incentive amounts. Specifically, the experiment assessed whether a higher incentive amount encourages sample members to respond early in the field period and to respond via the web, the least costly mode of data collection. We randomly assigned all sample members to one of two incentive conditions: (1) “the early bird” and (2) the incentive control condition. The “early bird” group received $40 for completing their survey online within the first four weeks of the field period; thereafter they received $25 for completing the survey regardless of mode of completion. The control group received $25 regardless of when they completed they survey. Our analysis compared the number and cost per complete survey for those surveys completed within the “early bird” window with those completed later in the field period broken out by incentive condition. We also examined the characteristics of respondents by mode and incentive condition to see if the “early bird special” draws in some subgroups that may otherwise be harder to reach. We found that the $40 incentive condition is associated with greater odds of completing early, reduced costs due to fewer cases being sent to the phones and the field, and potentially greater representativeness among respondents (although these findings were not statistically significant). Specifically, those who were offered the $40 incentive had 38 percent higher odds of completing their survey within the first four weeks, compared to those who were offered the $25 incentive. This finding remained significant after controlling for a host of demographic characteristics associated with non-response such as gender, age, and race (OR= 1.38 p<.01). As a result, we continue to offer the “early bird special” during the 30-month follow up survey data collection. We shared the findings, along with other findings from the 12-month survey, with ETA shortly after completion of the 12-month follow-up survey (Appendix J).

10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

Respondents’ privacy is protected to the extent permitted by law. Study participants are informed that their participation is voluntary and that information will be kept private to the extent permitted by law, and that information gathered will be presented in summary format and used only for statistical purposes. They are also informed that their answers will not affect their eligibility for any Federal, state or local government programs, or for receipt of benefits from such programs. In addition, the study has obtained a Confidentiality Certificate from the U.S. government (available in Appendix H), meaning that we do not have to identify individual participants, even under court order or subpoena. This information was conveyed to participants in the Informed Consent Form, which was included in the ICR approved under Reference Number 201208-1205-007.

A range of measures described below will be followed to protect and safeguard the data that are collected.

a. Protection of Personal Information

Mathematica, under contract to MDRC, is responsible for administration of the survey. It is Mathematica and MDRC policy to efficiently protect personal information and data, in whatever media they exist, in accordance with applicable Federal and state laws and contractual requirements. All study participants receive unique identification codes which are stored separately from personally identifying information. Consent forms are also stored separately from identification codes. Baseline data were entered in a database separate from the survey and administrative data, and these databases will never be merged. In conjunction with this policy, we require all staff members to:

  • Comply with a Confidentiality Pledge and Security Manual procedures to prevent the improper disclosure, use, or alteration of confidential information. Staff may be subjected to disciplinary and/or civil or criminal actions for knowingly and willfully allowing the improper disclosure or unauthorized use of confidential information.

  • Access confidential and proprietary information only on a need-to-know basis when necessary in the performance of assigned duties.

  • Notify their supervisor, the Project Director, and the organizational Security Officer if confidential information has either been disclosed to an unauthorized individual, used in an improper manner, or altered in an improper manner.

  • Report immediately to both the Project Director and the organizational Security Officer all contacts and inquiries concerning confidential or proprietary information from unauthorized staff and non-research team personnel.

b. Protection of Data

The security protocols also cover all aspects of privacy for hard copy and electronic data. All hardcopy materials are shipped to contractors using Federal Express or an equivalent system that allows for package tracking; if any item is delayed or lost it is investigated immediately. All completed hardcopy documents and other survey materials are shipped to the SOC, a secure facility that can only be accessed by a key card. SOC staff receipt the hardcopy documents into a secure database and store all documents containing sensitive information in locked file cabinets or locked storage rooms when not in use. These documents will be destroyed when no longer needed in the performance of the project. All SOC staff are required to comply with security policy and complete yearly refresher trainings on security awareness, including procedures for safeguarding personally identifiable information.

In addition, Mathematica has developed a Disaster Recovery Plan that provides a full contingency/disaster recovery plan for major systems outages. Data use agreements (DUAs) are negotiated on a case-by-case basis. DUAs are tracked in a database on a project-by-project basis to ensure, among other things, that the data collected during the project are destroyed at the end of the project in accordance with the DUA.

All of the major software systems that are used on the project guarantee the security of the data they collect and store. All systems and their associated databases are secured behind the firewall between the local area network and any external internet connection. To the extent that the databases must be accessed outside this local area network, or when data must be shared across the different organizations that comprise the evaluation team, this access is across secure point-to-point connections, and data are not sent across the internet.

c. Background checks and security

Evaluation team members working with this data have undergone background checks, which includes completing the SF-85 form.

11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers these questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

The follow-up surveys include questions that some respondents might find sensitive. These questions ask about delinquent activities, arrests, incarcerations, earnings, drug and alcohol use, and physical and mental health. Collection of this information, though sensitive in nature, is critical for the evaluation because it will allow the study team to assess the impact of YouthBuild activities on outcomes the program is intended to influence, beyond the central outcomes related to education and employment. These areas are important because they reflect youth’s ability to successfully transition into adulthood, affect their overall well-being, and may mediate the effects of the program on education and employment outcomes. All questions on the follow-up survey, including those deemed potentially sensitive, have been pre-tested and used extensively in prior studies with no evidence of harm. All respondents are informed that they can decline to answer any question they do not wish to answer and there are no negative consequences for not participating.

12. Provide estimates of the hour burden of the collection of information. The statement should:

  • Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

  • If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

  • Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage and rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 13.

There will be no burden from this data collection on YouthBuild sites participating in this evaluation. The hours burden of follow-up survey data collection for YouthBuild participants and members of the control group in the 75 sites participating in the impact component is outlined below in A.2. We expect that 80 percent (n=2,749) of the 3,436 sample members will complete the 48-month survey. We anticipate that each completed interview for the 48-month survey will take approximately 35 minutes to administer. The estimated burden previously reported to OMB is reduced for the survey. Although we propose to add five questions, which will add a few minutes to the survey administration, we also plan to omit many of the contact questions previously used for subsequent locating efforts. While these were necessary when administering the 12-month and 30-month surveys, we will not contact study participants following administration of the 48-month survey, thus, we have no further need to impose a burden on respondents in order to update their contact information. Based on this assumption, the total respondent burden for the 48-month survey is 1,604 hours [2,749 completes x 35 minutes ÷ 60 minutes]).

Table A.2. Burden Hour Estimates for YouthBuild Participants

Data Collection Instrument

Number of Respondents/Instances of Collection

Frequency of Collection

Average Time Per Response

Burden (Hours)

48 month survey

2,749

Once

35 minutes

1,604

As noted above, the total estimate of burden for completion of the follow-up survey is 1,604 hours; all hours would be borne by participants. At an average wage of $7.25 per hour, which is the wage paid to YouthBuild participants for their time spent in vocational training, this represents a time value of $11,629, a reduction of $1,660 from the estimated cost provided in the original request for clearance. Estimates of annualized time value to respondents for the hour burdens for collection of information are noted in Table A.3.

Table A.3. Annualized Costs to YouthBuild Respondents

Data Collection Instrument

Number of Respondents/Instances of Collection

Average Time Per Response

Burden (Hours)

Wage

Rate

Total Annual Value

48 month survey

2,749

35 minutes

1,604

$7.25

$11,629



13. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).

The proposed data collection will not require the respondents to purchase equipment or services or to establish new data retrieval mechanisms. Survey content is based on study participants’ experiences in YouthBuild and other programs and factual information about their education, employment, involvement with the criminal justice system, living arrangements, marital status, fertility and social-emotional development. Therefore, the cost to respondents solely involves answering the questions on the survey. No capital or start-up costs are anticipated nor does the evaluation team expect extensive time spent on generating, maintaining, disclosing or providing the information.

14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.

The estimated cost to the Federal government for the study design and survey components discussed in this supporting statement can be seen below in Table A.4.

The total cost to the Federal government of carrying out this study is $19,749,515, to be expended over the seven-year period of the study. Of this, $10,368,054 is for the follow-up surveys. The remaining $9,381,461 is for other costs of the study, including design, implementation and monitoring of random assignment, analysis of administrative records data, and reporting.

Table A.4. Estimated Cost to the Federal Government*

Task

Total

48-month follow-up survey

$2,023,916

Design Survey

$38,108

Programming

$194,002

Data Collection

$1,694,688

Data File

$97,118

Analysis of 48 month follow-up survey

$98,308

Total Cost for this Data Collection Request

$4,146,140

*While OMB clearance lasts for a total of three years, annualized costs to the government are based on costs incurred over the total project period of performance, which is seven years. Design and data file preparation tasks are included in the annualized costs.


In addition, an estimated $200,000 (two staff-year equivalents) will be spent by grade 14 and 15 DOL staff8 managing the study and overseeing the contractor. Since the project will last seven years (including initial preparation, follow-up data collection, analysis and reporting), the annualized staff cost to the Federal government is $28,571 ($200,000 ÷ 7 years = $28,571). The total annualized cost to the government including staff cost and the costs of data collection is $620,877. ($4,146,140 + 200,000 ÷ 7 years = $620,877).

15. Explain the reasons for any program changes or adjustments reported on the burden worksheetin Items 13 or 14 of the OMB Form 83-I.

Recruitment and enrollment took longer than originally anticipated – extending from 12 to 18 months in order to achieve the study’s enrollment targets. As a result, the field period for each the follow-up surveys increased from 16 to 22 months in order to achieve sufficiently high response rates. The lengthier field period, coupled with the slow buildup of sample, increased overall data collection costs.

Additionally, the aforementioned revisions to the 48-month survey, including the addition of five new questions and the removal of 16 questions designed to facilitate future contact (since this is the last survey administration and no further contact will be necessary) resulted in a reduced burden estimate than previously proposed. More specifically, our pretests for the 48-month survey found that the survey will take approximately 35 minutes (rather than the 40 minutes originally proposed).



16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and end dates of the collection of information, completion of report, publication dates, and other actions.

The data collection for which this supporting statement is seeking clearance will not result in publicly available records. Data and study progress will be documented internally throughout the project. The evaluation plan includes a range of deliverables and reports. Table A.5 below shows an outline of these deliverables, followed by a fuller explanation for each item.

Table A.5 schedule of deliverables

Deliverable

Date

Design Report

June 2012

Process Report

February 2015

Interim Report

August 2016

Final Report

August 2017


Design report. In summer 2012, the evaluation team completed a design report describing in detail the proposed design for the evaluation. The report (included in the previous clearance request) discussed proposed sample sizes, research groups, the random assignment process, and site selection and recruitment. Based on a conceptual model of how YouthBuild might affect youth outcomes, key administrative data to be collected and major topics to be addressed in each of the follow-up surveys were outlined. Finally, the report outlined the proposed analysis plan for the process, impact, and cost-effectiveness studies.

Process report. In February 2015, the evaluation team completed a report describing the findings from the process study. This report documented, for example, the process of recruiting sites for the evaluation, the characteristics of sites that participate, and the process of randomly assigning youth to either the program group or a control group. The report also discussed the characteristics of youth served, the flow of participants through the programs, the delivery of services, youth participation rates, and any challenges to serving participants.9

Interim report. In August 2016 the evaluation team will complete a report describing interim effects of YouthBuild on a range of outcomes. This report will use data from both administrative records and the 12- and 30-month surveys to examine impacts on educational attainment, employment, job characteristics, community involvement, attitudes and aspirations, family structure and living arrangements, and involvement with the criminal justice system. The report may also include an examination of effects for key subgroups of youth.

Final report. In August 2017, the evaluation team will complete the final report documenting longer-term impacts of YouthBuild. Likely outcomes will include participation in education and training, the attainment of educational credentials, employment and earnings, criminal justice involvement, family status and living arrangements, positive behaviors and activities, risky behaviors, health status and other measures of well-being. This report will also examine effects for key subgroups of youth and present an analysis of the effectiveness of certain program components. Finally, the report will present an analysis of the cost-effectiveness of the program.

Public Use File. A public use file of the data, stripped of individual-identifying information will be produced of the final data sets used by the evaluation.10 . This file will include instructions for data retrieval, code books presenting means and frequencies for all variables, and document important decisions made in construction of outcome variables.

Program impacts on the outcomes will be estimated using a basic impact model:

Yi = α + βPi + δXi + εi

where: Yi = the outcome measure for sample member i; Pi = one for program group members and zero for control group members; Xi = a set of background characteristics for sample member i; εi = a random error term for sample member i; β= the estimate of the impact of the program on the average value of the outcome; α =the intercept of the regression; and δ = the set of regression coefficients for the background characteristics.

We will use a linear regression framework or a more complex set of methods depending on the nature of the dependent variable and the type of issues being addressed, such as: logistic regressions for binary outcomes (e.g., employed or not); or Poisson or Negative Binomial regressions for outcomes that take on only a few values (e.g., months of employment).

17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

The expiration date for OMB approval appears to respondents in the recruitment letter.

18. Explain each exception to the certification statement identified in Item 19, “Certification for Paperwork Reduction Act Submissions.”

Exception to the certification statement is not requested for the data collection.


1 Beebe, Timothy, Michael Davern, Todd Rockwood, Donna McAlpine and Kathleen Call (2004). The effects of a prepaid monetary incentive among low income and minority populations. Paper presented at the annual meeting of the American Association of Public Opinion Research, Phoenix, AZ, May 11, 2004. http://www.allacademic.com/meta/p_mla_apa_research_citation/1/1/5/9/6/p115965_index.html

2 We obtained a total sample size of 3,930 youth. The survey sample of 3,436 youth was drawn randomly from the full evaluation sample.

3 Our experience to-date confirms that youth will complete online but not at the rates found among students who enrolled in college. In comparison to our CSAP experience, 27.8% of completions to the 12-month follow-up survey (out of an 81.0% response rate) came from the web survey. Aside from our incarcerated sample members, we have not had requests for hardcopy questionnaires.

4 The study team has requested access to the employment and earnings data available from the National Directory of New Hires.

5 Disadvantaged youth are typically a challenging population to locate and contact because they are highly mobile, lack an electronic footprint, and rely on cell phones and/or “throwaway” phones in higher numbers than other segments of the population. For example, the Census Bureau estimates that 30 percent of young adults between the ages of 20-24 and 28 percent of those between the ages of 25-29 moved between 2004 and 2005, the most recent years for which these data are available (http://www.census.gov/population/www/pop-profile/files/dynamic/Mobility.pdf). The locating challenge that these high mobility rates present is exacerbated by youth’s limited “electronic footprint,” meaning the traceable records that adults create as they become employed, gain credit cards, and purchase homes. In February 2010, the seasonally adjusted employment rate for youth aged 16-19 was only 29 percent (http://www.bls.gov/news.release/empsit.t02.htm), an estimated 25 percent of the young adult population have credit cards (Abal, 2007), and around 25 percent of people under 25 years old own their own homes (http://www.census.gov/hhes/www/housing/hvs/annual04/ann04t15.html). Nearly 31 percent of adults under the age of 30 reside in cell-phone only households (http://ajph.aphapublications.org/cgi/reprint/99/10/1806) and disposable cell phones are becoming increasingly popular with this demographic. All of these issues will be relevant for locating and contacting efforts for the 48-month follow-up youth survey.

6 Berlin, M., L. Mohadjer and J. Waksberg (1992). An experiment in monetary incentives. Proceedings of the Survey Research Section of the American Statistical Association, 393-398; de Heer, W. and E. de Leeuw. “Trends in household survey non-response: A longitudinal and international comparison.” In Survey Non-response, edited by R. M. Groves, D. A. Dillman, J. L. Eltinge, and R. J. A. Little. New York: John Wiley, 2002, pp.41-54; Singer, E. and Kulka, R. Studies of Welfare Populations: Data Collection and Research Issues, Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs. Ploeg, Robert A. Moffitt, and Constance F. Citro, Editors. National Academies Press, Washington, DC, 2000, pp. 105-128.; Beebe, Timothy, Michael Davern, Todd Rockwood, Donna McAlpine, and Kathleen Call. 2004. "The Effect of a Prepaid Monetary Incentive Among Low Income and Minority Populations." Conference Papers -- American Association for Public Opinion Research N.PAG. Academic Search Premier, EBSCOhost (accessed September 26, 2011).

7 Ferber, R., and S. Sudman (1974). Effects of compensation in consumer expenditure surveys. Annals of Economic and Social Measurement, 3(2):319-331; Kulka, R.A. (1995, May). The use of incentives to survey hard to reach respondents. Paper prepared for the Council of Professional Associations on Federal Statistics seminar on New Directions in Statistical Methodology, Bethesda, MD.; Link, MW, A. G. Malizo, and T.R. Curtin, T.R. (2001). Use of targeted monetary incentives to reduce nonresponse in longitudinal surveys. Paper presented at the annual conference of the American Association of Public Opinion Research, Montreal, Quebec Canada; Martin, E., D. Abreu, and F. Winters (2001). Money and motive: Effects of incentives on panel attrition in the survey of income and program participation. Journal of Official Statistics, 17(2):267-284; Martinez-Ebers, V. (1997). Using monetary incentives with hard-to-reach populations in panel surveys. International Journal of Public Opinion Research, 9(1): 77-86.

8 Estimated using the Office of Professional Management’s labor rates table.

9 Andrew Wiegand, Michelle S. Manno, Sengsouvanh Leshnick, Louisa Treskon, Christian Geckeler, Heather Lewis-Charp, Castle Sinicrope, Mika Clark, Brandon Nicholson, (2015) Adapting to Local Context: Findings from the YouthBuild Evaluation Implementation Study, MDRC: New York.

10 Data on employment and earnings obtained from the National Directory of New Hires will not be included in the public use file, given the restrictions on their use.

9

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCCastro
File Modified0000-00-00
File Created2021-01-25

© 2024 OMB.report | Privacy Policy