YOuthBuild Follow-Up Survey OMB Supporting Statement, Part A, 9.14.2012

YOuthBuild Follow-Up Survey OMB Supporting Statement, Part A, 9.14.2012.docx

YouthBuild Impact Evaluation: Youth Follow-Up Surveys

OMB: 1205-0503

Document [docx]
Download: docx | pdf

Part a: Supporting statement for
paperwork
reduction act submission

The Impact Evaluation of the YouthBuild Program is a seven-year experimental design evaluation funded by the U.S. Department of Labor (DOL), Employment and Training Administration (ETA) and the Corporation for National and Community Service (CNCS). YouthBuild is a youth and community development program that addresses several core issues facing low-income communities: education, employment, crime prevention, leadership development, and affordable housing. The program primarily serves high school dropouts and focuses on helping them attain a high school diploma or general educational development (GED) certificate and teaching them construction skills geared toward career placement. The evaluation will measure core program outcomes including educational attainment, postsecondary planning, employment, earnings, delinquency and involvement with the criminal justice system, and youth social and emotional development.

The evaluation contract was awarded to MDRC in June 2010. The evaluation began in the fall of 2011 and is scheduled to continue until June 2017. MDRC is the prime contractor; Mathematica Policy Research (Mathematica) and Social Policy Research Associates (SPR) are subcontractors that will assist MDRC with designing the study, implementing random assignment, analyzing data collected for the study, and reporting the study’s findings. The YouthBuild evaluation design includes an impact component, an implementation component and a cost-effectiveness component. All 2011 DOL-funded and CNCS-funded YouthBuild grantees will participate in the implementation component while a random selection of grantees will participate in the impact and cost-effectiveness components.

The evaluation will assess YouthBuild’s operation, participation by youth, and impact on youth’s education, employment, criminal and personal development outcomes. It will also determine the net cost of the impacts generated.

DOL has submitted several requests to the Office of Management and Budget (OMB) as part of the YouthBuild evaluation (see Table A.1). The full package for this study is being submitted in separate parts because data collected through the evaluation’s initial stages informed the development of the subsequent data collection instruments. On June 15, 2011, OMB approved DOL’s request to administer a grantee questionnaire to programs participating in the evaluation (see ICR Reference #201005-1205-002), designed to provide basic information about how YouthBuild programs are structurally managed and operated relative to other youth training and education programs. The information was to be collected from all 2011 DOL- and CNCS-funded YouthBuild grantees to provide initial information about operations and to develop sufficient information to select 83 grantees to participate in the impact component of the evaluation.

A second clearance request, to continue to collect baseline and program service and activity data from YouthBuild participants was approved on April 18, 2011 (see ICR Reference #201008-1205-002). This information will be collected via a web-based management information system (MIS) and is a key component of the departmental management of the program that will support case management and performance reporting. The MIS allow program operators to track services, outcomes and, for this evaluation, random assignment status of youth participating in YouthBuild. On March 13, 2012, OMB approved DOL’s request to administer a YouthBuild grantee questionnaire (see ICR Reference #201108-1205-005). That questionnaire will be administered to all 2011 YouthBuild grantees funded by DOL and CNCS.



Table A.1. OMB Clearance Requests

Request #

Data collection

Date approved

OMB control #

1

YouthBuild Grantee Questionnaire

June 5, 2011

1205-0436

2

YouthBuild Reporting System

April 18, 2011 Rev. May 22, 2012

1205-0464

3

Grantee Survey

March 13, 2012

1205-0488

4

(under OMB review)

YouthBuild Site Visit Protocols



5

(Current request)

Participant Follow-up Survey




This package requests clearance for three follow-up surveys with youth who were randomly assigned in the 83 sites to either a treatment group or control group. The surveys will be fielded 12-, 30- and 48-months after random assignment. It is understood that OMB clearance to administer either the upcoming grantee survey or to collect qualitative in-person information during future site visits does not imply or constitute clearance to administer the follow-up instruments that are subject of this clearance request. This package requests clearance for these follow-up surveys and related respondent materials. Specifically:

  1. Survey Instrument – Computer Assisted Telephone Instrument (Appendix A)

  2. Informed Consent Form (Appendix A1)

  3. Survey Instrument – Web Version (Appendix A2)

  4. Participant Advance Letter (Appendix B)

  5. Participant Advance Email (Appendix B1)

  6. Participant Interim Letter (Appendix C)

In addition, this package includes:

  1. Workforce Investment Act—Section 172 (Appendix D)

  2. Frequently Asked Questions (Appendix E)

  3. 60-Day Federal Register Notice (Appendix F)

  4. Summary of Pretest Findings (Appendix G)



A. Justification

1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


The YouthBuild evaluation design includes an impact component, an implementation component and a cost-effectiveness component. The follow-up surveys will be the key data source used to estimate the impacts of the program. Survey data on education and training participation, post-secondary enrollment, employment and earnings, criminal justice involvement, and social and emotional development—data that are not available from other sources—will inform DOL and CNCS of YouthBuild’s effects on a range of youth outcomes.

Program impacts on a range of outcomes will be estimated using a basic impact model:

Yi = α + βPi + δXi + εi

where: Yi = the outcome measure for sample member i; Pi = one for program group members and zero for control group members; Xi = a set of background characteristics for sample member i; εi = a random error term for sample member i; β= the estimate of the impact of the program on the average value of the outcome; α =the intercept of the regression; and δ = the set of regression coefficients for the background characteristics.

We will use a linear regression framework or a more complex set of methods depending on the nature of the dependent variable and the type of issues being addressed, such as: logistic regressions for binary outcomes (e.g., employed or not); or Poisson or Negative Binomial regressions for outcomes that take on only a few values (e.g., months of employment).

Second, the evaluation will examine whether impacts vary with certain program features, such as program fidelity, length of mental toughness orientation, or strength of post-secondary services. We will use multi-level estimation methods for this analysis, where the unit of analysis is the individual for Level One and the sites for Level Two. The site-level impact, then, is allowed to vary with site characteristics (e.g., implementation strength, program components, service contrast, etc.).

This evaluation of the YouthBuild Program will be carried out under the authority of the Workforce Investment Act, Section 172, (Appendix D) which states that “for the purpose of improving the management and effectiveness of programs and activities…the Secretary shall provide for the continuing evaluation of the programs and activities.” (WIA, Sec. 172(a) 1998).

This request seeks clearance only for the follow-up survey (Appendix A) and the participant advance and interim letters (Appendix B and Appendix C, respectively), which are discussed in detail below.

2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.

Clearance by OMB is currently being requested to administer the follow-up surveys (Appendix A) and the advance and interim letters (Appendix B/B1 and Appendix C, respectively). The Frequently Asked Questions (FAQ) document (Appendix E) will be available online for participants throughout the study. The data collected in the follow-up surveys will be used to assist in the impact analyses. The advance and interim letters, the FAQ document, and the follow-up surveys are described in detail below, along with how, by whom, and for what purposes the information collected will be used.

a. Advance Letter

The participant advance letter (Appendix B) will be sent to all study participants approximately two weeks prior to the email invitation (Appendix B1) that includes a link to the initial survey. The letter will reiterate basic information about the study which sample members received before consenting to participate in the study, including the study sponsorship, the optional and protected nature of their responses, and the incentive payment; and will provide contact information in case they have questions about the study or the survey. Information about OMB is provided on the back of the letter. The advance letter will include a separate enclosure that provides a user identification and password for accessing the online survey. The link to the survey will be sent separately.

b. Interim Letter

Two months before the start of data collection, we will send out a reminder letter asking sample members to call our toll-free number, email us, or return an enclosed card to update their contact information (Appendix C). We will enclose a pre-paid monetary incentive ($2 bill) with this mailing. A growing body of research suggests that pre-paying sample members with as little as $2 remains an effective way to increase response rates. Beebe et al. (2004) found that a $2 pre-paid incentive increased response rates by close to 10 percent on a mail survey of over 9,000 Medicaid beneficiaries. The study included an oversample of racial and ethnic minorities and found that the incentive had a comparable effect across different racial and ethnic strata.1

c. Frequently Asked Questions

All sample members will be provided with a user ID and log in information for the web survey. The FAQ document (Appendix E) will be available online for participants for the duration of the study via the web link. The FAQ addresses basic questions that participants might have regarding the study, including who is being invited to complete the surveys, who is sponsoring the study, the voluntary nature of survey responses and the survey team’s plans for protecting the respondents’ information. Interviewers will have copies of the FAQs and will be able to answer questions as needed.

d. Follow-up Surveys

Approximately two weeks after receiving the advance letter, study participants will receive an email with the link to complete the follow-up survey (Appendix A) on the web. The follow-up surveys are the primary source of outcome data to assess program impacts. These data will inform DOL and CNCS on how YouthBuild programs affect program participants’ educational attainment, postsecondary planning, employment, earnings, delinquency and involvement with the criminal justice system, and youth social and emotional development.

The sample for the follow-up surveys includes 3,465 study participants (program and control group members) who were between the ages of 16 and 24 at the time they consented to participate in the study and were randomly assigned. Random assignment began in August 2011 and is expected to continue through December 2012.2

We expect to achieve the OMB required 80 percent response rate, meaning that we will obtain completed surveys from 2,772 respondents to each round of the survey. In order to achieve at least an 80 percent response rate, we will employ the following approach, designed to maximize efficiency:

  • A multi-mode approach that will deploy the most cost-effective modes first. For the YouthBuild evaluation, we will implement a multi-mode survey that begins on the web, and then moves to more intensive methods – Computer Assisted Telephone Interviewing (CATI) and in-person locating – as part of our nonresponse follow-up strategy. We gave careful consideration to offering a mail option; however, youth’s high mobility rates suggest that a mail option would not be cost-effective. Youth are more likely to have a stable email address than a stable mailing address and as a result, we could expend resources mailing out questionnaires to addresses that are no longer valid. We will provide a hardcopy questionnaire to any sample member who requests one but, increasingly, our studies of youth find a preference for completing surveys online. Mathematica recently conducted the College Student Attrition Project (CSAP) for the Mellon Foundation on which over 80 percent of the completed surveys were done online. While we do not anticipate as high a web-completion rate for the YouthBuild evaluation – the YouthBuild sample is more disadvantaged than the CSAP population – our approach emphasizes communicating with youth through the media that they prefer. 

  • A clear, stream-lined survey instrument that will make responding easy and ensure accurate responses. The survey is designed to be as brief as possible, with clear, easy-to-answer questions (mostly closed questions with a few open-ended questions).


  • An official letter will gain attention and legitimize the study. An advance letter with log-in information will be mailed to sample members, helping legitimize the study and further encouraging participation.

  • A staged outreach strategy for reluctant responders will result in a high conversion rate. Beginning in week 2, we will send e-mail reminders to those who have not responded. We will conduct follow-up telephone calls (in addition to e-mail reminders) to non-responders beginning in week 4. Mathematica telephone interviewers are trained in refusal conversion. Experienced, expert refusal converters will be assigned to work with the reluctant responders to maximize conversion rates. Lastly, field locators will attempt to find and gain cooperation from those sample members who cannot be reached by web or telephone. (Additional details and justification for our approach is provided in the description of the incentive experiment below).


The data collected through the surveys will allow the evaluation team to measure the full range of outcomes associated with participation in the YouthBuild program. YouthBuild programs are based on a youth development framework that emphasizes academic achievement, vocational skills, and personal development as the primary means by which youth attain self-sufficiency through employment. Program participants receive a range of educational services that lead to a high school diploma or GED. These services include basic skills testing and instruction, remedial education, as well as assistance with post-secondary planning and financial aid. Program participants may be elected to represent their peers on a Youth Policy Council, a committee that is charged with making recommendations for improving the YouthBuild program; serve on a Youth Advisory Council; participate in public speaking events on behalf of the program; and gain valuable community service experience by volunteering for a local organization. In addition, participants spend almost half of their time in construction training – rehabilitating or building new housing for low-income and homeless families in their communities – or in other vocational skills training.

Based on our knowledge of the YouthBuild program, the study’s research objectives, and our review of the literature on educational and employment interventions for youth, we will include the following five core topics in each of the follow-up surveys:

  • Service Receipt: This section will assess respondents’ participation in education, training, employment and youth development activities, all of which will provide important information about the services that treatment and control group youth received during the study period (from YouthBuild or elsewhere). Obtaining information on service receipt is important for obtaining rates of participation in YouthBuild. In addition, it is critical to accurately measure the “treatment difference” between the two research groups.

  • Educational attainment: This section will collect information on respondents’ highest level of education completed, post-secondary educational plans, and post-secondary enrollment. Increased educational attainment is one of the primary goals of the YouthBuild program.

  • Employment: This section will gather detailed characteristics of current or most recent job, job and earnings history, and job readiness. It is hypothesized that the vocational training and hands-on experience that YouthBuild provides will help participants obtain stable, well-paying jobs, either in construction or in other fields.

  • Criminal justice involvement and delinquency: This section will collect information about respondents’ delinquent activities, arrests, convictions, and incarcerations. The YouthBuild program model focuses on social and emotional development and positive engagement with the community. In addition, some programs focus primarily on youth offenders and offer YouthBuild as an alternative to sentencing. It is expected that the program, by providing positive, caring adult role models, positive peers, and educational and vocational services, will result in a reduction in criminal behavior and delinquent activities.

  • Current living arrangements, marital status, and fertility: This section will ask respondents about their current living situations, partnership arrangements, and pregnancy histories. These questions will measure the effects that the YouthBuild program has on respondents’ transitions into adulthood.

In addition, each of the surveys will include topic modules on one or more dimensions of adolescent development such as: social and emotional development, identity development, and health and well-being. YouthBuild is expected to positively influence each of these developmental domains. These modules are included, and identified, in the survey (Appendix A).

3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.

Advanced technology will be used in data collection efforts to reduce burden on study participants. We will use a multi-mode survey that begins on the web, and then moves to more intensive methods – Computer Assisted Telephone Interviewing (CATI) and in-person locating – as part of our nonresponse follow-up strategy. The web and CATI surveys will be identical (Appendix A shows the questions as they will be structured in both modes), and both web and CATI surveys will have built-in verifications and skip patterns to ensure data are reliable and participants only answer questions that pertain to them. Web surveys reduce the amount of interviewer labor necessary to complete a data collection, and allow respondents to complete the questionnaire on their own schedule, in multiple sittings, and without having to return any forms by mail. We anticipate that not all study participants will be able to complete the survey on the web. For those sample members who do not complete the survey on the web, we will attempt to complete it over the phone using a CATI survey. For those sample members who cannot be located by telephone, Mathematica’s specialized locating staff will use searchable databases, directory assistance services, reverse directories, and contact with neighbors and community organizations to obtain current contact information. We will also include searches of social network websites (e.g. Facebook and Twitter), community and college websites, and prison websites. If electronic locating yields no results, field locators will attempt to locate sample members by using recently known address(es) of residency. Field locators will be equipped with cell phones and will be able to dial into Mathematica’s Survey Operation Center (SOC) to allow a sample member to complete a CATI survey, once they have been located.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in item 2 above.

The YouthBuild evaluation strives to minimize data duplication. The survey will focus on information that is critical to the evaluation but not available from administrative records. For example, MIS data will be available to measure participation in YouthBuild activities among program group youth. However, these data are limited to YouthBuild activities, and are not collected for youth in the control group. Assessing the program-control difference in participation in education and employment services is critical for the evaluation. Similarly, employment and earnings information may be available from administrative records,3 but these data do not provide information on job tenure and wages and other measures of job quality. YouthBuild is also likely to affect youth’s involvement with the criminal justice system, data that will only be available through the follow-up surveys.

5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.

Collection of follow-up information will not impact small businesses or other small entities.

6. Describe the consequences to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles in reducing burden.

The evaluation represents an important opportunity for DOL and CNCS to add to the growing body of knowledge about the impacts of second-chance programs for youth who have dropped out of high school, including outcomes related to educational attainment, postsecondary planning, employment, earnings, delinquency and involvement with the criminal justice system, and youth social and emotional development.

If the information collection is not conducted, Federal program or policy activities will not be informed by high quality information upon which to base critical decisions regarding the impacts of the YouthBuild program, nor will DOL know whether this program is one that has substantial impacts upon its participants. Since the program continues to be funded through various grantees across the country, and DOL may wish to continue funding programs targeted to high school dropouts, it is imperative that rigorous information on the impacts of this program is obtained. The follow-up surveys are the primary source of outcome data to assess program impacts, thus, not conducting the follow-up survey will limit the study team’s ability to assess the impact of YouthBuild programs.

7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

  • requiring respondents to report information to the agency more often than quarterly;

  • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • requiring respondents to submit more than an original and two copies of any document;

  • requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

  • in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

  • requiring the use of statistical data classification that has not been reviewed and approved by OMB;

  • that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

  • requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.

There are no special circumstances surrounding data collection. All data will be collected in a manner consistent with Federal guidelines.

8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.

a. Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995, the public was given an opportunity to review and comment through a 60-day Federal Register Notice, published on May 15, 2012 (FR, Vol. 77, No.94, pp. 28623-28625). A copy of this notice is attached as Appendix F.

We received one comment in June, 2012, which is below:

The document lists a 48 month evaluation, which is too long of a period. They should do continual assessments and maybe one a year later. I think any person who experiences a set back or gets off track would have another occurrence after being rehabbed into a better situation. The impact of any benefit would be cancelled from carrying on with life. Life issues you cannot control would overwhelm the outcome you are looking for.


The reviewer recommends continual assessments. In response, we note that continual assessments are costly and for this study, unlikely to yield information beyond what we could learn from the first follow-up survey, one year after random assignment. The average YouthBuild program runs for about 9 months which suggests that for our key outcomes of interest – education and employment – a one year follow-up is appropriate for assessing short-term impacts. For the program or treatment group, we will have some administrative data that may allow us to look at the relationship between time in program, service receipt and outcomes of interest. However, these data will be fairly limited and will not be available for the comparison group members who do not receive YouthBuild services. Collecting comparable data from the comparison group would be cost prohibitive. In the absence of these data, we cannot speak to the more immediate impacts of the YouthBuild program that might be identified through more continual assessment.

We also note that the period of performance for the evaluation is 7 years, which includes time to design the study, recruit programs into the study, collect data at three points in time, analyze the data and write a final report. The 48-month evaluation listed in the document refers to the third and final survey of youth which is intended to assess long term impacts of the program. In addition to the 48-month evaluation, we will collect short-term impacts 12 months after youth are randomly assigned (approximately the same time as most program participants will complete the YouthBuild program), and again at 30 months to measure medium-term impacts. The data collection plan as described will allow us to assess whether impacts are maintained or attenuate over time.

b. Consultations Outside of the Agency

There have been no consultations on the research design, sample design, data sources and needs, and study reports with anyone outside the evaluation team members from MDRC, Mathematica , and SPR.

9. Explain any decision to provide any payment or gift to respondents, other than re-numeration of contractors or grantees.

The evaluation’s data collection plan includes respondent payments for completing the follow-up surveys. It is essential to include an incentive in order to maximize the response rate, and it is particularly important with a challenging population4 and a demanding data collection strategy. In a seminal meta-analysis, Singer, et al. (1999) found that incentives in face-to-face and telephone surveys were effective at increasing response rates, with a one dollar increase in incentive resulting in approximately a one-third of a percentage point increase in response rate, on average. They, as well as others, have found some evidence that incentives are useful in boosting response rates among underrepresented demographic groups, such as low-income and minority individuals.5 This is a significant consideration for this study of YouthBuild and the YouthBuild-eligible population. Several studies have found that the use of incentives in early rounds has a positive effect on cooperation rates in subsequent rounds.6 Generally, prepaid monetary incentives were found to be more effective than promised incentives or non-monetary incentives. However, Ferber and Sudman (1974) found that non-monetary, follow-up gifts appeared to be especially effective in panel surveys. Some survey researchers have argued that the use of incentives in panel studies creates an expectation of payment in subsequent rounds. Singer, VanHoewyk and Maher (1998) conducted a series of experiments to determine if incentive payments create expectation effects. They found that respondents who receive a monetary incentive in the first round of a study are more likely than non-incentive respondents to agree with the statement, “People should be paid for doing surveys like this,” but they were also more likely to participate in subsequent non-incentive waves of the survey.

To encourage youth to complete their survey early and online, we will conduct an experiment during the 12-month follow-up survey that will inform our data collection strategy for subsequent rounds. The experiment is designed to assess the cost effectiveness of offering two different incentive amounts. Specifically, the experiment will assess whether a higher incentive amount encourages sample members to respond early in the field period and to respond via the web, the least costly mode of data collection. We will randomly assign all sample members to one of two incentive conditions: (1) “the early bird” and (2) the control condition. The “early bird” group will receive $40 for completing their survey online within the first four weeks of the field period; thereafter they will receive $25 for completing the survey regardless of mode of completion. The control group will receive $25 regardless of when they complete they survey. Our analysis will compare the number and cost per complete survey for those surveys completed within the “early bird” window with those completed later in the field period broken out by incentive condition. We will also examine the characteristics of respondents by mode and incentive condition to see if the “early bird special” draws in some subgroups that may otherwise be harder to reach. We will share the findings, along with other findings from the 12-month survey, with ETA, CNCS and OMB shortly after completion of the 12-month follow-up survey. Mathematica conducted an incentive experiment on the National Science Foundation’s National Survey of Recent College Graduates. The experiment included eight different incentive conditions, one of which offered a higher incentive to encourage sample members to respond online. Nearly 80 percent of the sample members who were randomly assigned to the higher web incentive condition completed their survey, the highest response rate of all the experimental groups.7

10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

Respondents’ privacy will be protected to the fullest extent permitted by law. Study participants will be informed that their participation is voluntary and that information will be kept private to the extent permitted by law, and that information gathered will be presented in summary format and used only for statistical purposes. They will also be informed that their answers will not affect their eligibility for any Federal, state or local government programs, or for receipt of benefits from such programs. A range of measures described below will be followed to protect and safeguard the data that are collected.

a. Protection of Personal Information

It is Mathematica and MDRC policy to efficiently protect personal information and data, in whatever media they exist, in accordance with applicable Federal and state laws and contractual requirements. All study participants will receive unique identification codes which will be stored separately from personally identifying information. Consent forms will also be stored separately from identification codes. Baseline data will be entered in a database separate from the survey and administrative data, and these databases will never be merged. In conjunction with this policy, we require all staff members to:

  • Comply with a Confidentiality Pledge and Security Manual procedures to prevent the improper disclosure, use, or alteration of confidential information. Staff may be subjected to disciplinary and/or civil or criminal actions for knowingly and willfully allowing the improper disclosure or unauthorized use of confidential information.

  • Access confidential and proprietary information only on a need-to-know basis when necessary in the performance of assigned duties.

  • Notify their supervisor, the Project Director, and the organizational Security Officer if confidential information has either been disclosed to an unauthorized individual, used in an improper manner, or altered in an improper manner.

  • Report immediately to both the Project Director and the organizational Security Officer all contacts and inquiries concerning confidential or proprietary information from unauthorized staff and non-research team personnel.

b. Protection of Data

The security protocols also cover all aspects of privacy for hard copy and electronic data. All hardcopy materials will be shipped to contractors using Federal Express or an equivalent system that allows for package tracking; if any item is delayed or lost it will be investigated immediately. All completed hardcopy documents and other survey materials will be shipped to the SOC, a secure facility that can only be accessed by a key card. SOC staff will receipt the hardcopy documents into a secure database and store all documents containing sensitive information in locked file cabinets or locked storage rooms when not in use. These documents will be destroyed when no longer needed in the performance of the project. All SOC staff are required to comply with security policy and complete yearly refresher trainings on security awareness, including procedures for safeguarding personally identifiable information.

In addition, Mathematica has developed a Disaster Recovery Plan that provides a full contingency/disaster recovery plan for major systems outages. Data use agreements (DUAs) are negotiated on a case-by-case basis. DUAs are tracked in a database on a project-by-project basis to ensure, among other things, that the data collected during the project are destroyed at the end of the project in accordance with the DUA.

All of the major software systems that will be used on the project guarantee the security of the data they collect and store. All systems and their associated databases are secured behind the firewall between the local area network and any external internet connection. To the extent that the databases must be accessed outside this local area network, or when data must be shared across the different organizations that comprise the evaluation team, this access will be across secure point-to-point connections, and data will not be sent across the internet.

c. Background checks and security

Evaluation team members working with this data have undergone background checks, which includes completing the SF-85 form.

11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers these questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

The follow-up surveys include questions that some respondents might find sensitive. These questions ask about delinquent activities, arrests, incarcerations, earnings, drug and alcohol use, and physical and mental health. Collection of this information, though sensitive in nature, is critical for the evaluation because it will allow the study team to assess the impact of YouthBuild activities on outcomes the program is intended to influence, beyond the central outcomes related to education and employment. These areas are important because they reflect youth’s ability to successfully transition into adulthood, affect their overall well-being, and may mediate the effects of the program on education and employment outcomes. All questions on the follow-up survey, including those deemed potentially sensitive, have been pre-tested and used extensively in prior studies with no evidence of harm. All respondents will be informed that they can decline to answer any question they do not wish to answer and there are no negative consequences for not participating.

12. Provide estimates of the hour burden of the collection of information. The statement should:

  • Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

  • If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.

  • Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage and rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 13.

There will be no burden from this data collection on YouthBuild sites participating in this evaluation. The hours burden of follow-up survey data collection for YouthBuild participants and members of the control group in the 83 sites participating in the impact component is outlined below in Table A.2. It is expected that there will be a total of 8,316 survey responses. We expect that 80 percent (n=2,772) of the 3,465 sample members will complete each of the three follow-up surveys. Youth who do not complete one of the surveys remain eligible for subsequent follow-ups. We anticipate that each completed interview will take, on average, 40 minutes to administer (this estimate is based on the pretest, as explained later). Based on this assumption, the total respondent burden for the three surveys is 5,544 hours (or 1,848 hours per follow-up survey [2,772 completers per survey x 40 minutes ÷ 60 minutes]).


Table A.2. Burden hour Estimates for YouthBuild Participants

Data Collection Instrument

Number of Respondents/Instances of Collection

Frequency of Collection

Average Time Per Response

Burden (Hours)

12 month survey

2,772

Once

40 minutes

1,848

30 month survey

2,772

Once

40 minutes

1,848

48 month survey

2,772

Once

40 minutes

1,848

Total for Proposed Data Collection

8,316

Thrice


5,544


As noted above, the total estimate of burden for completion of the follow-up surveys is 5,544 hours; all hours would be borne by participants. At an average wage of $7.25 per hour, which is the wage paid to YouthBuild participants for their time spent in vocational training, this represents a cost of $13,398 per survey, or a total cost of $40,194 for the follow-up surveys, though this cost is offset by the incentive payments being provided to survey respondents. Estimates of annualized costs to respondents for the hour burdens for collection of information are summarized in Table A.3. Annualized costs assume that respondents participate in one survey per year over the three year period needed to conduct all three follow-up surveys.




Table A.3. annualized costs to youthbuild respondents

Data Collection Instrument

Number of Respondents/Instances of Collection

Average Time Per Response

Burden (Hours)

Wage

Rate

Total Annual Cost

12 month survey

2,772

40 minutes

1,848

$7.25

$13,398

30 month survey

2,772

40 minutes

1,848

$7.25

$13,398

48 month survey

2,772

40 minutes

1,848

$7.25

$13,398

Total for Proposed Data Collection

8,316


5,544


$40,194



13. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).

The proposed data collection will not require the respondents to purchase equipment or services or to establish new data retrieval mechanisms. Survey content is based on study participants’ experiences in YouthBuild and other programs and factual information about their education, employment, involvement with the criminal justice system, living arrangements, marital status, fertility and social-emotional development. Therefore, the cost to respondents solely involves answering the questions on the survey and are summarized above in Table A.3. No capital or start-up costs are anticipated nor does the evaluation team expect extensive time spent on generating, maintaining, disclosing or providing the information.

14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.

The estimated cost to the Federal government for the study design and survey components discussed in this supporting statement can be seen below in Table A.4.

The total cost to the Federal government of carrying out this study is $15,216,674, to be expended over the seven-year period of the study. Of this, $6,023,988 is for the follow-up surveys. The remaining $9,192,686 is for other costs of the study, including design, implementation and monitoring of random assignment, analysis of administrative records data, and reporting.

Table A.4. Estimated Cost to the Federal Government

Task

Total

12-month follow-up survey

$1,975,975

Design Survey

$694,932

Programming

$330,054

Data Collection

$851,050

Data File

$99,939

30-month follow-up survey

$1,765,392

Design Survey

$40,553

Programming

$198,629

Data Collection

$1,428,897

Data File

$97,312

48-month follow-up survey

$2,023,916

Design Survey

$38,108

Programming

$194,002

Data Collection

$1,694,688

Data File

$97,118

Analysis of 12 month follow-up survey

$75,024

Analysis of 30 month follow-up survey

$85,373

Analysis of 48 month follow-up survey

$98,308

Total Cost for this Data Collection Request

$6,023,988

Note: While OMB clearance lasts for a total of three years, annualized costs to the government are based on costs incurred over the total project period of performance, which is seven years. Design and data file preparation tasks are included in the annualized costs. With the inclusion of these tasks, the data collection costs summarized in this table will be incurred over the seven-year contract period of performance.


In addition, an estimated $200,000 (two staff-year equivalents) will be spent by DOL staff managing the study and overseeing the contractor. Since the project will last seven years (including initial preparation, follow-up data collection, analysis and reporting), the annualized staff cost to the Federal government is $28,571 ($200,000 ÷ 7 years = $28,571). The total annualized cost to the government including staff cost and the cost of data collection is $889,141. ($6,023,988 + 200,000 ÷ 7 years = $889,141).

15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I.

This is a new submission.

16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and end dates of the collection of information, completion of report, publication dates, and other actions.

The data collection for which this supporting statement is seeking clearance will not result in publicly available records. Data and study progress will be documented internally throughout the project. The evaluation plan includes a range of deliverables and reports. Table A.5 below shows an outline of these deliverables, followed by a fuller explanation for each item.


table a.5 schedule of deliverables

  1. Deliverable

  1. Date

  1. Design Report

  1. Spring 2012

  1. Process Report

  1. June 2013

  1. Interim Report

  1. September 2015

  1. Final Report

  1. March 2017


Design report. In Summer 2012, the evaluation team completed a design report describing in detail the proposed design for the evaluation. The report (included in the previous clearance request) discussed proposed sample sizes, research groups, the random assignment process, and site selection and recruitment. Based on a conceptual model of how YouthBuild might affect youth outcomes, key administrative data to be collected and major topics to be addressed in each of the follow-up surveys were outlined. Finally, the report outlined the proposed analysis plan for the process, impact, and cost-effectiveness studies.

Process report. In June 2013, the evaluation team will complete a report describing the findings from the process study. This report will document, for example, the process of recruiting sites for the evaluation, the characteristics of sites that participate, and the process of randomly assigning youth to either the program group or a control group. The report will also discuss the characteristics of youth served, the flow of participants through the programs, the delivery of services, youth participation rates, and any challenges to serving participants.

Interim report. In September 2015, the evaluation team will complete a report describing interim effects of YouthBuild on a range of outcomes. This report will use data from both administrative records and the 12- and 30-month surveys to examine impacts on educational attainment, employment, job characteristics, community involvement, attitudes and aspirations, family structure and living arrangements, and involvement with the criminal justice system. The report may also include an examination of effects for key subgroups of youth.

Final report. In March 2017, the evaluation team will complete the final report documenting longer-term impacts of YouthBuild. Likely outcomes will include participation in education and training, the attainment of educational credentials, employment and earnings, criminal justice involvement, family status and living arrangements, positive behaviors and activities, risky behaviors, health status and other measures of well-being. This report will also examine effects for key subgroups of youth and present an analysis of the effectiveness of certain program components. Finally, the report will present an analysis of the cost-effectiveness of the program.

Public Use File. A public use file of the data, stripped of individual-identifying information will be produced of the final data sets used by the evaluation. This file will include instructions for data retrieval, code books presenting means and frequencies for all variables, and document important decisions made in construction of outcome variables.


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

The expiration date for OMB approval will be displayed on all forms completed as part of the data collection.


18. Explain each exception to the certification statement identified in Item 19, “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I.

Exception to the certification statement is not requested for the data collection.


1 Beebe, Timothy, Michael Davern, Todd Rockwood, Donna McAlpine and Kathleen Call (2004). The effects of a prepaid monetary incentive among low income and minority populations. Paper presented at the annual meeting of the American Association of Public Opinion Research, Phoenix, AZ, May 11, 2004. http://www.allacademic.com/meta/p_mla_apa_research_citation/1/1/5/9/6/p115965_index.html

2 We expect to obtain a total sample size of approximately 4,200 youth, assuming that the average program will have 50 applicants. The survey sample of 3,465 youth will be drawn randomly from the full evaluation sample.

3 The study team has requested access to the employment and earnings data available from the National Directory of New Hires.

4 Disadvantaged youth are typically a challenging population to locate and contact because they are highly mobile, lack an electronic footprint, and rely on cell phones and/or “throwaway” phones in higher numbers than other segments of the population. For example, the Census Bureau estimates that 30 percent of young adults between the ages 20-24 and 28 percent of those between the ages of 25-29 moved between 2004 and 2005, the most recent years for which these data are available (http://www.census.gov/population/www/pop-profile/files/dynamic/Mobility.pdf). The locating challenge that these high mobility rates present is exacerbated by youth’s limited “electronic footprint,” meaning the traceable records that adults create as they become employed, gain credit cards, and purchase homes. In February 2010, the seasonally adjusted employment rate for youth aged 16-19 was only 29 percent (http://www.bls.gov/news.release/empsit.t02.htm), an estimated 25 percent of the young adult population have credit cards (Abal, 2007), and around 25 percent of people under 25 years old own their own homes (http://www.census.gov/hhes/www/housing/hvs/annual04/ann04t15.html). Nearly 31 percent of adults under the age of 30 reside in cell-phone only households (http://ajph.aphapublications.org/cgi/reprint/99/10/1806) and disposable cell phones are becoming increasingly popular with this demographic. All of these issues will be relevant for locating and contacting efforts for the 12-, 30-, and 48-month follow-up youth surveys.

5 Berlin, M., L. Mohadjer and J. Waksberg (1992). An experiment in monetary incentives. Proceedings of the Survey Research Section of the American Statistical Association, 393-398; de Heer, W. and E. de Leeuw. “Trends in household survey non-response: A longitudinal and international comparison.” In Survey Non-response, edited by R. M. Groves, D. A. Dillman, J. L. Eltinge, and R. J. A. Little. New York: John Wiley, 2002, pp.41-54; Singer, E. and Kulka, R. Studies of Welfare Populations: Data Collection and Research Issues, Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs. Ploeg, Robert A. Moffitt, and Constance F. Citro, Editors. National Academies Press, Washington, DC, 2000, pp. 105-128.; Beebe, Timothy, Michael Davern, Todd Rockwood, Donna McAlpine, and Kathleen Call. 2004. "The Effect of a Prepaid Monetary Incentive Among Low Income and Minority Populations." Conference Papers -- American Association for Public Opinion Research N.PAG. Academic Search Premier, EBSCOhost (accessed September 26, 2011).

6 Ferber, R., and S. Sudman (1974). Effects of compensation in consumer expenditure surveys. Annals of Economic and Social Measurement, 3(2):319-331; Kulka, R.A. (1995, May). The use of incentives to survey hard to reach respondents. Paper prepared for the Council of Professional Associations on Federal Statistics seminar on New Directions in Statistical Methodology, Bethesda, MD.; Link, MW, A. G. Malizo, and T.R. Curtin, T.R. (2001). Use of targeted monetary incentives to reduce nonresponse in longitudinal surveys. Paper presented at the annual conference of the American Association of Public Opinion Research, Montreal, Quebec Canada; Martin, E., D. Abreu, and F. Winters (2001). Money and motive: Effects of incentives on panel attrition in the survey of income and program participation. Journal of Official Statistics, 17(2):267-284; Martinez-Ebers, V. (1997). Using monetary incentives with hard-to-reach populations in panel surveys. International Journal of Public Opinion Research, 9(1): 77-86.

7 Mooney, Geraldine M. “National Survey of Recent College Graduates (NSRCG) 2008: Impact of the 2008 Incentive Experiment on 2008 NSRCG Costs.” Report submitted to the National Science Foundation. Princeton, NJ: Mathematica Policy Research, October 2011.

10


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePART A: SUPPORTING STATEMENT FOR
AuthorComputer & Network Services
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy