Supporting_Statement_Part_A_final_7_3_2012

Supporting_Statement_Part_A_final_7_3_2012.doc

Youthbuild Site Visit Protocols

OMB: 1205-0502

Document [doc]
Download: doc | pdf

Part A: SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION


The Impact Evaluation of the YouthBuild Program is a seven-year experimental design evaluation funded by the U.S. Department of Labor (DOL), Employment and Training Administration (ETA) and the Corporation for National and Community Service (CNCS). YouthBuild is a youth and community development program that addresses several core issues facing low-income communities: education, employment, crime prevention, leadership development, and affordable housing. The program primarily serves high school dropouts and focuses on helping them attain a high school diploma or general educational development (GED) certificate and teaching them construction skills geared toward career placement. The evaluation will measure core program outcomes including educational attainment, postsecondary planning, employment, earnings, delinquency, and involvement with the criminal justice system, and youth social and emotional development.

The evaluation contract was awarded to MDRC in July 2010. The evaluation began in the fall of 2011 and is scheduled to continue until June 2017. MDRC is the prime contractor; Mathematica Policy Research (MPR) and Social Policy Research Associates (SPR) are subcontractors that will assist MDRC with designing the study, implementing random assignment, analyzing data collected for the study, and reporting the study’s findings. The YouthBuild evaluation design includes an impact component, an implementation component, and a cost-effectiveness component. All 2011 DOL-funded and CNCS-funded YouthBuild grantees will participate in the implementation component while a random selection of grantees will participate in the impact and cost-effectiveness components.

The evaluation will assess YouthBuild’s operation, participation by youth, and impact on youth’s education, employment, criminal and personal development outcomes. It will also determine the net cost of the impacts generated.

DOL has submitted several requests to the Office of Management and Budget (OMB) as part of the YouthBuild evaluation (see Table A.1). The full package for this study is being submitted in separate parts because data collected through the evaluation’s initial stages informed the development of the subsequent data collection instruments. On June 15, 2011, OMB approved DOL’s request to administer a grantee questionnaire to programs participating in the evaluation (see ICR Reference #201005-1205-002), designed to provide basic information about how YouthBuild programs are structurally managed and operated relative to other youth training and education programs. The information was to be collected from all 2011 DOL- and CNCS-funded YouthBuild grantees to provide initial information about operations and to develop sufficient information to select 83 grantees to participate in the impact component of the evaluation.  

Table A.1. OMB Clearance Requests

Request #

Data collection

Date approved

OMB control #

1

YouthBuild Grantee Questionnaire

June 15, 2011

1205-0436

2

YouthBuild Reporting System

April 18, 2011 Rev. May 22, 2012

1205-0464

3

Grantee Survey

March 13, 2012

1205-0488

4 (Current request)

YouthBuild Site Visit Protocols



5 (Future request)

Participant Follow-up Survey




A second clearance request, to continue to collect baseline and program service and activity data from YouthBuild participants was approved on April 18, 2011 (see ICR Reference #201008-1205-002). This information will be collected via a web-based management information system (MIS) and is a key component of the departmental management of the program that will support case management and performance reporting. The MIS allow program operators to track services, outcomes and, for this evaluation, random assignment status of youth participating in YouthBuild. On March 13, 2012, OMB approved DOL’s request to administer a YouthBuild grantee questionnaire (see ICR Reference #201108-1205-005). That questionnaire will be administered to all 2011 YouthBuild grantees funded by DOL and CNCS. A future OMB-PRA package will request clearance for the remaining data collection instruments for the evaluation, specifically, the follow-up surveys of study participants. It is understood that OMB clearance of the site visit data collection instruments that are the subject of this clearance request does not constitute clearance of the participant follow-up surveys that will be submitted in the future.

This package requests clearance for data to be collected during site visits to the 83 grantees selected to participate in the impact component of the YouthBuild evaluation. Specifically, it includes the following in-person interview protocols that will be used during those site visits:

Organizational Structure, Program Administration and Operations, Budget and Staffing (Appendix A)

Recruitment, Intake, Assessment, and Enrollment (Appendix B)

Mental Toughness Orientation (Appendix C)

Case Management, Supportive Services, and Follow-up Services (Appendix D)

Academic Services (Appendix E)

Vocational and Construction Training (Appendix F)

Employment Services (Appendix G)

Youth Leadership and Community Service (Appendix H)

Partnerships (Appendix I)

Alternative Youth Services (Appendix J)


The package also includes the following additional data collection instruments:

Cost Data Collection Worksheet (Appendix K)

Youth Focus Group Questionnaire (Appendix L)

Individual Interview Questionnaire (Appendix M)

Classroom Observation Checklist (Appendix N)

Worksite Observation Checklist (Appendix O)

Grantee Information Form (Appendix P)


A. Justification


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


As noted above, the YouthBuild evaluation design includes an impact component, an implementation component, and a cost-effectiveness component. This site visit information collection will produce a substantial amount of information needed for the implementation study and cost-effectiveness analysis. These site visits will take place while applicants are being randomly assigned to the program and while services are being provided to those in the program group. The implementation study will address several major topics important to the success of the evaluation, documenting: 1) the design of the participating YouthBuild programs, the services they offer to youth in the program groups, and the implementation of these services; 2) the key contextual factors in the local communities that may affect services and outcomes for youth in both research groups; and 3) the characteristics of the youth who participate in the study, the experiences of program group youth in YouthBuild, and the dosage of YouthBuild services they receive. Thus, the process study and the data obtained from the site visits are critical for providing context for the impact analysis. In addition, these data will be used as part of a formal analysis examining how program impacts vary with program features.

Evaluation team members will conduct site visits to each of the 83 sites participating in the impact component of the study, 60 funded by DOL and an additional 23 funded by CNCS. These visits are intended to provide information about the design of participating YouthBuild sites, the services they offer to youth in the program and the implementation of those services. They are also intended to provide information about key contextual factors in the local communities that may affect services and outcomes for youth and, finally, provide for the collection of data on the costs of operating YouthBuild.

The data collected through the site visits will also be used in the impact analysis in two ways. First, we may use the process study data to divide the sites into subgroups, based on fidelity to the YouthBuild model. In this case, program impacts on educational attainment from the follow-up surveys or employment from administrative records, for example, will be estimated using a basic impact model, estimated separately for high fidelity versus low fidelity sites:

Yi = α + βPi + δXi + εi

where: Yj = the outcome measure for sample member I; Pi = one for program group members and zero for control group members; Xi = a set of background characteristics for sample member I; εi = a random error term for sample member I; β= the estimate of the impact of the program on the average value of the outcome; α=the intercept of the regression; and δ = the set of regression coefficients for the background characteristics.

Second, the evaluation will also examine the question of whether the strength of implementation or whether certain program features, such as length of program fidelity, length of mental toughness orientation, or strength of post-secondary services, are associated with larger impacts, holding other program features constant. Findings from the process analysis as well as discussion with reviewers will guide the selection of program features. We will use multi-level estimation methods for this analysis, where the unit of analysis is the individual for Level One and the sites for Level Two. The site-level impact, then, is allowed to vary with site characteristics (e.g., implementation strength, program components, service contrast, etc.).

This evaluation of the YouthBuild program will be carried out under the authority of the Workforce Investment Act, Section 172 (Appendix Q), which states that “for the purpose of improving the management and effectiveness of programs and activities…the Secretary shall provide for the continuing evaluation of the programs and activities” (WIA, Sec. 172(a) 1998).

This request seeks clearance only for the Interview Protocols, Cost Data Collection Worksheet, Youth Focus Group Questionnaire, Individual Interview Questionnaire, and the Grantee Information Form which are discussed below, all of which will be utilized during the proposed site visits.

2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


Clearance by OMB is currently being requested to conduct interviews during the proposed site visits (Appendix A-J), collect program cost data (Appendix K), conduct focus groups and interviews with YouthBuild participants (Appendices L and M) , and complete the Grantee Information Form (Appendix P). The data collected through these means will be used in the implementation and cost-effectiveness components of the study. Each instrument is described in detail below, along with a description of how, by whom, and for what purposes the information will be used. The site visits will take place after sites have conducted random assignment.

a. Interview Protocols


The purpose of the interviews with YouthBuild and partner organization staff is to fully document and understand the flow of participants through the program, the range of services provided, and how staff monitors the progress of their participants. Separate protocols (Appendix A – J) have been developed for a range of topics, including: organizational structure, program administration and operations, budget and staffing; recruitment, intake, assessment and enrollment; Mental Toughness Orientation; case management, supportive services and follow-up services; academic services; vocational and construction training; employment services; youth leadership and community service; partnerships; and alternative youth services. The protocols are semi-structured scripts for site visitors to conduct their data collection while on site, allowing flexibility to tailor the composition of questions for each person as appropriate. The individuals expected to be interviewed include: YouthBuild program administrators, including those from the sponsoring agency, if relevant; recruitment and intake staff; case managers/counselors; job developers; educational and vocational instructors; and worksite supervisors. Contracted providers and partners providing services to participants, such as employers providing employment opportunities for youth, will also be interviewed (Appendix I).

Furthermore, describing key contextual factors in the local communities that may affect services and outcomes for study participants is one of the primary goals for the evaluation’s implementation analysis; attention to alternative services available is vital to understanding the service difference between members of the program group and the control group. In addition to talking to YouthBuild staff and participants about other services available in the community, site visitors will also contact and interview providers of alternatives to YouthBuild services in the community. Alternative providers include alternative schools, One-Stop Career Centers, and other community-based organizations (Appendix J).

Each interview will last approximately one hour.



b. Cost Data Collection Worksheet

During the site visit interviews, the evaluation team will also collect data on the costs of providing YouthBuild services (Appendix K). Cost information will be collected directly from sites, rather than DOL, because site-level data collection is more detailed and many YouthBuild programs combine funding from multiple sources to provide services to youth, which would be missed if cost data were collected from only one source. Assessments of the YouthBuild program will depend not only on the impacts it generates, but also on whether the benefits are deemed to justify the costs as would be learned from a cost-effectiveness analysis.

Data will be collected to determine the average cost of YouthBuild per program participant. These costs include: administrative expenses; staff salaries and fringe benefits; other direct costs such as materials and supports; and indirect expenses such as rental and facility expenses. Costs of other services accessed by program participants through community partners (such as substance abuse or mental health counseling) and implied dollar value of contributions by participants to the production of housing will also be calculated.

c. Youth Focus Group/Interview Questionnaire

During each site visit, evaluation team members will interview a small number of YouthBuild participants in either a group, of five-to-six participants, or individually. In a randomly selected half of the sites, focus groups of youth in the research sample will be held. In the other half of the sites, one-on-one interviews will be conducted with two youth participants. Interviews will ask participants about the program, its strengths and weaknesses, and about its influence on their knowledge development, attitudes, and behaviors.

For the focus groups, site staff will be asked to select a group of youth to participate. Each focus group discussion will last approximately one hour. Site visitors will facilitate the discussions, asking the group to respond to a series of open-ended questions (Appendix L). These discussions will be relatively unstructured in order to allow for open discussion and a candid exchange of ideas.

For the individual interviews, youth will be randomly selected from the research sample in advance of the visits. Each individual interview will last approximately 45 minutes and use similar questions as for the focus groups (Appendix M). The youth interviews and focus group discussions will supplement information gathered through observations and interviews with program staff. Data collected during the discussions will also provide a youth participant perspective to the evaluation team’s understanding of YouthBuild program services and operations.

d. Classroom and Worksite Observation Checklists

While information collected through direct observation may not be subject to clearance under the Paperwork Reduction Act (5 C.F.R. § 1320.3(h)(3), we note that during the evaluation team’s site visits, structured observations of YouthBuild education and training activities will supplement discussions with program staff and participants. Systematic observations should provide rich insights into dimensions of YouthBuild programs that may relate to positive outcomes. The Classroom Observation Checklist (Appendix N) will be used in academic and vocational classes and focuses on several dimensions: classroom environment; teacher-student connection; linkage to vocational (or academic) program; overall instruction; and student engagement. The Worksite Observation Checklist (Appendix O) will be used on construction worksites and assesses worksite management, the quality of activities available on the worksite, and student engagement.

The checklists will provide objective information about aspects of instructional quality that are central to YouthBuild. These data can be used to assess how well programs in the study implement their visions for their instructional component. Furthermore, detailed information about how successful programs provide classroom instruction and hands-on learning opportunities at worksites will be a useful resource for all YouthBuild programs.

e. Grantee Information Form

During the evaluation team’s site visits, evaluation team members will ask the YouthBuild program administrator to complete the Grantee Information Form (Appendix P). This form gathers data on the grantee’s history operating DOL-funded YouthBuild programs and the program’s overall budget, including the availability of non-DOL funding sources. The form also gathers comprehensive data on the YouthBuild program’s staffing structure, including the total number of staff and their roles and level of experience. These data will be used to assess the level of resources available at each program.

3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.

The data collection efforts planned for each site visit will not involve technological or information technology collection techniques, rather, data on site will be collected via interviews and observations exclusively.

4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in item 2 above.


The current evaluation strives to minimize data duplication. Interview questions will gather information not available through other data collection efforts, though, in some cases, multiple individuals will be asked similar questions as it is important to obtain multiple perspectives. Focus groups with program participants and individual interviews with one or two participants in each site will collect much more detailed information about program experiences than will be collected through the participant follow-up surveys (subject of a future clearance package request for OMB). The focus group method and individual participant interviews also allow for additional probing which is not possible through the follow-up surveys.

5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.


Administration of site visit data collection instruments will create a minimal burden on small businesses or other small entities. The evaluation team will work with site staff to plan an efficient, yet productive, site visit that is most convenient for the organization operating the YouthBuild program. Interviews with staff and others with connections to YouthBuild will last approximately one hour only and will be scheduled to accommodate the needs of the respondents. Completion of the Cost Data Collection Worksheet will take approximately two hours per grantee. Completion of the Grantee Information Form will take approximately one hour per grantee. The observation of classroom and worksite activities does not require departure from standard operating practices and imposes no burden on program staff or YouthBuild participants.

6. Describe the consequences to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles in reducing burden.


The evaluation represents an important opportunity for DOL to add to the growing body of knowledge about the impacts of second-chance programs for youth who have dropped out of high school, including outcomes related to educational attainment, postsecondary planning, employment, earnings, delinquency and involvement with the criminal justice system, and youth social and emotional development.

If the information collection is not conducted, Federal program and policy activities will not be informed by high quality information upon which to fully understand the impacts of the YouthBuild program, nor will DOL and CNCS understand the program components which are most likely to provide impacts upon its participants. In general, it is critical to place the impact findings in the context of the program’s fidelity and strength. If some programs fail to fully implement all of the key components of the YouthBuild model, this will be important context for interpreting the impact findings. Specifically, however, a key part of the evaluation is to examine how impacts vary by program features, such as instructional quality and intensity of case management. This qualitative information is only available through the site visits. Finally, the site visits will provide important data on alternative services available in the community.

Not collecting information about the implementation of YouthBuild programs through the means outlined above will limit the evaluation team’s ability to understand and interpret the impact findings.

7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

  • requiring respondents to report information to the agency more often than quarterly;

  • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • requiring respondents to submit more than an original and two copies of any document;

  • requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

  • in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

  • requiring the use of statistical data classification that has not been reviewed and approved by OMB;

  • that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

  • requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.


There are no special circumstances surrounding the proposed data collection. All data will be collected in a manner consistent with Federal guidelines. There are no plans to require respondents to report information more than quarterly, to prepare a written response to a collection of information within 30 days of receiving it, to submit more than one original and two copies of any document, to retain records for more than three years, or to submit proprietary trade secrets. The evaluation team will conduct only one qualitative data collection site visit to each site participating in the impact component of the evaluation; this site visit will be limited to two days. The information gathered from site visit activities will be used to develop an understanding of how YouthBuild programs are implemented and operated and to inform results of the impact analysis.

8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


a. Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995, the public was given an opportunity to review and comment on this data collection plan through the 60-day Federal Register Notice, published on February 8, 2012 (FR, Vol. 77, No. 26, pp. 6585-6586). A copy of this notice is attached (Appendix R).

No comments were received.

b. Consultations Outside of the Agency

Outside of the evaluation team, there have been no consultations on the research design, sample design, data needs, or data. ETA has, however, collaborated with other DOL agencies and staff, including staff in the DOL Chief Evaluation Office.

9. Explain any decision to provide any payment or gift to respondents, other than re-numeration of contractors or grantees.


The evaluation team does not plan to offer any payments or gifts to program operators, staff, participants, or other individuals interviewed or observed during the site visits described in this clearance request.

10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


All individuals interviewed and observed will be informed that information gathered will not be attributable directly to the respondent and will only be discussed among members of the evaluation team. Terms of the DOL contract authorizing data collection require the contractor to maintain the privacy of all information collected, unless written permission is provided by the program applicant or participant. Accordingly, individual privacy will be protected to the fullest extent permitted by law.

a. Protection of Personal Information

Staff from MDRC, in conjunction with SPR, one of its subcontractors for this evaluation, will conduct the proposed site visits. It is SPR and MDRC policy to efficiently protect all information and data, in whatever media they exist, in accordance with applicable Federal and state laws and contractual requirements. In the event that program participant information is recorded, all program participants will receive unique identification codes which will be stored separately from personally identifying information. Researchers from MDRC and its subcontractors who play a role in data collection and analysis will be trained in proper procedures for data handling and will be prepared to describe these procedures in full detail, and to answer any related questions raised by YouthBuild staff and participants. Access to all data that identify respondents will be limited to staff at MDRC and its subcontractors who have a data collection or analysis role in the project, unless written permission is provided by the program applicant or participant. Such data will be needed for assembling records and assuring data alignment. Any data sent to DOL or CNCS will not contain personal identifiers or any other identifier that would allow individual identification of study participants, except as authorized in writing by the program applicant or participant.

In conjunction with MDRC’s data policy, all staff members are required to:

  • Comply with a Confidentiality Pledge and Security Manual procedures to prevent the improper disclosure, use, or alteration of confidential information. Staff may be subjected to disciplinary and/or civil or criminal actions for knowingly and willfully allowing the improper disclosure or unauthorized use of information.

  • Access information only on a need-to-know basis when necessary in the performance of assigned duties.

  • Notify their supervisor, the Project Director, and the organizational Security Officer if information has either been disclosed to an unauthorized individual, used in an improper manner, or altered in an improper manner.

  • Report immediately to both the Project Director and the organizational Security Officer all contacts and inquiries concerning information from unauthorized staff and non-research team personnel.


b. Protection of Data

The security procedures implemented by MDRC and its subcontractors cover all aspects of data handling for hard copy and electronic data. All hardcopy materials will be shipped to the contractors using Federal Express or an equivalent system that allows for package tracking; if any item is delayed or lost, it will be investigated immediately. All completed hardcopy documents will be stored in locked file cabinets or locked storage rooms when not in use. Unless otherwise required by DOL, these documents will be destroyed when no longer needed in the performance of the project.

c. Background checks and security

Evaluation team members working with this data have undergone background checks, which includes completing the SF-85 form.

11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers these questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


Questions of a sensitive nature will not be asked of YouthBuild program staff, program participants, or others interviewed during the site visits.

12. Provide estimates of the hour burden of the collection of information. The statement should:

  • Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

  • If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.

  • Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage and rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 13.


The hour burden of data collection is outlined in Table A.2.



Table A.2. Burden Estimates for YouthBuild Site Staff and YouthBuild Participants

Data Collection Instrument

Number of

Respondents

Frequency of

Collection

Number of Responses

Average Time Per Response

Burden

(Hours)

Interview Protocols


83

12

996

1 hour

996

Focus Group Questionnaire

231

1

231

1 hour

231

Individual Youth Questionnaire

82

1

82

45 minutes

62

Classroom Checklist

0

1



0

Worksite Checklist

0

1



0

Total for Proposed Data Collection—Unduplicated


396




1309



1,289


The 996 hours anticipated for the interview protocols is derived by multiplying the 83 sites that are to be visited as part of this effort by 12 respondents per site (83 x 12 = 996). Because the interviews will last one hour each, we know that 12 is the maximum number of interviews that can be conducted within the two days the evaluation team members will spend at any given site (given that we also are proposing to conduct observations of classroom and worksite settings). Thus, the 996 respondents represent the maximum number that will be interviewed for this effort. The 166 hours anticipated for completion of the Cost Data Collection Worksheet is derived by multiplying the 83 sites that are to be visited by two hours per site (83 x 2 = 166). Five-to-six youth are expected to participate in the focus groups in each of 42 sites (one-half of the sites), resulting in an expected 231 participants in the focus groups (42 x 5.5 = 231). These sessions are expected to last one hour. For the individual youth interviews, 82 youth are expected to participate in interviews (two youth in each of 41 sites) that will last 45 minutes, for a total of 61.5 hours. The Classroom Observation Checklist and Worksite Observation Checklist will be completed by the evaluation team site visitor with no resultant burden imposed on either site staff or YouthBuild participant. The 83 hours anticipated for the Grantee Information form is calculated by multiplying the 83 sites in the study by the one hour per site (83 x 1 = 83).

As noted above, the total estimate of burden for completion of the Interviews, Cost Data Collection Worksheet, Focus Group and Individual Questionnaires, Classroom and Worksite Observation Checklists, and Grantee Information Form is 1,538 hours; 293 of these hours would be borne by participants. At an average wage of $7.25 per hour, which is the wage paid to YouthBuild participants for their time spent in vocational training, this represents a total cost of $2,124.25 for the focus groups and interviews ($7.25 x 293 = $2,124). The total cost for interviews of YouthBuild staff and other professionals (996 hours), assuming a wage of $25 per hour for individuals who are interviewed, is $24,900 ($25.00 x 996 = $24,900). The $25 per hour estimate is derived from the Bureau of Labor Statistics’ estimates of the mean hourly wage paid to those in Community and Social Service Occupations ($20.76, as of 2010) plus an estimate for the fringe benefits paid to these workers, estimated as 20% of the individual’s wages. Though clearly any individual staff person’s wage or fringe benefits may be greater or lesser than this estimate, using the average wage estimate for all those within the industry should yield a reliable estimate of the overall burden across all respondents. Further, we use one hour as the estimate of burden for any single respondent, because we confine interviews to this length of time so as not to unduly burden any single staff person. The interview protocol design is based on substantial prior interviews we have conducted with program staff (including YouthBuild staff) on unrelated activities. Additionally, because the protocols for different respondents have some overlap, if we reach the one hour point for a given interview, but have not yet covered all possible topics, we will conclude the interview with that individual, out of concern for imposing a limited burden, and address any undiscussed topics with subsequent respondents. Thus, the one hour estimate for each interview also represents a maximum. Therefore, the 996 total burden hours is a maximum for these interviews overall.

The total cost of completing the Cost Data Collection Worksheet, assuming an average wage of $36 for those completing these forms, is $5,976 ($36 x 166 = $5,976). The $36 per hour estimate is derived from the Bureau of Labor Statistics’ estimates of the mean hourly wage paid to Social and Community Service Managers ($30 as of May 2010) plus an estimate for the fringe benefits paid to these workers, estimated also as 20% of the individual’s wages. The total cost of completing the Grantee Information Form uses the same assumptions for staff hourly wages as the Cost Data Collection Worksheet, resulting in $2,988 ($36 x 83 = $2,988) The total cost burden for the data collection included in this request for clearance is $35,988 ($2,124 + $24,900 + $5,976 + $2,988 = $35,988).

13. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information.


The proposed data collection will not require the respondents to purchase equipment or services or to establish new data retrieval mechanisms. No capital or start-up costs are anticipated. Nor does the evaluation team expect extensive time to be spent on generating, maintaining, disclosing or providing the information.

14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.


The estimated cost to the Federal government for the study design and components discussed in this supporting statement can be seen below in Table A.3.

The total cost to the Federal government of carrying out this study is $14,957,969, to be expended over the seven-year period of the evaluation contract. Of this, $1,197,996 is for the collection of information for the implementation and cost-effectiveness components of the evaluation, including one site visit to each YouthBuild site participating in the impact component of the evaluation.

Table A.3. Estimated Cost to the Federal Government


Evaluation Contract Task

Cost ($)

Developing site visit protocols, observation checklist, cost data worksheet, and youth focus group and interview questionnaires

3,428

Conducting site visits

967,302

Analyzing/coding data/preparing reports

227,266

Total Contract Cost for this Data Collection Request

1,197,996


In addition, an estimated $200,000 (two staff-year equivalents) will be spent by DOL staff managing the study and overseeing the contractor. Since the project will last seven years (including initial preparation, follow-up data collection, analysis and reporting), the annualized staff cost to the Federal government is $28,571 ($200,000 ÷ 7 = $28,571).

Total annualized cost for conducted for conducting this aspect of the evaluation is $1,226,567.

15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I.


This is the fourth submission for data collection for the Impact Evaluation of the YouthBuild Program. It is a one-time request and will count as +1,538 hours toward ETA’s Information Collection Burden.

16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and end dates of the collection of information, completion of report, publication dates, and other actions.


The data collection for which this supporting statement is seeking clearance will not result in publicly available records. However, information gathered for the implementation and cost-effectiveness components will be described in the evaluation’s Implementation Report.

The evaluation plan includes a range of deliverables and reports. Table A.4, below, shows an outline of these deliverables, followed by a fuller explanation of each item.

Table A.4. Evaluation Reports

Reports

Delivery Dates

Design Report

Spring 2012

Implementation Report

Summer 2013

Interim Report

Fall 2015

Final Impact Report

Spring 2017


Design Report. In the spring of 2012, the evaluation team completed the proposed design for the evaluation. The report (Appendix S) discusses proposed sample sizes, research groups, the random assignment process, and site selection and recruitment. Based on a conceptual model of how YouthBuild might affect youth outcomes, the report outlines key administrative data to be collected and major topics to be addressed in each of the follow-up surveys. Finally, the report outlines the proposed analysis plan for the implementation, impact, and cost-effectiveness components of the evaluation.

Implementation Report. In the summer of 2013, the evaluation team will complete a report describing the findings from the implementation component of the evaluation. This report will document, for example, the process of recruiting sites for the evaluation, the characteristics of sites that participate, and the process of randomly assigning youth to either the program group or a control group. The report will also discuss the characteristics of youth served, the flow of participants through the programs, the delivery of services, youth participation rates, and any challenges to serving participants.

Interim Report. In the fall of 2015, the evaluation team will complete a report describing interim effects of YouthBuild on a range of outcomes. This report will use data from both administrative records and the 30-month survey to examine impacts on educational attainment, employment, job characteristics, community involvement, attitudes and aspirations, family structure and living arrangements, and involvement with the criminal justice system. The evaluation team will also attempt to examine effects for key subgroups of youth, which will be documented and described in the report.

Final Report. In the spring of 2017, the evaluation team will complete the final report documenting longer-term impacts of YouthBuild. Likely outcomes will include participation in education and training, the attainment of educational credentials, employment and earnings, criminal justice involvement, family status and living arrangements, positive behaviors and activities, risky behaviors, health status and other measures of well-being. This report will also examine effects for key subgroups of youth and present an analysis of the effectiveness of certain program components. Finally, the report will present an analysis of the cost-effectiveness of the program.

17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The expiration date for OMB approval will be displayed on all forms completed as part of the data collection.

18. Explain each exception to the certification statement identified in Item 19, “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I.


Exception to the certification statement is not requested for the data collection.

22


File Typeapplication/msword
File TitlePART A: SUPPORTING STATEMENT FOR
AuthorBarbara Collette
Last Modified ByMichel Smyth
File Modified2012-07-16
File Created2012-07-16

© 2024 OMB.report | Privacy Policy