YouthBuild_OMB_Package_Part_B,_10.14.2011

YouthBuild_OMB_Package_Part_B,_10.14.2011.docx

Impact Evaluation of the YouthBuild Program

OMB: 1205-0488

Document [docx]
Download: docx | pdf

Mathematica Policy Research

part b: collection of information involving statistical methods

The U.S. Department of Labor (DOL) has contracted with MDRC to conduct a rigorous evaluation of the 2011 YouthBuild program funded by DOL and the Corporation for National and Community Service (CNCS). The evaluation includes an implementation component, an impact component and a cost-effectiveness component. All grantees will participate in the implementation component of the evaluation while only a random selection of grantees will participate in the impact component of the evaluation. This data collection request pertains to a grantee survey to be administered to all 2011 DOL-funded and CNC-funded grantees as part of the implementation component of the evaluation. In the future, DOL will submit a separate OMB-PRA clearance request for site visits (that are also part of the implementation component) and a longitudinal series of participant surveys that will be administered as part of the impact component of the evaluation. This request, however, pertains only to the grantee survey. It is understood that approval of this data collection request—to administer the grantee survey—does not imply OMB clearance of site visit protocols or participant follow-up data collection instruments.

The grantee survey will be administered to the full universe of YouthBuild programs that either received a 2011 DOL grant or received a 2011 CNCS grant (but not a DOL grant).

1. Site Selection for the Grantee Survey

DOL and CNCS will require all 2011 YouthBuild grantees to complete the grantee survey.

2. Procedures for the Collection of Information

The grantee survey, is a thirty-minute, web-based survey of all 2011 DOL-funded grantees and CNCS-funded grantees. The survey will gather information about the programs and will serve two key purposes. First, it will ensure uniform data on a variety of program characteristics to support the implementation analysis of the YouthBuild evaluation. These data will allow us to explore whether there are correlations between outcomes and program characteristics. Second, the survey will help place the impact analysis findings in context by allowing the team to document how the 77 programs participating in the impact component of the evaluation compare to the broader universe of 2011 grantees.

3. Methods to Maximize Response Rates and Data Reliability

We expect to achieve a response rate of 100 percent to the grantee survey. DOL has indicated that completing the survey is required for its grantees and indicated this requirement in its Solicitation for Grant Applications. Similarly, CNCS also will require all 2011 grantees to complete the survey. In addition to these mandates, we will employ the following approach, which is designed to maximize efficiency and minimize costs:

  • Leveraging DOL’s, CNCS’ and YouthBuild USA’s relationships with grantees to encourage participation. We will work with all three organizations to obtain their help promoting the full engagement of the grantees.

  • A clear, stream-lined survey instrument that will make responding easy and ensure accurate responses. The survey has been designed to be as brief as possible, with clear, easy-to-answer questions (mostly closed questions with a few open-ended questions).

  • An official letter will gain attention and legitimize the study. An advance letter with log-in information will be mailed to grantee sample members, helping legitimize the study and further encouraging participation.

A two-staged outreach strategy for reluctant responders will result in a high conversion rate. Beginning in week 2, we will send e-mail reminders to those who have not responded. We will conduct follow-up telephone calls (in addition to e-mail reminders) to non-responders beginning in week 4. MPR telephone interviewers are trained in refusal conversion. Experienced, expert refusal converters will be assigned to work with the reluctant responders to maximize conversion rates. In addition, DOL and CNCS will reach out to reluctant responders to remind them of their requirement to participate.

4. Tests of Procedures or Methods

Pretesting all surveys is vital to the integrity of data collection. We reviewed previously used questions and developed new questions for the evaluation according to the following guidelines:

  • Questions will be worded simply, clearly, and briefly, as well as in an unbiased manner.

  • Respondents can readily understand key terms and concepts.

  • Question response categories will be appropriate, mutually exclusive, and reasonably exhaustive, given the intent of the questions.

  • Questions will be accompanied by clear, concise instructions and probes so that respondents will know exactly what is expected of them.

Pretesting further enhances the quality of the data by ensuring that all questions are understood as intended and, thus, pretesting is essential to the survey design process.

Mathematica under subcontract to MDRC, pretested the grantee survey during March and April of 2011 by administering the survey to selected 2010 grantees in three modes: paper and pencil interview (PAPI) with a phone debriefing, phone interview with a phone debriefing, and cognitive in-person interviewing. Each mode was intended to closely mimic the experience of completing a survey on the web or by telephone. In total, six pretests took place, including two in each mode.

The goal of the pretest was to assess how respondents understood the terms and questions presented in the survey, learn the accuracy and relevancy of our questions, and determine if we were missing important elements of YouthBuild programs in our questions. The pretest also allowed us to determine the length of time the survey took to complete in each mode. Pretesting the survey in multiple modes was important because it allowed us to detect variations in how the respondents interact with the questionnaire in different settings. Additionally, cognitive interviews helped us to more thoroughly examine respondent understandings of questions, including information about the meaning of specific terms in the context of questions, the details of individual program operations, and what important factors we might be missing in our questions.

Feedback received during the pretests resulted in numerous changes to the grantee survey. Most of these changes related to the following broad issues:

  • Clarification of funding and operations related to DOL vs. other funding sources. The questionnaire initially asked several questions about operating budget, primary funder, and program capacity. Some pretesters reported confusion over how to report these figures because, as a DOL-funded evaluation, they believed that we were primarily interested in program elements tied directly to DOL funding. We clarified this issue by adding language in the introduction, as well as within several questions, specifying that we are interested in learning about budget and operations related to all funding sources.


  • Reporting staff in different program components. Originally, we initially asked respondents to provide the number of full- and part-time staff members employed in different components of their program (i.e. as educational instructors, case managers, worksite coordinators, etc.). Respondents found these questions challenging in three ways. First, respondents noted that some full- and part-time staff work across multiple program components, meaning that the respondent could not accurately describe the resources provided by each individual staff member. Second, respondents reported that some of their staff members were compensated by an outside agency, such as AmeriCorps, but worked at their YouthBuild site. Respondents were unsure of how to report these staff members because the question asked about staff “employed” (which they understood to mean “compensated”) by the program. Finally, respondents noted that some staff members worked in a specific program component but did not have a title to reflect that work; thus, they were not sure if they could report a staff member’s time within that program component. To resolve these three issues, we dropped questions about the number of full- and part-time staff employed by the site, and instead created a table asking for the Full-Time Equivalent (FTE) positions who work at the site within each program component. We also changed the program components from job titles (such as Case Manager) to descriptors of the work itself (such as Case Management).


  • Clarification of reported numbers. Respondents noted that it was challenging to keep track of the numbers they provided related to the number of applicants they received at their site, the MTO-enrollees they accepted, and the final participants they enrolled. We resolved this issue by providing a grid after this series of questions showing the numbers the respondents had provided, with the opportunity to revise the numbers as necessary.


  • Reporting number of hours in program activities. Originally, we asked respondents to provide the number of hours that participants spent in a variety of program activities (such as academic classes or construction work) over the course of the entire program. Respondents found this to be a difficult calculation, so later pretest versions of the questionnaire asked respondents to name the number of hours that participants spent in specific program activities during an average month. However, this calculation was still difficult for respondents to report, and we learned that some programs do not have consistent schedules from month to month, making responses for a “typical” month complicated. We determined that the most important component of this question was discovering the balance of vital program activities; thus the question was changed to ask whether educational activities or workforce activities took more time, or if they were approximately equal.


After completing the pretests, we merged the feedback from each questionnaire and discussed the best way to implement the appropriate changes. We recorded all pretest feedback, as well as the resulting revisions, in a master spreadsheet. Finally, we documented the duration of each pretest questionnaire to ensure that the average time burden was appropriate for respondents. These documented survey durations are available in Table B1.


Table B1

Pretest Mode

Duration (minutes)

Paper and Pencil 1

55

Paper and Pencil 2

30

Phone 1

41

Phone 2

37

Cognitive 1

111*

Cognitive 2

140*

Average Administration Time

41

*Cognitive interviews were not included in the calculation of average questionnaire duration because they were intended to gather data on respondent perceptions and opinions during the course of the interview. Thus, cognitive interviews were artificially lengthy.


The average administration time is 41 minutes excluding the cognitive interviews as noted. This estimate is artificially high due to a lengthy paper and pencil survey that lasted 55 minutes. During this interview the respondent asked for input from another staff member at the site and they ended up discussing most of the terms in the survey. The respondent indicated that she was very interested in the survey, and that her discussion with her colleague was about intellectual curiosity rather than confusion about the terms. The respondent didn’t think the survey would have taken that long if she hadn’t discussed so many of the terms with her colleague.

5. Individuals Consulted on Statistical Methods

No statistical methods will be used for the analysis of the data collected through the grantee survey.  All data collection will be based on a 100 percent sample of the inference population.  In all reports and other publications and statements resulting from this work, no attempt will be made to draw inferences to any population other than the set of units that responded to the data collection effort. As a result, there are no consultants on the statistical aspects of the design for the grantee survey.

3

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePART B: COLLECTION OF INFORMATION INVOLVING STATISTICAL METHODS
AuthorBarbara Collette
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy