50447 AP_OMB Part B Data Collection_R8-clean to DOL ver2X

50447 AP_OMB Part B Data Collection_R8-clean to DOL ver2.DOCX

America’s Promise Job-Driven Grant Program Evaluation

OMB: 1290-0020

Document [docx]
Download: docx | pdf

PART B: DATA COLLECTION ACTIVITIES FOR THE EVALUATION OF AMERICA’S PROMISE JOB-DRIVEN GRANT PROGRAM

OMB No. 1290-0NEW

september 2018

PART B: DATA COLLECTION ACTIVITIES

The Chief Evaluation Office of the U.S. Department of Labor (DOL) has commissioned an evaluation of the America’s Promise Job- Driven Grants program (America’s Promise). This program aims to create or expand regional partnerships that will identify the needs of specific industry sectors relying on the H-1B visa program to hire skilled foreign workers and prepare the domestic workforce for middle- and high-skilled, high-growth jobs in those sectors. The America’s Promise evaluation offers a unique opportunity to build knowledge about the implementation and effectiveness of these regional partnerships. Mathematica Policy Research and its subcontractor Social Policy Research Associates have been contracted to conduct an implementation and impact evaluation. This package requests clearance for two data collection activities as part of the implementation evaluation:

  1. Grantee survey

  2. Partner network survey

A future information collection clearance request will be submitted with additional implementation evaluation instruments, including protocols for semi-structured telephone interviews and protocols for program stakeholder interviews and focus groups for use during site visits. The impact evaluation will rely on administrative data (no primary data will be collected).

B.1. Respondent universe and sampling

The universe of sites for this evaluation includes the 23 grantees awarded America’s Promise grants. The implementation evaluation includes administration of the grantee survey across all 23 grantees, and a purposefully selected subsample of approximately 6 grantees and their regional partners for administration of the partner network survey. (A future information collection clearance request will include a purposefully selected sample of approximately 12 grantees to participate in site visits involving program stakeholder interviews and focus groups, as well as the selection of approximately 11 grantees to participate in telephone interviews with program stakeholders.)

Table 1. Universe and sample counts for evaluation components

Evaluation components

Number of Sites

Type of Respondent

Universe of Respondents per Site

Expected Sample per Site

Total Sample

Grantee survey

23

Grant managers

1

1

23

Partner network survey

6

Grant manager and lead contact at regional partners

6 to 100+

up to 25

up to 150



1. Site selection

Each component of the implementation study will include a different set of grantees. The grantee survey will be administered to all 23 grantees. There are no strict criteria for the selection of the partner network survey sites, but considerations include the (1) type, maturity, and strength of each partnership; (2) extent and nature of employer engagement; (3) H-1B sector of focus; (4) intensity of approach; (5) approach to sector-based career pathways; (6) target population; (7) strategies for obtaining and using resources for tuition-free training; and (8) strategies for customer-centered design. DOL will prioritize grantees that have developed promising partnerships, such as grantees with active advisory councils, engaged employer partners, partners dispersed throughout the region, or promising new partnerships.

B.2. Procedures for the collection of information

Understanding the effectiveness of the America’s Promise program requires data collection from multiple sources. To collect these data, the study team will field a grantee survey in fall 2018 and a partner network survey in early 2019 and in summer 2020. Next, we describe each of these data collection activities in detail followed by a discussion of the analysis methods to be used.

1. Data collection procedures

Grantee survey. The grantee survey will be administered to all 23 grantees in fall 2018 using a web-based format. Login information for the survey will be sent via email to the lead contact at each grantee. During the fielding window, the study team will send periodic reminders to complete the survey to lead contacts who have not yet returned the survey.

Partner network survey. The partner network survey will be administered to six grantees and approximately 24 partners for each grantee using a web survey in early 2019 and then again in summer 2020. We will identify a lead contact at each partner and email an invitation to the contact with personalized survey login information. During the fielding window, periodic reminders to complete the survey will be sent to contacts who have not yet returned the survey.

2. Analysis methods for implementation evaluation

The goal of the implementation evaluation is to obtain a comprehensive picture of America’s Promise grantees. This will be done by learning how the regional workforce and partnerships were developed and maintained; the types and combinations of services the partnerships provided; the characteristics of the target population; and the community contexts of the grantees. To do this, we will analyze document reviews of extant applications, quarterly performance and narrative reports for all 23 grantees; a web-based grantee survey for all 23 grantees; two rounds of partner network surveys with approximately 6 grantees; and site visits with approximately 12 grantees and telephone interviews with 11 grantees (to be included in a future request).

The Consolidated Framework for Implementation Research will be used to guide the analysis of implementation data gathered from all 23 grantees, including identification of facilitators and barriers.1 We will describe the analysis in a future clearance package that includes the data collection instruments for semi-structured telephone and site visit interviews. This framework was developed to facilitate systematic assessment of the implementation context to reveal the factors that influence implementation, common implementation challenges, and promising strategies for replication.

Using basic descriptive statistics, the study team will summarize quantitative data from the quarterly performance reports, grantee survey, and the partner network survey. The analysis of each source will follow a common set of steps: data cleaning, variable construction, and computing descriptive statistics. To prepare data for analysis, a series of data checks will be run to examine frequencies and means, and to assess the extent of missing data. To facilitate analysis, the team will create variables to address the implementation constructs of interest. Several survey items may be combined into a scale to compare responses from different sources or points in time to identify the level of agreement. For standardized scales, such as the Working Together Survey, psychometric properties of the variables will be examined to assess whether they meet accepted standards.2

Network analysis will be used to study grantee partnerships. The partner network survey will include a set of network questions in which respondents report on their relationships as measured by the frequency, type, and focus of communication with all other key partner respondents. These data will be used to describe the levels of communication and collaboration between partners as well as changes over time. For each partnership, visual representations of relationships (known technically as sociomatrices and sociograms) will be created between organizations and within each partnership (examples of these visuals can be found in Honeycutt 2009).3 Each partner is represented as a node, and connections between two organizations are shown with lines that vary in thickness to represent, for example, the frequency of communication. These visuals will be used to describe the size of the partnerships and to identify any organizations that are isolated from the network. A series of network statistics will also be calculated to examine various aspects of the partnerships. For example, a density statistic is the proportion of existing collaborative ties relative to all possible collaborative ties. A reciprocity statistic is the degree to which organizations agree on their shared relationships within the partnership. We will conduct descriptive analyses to examine change over time within each of the six partner network study sites, such as changes in patterns of interaction represented in the sociograms and changes in network statistics such as density. Interpretation of the results will be grantee specific, based on each grantee’s context and history of collaboration with partners. For example, some grantees may have been able to build on existing, mature partnerships in their communities, whereas others may have recruited new partners for America’s Promise.

3. Assessing and correcting for survey nonresponse bias

A plan for the assessment and correction of survey nonresponse bias will not be necessary for the analysis of grantee survey and partner network survey data. It is anticipated that the response rate for the grantee survey will be 100 percent and due to the nature of the partner network analysis, nonresponse adjustments will not sufficiently account for missing response data. Instead, the study team will focus efforts on establishing and maintaining contact with sample members to achieve sufficient levels of response.

B.3. Methods to maximize response rates and data reliability

No monetary or nonmonetary incentives will be provided to respondents. The methods to maximize response rates and data reliability are discussed for each instrument included in the request.

1. Grantee survey

Response rates. The grantee survey will be administered to all 23 grantees in fall 2018 to collect information on service delivery models, staffing, staff development, partnerships, and implementation of the core program elements. The study team will send the link to the web survey via email to the lead contact at each grantee and monitor progress throughout the fielding period. Reminders to complete the survey will be sent on a weekly basis to grantees that have not yet fully completed the survey instrument. These reminders will be sent first by email and eventually by phone until the desired response rate is met. In addition, ongoing relationships with grantee staff through grantee meetings, DOL support, and development of a succinct survey instrument will facilitate a response rate of 100 percent.

Data reliability. Data from completed web surveys will be reviewed throughout the fielding period for accuracy and consistency. The use of the web mode allows for sophisticated skip logic and fills within the instrument, further improving the overall reliability of the data collected.

2. Partner network survey

Response rates. The partner network survey will be administered to six America’s Promise grantees and their key partners (approximately 24) in early 2019 and again in summer 2020 to collect information on partnership development and resulting systems change. The study team will send the links to the web survey via email to lead contacts at each grantee and their identified partners. Progress will be monitored throughout fielding. Email reminders to complete the survey will be sent on a weekly basis to any contacts who have not yet fully completed the survey instrument. Ongoing relationships between the study team and program staff, specifically those participating in the impact study, as well as DOL support, will facilitate a response rate of at least 80 percent for each round of data collection. Inclusion of only key partners that are active in regional efforts will also support response rates. Due to the nature of the network analysis, nonresponse adjustments will not be sufficient to correct for missing response data. Instead, the study team will work with DOL to reach response rate targets, increasing contact with nonrespondents as needed. This extended contact may include activities such as phone calls or outreach from program staff to encourage response. The study team has fielded partner surveys for other studies that have been able to achieve similar response rates. 4,5

Data reliability. Data from completed web surveys will be reviewed throughout the fielding period for accuracy and consistency. The use of the web mode allows for sophisticated skip logic and fills within the instrument, further improving the overall reliability of the data collected.

B.4. Tests of procedures or methods

All instruments were pre-tested to identify any issues in substance or comprehension prior to submission of this package. The grantee survey was pre-tested with two individuals from existing America’s Promise grantees and the partner network survey was pre-tested with two individuals from partner organizations of existing America’s Promise grantees. Following each pre-test, members of the study team conducted a debrief using a standard protocol to determine whether any topic areas were unclear or any words or questions were difficult to understand or answer.

B.5. Individuals consulted on statistical methods

Consultations on the statistical methods used in this study will ensure the technical soundness of the study. The following individuals are being consulted on statistical aspects of the design:

Peter Mueser, PhD

Professor, Department of Economics and Truman School of Public Health

University of Missouri

Columbia, MO 65211


Mary Alice McCarthy

Director, Center on Education and Skills

New America

740 15th Street NW, Suite 900

Washington, DC 20005


Margaret Hargreaves

Principal Associate

Community Science

438 N. Frederick Avenue, Suite 315

Gaithersburg, MD 20877



The following individuals consulted on statistical aspects of the design and will also be primarily responsible for actually collecting and analyzing the data for the agency:

Mathematica Policy Research

Ms. Jeanne Bellotti (609) 275-2243

Dr. Sheena McConnell (609) 936-2783

Dr. Robert Santillano (510) 285-4653

Ms. Diane Paulsell (609) 275-2297

Consultant

Dr. Kevin Hollenbeck (269) 343-5541




1 Damschroder, L.A., D.C. Aron, R.E. Keith, S.R. Kirsh, J.A. Alexander, and J.C. Lowery. “Fostering Implementation of Health Services Research Findings into Practice: A Consolidated Framework for Advancing Implementation Science,” Implementation Science. vol. 4, no. 7, August 7, 2009.

2 Nunnally, J.C., and I.H. Bernstein. Psychometric Theory, 3rd ed. New York: McGraw-Hill, 1994.

3 Honeycutt, Todd. “Making Connections: Using Social Network Analysis for Program Evaluation.” Issue Brief Number 1. Princeton, NJ: Mathematica Policy Research. November 2009.

4 Cole, Russell, and Margo Rosenbach. “The Integration Initiative Network Survey: Cross-Site Report.” Report submitted to Living Cities. Princeton, NJ: Mathematica Policy Research, August 2012.

5 Angus, Megan Hague, Brittany English, Kevin Hollenbeck, Jeanne Bellotti, Stephanie Boraas, and Sarah Osborn. “Regional Collaboration to Create a High-Skilled Workforce: Evaluation of the Jobs and Innovation Accelerator Challenge Grants.” Draft submitted to U.S. Department of Labor, Employment and Training Administration. Ann Arbor, MI: Mathematica Policy Research, June 24, 2016.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDpatterson
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy