Download:
pdf |
pdfPart B: Collections of Information Employing
Statistical Methods for the Institutional
Analysis of American Job Centers
November 9, 2015
Submitted to:
Office of Management and Budget
Submitted by:
Chief Evaluation Office
Office of the Assistant Secretary for Policy
United States Department of Labor
200 Constitution Avenue, NW
Washington, DC 20210
This page has been left blank for double-sided copying.
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART B
CONTENTS
PART B: COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS .............................. 1
B.1. Respondent universe and sampling methods .......................................................................... 1
1. AJC sample selection.......................................................................................................... 1
2. State administrator interview ............................................................................................... 2
3. Partner network analysis survey ......................................................................................... 2
B.2. Procedures for the collection of information ............................................................................. 3
B.3. Methods to maximize response rates and deal with nonresponse .......................................... 7
B.4. Tests of procedures or methods to be undertaken................................................................. 10
B.5. Individuals consulted on statistical aspects of design and on collecting and/or
analyzing data.......................................................................................................................... 11
REFERENCES ............................................................................................................................................ 13
iii
This page has been left blank for double-sided copying.
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART B
PART B: COLLECTION OF INFORMATION EMPLOYING STATISTICAL
METHODS
The Chief Evaluation Office (CEO) of the U.S. Department of Labor (DOL) has contracted
with Mathematica Policy Research and its partners—Social Policy Research Associates, the
George Washington University, and Capital Research Corporation (hereafter “the study
team”)—to conduct the Institutional Analysis of American Job Centers (AJCs) in order to
examine the full spectrum of institutional features that shape AJCs’ day-to-day operations and
customer experiences. The study will (1) present a comprehensive and systematic description of
AJC funding, organization, administration and management, and service delivery structures and
processes, and (2) examine AJC service delivery to its customers, including those services
provided to target populations.
This package requests clearance for three data collection activities conducted as part of the
study: (1) site visits to AJCs and their local workforce investment boards (LWIBs);
(2) interviews with state workforce administrators; and (3) a network analysis survey of AJC
partner organizations.
B.1. Respondent universe and sampling methods
1.
AJC sample selection
To select AJCs for site visits, the study team will employ a two-phase sampling approach
that will yield a purposive sample of AJCs. This approach aims to capture geographic diversity,
variation in urbanicity, and variation in the types of administrative entities that operate AJCs.
Phase one. In the first phase of site selection, the study team will select a systematic random
sample of 120 comprehensive AJCs from all AJCs in the United States, implicitly stratifying by
state and urbanicity. DOL’s Office of Workforce Investment (OWI) maintains America’s Service
Locator, a database of the nation’s AJC network. We will obtain from OWI a file containing the
name and location (address) of all AJCs in the United States. We will limit the sample by the
following criteria to include only the following: (1) comprehensive AJCs (i.e., omitting affiliate
job centers), and (2) AJCs in the 48 contiguous states.
Urbanicity will be identified by using the USDA Rural-Urban Continuum Code
classification for 2013 (RUCC), which distinguishes metropolitan counties by the population size
of their metro area, and nonmetropolitan counties by degree of urbanization and proximity to a
metropolitan area. Each county in the U.S. is assigned a RUCC classification by USDA. To
match the corresponding RUCC classification to each AJC, we will identify the appropriate state
and county Federal Information Processing Standards (FIPS) codes based on the location of the
AJC using Census FIPS code data for counties. We will merge FIPS codes by zipcode onto the
sample frame.
We will then create a systematic sample of AJCs using implicit stratification. By implicit
stratification, we mean sorting the sampling frame by one or more stratification variables before
sampling to help the sample resemble the frame with respect to the distribution of those
characteristics. For our phase 1 sample of 120 AJCs, we will implicitly stratify the sampling frame
by state, and then by level of urbanicity within state.
1
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART B
Phase two. In the second phase, the study team will select AJCs using purposive sampling
based on variation in the types of administrative entities that manage AJC operations, geographic
location, and urbanity. Under the Workforce Investment Act (WIA) of 1998 and the recently
enacted Workforce Innovation and Opportunity Act (WIOA), which replaced WIA, center
operators can be a wide range of entities including postsecondary educational institutions,
employment service agencies, nonprofit organizations, private for-profit agencies, a government
agency, or other business organizations. Center operators can be a single entity or a consortium
of entities. Because there is no centralized data source that maintains data on AJC center operator
type (or other institutional features such as colocation of partners), we will ask the Local
Workforce Investment Boards (LWIBs) that oversee the AJCs selected in Phase I: (1) what type
of center operator currently manages that AJC, and (2) whether the AJC is part of a designated
consortium or procured through a competitive process. In the event that we are unable to make
successful contact with the LWIB, we will ask the AJC to provide us this basic information.
After obtaining these two items of information for all 120 AJCs, we will select a purposive
sample of 60 comprehensive AJCs to ensure that the AJC study sample captures variation in
center operator types, geographic diversity, and a mix of rural, suburban, and urban areas. The
study team will conduct site visits to these 60 selected comprehensive AJCs. Because Phase II of
site selection uses a purposive approach to sampling the 60 comprehensive AJCs for
participation in the study, study findings will apply only to the 60 selected AJCs and will not be
generalizable to the entire AJC population.
2.
State administrator interview
The study team will conduct telephone interviews with state workforce administrators in
each state in which there is a selected AJC. We anticipate that the 60 selected AJCs will be
located in as many as 40 states. It is possible that some states may contain more than one
selected AJC; the study team will only interview the state administrators from these states once.
Therefore, the study team will interview state administrators in up to 40 states.
The study team will focus on interviewing the state-level administrators of entities that are
most involved in policy and administrative activities that affect AJC operations. We anticipate
that this will typically include a representative from each of two groups of state-level staff:
(1) Employment Service managers and (2) agency staff responsible for enforcing WIA/WIOA
and setting and monitoring policies relevant to AJCs. The study team will contact the state
workforce administrator indicated in DOL records, and request that they identify these
individuals. This means that across the 40 states in which selected AJCs are located, the study
team will interview about 80 administrators.
3.
Partner network analysis survey
The network survey is a brief, targeted tool used to explore the strength of relationships
between the key entities (partners) that oversee service delivery within the AJC framework as
part of the overall effort to describe and analyze AJCs as institutions. The short survey will
systematically collect information on select elements of partner interactions (frequency of
communication, level of collaboration, and referral flow) (see instrument 6) and will include
partners that the study team might not be able to meet with directly on site.
2
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART B
The study team will conduct the survey with a purposively selected subset of 30 of the AJCs
visited. The purposes of the survey are to identify different typologies of AJC networks and
explore what these typologies suggest about the themes and variations across AJCs in the
structure and strength of partner networks. The survey is not intended to comprehensively
present the network for each individual AJC. It is an exploratory analysis to further inform
knowledge about AJC networks and their functioning. As such, the survey can fulfill its purpose
and accomplish its goals by focusing on a subset of AJCs that are purposively selected to
represent a range in the types of AJC operating entities, size, and geographic location (states as
well as urban/rural mix).
We will administer the survey to an average of 15 partners identified by the AJC manager in
each of the selected 30 AJCs. To identify the list of partners, the study team will first discuss the
structure of the AJC with AJC managers during initial outreach and previsit planning calls.
During these site contacts, we will ask managers about the appropriate entities and locations for
the delivery of all programs and services—both mandated and voluntary—that are important to
the AJC service delivery structure. During the site visits, the study team will then confirm this
structure and, if necessary, add more partners to develop a comprehensive map of the AJC
service delivery network.
Based on information gathered through the WIA Adult and Dislocated Worker Programs
Gold Standard Evaluation, we expect to identify 15 key partner entities, on average, across AJCs.
The number of partner entities in the WIA evaluation sites ranged from 4 to 30, with a median of
11 and an average of 12. A recent descriptive analysis of AJC services available to low-income
populations at six AJCs found that each had 9 to 16 partners, similar amounts as those identified
by the Gold Standard Evaluation (Wright and Montiel 2010). In addition, drawing from the WIA
Gold Standard Evaluation information and the experience of the study team, we expect to find
the following types of partner entities: (1) state departments of labor or workforce; (2) city or
county departments of labor, workforce, employment, and training; (3) state, city, or county
departments of human or social services; (4) community and/or technical colleges; (5) state or
local housing authority; (6) state or local office of the aging; (6) public–private partnership
entities (such as training partnerships); (7) community-based agencies (such as Goodwill or
Community Action Agencies); and (8) private, for-profit contracted providers.
B.2. Procedures for the collection of information
1. Data collection
The data sources and data collection activities for the Institutional Analysis of AJCs are as
follows.
Site visits to AJCs. The most important source of data for this study will be in-person visits
to 60 AJCs. Prior to and during the visits, the study team will use data collection instruments to
document information provided by the AJC and/or its LWIB. Sites will not be asked to complete
the instruments, but will participate in pre- and on-site visit interviews, and be asked to provide
particular documents and materials.
Prior to each site visit, the study team will collect key information about each AJC, such as
its address and hours of operation, and information about the AJC operator, its LWIB, and AJC
3
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART B
partners. Information collected prior to the visits will help the study team identify pertinent
respondents for site visit interviews and efficiently plan and conduct study site visits. Also, this
information will ultimately provide important AJC-level context for the analysis. Before
contacting sites, the study team will gather pertinent data available from AJC and LWIB
websites, and other publically available sources. During previsit phone calls, the research team
will ask only about outstanding items. We anticipate that the data will be collected across two
1.00-hour phone calls: one with each of these staff.
Knowledge about AJCs and their LWIBs gained from reviews of documents can also
significantly increase the efficiency of data collection on site, and assist the study team in
constructing detailed profiles of each AJC in the sample. Further, these documents commonly
contain information that is difficult for respondents to recall from memory and which is
particularly tedious and time-consuming to collect while on site. Hence, prior to site visits,
during these telephone calls, the study team will request documents from the LWIB and selected
AJCs, including:
•
Financial documents, such as AJC memorandums of understanding (MOUs), resource
sharing agreements (RSAs), and AJC operating budgets, that outline AJC financing
strategies and the role of each partner in supporting AJCs’ operations and service delivery
•
Data management, performance reports or other data outputs that present aggregate figures
on customer characteristics, total customers served, percentage of customers receiving select
services, and performance outcomes
•
Reports or other materials such as data system procedures and training manuals, data
dictionaries, data system guidelines and procedures, data mapping documents
•
AJC and LWIB annual plans and organizational charts.
During previsit calls, the study team will ask the LWIB and AJC managers to indicate from
whom we should collect these relevant materials. We anticipate that this will typically be some
combination of the AJC manager and LWIB staff. If this information cannot be provided prior to
site visits, the research team will collect it during site visits.
Part A of this submission lists the research topics that the study team will explore during site
visits. The study team will obtain information on each topic from multiple respondents, allowing
the study team to capture multiple perspectives so that no single person’s opinions or responses
will be assumed to be fully representative. The study team will interview both managers and line
staff. This will ensure that the study team members understand not only how service delivery and
administrative processes are supposed to work, but also how they actually work. The
respondents are: (a) AJC operator managers and central office staff, (b) AJC managers, (c)
representatives from key partners, (d) AJC line staff, (e) LWIB staff, and (f) local government
workforce administrators.
In each center, the study team will conduct structured observations of AJC layouts and
operations. Site visitors will use observation worksheets to collect information on topics such as:
the location of the AJC (for example, in a mall, a stand-alone building, or in an office building),
the signage for the AJC, the layout of the AJC, where various partner staffs are housed at the
center, administrative and MIS functional areas, and whether the center’s layout facilitates or
4
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART B
impedes collaboration and efficient customer flow. Site visitors will also observe what happens
when a customer first walks into the center (for example, how he or she is greeted and guided to
different activities, services, and resources), and the resource room.
The AJC data collection materials (see instruments 1 through 3) will guide site visit
preparation, on-site interviews, and observations. The specific questions and length of each onsite interview will depend on the specific respondent. On average, teams of three researchers will
spend about three days at each site. No single interview will exceed 2.00 hours, and most will
average between .75 hour and 1.00 hour.
State administrator interviews. The study team will conduct, semi-structured telephone
discussions with state workforce administrators in each state for which there is a selected AJC
(we estimate that there will be up to 40 such states). While the work of each AJC partner is
overseen by its requisite state agency (for example, the state human services agency oversees
Temporary Assistance for Needy Families and the Supplemental Nutrition Assistance Program,
and the state workforce agency oversees the WIOA program), the study team will focus on
interviewing administrators of entities that are most involved with AJCs in each of the study
states. This will typically include representatives from two groups of state-level staff: (1)
managers of state-run workforce programs such as the Employment Service, and (2) agency staff
responsible for enforcing WIA/WIOA and setting and monitoring policies relevant to AJCs.
State-level phone interviews will be conducted prior to site visits, providing the study team with
state-level contextual information that will allow for more detailed exploration of topics on site.
As with AJC data collection, the study team will collect data from state agency websites and
other publically available sources, and will focus on confirming this information during
interviews and asking only about outstanding items. An introductory letter and the interview
protocol are included as instruments 4 and 5, respectively.
Partner network analysis survey. The partner survey will be targeted to the administrator
or manager within each identified entity who has the most comprehensive knowledge of service
delivery decisions related to the AJC within his/her own entity, and of communication about
service delivery issues with other partner entities. This strategy could result in surveying a range
of respondents, from a local director of workforce services who is a state employee, to a manager
of adult literacy programs within a community-based organization. The study team will identify
individuals who fit these criteria, and will obtain contact information for them during the site
visits.
All data collection for the survey will be accomplished via email. The use of simple
electronic delivery allows for self-administration of the AJC partner survey, as well as for
tracking survey completions. Partner contact information, gathered during the AJC site visit, will
be used to distribute the survey to the partners identified by each selected AJC. The survey will
be attached as a PDF to an email introducing its purpose in the study and providing instructions
for its completion and return. Partner respondents can open the PDF attachment on the
introductory email, enter their responses, and forward the email back to the sender with the
document attached, at a convenient time for them. We plan for three additional follow-up
communications with non-respondents through various means: (a) an endorsement email from
the AJC manager, (b) a follow-up email from the AJC manager, and (c) follow-up by the study
team (see instruments 8 through 10).
5
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART B
2. Statistical methodology, estimation, and degree of accuracy
This study does not require statistical methodology or estimation. The data collected from
the site visits, state administrator interviews, and partner survey will be analyzed using
qualitative and descriptive methods. Because the study team will use a partially purposive
approach to sampling the 60 AJCs for participation in the study, study findings will apply only to
the 60 selected AJCs and will not be generalizable to the entire AJC population.
The qualitative and descriptive analysis plan consists of a mixed-method approach with five
steps:
•
Organize the qualitative data from state telephone interviews and site visits. To
effectively and systematically manage the volume of data, the study team will develop
structured templates and checklists for site visitors to use to distill the information that they
collect during site visits. Then, the study team will organize the data from the site visits and
the state telephone interviews using qualitative data analysis software, such as Altas-ti
(Scientific Software Development 1997).
•
Create summary statistics from the performance management reports provided by
sites. Using data from the performance management reports provided by sites, the study
team will calculate summary statistics by AJC in sites where this data is available. The study
team anticipates that the reports will contain data that allow them to describe the percentage
of all customers who are enrolled into WIA, the Employment Service, and other key
funding/programs, customers’ characteristics and their service receipt. The study team will
organize this information using a performance management data collection worksheet. This
will ensure that data is documented in a standardized way that allows for systematic analysis
across sites.
•
Use the funding and resources sharing information to calculate the proportion of AJC
infrastructure, management, and services funded by each partner. As with the
performance management data, the study team will use a financial data collection worksheet
to organize the financial information collected from sites. This will ensure that data is
documented in a standardized way. Then, the study team will calculate relevant summary
statistics, and identify themes and patterns in the financial data to complement the
institutional analysis and provide a comprehensive picture of variation in AJC funding and
cost-sharing approaches. Finally, the study team will highlight resource-sharing challenges;
different approaches for defining, sharing, and reporting costs; and particularly
comprehensive AJC resource-sharing approaches, which will be of particular interest to
policymakers.
•
Identify themes in the data within and across AJCs. Once all of the site visit data—from
pre-site visit data collection (including financial and performance management reports) and
onsite interviews and observations—and state administrator interview data have been
organized, the study team can examine the data to look for similarities in models of
organization, service delivery, or other characteristics, and then develop typologies of AJCs.
•
Conduct a network analysis using data from the network survey of AJC partners to
develop typologies of partner relationships. The study team will use two primary
measures to describe and depict service delivery networks within and across AJCs: density
6
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART B
(interconnectedness) and centrality (prominence). Density is the proportion of possible
relationships that are actually present, and measures the extent to which each partner is
connected with all others across the network as a whole. Centrality can be used to examine a
few different concepts: (a) the prominence of individual entities within the network by
identifying the partner entities that are most sought after (indegree centrality), (b) partners
that play a central role in supporting communication between other partners (betweenness
centrality), and (c) the degree to which entities are approximately equally central to the
network or to which some entities are much more central than others (betweenness
centralization).
Using “sociograms,” the study team will illustrate the patterns in the size of partner
networks, the strength of the relationships across partners, and the direction of partnerships.
These sociograms will depict the density and centrality of AJC networks based on (a)
contact frequency, and (b) the level of collaboration among key partners. In addition to
sociograms, the study will produce tables that present network-level characteristics such as
overall density and centralization (measures discussed above), also analyzed separately by
frequency of communication and level of collaboration. Throughout the analysis, the study
team will work to identify different typologies of AJC networks that will display key
differences across AJCs.
The study team will also use the data collected on partner referrals to measure density and
centrality of the network specifically as it relates to the flow of referrals. Using the
typologies derived from earlier analyses, the study team will compare the network measures
of density and centrality between the models based on frequency of communication with
those of the flow of referrals for select AJCs within each typology. The study team will
examine the measures of prominence for specific partners within the select networks across
the two models for comparison.
3. Unusual problems requiring specialized sampling procedures
There are no unusual problems requiring specialized sampling procedures.
4. Periodic data collection cycles to reduce burden
There will be only one cycle of data collection.
B.3. Methods to maximize response rates and deal with nonresponse
AJC site visits and state administrator interviews. The process to recruit sites for
participation in the study will include an explanation of the nature of the visits, so that state, AJC
and LWIB staff are aware of what is expected of them when they agree to participate. DOL will
first issue a Training and Employment Notice (TEN) informing states, AJCs and LWIBs about
the study and encouraging them to participate fully if selected. The study team will notify AJCs
and their LWIBs that they have been selected to participate in the study. The study team will
send each an introductory email that includes a letter, signed by DOL’s Chief Evaluation Officer,
urging them to participate in the study. The study team will do the same for states within which
there is a selected AJC. Site visitors will begin working with site staff well in advance of each
visit to ensure that the timing of the visit is convenient.
7
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART B
The site visits will take place over a period of six months, which will allow flexibility in
timing. Because the visits will involve several interviews and activities, there will be flexibility
in the scheduling of specific interviews and activities to accommodate the particular needs of
respondents and AJC operations. In addition, data collectors will meet with in-person interview
respondents in their own offices or at a location of their choice.
Several well-proven strategies will be used to ensure the reliability of site visit data. First,
site visitors, all of whom already have extensive experience with this data collection method, will
be thoroughly trained in the issues of importance to this particular study, including how to probe
for additional details to help interpret responses to interview questions. Second, this training and
the use of the protocols will ensure that the data are collected in a standardized way across sites.
When appropriate, the protocols will use standardized checklists to further ensure that the
information is collected systematically. Finally, all interview respondents will be assured of the
privacy of their responses to questions.
As is described in section B.1. Respondent universe and sampling methods, the study
team will use a partially purposive sampling approach; in the event that an AJC or its LWIB
declines to participate, the study team will use the same selection criteria to purposively select a
replacement AJC from the same region. Similarly, while making every effort to arrange
interviews that accommodate the scheduling needs of the state-level administrators, there may be
instances when an administrator is unable to meet with the team; when this happens, the study
team will request to meet with the administrator’s designees. We anticipate that these approaches
will result in 95% or higher response rates for AJC site visits and state administrator interviews,
which the study team has achieved while conducting qualitative research that include similar
data collection activities (such as, the Workforce Investment Act Adult and Dislocated Worker
Programs Gold Standard Evaluation, the Evaluation of the Summer Youth Employment Initiative
under the Recovery Act, Impact Evaluation of the Trade Adjustment Assistance (TAA) Program,
among others). Further, the study team will conduct follow-up interviews by phone for key
respondents who were unavailable to meet during the AJC visit itself.
Partner network analysis survey. To encourage response to the survey, the study team will
use methods that have been successful for numerous other Mathematica studies. We expect an 80
percent response rate for the partner survey, based on recent Mathematica network analysis
surveys such as the Evaluation of the SSI/SSDI Outreach, Access, and Recovery (SOAR)
Initiative, Healthy Weight Collaborative and Community Coalition Leadership Program. Based
on the study team’s previous experience conducting network surveys using approaches similar to
those described below, this response rate can be achieved through a strategy of outreach and
multiple follow-ups, timing and means of data collection, and ease of survey completion.
Outreach materials will be clear and succinct, and convey the importance of the survey data
collection from multiple perspectives. The initial outreach email conveying the survey to each
AJC partner will (1) introduce the study and its purpose, and its inclusion of the local AJC with
which the partner is connected; (2) highlight DOL as the study sponsor; (3) explain the voluntary
nature of participation in the survey; (4) include a DOL website address that sample members
can use to learn more about the study; and (5) provide a contact name, number, and email
address for questions that sample members may have. We will request that the AJC operator at
each selected site send a short endorsement email to encourage the participation of the AJC
8
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART B
partners within their site (we will provide one that can be adapted by the site). A reminder email,
sent two to three weeks after the initial invitation to complete the survey, will contain similar
information to the initial invitation email about the purpose of the study and encourage the AJC
partners to complete the survey.
After four weeks of non-response, the AJC operator within the selected site will be asked to
contact any non-responding AJC partner within their site. Mathematica will provide the
suggested email text, but we will request that the AJC administrator send the email directly to the
AJC partner. Alternatively, we may seek the endorsement of another partner that has participated
in the survey and has a stronger connection with a non-responsive partner entity than the AJC
operator may have. A thank you/reminder email will be sent to each non-responder six weeks
after the initial email, thanking them for their participation in completing the survey if they have
done so, or reminding them to go on line and complete the survey. If there is no response eight
weeks after the initial email, a member of the Mathematica team will place a telephone call to the
non-responders requesting that they complete the survey, and offering to complete it quickly by
phone if they prefer. The emails that will be sent to sample members are included in instrument
numbers 7 through 10.
The timing of the partner survey shortly after a site visit will support high response rates.
The survey will be administered in waves within two to three weeks of completion of the site
visits to the selected AJCs. The study team will have made some initial contact with a majority
of potential respondents during site visit interviews, lending them a level of comfort with the
purpose and legitimacy of the survey. In addition, administering the survey by email can support
a high response. Because the AJC partners are working professionals, email will be the most
effective means of communication.
Lastly, the AJC survey is designed to be easy to complete. The questions are written in clear
and straightforward language. The average time required for the respondent to complete the
survey on the web is estimated at 10 minutes. The full list of partners will be pre-loaded into the
PDF document to obtain a response that relates to each partner. The PDF will allow for the entry
of responses (only check marks or Xs are necessary) but prevent revision of any other text or
information in the questionnaire. The respondent will be able to view the question matrix with
each possible category of response (across the top) and the full range of partners (down the side)
on one sheet. This approach is commonly used for network data collection to help respondents
consider their levels of connectivity with all partners of the network and assess their relationships
using a common set of considerations regarding the question of interest. The approach can only
be used when the network is known ahead of time and the number of partners is relatively small,
and it has the added advantage of facilitating data entry and analysis in that respondents provide
information about all partners in the network.
Non-response will be addressed two ways. Partners that do not complete the survey at all
will be excluded from the analysis for that AJC. Missing responses on particular questions or for
particular partners will be represented as no communication, collaboration or flow of referrals in
the analysis. The default for nonresponse is that there is a minimal relationship between the
partners.
9
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART B
B.4. Tests of procedures or methods to be undertaken
All procedures, instruments, and protocols to be used in the conduct of the AJC evaluation
will be tested to assess the data collection processes, to evaluate the clarity of the questions, and
to identify possible modifications to either the wording of the question, or question order that
could improve the quality of the data.
To ensure that the AJC data collection protocol is used effectively as a field guide and that it
yields comprehensive and comparable data across the study sites, senior research team members
will conduct a pilot site visit before any other visits are conducted. The purpose of the pilot test
is to ensure that the field protocol, which will guide field researchers as they collect data on site,
includes appropriate probes that assist site visitors in delving deeply into topics of interest, and
that the protocol does not omit relevant topics of inquiry. Furthermore, use of the protocol during
a pilot site visit can enable the research staff leading this task to assess that the site visit agenda
that the research team develops—including how data collection activities should generally be
structured during each site visit—is practical, given the amount of data that is to be collected and
the amount of time allotted for each data collection activity. Adjustments to the AJC data
collection protocol will be made as necessary.
Based on the pilot experience, the study team will train all site visitors on the data collection
instruments to ensure a common understanding of the key objectives and concepts as well as
fidelity to the protocols. The training session will cover topics such as the study purposes and
research questions, data collection protocols, procedures for scheduling visits and conducting onsite activities (including a review of interview techniques and procedures for protecting privacy),
and post-visit files and summaries.
Senior research team members will pilot the state administrator phone interview guide in the
same way and for the same purposes as the AJC data collection protocol, before any other
interviews are conducted.
The AJC partner survey will be pre-tested using two to three AJCs with three to four
partners within each AJC, not to exceed nine AJC partner respondents in total. The study team
will select two AJC sites to pre-test the AJC Partner survey. The sites will be selected from
among the AJCs that are likely to be among the first scheduled for site visits but are not included
in the 30 AJCs selected for the full partner survey. Pre-tests will be conducted using the same
methods as those planned for the full survey administration, using hard-copy versions of the
survey delivered electronically by email. The participants will be asked to complete the survey
and record the amount of time that it took. Following each pre-test, the study team will debrief
with each participant using a standard debriefing protocol to determine how long the survey
took, whether any words or questions were unclear or difficult to understand and answer,
whether the participant thought that key partners were missing (and who those were), and how
the general flow and sequencing of questions worked.
10
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART B
B.5. Individuals consulted on statistical aspects of design and on collecting
and/or analyzing data
Consultations on the statistical methods used in this study have been used to ensure the
technical soundness of the study. The following individuals were consulted on the statistical
methods discussed in this submission to the Office of Management and Budget:
Mathematica Policy Research
Dr. Sheena McConnell
Vice President, Director of Human Services Research, Washington, DC Office
1100 1st Street, NE, 12th Floor
Washington, DC 20002-4221
Social Policy Research Associates
Dr. Ron D’Amico
President & Senior Social Scientist
1330 Broadway, Suite 1426
Oakland, CA 94612
George Washington University
Dr. Burt Barnow
Amsterdam Professor of Public Service
Media and Public Affairs Building
805 21st St. NW
Washington, DC, 20052
Additional staff responsible for collecting and/or analyzing data are listed in Table B.1
below.
Table B.1. Individuals who will collect and/or analyze data for the
Institutional Analysis of AJCs
Mathematica Policy Research
P.O. Box 2393
Princeton, NJ 08543-2393
(609) 799-3535
Pam Holcomb (Project Director)
Linda Rosenberg
Gretchen Kirby
Jessica Ziegler
Elizabeth Clary
Brittany English
Scott Baumgartner
Katie Bodenlos
Social Policy Research Associates
1330 Broadway Suite 1426
Oakland, CA 94612
(510) 763-1499
Kate Dunham
Deanna Khemani
Mike Midling
Jeff Salzman
Christian Geckeler
Melissa Mack
Jill Leufguen
Hannah Betesh
Miloney Thakrar
Anne Paprocki
David Mitnik
Lydia Nash
11
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART B
Vernice Chavota-Perez
George Washington University
2121 I Street, N.W., Suite 601
Washington, D.C. 20052
Burt Barnow
Capital Research Corporation
1910 N Stafford Street
Arlington, VA 22207
(703) 522-0885
John Trutko
Carolyn O’Brien
12
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: REFERENCES
REFERENCES
Barnow, Burt, and Christopher King. “The Workforce Investment Act in Eight States.” Prepared
for the U.S. Department of Labor Employment and Training Administration. Albany, NY:
Nelson A. Rockefeller Institute of Government, February 2005.
D’Amico, Ron, Kate Dunham, Annelies Goger, Charles Lea, Nicole Rigg, Sheryl Ude, and
Andrew Wiegand. “Findings from a Study of One-Stop Self-Services: A Case-Study
Approach.” ETA Occasional Paper No. 2011-16. Washington, DC: U.S. Department of
Labor, Employment and Training Administration, 2011.
Dunham, Kate, Annelies Goger, Jennifer Henderson-Frakes, and Nichole Tucker. “Workforce
Development in Rural Areas: Changes in Access, Service Delivery and Partnerships.”
Prepared for the U.S. Department of Labor Employment and Training Administration.
Washington, DC: DOL, 2005.
Holcomb, Pamela, Rosa Maria Castaneda, and John Trutko. “The One-Stop Workforce Delivery
System in Virginia’s Local Workforce Investment Areas: An Assessment.” Prepared for the
Virginia Employment Commission. Washington, DC: The Urban Institute, 2007.
Mack, Melissa. “Strategies for Integrating the Workforce System: Best Practices in Six States.”
Final report submitted to the Washington Workforce Training and Education Coordinating
Board. Oakland, CA: Social Policy Research Associates, September 7, 2006.
Macro, Bronwen, Sherry Almandsmith, and Megan Hague. “Creating Partnerships for
Workforce Investment: How Services Are Provided Under WIA.” Prepared for the U.S.
Department of Labor Employment and Training Administration. Oakland, CA: Berkeley
Planning Associates, September 2003.
Mueser, Peter R., and Deanna L. Sharpe. “Anatomy of Two One-Stops: Camdenton, Missouri
and Columbia, Missouri.” Prepared for the U.S. Department of Labor Employment and
Training Administration, Occasional Paper 2006-08. Columbia, MO: Employment and
Training Administration and University of Missouri-Columbia, 2006.
Pindus, N., K. Robin, K. Martinson, and J. Trutko. “Coordination and Integration of Welfare and
Workforce Development System.” Washington, DC: The Urban Institute and the U. S.
Department of Health and Human Services, 2000.
Salzman, J. “Integration and the WorkSource System.” Final report submitted to the Washington
Workforce Training and Education Coordinating Board. Oakland, CA: Social Policy
Research Associates, September 7, 2006.
Social Policy Research Associates. “The Workforce Investment Act After Five Years: Results
from the National Evaluation of the Implementation of WIA.” Oakland, CA: Social Policy
Research Associates, 2004.
13
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: REFERENCES
Stack, Treva, and David Steven. “Anatomy of a One-Stop: Baltimore City Eastside Career
Center.” Prepared for the U.S. Department of Labor Employment and Training
Administration, Occasional Paper 2006-07. Employment and Training Administration.
Baltimore, MD: University of Baltimore, 2006.
United States General Accounting Office. “Workforce Investment Act: One-Stop Centers
Implemented Strategies to Strengthen Services and Partnerships, but More Research and
Information Sharing Is Needed.” Report to Congressional Requestors. Washington, DC:
GAO, June 2003.
Werner, Alan, and Kendra Lodewick. “Serving TANF and Low-Income Populations Through
WIA One-Stop Centers.” Cambridge, MA: Abt Associates, 2004.
Wright, David J., and Lisa M. Montiel. “Workforce System One-Stop Services for Public
Assistance and Other Low-Income Populations: Lessons Learned in Selected States.” Report
to the U.S. Department of Labor, Employment and Training Administration. Albany, NY:
Nelson A. Rockefeller Institute of Government, 2010.
14
File Type | application/pdf |
File Title | Part B: Collections of Information Employing Statistical Methods for the Institutional Analysis of American Job Centers |
Author | Jessica Ziegler |
File Modified | 2015-11-09 |
File Created | 2015-11-09 |