Download:
pdf |
pdfPart A: Justification for the Collection of Data
for the Institutional Analysis
of American Job Centers
November 9, 2015
Submitted to:
Office of Management and Budget
Submitted by:
Chief Evaluation Office
Office of the Assistant Secretary for Policy
United States Department of Labor
200 Constitution Avenue, NW
Washington, DC 20210
This page has been left blank for double-sided copying.
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
CONTENTS
Part A: Justification for the Study ..................................................................................................... 1
A.1. Circumstances making the collection of information necessary ........................................ 1
A.2. Purposes and use of the information ................................................................................. 5
A.3. Use of technology to reduce burden ................................................................................ 11
A.4. Efforts to avoid duplication ............................................................................................... 12
A.5. Methods to minimize burden on small entities ................................................................. 12
A.6. Consequences of not collecting data ............................................................................... 12
A.7. Special circumstances ..................................................................................................... 13
A.8. Federal Register announcement and consultation .......................................................... 13
A.9. Payments or gifts ............................................................................................................. 14
A.10. Assurances of privacy .................................................................................................... 14
A.11. Justification for sensitive questions ................................................................................ 16
A.12. Estimates of hours burden ............................................................................................. 16
A.13. Estimates of cost burden to respondents....................................................................... 19
A.14. Annualized costs to the federal government .................................................................. 19
A.15. Reasons for program changes or adjustments .............................................................. 21
A.16. Plans for tabulation and publication of results ............................................................... 21
A.17. Approval not to display the expiration date for OMB approval....................................... 24
A.18. Explanation of exceptions .............................................................................................. 24
REFERENCES ............................................................................................................................... 25
iii
This page has been left blank for double-sided copying.
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
PART A: JUSTIFICATION FOR THE STUDY
The Chief Evaluation Office (CEO) of the U.S. Department of Labor (DOL) has contracted
with Mathematica Policy Research and its partners—Social Policy Research Associates, the
George Washington University, and Capital Research Corporation— (hereafter “the study
team”) to conduct the Institutional Analysis of American Job Centers (AJCs). The purpose of the
study is to provide information to policymakers and administrators that can be used to accurately
describe the full range of institutional features that shape AJCs’ day-to-day operations and
customer experiences. This package requests clearance for three data collection activities
conducted as part of the study: (1) site visits to AJCs; (2) telephone interviews with state
workforce administrators; and (3) a network analysis survey of AJC partner organizations.
A.1. Circumstances making the collection of information necessary
The Workforce Investment Act (WIA) of 1998 and the recently enacted Workforce
Innovation and Opportunity Act (WIOA), which replaced WIA, provide the framework for
America’s workforce development system. Paramount among the goals of the resulting public
workforce system is the provision of employment and training programs through a streamlined
and coordinated service delivery system. To that end, local workforce investment boards
(LWIBs) must bring together a set of “mandatory partners” and establish American Job Centers
(AJCs, formerly known as One-Stop Career Centers) to provide “one-stop shopping” to jobseekers and employers in need of a variety of workforce development resources. States and local
areas are afforded considerable flexibility in structuring the centers and designing and executing
service delivery to meet the needs of their local customers. As policymakers and workforce
administrators continue to look for ways to reduce duplication of services, leverage resources,
and build a strong and collaborative network of partnerships, system stakeholders require a
comprehensive understanding of how the current AJC system operates across the country.
1. American Job Centers
The current AJC system is marked by considerable variation at the local level. All centers,
however, share the goals of providing customers with coordinated and streamlined access to an
array of employment and social services provided by partner agencies, and ensuring that all jobseekers have access to core job search services. LWIBs determine the number and location of all
comprehensive and affiliate centers, and select the organizations that operate AJCs through a
competitive procurement process. Operators can include state and local government agencies
(such as a city or county workforce development office), nonprofit organizations (such as
community action agencies or the LWIB itself), community colleges, and for-profit firms; AJCs
may also be administered by a consortium of entities.
Partners are central to the vision of a streamlined system of services in AJCs. More than a
dozen programs (“mandatory partners”) are required to provide services through AJCs. Partners
have flexibility in the way they provide services through the AJC network, and can co-locate
services at the AJC, or make referrals to off-site service or training providers. Most commonly,
Employment Service staff, who are state employees, provide labor market information and job
placement assistance to customers through an AJC’s resource room. Also, Adult and Dislocated
Worker program staff, who are often LWIB or local contractor staff, engage customers in
additional or intensive services to move them to employment, and as necessary provide them
1
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
with training funds. Other partners, whether co-located at the center or off site, provide target
populations, such as individuals with disabilities, veterans, and public assistance recipients, with
additional services to help them become employed.
Mandatory AJC partners are required to reach a voluntary agreement in the form of a
memorandum of understanding to fund the costs of the AJC infrastructure, other shared costs,
and how the partners will deliver services under the system. The extent to which partners
collaborate and the nature of that collaboration varies across AJCs. In some AJCs, partner
agencies remain relatively “siloed” (although they may collaborate on behalf of common
customers) whereas other centers adopt more integrated management, staffing, and service
delivery models.
2. Previous studies
While previous research on AJCs provides important insights into the structural,
organizational, and service delivery characteristics of AJCs, studies have been limited in scope
and many are outdated (Barnow and King 2005; D’Amico et al. 2009; Dunham et al. 2005; GAO
2003; Holcomb et al. 2007; Mack 2006; Macro et al. 2003; Mueser and Sharpe 2006, Pindus et
al. 2000; Salzman 2006; Social Policy Research Associates 2004; Stack and Steven 2006;
Werner and Lodewick 2004; and Wright and Montiel 2010). Some studies have focused on only
a few AJCs or on the operations of a particular center or small group of centers (for example,
U.S. Government Accounting Office [GAO] (2003); Mueser and Sharpe (2006)). Other studies
focus on specific populations served by AJCs (such as DOL’s ongoing study “Evaluating the
Accessibility of AJCs for People with Disability”) or exclusively on a subset of individual
programs or partners within the AJC system (such as the WIA Adult and Dislocated Worker
Programs Gold Standard Evaluation), rather than on examining the services provided by the full
set of AJC partners or how the partners work together. None provide a system-wide examination
to allow policymakers and administrators a comprehensive understanding of key institutional
features in the current AJC system and variations in how it operates.
3. Study overview
The goals of the Institutional Analysis of AJCs are to thoroughly understand and
systematically document the institutional characteristics of AJCs, and to identify variations in
funding, service delivery, organizational structure, and administration and performance
management across AJCs. To achieve these goals, the study will (1) present a comprehensive
and systematic description of AJC funding, organization, administration and management, and
service delivery structures and processes, and (2) examine AJC service delivery to its customers,
including those services provided to target populations.
The study framework is shown in Figure A.1. The AJC, depicted as a circle, is the primary
unit of analysis. It comprises the ten key domains that define and shape the ways in which the
AJC system operates and serves its job-seeker and employer customers: (1) administrative
structure (2) partnerships; (3) performance and strategic management; (4) staffing; (5) physical
environment; (6) MIS system capacity and the use of technology, including electronic tools and
resources; (7) service delivery structure and linkages; (8) the program and service mix and
sequencing; (9) outreach; and (10) funding. The three boxes on the left-hand side of the figure
depict external factors that are particularly important for understanding AJC organization and
2
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
service provision. These include multiple levels of administration and oversight: national, state,
and local. Other important contextual factors, such as the local labor market conditions and the
socio-economic characteristics of centers’ customers, are depicted as a box surrounding the AJC
and all levels of center oversight.
Figure A.1. Institutional analysis of AJC study framework
The study will address multiple research questions pertaining to each of the AJC domains
and its administration and oversight at multiple levels, and local-level contextual factors that may
also affect AJC operations and service delivery:
Administrative structure and staffing: What is the administrative and management
structure of the AJCs? Which organizations operate the centers? Are they selected through a
competitive process or through a consortium? How do differences in administrative,
management, and staffing structures affect service delivery, program mix, and customer
flow? How does state workforce administrative structure and policy influence AJC
administrative structure and staffing?
Partnerships: Who are the AJC partners, and how does the number and composition of
partner entities vary across AJCs? What is the extent of their involvement in the AJC’s
management and service delivery? What is the nature of the relationship among the various
partners? What roles do state workforce administrative structure and policy play in
facilitating or impeding strong collaborative partnerships among AJC partners?
Service delivery structure and linkages: What are the services provided by the AJC to
job-seeker and employer customers, and how are the services delivered? How do customers
3
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
access services, and how does customer flow differ by customer characteristics and needs?
What services are targeted to specific populations, such as women, veterans, UI claimants,
farmworkers, people with disabilities, older workers, dislocated workers, low-income
workers, welfare recipients, and youth? What strategies are used to reach out to potential
customers and what innovative strategies, such as use of technology tools, are used to
deliver services?
Management and performance: What types of performance measures and indicators are
collected about AJCs? How are data combined, reported, and used within and across
programs to measure performance and inform management decisions for the AJC as a
whole? What systems and procedures are in place to promote coverage, quality, and
accessibility of data within and across programs to support service delivery and performance
management? What data on customers served, types of services provided, client
characteristics, and outcomes are readily accessible and can help describe the range of AJCs
included in the study? How is customer feedback obtained and measured?
Funding and resource sharing: What sources of federal, state, local and private funds
support AJC infrastructure, management and administration, and service delivery? What is
the relative share of each of these sources of funding? How do partners share financial and
other resources to administer programs and serve customers?
Community and contextual factors: How do various contextual factors, such as the local
labor market, affect the operations of the AJC? What broader community initiatives or
networks are AJC managers involved in, and what is the nature of their involvement?
Overall: What innovative and/or promising practices have AJCs implemented to manage
their centers and provide services to customers? What challenges have they faced, and how
have they overcome them?
The key elements of the study’s data collection and analysis plan include: (a) selecting 60
comprehensive AJCs that reflect the diversity of centers across the country; (b) thorough data
collection that includes conducting in-depth site visits to the AJCs and their LWIBs,
administering a short network analysis survey of AJC partners, and holding telephone
discussions with state workforce agency staff in states where site visits take place; and (c)
identifying typologies of AJCs and describing the institutional features associated with these
typologies.
4. Data collection activities requiring clearance
This package requests clearance for three data collection activities of the Institutional
Analysis of AJCs: (1) site visits to AJCs; (2) telephone interviews with state workforce
administrators; and (3) a network analysis survey of AJC partner organizations. The data
collection instruments associated with these activities that require Office of Management and
Budget approval are:
1.
AJC site visit master protocol. The most important source of data for this study is inperson visits to comprehensive AJCs and their LWIBs. Data collection before and during
site visits will involve: interviews with LWIB staff, AJC line staff and partner
representatives; observations; and collection of performance management and financial
4
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
reports. The study team will visit 60 comprehensive AJCs and their respective LWIBs.
Teams of two to three researchers will conduct each visit, which will last, on average, three
days.
2.
State administrator discussion guide. Telephone discussions with state-level workforce
respondents will provide important contextual information about how state-level factors (for
example, state workforce priorities, state workforce administrative structure, state-level
policy and guidance, and state performance standards) influence AJC organization and
service delivery. To systematically collect this information, the study team will conduct
semi-structured telephone discussions with state-level workforce administrators in each state
for which there is a selected AJC. The study team anticipates interviewing an average of two
state-level respondents in each state, and that each interview will take 60 minutes to
complete.
3.
Partner network analysis survey. To gain additional information and insights into AJC
partnership arrangements and relationships, the study team will collect information on
partnerships from a network analysis survey. The short, three-question survey will
systematically collect information on all the AJC partnerships, including partner entities
with which the study team may not be able to meet directly on site. We will conduct the
survey with partners from 30 of the 60 AJCs with which the study team conducts in-depth
site visits. The number of partners surveyed per AJC will likely vary, but the study team
expects to identify and survey 15 key partner entities, on average, for each selected AJC, for
a total of 450 partners. The survey is expected to take 10 minutes to complete.
Documents used to collect data for each of these activities are located at the end of this
Justification Statement, and include materials for the AJC site visits, state administrator
interviews, and partner network analysis survey. Table A.1 lists each instrument included in this
request.
Table A.1. Data collection activity and instruments included in the request
Data collection activity/instrument
AJC site visits
1. Introductory letter (from DOL to AJC managers introducing the study)
2. Introductory telephone script (conducted by the study team)
3. AJC site visit master protocol
State administrator interview
4. Introductory letter (from DOL to state administrators)
5. State administrator interview protocol
Network analysis survey administration
6. Partner survey
7. Introductory email (from Mathematica to partners introducing the survey)
8. Endorsement email (from AJC managers to partners endorsing the survey)
9. Reminder email (from AJC managers to partners)
10. Reminder email (from Mathematica to partners reminding them to complete the survey)
A.2. Purposes and use of the information
The data collected through activities described in this request will be used to: (1) comprehensively and systematically describe AJC funding, organization, administration and
management, data reporting, and service delivery structures and processes; and (2) examine AJC
5
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
service delivery to its customers, including those provided to target populations. These data and
the study team’s analysis will provide important information about how AJCs are structured and
operate. Policymakers can use these data to inform policy discussions, and administrators can
use them to inform management decisions and future planning efforts. Details on the purpose
and use of the information collected for each study activity are provided below. An explanation
of how the study team will analyze and report on all data collected is outlined in Section A.16,
Plans for tabulation and publication of results.
1. Site visits to AJCs
The most important source of data for this study will be in-person visits to 60 AJCs. Data
collection will include the activities described below.
Site visit preparation. Prior to site visits, the study team will collect key information about
each AJC, such as its address and hours of operation, and information about the AJC operator, its
LWIB, and key AJC partners. Information collected during previsit calls will help the study team
identify pertinent respondents for on-site interviews and otherwise plan study site visits, and will
ultimately provide important AJC-level context for the analysis. Before contacting sites, the
study team will attempt to gather pertinent data from AJC and LWIB websites, and other
publicly available sources. During previsit phone calls, the research team will confirm this
information and only request new information about outstanding items. We anticipate that the
data will be collected in two separate 1.00-hour telephone calls—one with an LWIB manager
and one with an AJC manager.
Knowledge about AJCs and their LWIBs gained from document reviews can also
significantly increase the efficiency of data collection on site and assist the study team in
constructing detailed profiles of each AJC in the sample. Further, these documents commonly
contain information that is difficult for respondents to recall from memory and which is
particularly tedious and time-consuming to collect while on site. Hence, prior to site visits,
during previsit telephone calls, the study team will request documents from the LWIB and
selected AJCs, including:
Financial documents. Before site visits, site visitors will request AJC memorandums of
understanding (MOUs), resource sharing agreements (RSAs), AJC operating budgets, and
other existing financial documents. Because there is no dedicated funding stream for the
operational costs of AJC and services are provided through a variety of partners,
understanding AJC financing strategies and the role of each partner in supporting AJCs’
operations and service delivery is vital to fully understanding how the AJC system
functions.
Performance management reports. As part of this request, the study team will request
existing performance management reports or data outputs pertaining to, and used by, the
AJCs that present aggregate figures on customer characteristics, total customers served,
percentage of customers receiving select services, and performance outcomes. The study
team will also request complementary materials such as data system procedures and training
manuals, data dictionaries, data system guidelines and procedures, data mapping documents,
and data, management, and performance reports. Collected performance management
reports will help the study team examine how AJCs measure performance, the extent of
6
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
coordinated approaches to performance management within an AJC, and identify promising
practices across sites. These documents also contain valuable information about the data
systems that support AJC service delivery, and are used for the purposes of performance
management.
Because each AJC routinely submits financial and performance management reports to
funders and oversight agencies, we anticipate that the requested information will be readily
available and not burdensome to obtain. The study team will also request AJC annual plans and
organizational charts and other information (for example, calendars of AJC activities) that
describe the AJC and provide useful background preparation information for the site visit team.
During previsit calls, the study team will ask the site representatives to indicate the individuals
from whom they should collect these materials. We anticipate that this will typically be some
combination of the AJC manager and LWIB staff. If this information cannot be provided prior to
site visits, the research team will collect it during site visits. The script used to guide previsit
calls and list of documents that the study team will request during the calls is included as
instrument #2 of this submission.
On-site interviews and observations. Section A.1 of this submission contains a bulleted
list of key study research questions by topic. The study team will obtain information on each
topic during site visits and from multiple respondents, allowing them to capture multiple
perspectives so that no single person’s opinions or responses will be assumed to be fully
representative. The study team will interview both managers and line staff. This will ensure that
the team members understand not only how service delivery and administrative processes are
supposed to work, but also how they actually work. Specifically, the respondents are:
AJC operator managers and central office staff. Many AJC operators manage multiple
AJCs. When this is the case, the study team will interview the overall AJC operator
manager, as well as other staff from the operator’s central office who can provide
information about the AJC’s financing and data reporting.
AJC managers. These managers will provide an overview of the institutional arrangements
and a detailed account of the day-to-day operational processes at the comprehensive AJC in
the study sample.
Representatives from key partner agencies. The study team will meet with as many
representatives from AJC partner agencies as possible. We expect that we will meet with
representatives from the WIA/WIOA Adult and Dislocated Worker programs and state
workforce programs (Employment Service, Trade Adjustment Assistance, and Veteran
Employment and Training Service). The study team will also meet with representatives from
community colleges, Adult Education, Vocational Rehabilitation, Temporary Assistance for
Needy Families, and Supplemental Nutrition Assistance Program Employment and Training,
to the extent that they are partners within the study AJC’s network. The study team will also
seek interviews with representatives from other active AJC partners, such as communitybased organizations and agencies serving disabled populations.
AJC line staff. One-on-one interviews and group discussions with line staff from the WIA
Adult and Dislocated Worker programs, the Employment Service, and other partner
programs will be conducted with the aim of learning about: the services they provide; client
flow and sequencing of services for all customers, including DOL target populations; how
7
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
they interact with each other and share information, reporting and fiscal data they record;
information technology systems they use; and how they use labor market information and
other data in serving customers.
LWIB staff and local government workforce administrators. Since AJCs operate within
networks and policy frameworks established by LWIBs, these interviews will provide data
that will put the subsequent AJC visits in context. The study team will interview the LWIB
executive director and the managers overseeing AJC operations, contracts, finance, and its
management information system (MIS). We will also interview local government agency
representatives in sites where these entities (such as the mayor’s office) play an active role
in local workforce system policy and service delivery, including but not limited to serving
on the WIB and providing additional resources and services to AJC customers.
In each AJC, the study team will conduct structured observations of AJC layouts and
operations. Site visitors will use observation worksheets to collect information on topics such as:
the location of the AJC (for example, in a mall, a stand-alone building, or an office building), the
signage for the AJC, the layout of the AJC, where various partner staffs are housed at the center,
administrative and MIS functional areas, and how the center’s layout facilitates or impedes
collaboration and efficient customer flow. Site visitors will also observe what happens when a
customer first walks into the center (for example, how he or she is greeted and guided to
different activities, services, and resources), and the resource room.
The AJC Site Visits Master Protocol (instrument #3) will guide these on-site activities, and
Table A.2 below displays the research topics that the research team will address with each onsite activity. The table also indicates topics that will be addressed in telephone interviews with
state administrators.
2. State administrator interview data
Although the AJC is the primary unit of analysis, state-level factors—such as the workforce
system’s policies and decision-making structure—may play a role in influencing organizational
structure and behavior at AJCs. To systematically collect data on these state-level factors, the
study team will conduct semi-structured telephone discussions with state workforce
administrators in those states in which there is a study AJC. While the work of each AJC partner
is overseen by its requisite state agency (for example, the state division of vocational
rehabilitation that oversees services provides to people with disabilities), the study team will
focus on interviewing the administrators of entities that are most involved with the AJCs in each
of the study states. This will typically include representatives from two groups of state-level
staff: (1) managers of state-run workforce programs such as the Employment Service and (2)
agency staff responsible for enforcing WIA/WIOA and setting and monitoring policies relevant
to AJCs.
Telephone interviews with state-level staff will be conducted prior to site visits, providing
the study team with background information that will allow for more detailed exploration of
topics on site. While these interviews will typically include state workforce agency policy
specialists, state workforce administrators might determine that other staff members are more
appropriate to speak with, given the objectives of the calls and information sought. The protocol
that will guide these discussions is included as instrument #5 of this submission.
8
Table A.2. Research topics by data collection activity
AJC site visit interviews and observations
Topic
AJC
operator
staff
Partner
AJC
representat Line
manager
ives
staff
LWIB
executive
director
LWIB
contracts
manager
AJC/
AJC/
LWIB LWIB
finance MIS
staff
staff
Local
government
workforce
admin.
Observations
State
administrator
Administrative structure and staffing
Administrative structure/operator
Management
Performance incentives
Staffing structure
Staff responsibilities and experiences
Staff supervision
Staff turnover
Staff training
Staff coordination
Partners/partner roles
Partner arrangements/contracts
Partner co-location
Partner goals
Employer engagement
Relationship with other employment service
centers
Customer flow
Assessment of customer needs
Customer referrals
Services for employers
Number of customers served
Programs/initiatives for targeted groups of
customers
9
Partnerships
Service delivery and linkages
AJC site visit interviews and observations
Topic
AJC
operator
staff
Partner
AJC
representat Line
manager
ives
staff
LWIB
executive
director
LWIB
contracts
manager
AJC/
AJC/
LWIB LWIB
finance MIS
staff
staff
Local
government
workforce
admin.
Observations
State
administrator
Cross-program service delivery
Procedures for occupational training
Performance outcomes tracked
Performance data collected
Performance reporting
Use of performance data/outcomes
Funding sources and cost sharing
Financing structure
Fiscal monitoring and systems
Political factors
Labor market/community conditions
Demographic characteristics
Innovative/promising practices
Challenges
Management and performance
Funding and resource sharing
10
Community and contextual factors
Overall
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
3. Partner network analysis survey data
The study team will collect detailed information to examine partnerships from a network
analysis survey. The three-question survey will systematically collect information on all of the
AJC partnerships, including partners with whom the study team may not be able to meet directly
on site. As part of the overall effort to describe and analyze AJCs as institutions, analyses of the
survey data will explore the strength of relationships between and among the key partners that
oversee service delivery within the AJC framework. The analysis will add unique information,
beyond that collected during site visit interviews, in that it will describe the extent of
collaborative relationships between and among partners in a visual, accessible way. The network
survey is a very brief, targeted tool. It is not intended to capture details on administrative
structures, formal partner mechanisms, characteristics, or roles of partner entities that would be
better collected during site visit interviews or other means. The survey is included as instrument
#6 of this submission.
A.3. Use of technology to reduce burden
The Institutional Analysis of AJCs will use multiple methods to collect study information.
AJC site visits. AJC site visit data collection will not require information technology. The
study team will collect pre-site visit information by telephone and will conduct the site visit
interviews in person.
State administrator interviews. The study team will conduct state administrator interviews
by telephone without the use of information technology.
Partner network analysis survey. The partner network analysis survey will be distributed
and returned through email. It does not contain or request sensitive or personally identifiable
information (PII). Given the instrument’s brevity and the fact that it does not request or contain
PII, using a PDF document attached to email is the least burdensome and most accessible means
of collecting the data. Partner respondents can open the PDF attachment to the introductory
email, enter their responses, and forward the email back to the sender with the completed
document attached. They can do so at a convenient time and not be held to a scheduled
appointment, as would be the case if data collection were conducted by phone or in person. Each
AJC partner will be asked the same three questions about the other partners at that AJC.
The use of email allows for self-administration of the AJC partner survey, as well as
tracking survey completes. Partner contact information, gathered during the AJC site visit, will
be used to distribute the survey to the partners identified by each selected AJC. The full list of
partners will be preloaded into the PDF document to obtain a response that relates to each
partner. The PDF will allow for the entry of responses (only check marks or Xs are necessary)
but prevent revision of any other text or information in the questionnaire. The survey does not
contain complex skip patterns, and the respondent will be able to view the question matrix with
each possible category of response (across the top) and the full range of partners (down the side)
on one sheet. This approach is commonly used for network analysis data collection to help
respondents consider their levels of connectivity with all partners of the network and assess their
relationships using a common set of considerations regarding the question of interest. The
approach can only be used when the network is known ahead of time and the number of partners
11
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
is relatively small, and it has the added advantage of facilitating data entry and analysis in that
respondents provide information about all partners in the network. If the respondent is not able to
complete the survey in one sitting, he or she may save the document and return to it at another
time, further reducing the burden on the respondent.
A.4. Efforts to avoid duplication
The site visit, state interview, and partner network analysis survey data being collected for
the Institutional Analysis of AJCs is not otherwise available from existing sources. While
individual states, LWIBs, and AJCs track and report administrative data on program outcomes,
such quantitative data does not provide insight into how AJCs actually operate. Without
collecting the information specified in the site visit master protocol, state administrator telephone
interview protocol, and partner network analysis survey, a comprehensive institutional analysis
of AJCs cannot occur. This would prevent information from being provided to stakeholders
about the context in which AJCs operate, any operational challenges faced by AJCs, and
information about operational best practices. Further, no data exist with which to assess the range
and nature of partner program relationships, which are central to AJC operations.
No respondent will be asked for the same information more than once. The AJC partner
representatives will not be asked during the semi-structured interviews any questions that they
are asked on the partner network analysis staff survey. The study team will request existing
agency performance management reports in order to construct a descriptive profile of each of the
AJCs in the sample and to fully understand AJC data systems and how the AJC uses data for the
purposes of performance management. The study team will not ask respondents to collect or
report information that is not available from existing reports; further, they will not request that
sites perform any special data tabulations. The study team will conduct abbreviated state
administrator interviews with staff in those states for which recent or current studies, such as the
WIA Gold Standard Evaluation, have examined and documented key aspects about the state role
in areas related to this study’s research domains. Finally, before contacting sites, the study team
will gather pertinent data from AJC and LWIB websites, and other publically available sources;
during previsit phone calls and onsite interviews, the research team will confirm this information
and request new information only about outstanding items.
A.5. Methods to minimize burden on small entities
No small businesses are expected to be involved in data collection. Nonetheless, instruments
have been tailored to minimize burden and collect only critical evaluation information.
A.6. Consequences of not collecting data
The federal investment of resources into the public workforce system and AJCs requires the
systematic collection of comprehensive institutional and partnership data. If the information is
not collected, program stakeholders will not have comprehensive information from a large set of
diverse AJCs from which to examine and describe the full range of AJC institutional features,
including the breadth of services offered, funding sources and leveraging, partnership
relationships and arrangements, services to target populations, data reporting and performance
management. Without these data, federal policymakers will not have information on promising
12
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
AJC practices to meet the needs of diverse customers, nor will they understand how performance
is measured and defined for management purposes.
A.7. Special circumstances
No special circumstances apply to this data collection. In all respects, the data will be
collected in a manner consistent with federal guidelines. There are no plans to require
respondents to report information more often than quarterly, to submit more than one original
and two copies of any document, to retain records, or to submit proprietary trade secrets.
A.8. Federal Register announcement and consultation
1. Federal Register announcement
The 60-day notice [79 FR 44869] to solicit public comments was published in the Federal
Register on August 1, 2014. No comments were received.
2. Consultation outside of the agency
Consultations on the research design, sample design, and data needs were part of the study
design phase of the Institutional Analysis of AJCs. The purposes of these consultations were to
ensure the technical soundness of study sample selection and the relevance of study findings and
to verify the importance, relevance, and accessibility of the information sought in the study.
On December 5, 2013, the study team convened a Technical Working Group (TWG) that
included seasoned workforce professionals and research methodology experts to consult in
developing the design, the data collection plan, and the analysis plan for the study. TWG
members and others from the study team provided input regarding the three activities for which
clearance is requested: site visits, state administrator interviews, and the partner network
analysis survey.
All individuals who provided consultation during the development of the Institutional
Analysis of AJCs are listed below.
Technical Working Group
Maureen Conway, Executive Director, Economic Opportunities Program, The Aspen
Institute
Dr. Christopher King, Senior Research Scientist and Director of the Lyndon B. Johnson
School of Public Affairs, The University of Texas at Austin
Ron Painter, President/CEO, National Association of State Workforce Boards
Helen Parker, Workforce Consultant (former Regional Administrator at DOL Employment
and Training Administration)
Dr. Carl Van Horn, Professor of Public Policy and Director, John J. Heldrich Center for
Workforce Development, Rutgers University
13
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
Mathematica Policy Research
Dr. Sheena McConnell. Vice President, Director of Human Services Research, Washington,
DC Office
Pamela Holcomb, Senior Researcher
Linda Rosenberg, Senior Researcher
Gretchen Kirby, Senior Researcher
Todd Honeycutt, Senior Researcher
Social Policy Research Associates
Dr. Ron D’Amico, President & Senior Social Scientist
Kate Dunham, Social Scientist and Assistant Director of Workforce Development
George Washington University
Dr. Burt Barnow, Amsterdam Professor of Public Service
A.9. Payments or gifts
The study team does not plan to offer any payments or gifts as incentives to interview or
survey respondents (who include state workforce staff, LWIB staff, AJC operators, and AJC
partner representatives and staff) as part of the data collection efforts described in this clearance
request.
A.10. Assurances of privacy
The Institutional Analysis of AJCs will not collect or report any PII, and is therefore not
subject to the Privacy Act (5 USC 552a). Data requested via existing program administration
reports will be asked for in the aggregate; the study team will not request any individual-level
data. Nonetheless, the study team will adhere to a set of strict approaches to ensure that data and
respondent privacy are protected. All interview respondents will be notified that the information
that they provide is private, that all data reported in project reports will be de-identified, and that
the study team will carefully safeguard study data. All study team site visitors and interviewers
will receive training in privacy and data security procedures.
1. Privacy
AJC site visits and state administrator interviews. AJC and state workforce agency
respondents will not be identified in any reports, nor will interview notes be shared by the study
team with DOL or anyone outside of the project team, except as otherwise required by law. Site
visit interviews and telephone interviews will be conducted in private areas, such as offices or
conference rooms. At the start of each interview, the study team will read the following
statement to assure respondents of privacy and to ask for their verbal consent to participate in the
interview:
14
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
Everything that you say will be kept strictly private within the study team.
The study report will include a list of the AJCs and their LWIBs and states
included in the study, as well as a description of the method by which AJCs and
states were selected to participate in the study. All interview data, however, will
be reported in the aggregate and, in our reports, we will not otherwise identify a
specific person, AJC, partner agency, or state unless we are highlighting a
promising practice.
This discussion should take about minutes. Do you have any
questions before we begin? Do you consent to participating in this discussion?
: I would like to record our discussion so I can
listen to it later when I write up my notes. No one outside the immediate team will
listen to the tape. If you want to say something that you do not want taped, please
let me know and I will be glad to pause the tape recorder. Do you have any
objections to being part of this interview or to my taping our discussion?
This statement can be found at the top of the AJC Site Visit Master Protocol (instrument #3) and
the State Administrator Interview Protocol (instrument #5).
Partner network analysis survey. AJC partner network analysis survey respondents will
not be identified in any reports. The study team only will request the name of the organization on
the network analysis survey instrument; respondents’ will not be requested to provide their
names. All other data items that identify network analysis survey respondents—job title,
organization name, and location—will be stored in a restricted file accessible only to the study
team. As the study team is not requesting respondent names as part of the survey, analysis files
will also not contain respondent names. The introduction to the network survey contains a
statement assuring respondents of privacy, “Your responses will be kept private to the extent of
the law. Findings from the survey will be reported in aggregate form only so that no person can
be identified.” This statement can be found at the beginning of the survey (instrument #6).
To further remove any connection between individuals and their partner network analysis
survey responses, each completed survey will be saved immediately upon receipt in a secure
project folder on Mathematica’s restricted network drives. The saved survey will indicate only
the organizational affiliation of the respondent and the partner AJC. The survey document will
then be deleted from the return email, to prevent its being backed-up on the email servers.
2. Data security
While the Institutional Analysis of AJCs will not collect or report any Personally
Identifiable Information (PII), Mathematica’s security staff and the study team will work
together to ensure that all data collected as part of the study – including data collected as part of
site visits, during administrator interviews, and through the partner network analysis survey
(including interview recordings) – is handled securely. As frequent users of data obtained from
and on behalf of federal agencies, Mathematica has adopted federal standards for the use,
protection, processing, and storage of data. These safeguards are consistent with the Privacy Act,
the Federal Information Security Management Act, Office of Management and Budget Circular
A-130, and National Institute of Standards and Technology security standards. Mathematica
15
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
strictly controls access to information on a need-to-know basis. Data is encrypted in transit and at
rest using Federal Information Processing Standard 140-2 compliant cryptographic modules.
Mathematica will retain the data collected on the Institutional Analysis of AJCs for the duration
of the study. Data processed for the Institutional Analysis of AJCs will be completely purged
from all data storage components of the computer facility in accordance with instructions from
DOL. Until purging of all data storage components, Mathematica will certify that any data
remaining in any storage component will be safeguarded to prevent unauthorized disclosure.
A.11. Justification for sensitive questions
The instruments associated with the Institutional Analysis of AJCs do not contain questions
of a sensitive or personal nature. No personal information will be requested from respondents
interviewed during site visits, other than the number of years served in their current employment
position. The interviews focus on respondents’ knowledge, experiences, and impressions of the
AJC system. Nonetheless, respondents will be informed that they do not have to respond to any
questions that they do not feel comfortable answering.
A.12. Estimates of hours burden
1. Hours by activity
Table A.3 provides the annual burden estimates for each of the three data collection
activities for which this package requests clearance. All of the activities will take place over a
12-month period.
AJC site visits. We expect to conduct two 1.00-hour phone calls with the AJC manager and
an LWIB staff person prior to site visits. The total estimated reporting burden for pre-site visit
data collection will be 120 hours (60 sites X 2 telephone calls per site X 1 respondent per call X
1.00 hour per call).
On site, we expect to conduct interviews with LWIB, local government, AJC, and partner
staff. Interviews will last, on average, 1.50 hours (such as interviews with line staff) and no
single interview will exceed 2 hours (such as interviews with LWIB staff). Most will be one-onone interviews, but we anticipate that some of the line-staff interviews will be with small groups
of two to three staff.
We estimate that the maximum total hours for AJC data collection in the 60 sites is 2,820
hours [(60 sites X 1 LWIB staff respondent X 1.00 hour per previsit call) + (60 sites X 1 AJC
manager respondent X 1.00 hour per previsit call) + (60 sites X 5 LWIB staff respondents X 1.50
hours per interview) + (60 sites X 1 local government official X 1.50 hours per interview) + (60
X 2 AJC operator staff X 1.50 hours per interview) + (60 sites X 6 AJC manager and partner
representative respondents X 1.50 hours per interview) + (60 sites X 16 AJC line staff
respondents X 1.50 hours per interview)].
State administrator interviews. We expect to conduct an average of two 1.00-hour
telephone interviews with state workforce administrators in up to 40 states. We expect that one
administrator will attend each interview (although, at their discretion, state managers might
invite other staff to participate). Thus, we have estimated the total maximum reporting burden for
16
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
the state administrator interview to be 80 hours (40 states X 2 interviews per state X 1
respondent at each interview X 1.00 hour per interview).
Partner network analysis survey. We expect to conduct the network analysis survey of
AJC partners with approximately 450 partners (30 AJCs with 15 partners per site). We expect the
survey to take .17 hours (or approximately 10 minutes) to complete, on average, per respondent.
We will select two of the AJC sites to pre-test the partner survey. Within each site, we will
identify three to four partners for pre-testing, not to exceed nine AJC partners in total across the
two centers.
The total estimated reporting burden for the AJC partners participating in the survey is 76.5
hours (30 AJCs X 15 partners X .17 hours per survey). Including pre-testing time, the total
burden for the partner network analysis survey is 78 hours.
Table A.3. Annual burden estimates for data collection efforts
Respondents
Number of
respondents
per site
Total
number of
respondentsa
Frequency
of data
collection
Average
time per
response
(hours)
Total
maximum
burden
(hours)
AJC site visits
LWIB staff
- Previsit calls
- Site visit interviews
Local government official
AJC operator staff
AJC staff: managers
- Previsit calls
- Site visit interviews
AJC staff: line staff
1
5
1
2
60
300
60
120
Once
Once
Once
Once
1.00
1.50
1.50
1.50
60
450
90
180
1
6
16
60
360
960
Once
Once
Once
1.00
1.50
1.50
60
540
1,440
Subtotal
32
1,920
--
--
2,820
Once
1.00
80
State administrator interviews
State workforce administrators
2
80
Partner network analysis survey
AJC partner network survey
- Pre-test
- Administration
Total
-15
9
450
Once
Once
.17
.17
1.50
76.5
--
2,459
Once
--
2,978
aTotal
number of respondents reflects site visits to 60 AJCs, state administrator interviews in 40 states, and the
partner network analysis survey administered to 450 partners from 30 AJCs.
2. Total estimated burden hours
The total estimated maximum hours of burden for the data collection included in this request
for clearance is 2,978 hours (see Table A.3), which equals the sum of the estimated burden for
the AJC data collection, state administrator telephone interviews, and completion of the partner
network analysis survey (2,820 + 80 + 78 =22,978).
17
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
The total monetized burden estimate for this data collection is $74,115 (see Table A.4). The
average hourly wage of miscellaneous community and social service specialists taken from the
U.S. Bureau of Labor Statistics, National Compensation Survey, 2012, is $18.37. Therefore, the
cost estimate for front-line staff from across different AJC partners to participate in site visit
interviews is $26,453. Using the average hourly wage of social and community service managers
taken from the U.S. Bureau of Labor Statistics, National Compensation Survey, 2012
(http://www.bls.gov/ncs/ncspubs_2012.htm), the cost estimate for this staff burden is $30.99.
Therefore, the maximum cost estimate for state administrators, local government official, AJC
operator staff, AJC managers and partner representatives , and LWIB staff to participate in AJC
data collection, phone interviews, and the partner survey is $47,662 ($1,859 + $13,948 + $2,789
+ $5,578 + $1,859 + $16,735 + $2,479 + $46 + $2,371). Note, that not all interviewed partner
representatives will be manager level staff; we used manger-level salaries to estimate maximum
monetary burden because managers will be the highest possible level of partner staff
interviewed.
18
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
Table A.4. Monetized burden hours
Respondents
Total
maximum
burden
(hours)
Type of
respondent
Estimated
hourly
wages
Total
indirect
cost burden
AJC site visits
LWIB staff
- Previsit calls
- Site visit interviews
Local government official
AJC operator staff
AJC staff: managers
- Previsit calls
- Site visit interviews
AJC staff: line staff
60
450
90
180
Manager
Manager
Manager
Manager
$30.99
$30.99
$30.99
$30.99
$1,859
$13,946
$2,789
$5,578
60
540
1,440
Manager
Manager
Line staff
$30.99
$30.99
$18.37
$1,859
$16,735
$26,453
Subtotal
2,460
--
--
$69,219
$30.99
$2,479
Manager
Manager
$30.99
$30.99
$46
$2,371
--
--
$74,115
State administrator interviews
State workforce administrators
80
Manager
Partner network analysis survey
AJC partner network survey
- Pre-test
- Administration
1.5
76.5
Total
Total
2,618
A.13. Estimates of cost burden to respondents
There will be no direct costs to respondents for the Institutional Analysis of AJCs.
A.14. Annualized costs to the federal government
DOL, like most other Federal agencies, uses contracts with firms that have proven
experience with program evaluation to conduct all evaluation activities. Federal employees will
rely on contract staff to perform the majority of the work described in this package, and have no
direct role in: conducting site visit discussions or focus groups, developing study protocols or
designs, the direct collection of data using these instruments, or the analysis or production of
reports using these data. The role of Federal staff is almost entirely restricted to managing these
projects. The costs incurred by contractors to perform these activities are essentially direct
Federal contract costs associated with conducting site visits, discussions and focus groups.
This estimate of Federal costs is a combination of (1) direct contract costs for planning and
conducting this research and evaluation project including any necessary information collection
and (2) salary associated with Federal oversight and project management.
19
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
Estimates of direct contract costs. There are three categories of direct costs to the Federal
government associated with conducting this project. These costs are routine and typical for
studies such as this. The first category is design and planning, including external review of the
design by a technical working group of outside subject matter experts. This work is estimated to
cost $579,638. The second category is data collection, which will occur through the project
period, and is estimated to cost $1,783,190. The final category is for analysis and reporting.
This category includes synthesizing the findings into conclusions and production of deliverables
such as reports. This work is estimated to cost $635,994. The total estimated direct costs are
estimated to be:
$579,638 (design) + $1,783,190 (data collection) + $635,994 (reporting) = $2,998,822.
Although this project is expected to have a duration of four years, an accurate estimate of the
annualized direct contract cost will vary considerably from year to year because the tasks are
focused on specific periods in the project life cycle. The design and planning costs are obviously
front-loaded, the data collection costs will be incurred throughout the project, and the analysis
and reporting costs will occur close to the end of the project. As a very basic estimate, the total
estimated direct costs can cost can be divided by the four years of the study to produce an
estimate of the average annualized cost:
$2,998,822 / 4 years of study = $749,705.50 per year in estimated direct contract costs.
Estimates of Federal oversight and project management costs. Staff in the Office of the
Chief Evaluation Officer have regular duties and responsibilities for initiating, overseeing and
administering contracts to perform research and evaluation on behalf of agency programs and
offices. In the event that OMB approves this information collection request, Federal staff would
need to perform certain functions that, while clearly part of their normal duties, would be directly
attributable to this specific research and evaluation project. For purposes of calculating Federal
salary costs, DOL assumes:
1. a Senior Evaluation Specialist, GS-14, step 2, based in the Office of Chief Evaluation Officer
in Washington DC, who would earn $53.14 per hour to perform this work, and would spend
approximately one-fourth of their annual time (2080 hours / 4 = 520 Hours) on this project.
Total estimated federal costs, on an annualized basis for this individual, are 520 Hours X
$53.14/hour = $27,632.80.
2.
a Senior Evaluation Specialist, GS-15, step 2, based in the Office of Chief Evaluation
Officer in Washington DC, who would earn $62.51 per hour to perform this work, and
would spend approximately one-eighth of their annual time (2080 hours / 8 = 260 Hours) on
this project. Total estimated federal costs, on an annualized basis for this individual, are 260
hours X $62.51/hour = $16,252.60.
3.
These wages are drawn from the most current available estimates of wages and salaries
available at https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salarytables/15Tables/html/DCB_h.aspx.
20
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
Table A.5 Summary Table of Estimated Federal Costs for the Institutional
Analysis of American Job Centers
Estimates of Direct Contract Costs
Design and Planning for the Study
Data Collection
Analysis and Reporting
Subtotal for Direct Contract Costs
$579,638
$1,783,190
$635,994
$2,998,822
Estimates of Direct Federal Staff Costs
1 GS-15 (1/8 time)
1 GS-14 (1/4 time)
Subtotal for Federal Oversight and Management
Total Cost
$16,251
$27,633
$43,884
$3,042,706
Note that Federal staff costs are based on Salary Table 2015-DCB (Step 2, incorporating the
1.5 percent general schedule increase and a locality payment of 24.22 percent for the locality pay
area of Washington-Baltimore-Northern Virginia, DC-VA-WV-PA), Department of Labor grade
ranges are as of October, 2015. Ref: https://www.opm.gov/policy-data-oversight/payleave/salaries-wages/salary-tables/15Tables/html/DCB_h.aspx).
A.15. Reasons for program changes or adjustments
This is a new submission. There is no request for program changes or adjustments.
A.16. Plans for tabulation and publication of results
1. Data analysis
The Institutional Analysis of AJCs will use the rich information collected from all sources to
describe the AJCs and to identify different AJC typologies. The analysis plan consists of a
mixed-method approach with five steps: (1) organize the qualitative data from state telephone
interviews and site visits; (2) create summary statistics from the performance management
reports provided by sites; (3) use the funding information to calculate the proportion of AJC
infrastructure, management, and services funded by each partner; (4) identify themes in the data
within and across AJCs; (5) conduct a network analysis using data from the network survey of
AJC partners to develop typologies of partner relationships.
Organize the qualitative data. Analyzing qualitative data is inherently challenging because
it requires combining information from different sources, a great deal of which is unstructured.
Compounding this challenge, the proposed study will collect data from 60 AJCs across multiple
states and regions. Our first strategy to manage the volume of data will be to develop structured
templates and checklists for site visitors to use to distill the information that they collect during
site visits. Through these templates, site visitors will respond to specific questions and avoid long
narratives on particular topics of interest. Our second strategy will be to lay an analytic
foundation by organizing the data from the site visits and the state telephone interviews using
qualitative data analysis software, such as Altas-ti (Scientific Software Development 1997).
21
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
Create summary statistics from the data in performance management reports
provided by sites. Using data from the performance management reports provided by sites, the
study team will calculate summary statistics by AJC in sites where this data is available. The
study team anticipates that the reports will contain data that allow them to describe the
percentage of all customers who are enrolled into WIA, the Employment Service, and other key
funding/programs, customers’ characteristics and their service receipt. For example, for each
AJC, the study team will estimate the proportion of customers who are veterans and the
proportion who are under 24, 24-55, and older than 55 years of age. Similarly, to the extent sites
are able to provide this data, the study team will calculate the proportion of customers who
receive each type of service (such as intensive services, training, Trade Adjustment Assistance,
Temporary Assistance for Needy Families, supportive services). The study team will organize
this information using a performance management data collection worksheet. This will ensure
that data is documented in a standardized way that allows for systematic analysis across sites.
Use funding information to calculate the proportion of AJC infrastructure,
management, and services funded by each partner. The study team will use AJC funding data
to gain a deeper understanding of the role and importance of each partner in AJCs’ institutional
arrangements and service delivery, within and across sites. Specifically, the team will calculate
(a) how much funding each AJC had in the program year of interest, and the sources of the
funds, and (b) the proportion of AJC infrastructure, management, staffing, and services that were
funded by each partner. As with the performance management data, the study team will use a
financial data collection worksheet to organize the financial information collected from sites.
This will ensure that data is documented in a standardized way. Then, the study team will
calculate relevant summary statistics, and identify themes and patterns in the financial data to
complement the institutional analysis and provide a more comprehensive picture of variation in
AJC funding and cost-sharing approaches. Finally, the study team will highlight resource-sharing
challenges; different approaches for defining, sharing, and reporting costs; and particularly
comprehensive AJC resource-sharing approaches, which will be of particular interest to
policymakers.
Identify themes and patterns in the data. A critical part of the analytic approach will be to
draw on multiple sources, including different respondents within an AJC, and interview and
programmatic data, to triangulate the data. Both agreements and discrepancies in respondents’
responses or across data sources can provide useful information for how the AJCs operate and
communicate across partners and staff.
Once all of the site visit data—from pre-site visit data collection (including financial and
performance management reports) and onsite interviews and observations—and state
administrator interview data have been organized, the study team can examine the data to look
for similarities in models of organization, service delivery, or other characteristics, and then
develop typologies of AJCs. For example, researchers from Mathematica’s WIA Adult and
Dislocated Worker Programs Gold Standard Evaluation have identified two structures for how
AJCs sequence services: one set requires all customers to first use the resource room before they
can ask for more intensive assistance, and the other set requires that all customers meet with a
staff member soon after they enter the AJC. In the Institutional Analysis of AJCs, the study team
would go beyond that conception to determine which customers are enrolled into WIA programs
(versus other funding sources) or co-enrolled, and why.
22
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
One simple analytical technique for identifying patterns is to present the data in summary
tables by domains. For example, if the study team expects that there is a correlation between the
likelihood that a customer will receive services from multiple partners and the co-location of
partners at the AJC, they can present in tables the percentage of AJC customers who receive one,
two, or three separate services by the number of partners physically located at the AJC.
Conduct a network analysis using data from the partner survey. The partner survey will
explore the structure and strength of the networks that exist to serve AJC customers by assessing
a number of specific characteristics of each AJC service delivery network. The survey will
gather information about the frequency of communication, level of collaboration, and flow of
referrals between entities that will build a picture—both visual and descriptive—of the role of
and connection between partner entities within the AJC service delivery network. The study team
will not request respondents’ names on the network survey instrument, just organization names.
Further, while the study team will conduct the analysis separately for each AJC, sites will not be
identified in the presentation of findings and the study team will analyze partner networks across
all AJC partners surveyed to develop typologies of networks for presentation. In this way,
results from the network survey will not reveal identities of any respondents.
The study team will use two primary measures to describe and depict service delivery
networks within and across AJCs: density (interconnectedness) and centrality (prominence).
Density is the proportion of possible relationships that are actually present, and measures the
extent to which each partner is connected with all others across the network as a whole.
Centrality can be used to examine a few different concepts: (a) the prominence of individual
entities within the network by identifying the partner entities that are most sought after (indegree
centrality), (b) partners that play a central role in supporting communication between other
partners (betweenness centrality), and (c) the degree to which entities are approximately equally
central to the network or to which some entities are much more central than others (betweenness
centralization).
Using “sociograms,” the study team will illustrate the patterns in the size of partner
networks, the strength of the relationships across partners, and the direction of partnerships.
These sociograms will depict the density and centrality of AJC networks based on (a) contact
frequency, and (b) the level of collaboration among key partners. In addition to sociograms, the
study will produce tables that present network-level characteristics such as overall density and
centralization (measures discussed above), also analyzed separately by frequency of
communication and level of collaboration. Throughout the analysis, the study team will work to
identify different typologies of AJC networks that will display key differences across AJCs. We
expect that the typologies may be derived based on similar size and composition of the partner
networks, by network measures of density and centralization, or by clustering among specific
partners (as identified by subgroup density).
The study team will also use the data collected on partner referrals to measure density and
centrality of the network specifically as it relates to the flow of referrals. Using the typologies
derived from earlier analyses, the study team will compare the network measures of density and
centrality between the models based on frequency of communication with those of the flow of
referrals for select AJCs within each typology. The study team will examine the measures of
prominence for specific partners within the select networks across the two models for
23
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: PART A
comparison. We expect that we may find differences in the network interconnectedness and
centrality of partners based on any communication and based specifically on service referrals.
2. Publication plan and schedule
Findings from the evaluation will be presented in a final report and one user-friendly policy
brief. Table A.6 shows the schedule for the study.
Table A.6. Schedule for the Institutional Analysis of AJCs
Activity
Date
February 2016 – July 2016
February 2016 – July 2016
March 2016 – September 2016
December 2017
December 2017
Conduct state administrator telephone interviews
AJC data collection
Network survey administration
Policy brief
Final report
A.17. Approval not to display the expiration date for OMB approval
The OMB approval number and expiration date will be displayed or cited on all forms
completed as part of the data collection.
A.18. Explanation of exceptions
No exceptions are necessary for this information collection.
24
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: REFERENCES
REFERENCES
Barnow, Burt, and Christopher King. “The Workforce Investment Act in Eight States.” Prepared
for the U.S. Department of Labor Employment and Training Administration. Albany, NY:
Nelson A. Rockefeller Institute of Government, February 2005.
D’Amico, Ron, Kate Dunham, Annelies Goger, Charles Lea, Nicole Rigg, Sheryl Ude, and
Andrew Wiegand. “Findings from a Study of One-Stop Self-Services: A Case-Study
Approach.” ETA Occasional Paper No. 2011-16. Washington, DC: U.S. Department of
Labor, Employment and Training Administration, 2011.
Dunham, Kate, Annelies Goger, Jennifer Henderson-Frakes, and Nichole Tucker. “Workforce
Development in Rural Areas: Changes in Access, Service Delivery and Partnerships.”
Prepared for the U.S. Department of Labor Employment and Training Administration.
Washington, DC: DOL, 2005.
Holcomb, Pamela, Rosa Maria Castaneda, and John Trutko. “The One-Stop Workforce Delivery
System in Virginia’s Local Workforce Investment Areas: An Assessment.” Prepared for the
Virginia Employment Commission. Washington, DC: The Urban Institute, 2007.
Mack, Melissa. “Strategies for Integrating the Workforce System: Best Practices in Six States.”
Final report submitted to the Washington Workforce Training and Education Coordinating
Board. Oakland, CA: Social Policy Research Associates, September 7, 2006.
Macro, Bronwen, Sherry Almandsmith, and Megan Hague. “Creating Partnerships for
Workforce Investment: How Services Are Provided Under WIA.” Prepared for the U.S.
Department of Labor Employment and Training Administration. Oakland, CA: Berkeley
Planning Associates, September 2003.
Mueser, Peter R., and Deanna L. Sharpe. “Anatomy of Two One-Stops: Camdenton, Missouri
and Columbia, Missouri.” Prepared for the U.S. Department of Labor Employment and
Training Administration, Occasional Paper 2006-08. Columbia, MO: Employment and
Training Administration and University of Missouri-Columbia, 2006.
Pindus, N., K. Robin, K. Martinson, and J. Trutko. “Coordination and Integration of Welfare and
Workforce Development System.” Washington, DC: The Urban Institute and the U. S.
Department of Health and Human Services, 2000.
Salzman, J. “Integration and the WorkSource System.” Final report submitted to the Washington
Workforce Training and Education Coordinating Board. Oakland, CA: Social Policy
Research Associates, September 7, 2006.
Social Policy Research Associates. “The Workforce Investment Act After Five Years: Results
from the National Evaluation of the Implementation of WIA.” Oakland, CA: Social Policy
Research Associates, 2004.
25
INSTITUTIONAL ANALYSIS OF AJCs
OMB SUPPORTING STATEMENT: REFERENCES
Stack, Treva, and David Steven. “Anatomy of a One-Stop: Baltimore City Eastside Career
Center.” Prepared for the U.S. Department of Labor Employment and Training
Administration, Occasional Paper 2006-07. Employment and Training Administration.
Baltimore, MD: University of Baltimore, 2006.
United States General Accounting Office. “Workforce Investment Act: One-Stop Centers
Implemented Strategies to Strengthen Services and Partnerships, but More Research and
Information Sharing Is Needed.” Report to Congressional Requestors. Washington, DC:
GAO, June 2003.
Werner, Alan, and Kendra Lodewick. “Serving TANF and Low-Income Populations Through
WIA One-Stop Centers.” Cambridge, MA: Abt Associates, 2004.
Wright, David J., and Lisa M. Montiel. “Workforce System One-Stop Services for Public
Assistance and Other Low-Income Populations: Lessons Learned in Selected States.” Report
to the U.S. Department of Labor, Employment and Training Administration. Albany, NY:
Nelson A. Rockefeller Institute of Government, 2010.
26
File Type | application/pdf |
File Title | Part A: Justification for the Collection of Data for the Institutional Analysis of American Job Centers |
Author | Jessica Ziegler |
File Modified | 2015-12-02 |
File Created | 2015-12-02 |