JCEB Supporting Statement A Revised_final

JCEB Supporting Statement A Revised_final.docx

Job Corps Evidence Building Portfolio (JCEB)

OMB: 1290-0037

Document [docx]
Download: docx | pdf

PART A: JUSTIFICATION for JOB CORPS EVIDENCE-BUILDING PORTFOLIO PROGRAM

OMB No. 1290-0NEW

FEbruary 2022


Part A: Justification

The Chief Evaluation Office (CEO) of the U.S. Department of Labor (DOL) seeks approval to collect information for the implementation, outcome and impact feasibility studies of two Job Corps demonstration pilots. These studies, funded as part of the broader Job Corps Evidence-Building Portfolio Program, aim to understand who the pilots enroll, what services they provide, how these services are implemented, and how the pilots compare with traditional Job Corps centers. The evaluation will also assess outcomes of participants in the demonstration pilots. The project also includes impact feasibility assessments of each of the pilots to assess the potential for conducting an impact evaluation of the pilot’s effectiveness or similar future pilots. MDRC and its subcontractor Abt Associates have been contracted to conduct these studies. This supporting statement is the first and only OMB submission planned for the Job Corps Evidence-Building Portfolio Program. This package requests clearance for four data collection activities as part of these studies:


  1. Program survey of demonstration pilot grantees.

  2. Semi-structured interviews with program staff and staff from employer and partner organizations topic guide.

  3. Participant interviews or focus groups topic guide.

  4. Impact feasibility assessment interviews with demonstration pilot staff topic guide.



1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.

Job Corps is the nation’s largest program serving young people 16 to 24 who are in need of comprehensive education, job training and employment services to connect to the labor force and advance towards stable, high-quality jobs. The program aims to help young people who did not finish high school or are unemployed train for careers and obtain a high school diploma or its equivalency. Job Corps services are typically delivered at 123 centers throughout the United States.

Through the Workforce Innovation and Opportunity Act (WIOA), Job Corps had authority to implement demonstration projects to explore new and innovative ways to serve the young people who are eligible for Job Corps. Current demonstration pilots include the Job Corps Scholars Program and Idaho Job Corps Demonstration Project. The agreements to operate these pilots require that pilots participate in an independent, third-party evaluation. The implementation and impact feasibility studies will assess the outcomes and implementation of these pilots to provide information to Job Corps about potential strategies for serving this population of young people. Lessons from the demonstration pilots will provide information to Job Corps and programs serving Job Corps-eligible young people about how to potentially strengthen education and employment services for this population.



Citation of sections of laws that justify this information collection:

The Job Corps program is authorized by the Workforce Innovation and Opportunity Act (WIOA) of 2016 (P.L. 113-128). Section 156(a) of the Workforce Innovation and Opportunity Act (WIOA) allow the Department of Labor to “carry out experimental, research, or demonstration projects relating to carrying out the Job Corps program (29 U.S.C 3206(a)).

Sections 161 and 169 of the Workforce Innovation and Opportunity Act (WIOA) require that the Job Corps program undergo a third-party evaluation every 5 years that looks at the general effectiveness of the program, effectiveness of performance accountability measures, effectiveness of the structure and mechanisms for delivery of services, impact of the program on local community, related programs, and the extent to which it meets the needs of various demographic groups.

This package requests clearance for four data collection activities that require approval so that collection may begin in February 2022 as part of the implementation evaluation. Given that some of the pilots will cease enrolling or providing services to students in the first half of 2022, it is critical that the research team be able to start fielding the instruments in early 2022 in order to gather data from the pilots when they are operating in a steady-state.

2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.

The data gathered through the activities summarized in this request will be used by DOL to comprehensively describe implementation of the demonstration pilots, including who the pilots enroll, what services they provide, how these services are implemented, and how the pilots compare with traditional Job Corps. Although not addressed in the instruments in this package, the evaluation will also assess outcomes of participants in the demonstration pilots through analysis of de-identified data gathered by DOL from the pilots and through existing administrative data sets, which does not require OMB approval. The project also includes impact feasibility assessments of each of the demonstration pilots to assess the potential for conducting an impact evaluation of the pilot’s effectiveness or similar future pilots.


  1. Overview of evaluation


The Job Corps Evidence-Building Portfolio Program demonstration pilot studies includes two components: (1) an implementation and outcome evaluation to understand program implementation and participant outcomes and (2) an impact feasibility assessment to assess the potential for conducting an impact evaluation of the pilot’s effectiveness or similar future pilots. The research will take place from 2021 to 2024 and will address the following research questions across seven domains:


  1. Program planning and design. What is needed and required for each of the pilots to startup? How does the demonstration pilot compare with Job Corps?

  2. Program services. What types and combinations of services were provided by the pilot? How were the services implemented, such as staffing, training, and use of technology? What were the perceived influences that the implementing agency or local context had on implementation?

  3. Recruitment and enrollment. How did young people enroll in the pilots and what are their characteristics?

  4. Participant experience. What were the experiences of young people who enrolled in the program? To what extent were youth engaged in pilot program activities, and what factors facilitated or constrained their participation?


  1. Cost. What were the costs to provide services?  

  2. Student outcomes. What were the outcomes for the young people who participated in the program?   

  3. Impact feasibility. What are the options for conducting an impact study of the pilot?

To address the research questions listed above, pilot studies will include the following data collection activities:

  1. Pilot survey data (clearance requested in this package)

  2. Interview and focus group data collection during in-person or virtual site visits (clearance requested in this package)

  3. Document review

  4. Observations of program activities

  5. Participant application, participation and follow-up data provided by pilot sites to ETA

  6. Other administrative data from the National Student Clearinghouse (NSC) and the National Directory of New Hires (NDNH)

The implementation study will answer research questions 1-5. The outcome study will answer question 6. The impact feasibility study will answer question 7. To conduct the implementation study, the research team will collect descriptive information about pilot implementation through staff and participant interviews and focus groups, and through a pilot survey. The pilot survey will be fielded to 30 demonstration sites and satellites (26 Job Scholars sites and 4 Idaho sites). Interviews and focus groups will be conducted with staff and participants either virtually or in-person at a subset of pilot grantees. The results of the pilot survey will be used to select sites for sites visits and interviews. The research will take a tiered approach to data collection, with some sites receiving in-depth site visits where most staff are interviewed, and some sites receiving targeted interviews where a smaller portion of staff are interviewed. Approximately six pilots will be selected for site visits with approximately 15 interviews during the visit, and approximately 15 additional sites will be selected for virtual interviews of five or fewer staff. The research team will select sites to ensure they span the variation in pilot implementation, such as structure of pilot, enrollment strategies, type of trainings offered, urbanicity, region, types of partners, and type of support services provided to participants. The impact feasibility interviews will be conducted in five sites. The research team will also review documents from all pilots and observe program activities at select pilots, either in person or virtually via phone or video, to help us describe key program components and participant engagement. Document review and observations do not require clearance.

This ICR includes the pilot survey and the topics guides that will be used to develop pilot-specific interview protocols that will be used during the interviews and focus groups for the implementation and impact feasibility studies. De-identified data gathered by DOL and administrative data will be used to answer question 6.


2. Overview of the data collection


Understanding the implementation, outcomes and impact feasibility of Job Corps’ pilots requires collecting data from multiple sources. A survey will provide general information about implementation and services offered across all the pilots. Interviews with staff, partners and participants will provide more detailed information about implementation at selected pilots. Though the demonstration pilots are required to provide certain services, each pilot has adapted the program to their local context, including available training programs, partners, and employers. Thus, the research team will use topic guides to develop specific interview protocols for each pilot. Pilot responses to the survey will be used to select sites for interviews to ensure that interviewees reflect the range of variation in how pilots are implementing their programs. This package seeks approval for the pilot survey and topic guides for two types of respondents for the implementation study: program staff and partner staff, and participants. An additional topic guide is included for interviews with pilot staff who can inform the impact feasibility study. Participant interviews will be conducted individually or as focus groups.

  1. Program survey of demonstration pilot grantees. The project will field a program survey to each of the demonstration pilot grantees to gather information about program implementation, service offerings, and staffing. The survey will be fielded to approximately 30 pilot demonstration sites in 2022.


  1. Semi-structured interviews with program staff and staff from selected community partner organizations topic guide. Interviews will be conducted over the phone or video or during in-person site visits in 2022. Whether conducted virtually or in person, each interview will last about one hour. Each of the pilot demonstration projects draw on a range of staff and partners to deliver services; thus, interviews may include pilot staff, partner staff, employers, and training and education providers. We estimate that a total of 175 interviews will be conducted across all pilots.


  1. Participant interviews or focus group guide. We will also interview demonstration pilot participants through one-on-one interviews or focus groups. Focus groups or interviews will be limited to a maximum of 175 interviewees across all pilots. These interviews or focus groups may be conducted in person, video, or over the phone.


  1. Impact feasibility assessment interviews with demonstration pilot staff topic guide. In addition to the implementation and outcome study, the evaluation will gather information from select grantee staff about topics related to feasibility of conducting an impact study of the demonstration pilot. The team will conduct phone, video or in person interviews with grantee staff who are involved in management, enrollment, and program services. The project will conduct 25 interviews across the three pilots.


Proposed uses for each data collection activity are described in Table A.1.

Table A.1: Proposed Uses for Data Collection Activities


Data Collection Activity

How the data will be used




Program survey of Job Corps centers and demonstration pilot grantees


On-line survey to gather information from all pilots about key aspects of implementation, including staffing, enrollment, services, partnerships, funding, and data management.


Semi-structured interviews with program staff and staff from selected community partners topic guide


In-person or virtual interviews to gather information about pilot start-up, staffing, services, implementation, recruitment, partners, and local context.

Participant interviews or focus group guide


In-person or virtual interviews or focus group gather information about participants’ experience in the pilot, including background, interest in program, enrollment, services, post-program experiences.


Impact feasibility assessment interviews with demonstration pilot staff topic guide


In-person or virtual interviews with pilot staff to gather information about recruitment, enrollment, data systems and alternative services in the community to assess feasibility of an impact study.


This data collection will provide useful information to the government about how Job Corps services may be adapted and provided in different contexts, such as community colleges, to better serve young people who are eligible for the program.

3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.

The evaluation team will primarily use email to help facilitate the logistics and scheduling of the site visits and interviews to reduce the burden on participants. Site visitors for the evaluation will use electronic audio recorders to record the interviews. This will allow the visitors to conduct interviews in the shortest amount of time possible, as they will not be required to use interview time to take notes on the content of the conversation. The pilot survey will be fielded using Qualtrics, an online survey system and will utilize skip patterns to minimize burden on respondents by not asking non-applicable questions. The system includes drop-down menus and response categories will be mostly multiple choice to minimize data entry placed on respondents.

4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.

The evaluation will identify and minimize duplication of information available through alternate sources. For example, the evaluation will use available information from the survey, grantee applications and existing administrative data sets to ensure that data collected through interview and focus groups are not available elsewhere.

5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.

Interviews may be conducted with employers or program partners who represent small businesses or other small entities. Interviewers will only request information necessary for the evaluation and will minimize burden by restricting the length of interviews to the minimum required time.

6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

If the survey and interviews are not conducted, DOL will not have the information necessary to answer the key research questions of the evaluation. The pilots will operate for a time-limited period, thus not collecting data at this point will prevent DOL from collecting data about the pilots while they are operating. In result, the research team will lack the information needed to answer questions about implementation of services and participant experience. Without this information, DOL and stakeholders will lack the lessons from the demonstration pilots about how to improve Job Corps or implement similar pilots in the future.

7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* Requiring respondents to report information to the agency more often than quarterly;

* Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* Requiring respondents to submit more than an original and two copies of any document;

* Requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* In connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* Requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* Requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.

No special circumstances apply to this data collection.

8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.

A 60-day notice to solicit public comments was published in the Federal Register, 86 FR 1528 on January 8, 2021. No comments were received.

Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years - even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


Seven individuals, who compose the contractor’s research team, were consulted in developing the design, the data collection plan, and the data collection tools for which clearance is requested.

The research team will also assemble a technical working group consisting of approximately five experts in the following areas: (1) experience with Job Corps and other young employment and training programs; (2) experience with workforce development and job training; (3) experience with community colleges; and (4) implementation research methods. These experts will review and comment on the implementation study design plan, impact feasibility assessments and reports arising from this research. The Team will also engage subject matter experts to provide consultation as needed on evaluation design.

9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.

There are no payments or gifts to program and partner staff, as activities are expected to be carried out in the course of their employment, and no additional compensation will be provided outside of their normal pay. Program participants who take part in interviews or focus groups will receive a $25 gift card.

10. Describe any assurance of privacy provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

Procedures will be strictly followed by the evaluation team to ensure the privacy of participant information to the extent permitted by law. The contractor has been awarded FedRAMP Authorizations to Operate (ATOs) at the moderate impact level issued by the Department of Health and Human Services (HHS) and the FedRAMP Program Management Office (PMO). The following actions will be taken:

  • Survey. Staff survey responses will be collected via Qualtrics, a web-based FedRAMP authorized software package in which data will be protected meeting federal government FISMA moderate standards. Data will be exported from Qualtrics into a CSV format and securely transferred into the contractor’s private network via BOX.


  • Training interviewers in privacy procedures. Evaluation team members interviewing staff, program partners, and participants will follow strict privacy procedures to protect data. All evaluation staff must regularly undergo mandatory human subjects research and data security trainings. Whenever possible, notes will be taken on encrypted laptops and saved on secure servers.


  • Using de-identified data for all interview participants. Any data elements used for recruitment of participants for semi-structured interviews, such as name and telephone number, will be destroyed after completion of the interview. Interview transcripts and resulting reports will not identify respondents by name.


  • Obtaining informed consent from all interview participants. All interview participants will be informed that their participation in interviews is completely voluntary, that their information will be kept private, and that members of the research team will be taking notes during the interviews but their names will not be attached to these notes and the evaluation team will not reveal statements made by individuals to anyone outside of the evaluation team. Staff will be further informed that their participation or non-participation will not affect their employment, while participants will be informed that their participation or non-participation will not affect the services they receive. The evaluation team will obtain verbal informed consent from staff, partners, and study participants prior to conducting interviews or focus groups.


  • Institutional data security protocols. MDRC has strict confidentiality protocols for all research data collection and usage. The evaluation team complies with DOL data security requirements by implementing security controls for processes that it routinely uses in projects that involve sensitive data.


  • IRB approval. The data collection protocols have been approved by MDRC’s accredited IRB. (IRB Registration Number 0003522, Federal-wide Assurance (FWA) Number 00003694).

11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

Staff and partner interviews focus on the experiences of program and partner staff with their jobs of delivering services. Similarly, there are no sensitive questions involved in the pilot survey. Participant interviews and focus groups will collect demographic data such as gender, living arrangements, employment status, and education history. This information will describe participant backgrounds and reasons for seeking services through Job Corps pilots, which could be considered sensitive. The information will be important for describing participants who take part in interviews. Past evaluations have included similar questions without any evidence of significant harm. As described earlier, all participants will be assured of the privacy of their responses before being asked to answer any questions and they can skip any questions they do not wish to answer. All data will be reported in aggregate, summary format only, eliminating the possibility of individual identification and ensuring that individual responses are confidential.

12. Provide estimates of the hour burden of the collection of information.

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under “Annual Cost to Federal Government.”



Table A.3 provides annual burden estimates for each of the data collection activities for which this package requests clearance. All of the activities covered by this request will take place over a one-year period. To calculate the estimated cost burden for respondents, average hourly wages from the U.S. Bureau of Labor Statistics, National, State, Metropolitan, and Nonmetropolitan Area Occupational Employment and Wage Estimates for May 2020 were multiplied by the number of hours per respondent type. The following summarizes the annual burden estimates for each of the four data collection activities:

  1. Program survey of demonstration pilot grantees. The survey will be targeted at program directors. The survey will be fielded to all 30 pilot demonstration sites in early 2022. On average, the online survey will take approximately 2 hours per grantee to complete. We expect that in most cases the pilot program director or manager will complete the survey. The total burden for survey is 60 hours (30 pilots (26 Job Scholars + 4 Idaho) x 2 hours).


  1. Semi-structured interviews with program staff and staff from selected community partner organizations topic guide. Each of the pilot demonstration projects draw on a range of staff and partners that deliver services; thus, interviews may include pilot staff, partner staff, employers, and training and education providers. The team will conduct one site visit (in-person or virtual) to Idaho Job Scholars (which may include multiple locations) and will interview approximately 25 staff, partners and employers associated with the program. Approximately five Job Scholars sites will be selected for site visits, which will also take place virtually or in-person and will include approximately 15 interviews. Approximately 15 Job Scholars sites will be selected for phone interviews, and approximately five staff, partners or employers at these sites will participate. Each interview will take approximately 90 minutes to complete. All interviews will take place in 2022. The total burden for staff interviews is 262.5 hours (1x25 for Idaho site visit +5x15 for Job Scholars site visit +15x5 for Job Scholars interviews x 90/60 hours))


  1. Participant interviews or focus group guide. These interviews or focus groups may be conducted in person, online, or over the phone. We will conduct interviews and focus groups with approximately 25 participants at Idaho Job Corps. At each of the five Job Scholars sites selected for site visits, we will conduct interviews and focus groups with approximately 20 participants. At the remaining sites, we will conduct approximately 50 phone interviews with participants. Each interview or focus group will take approximately 90 minutes to complete. All interviews will take place in 2022. The total burden for participant interviews is 262.5 hours (1x25 for Idaho site visit +5x25 for Job Scholars site visit + 50 phone interviews x 90/60 hours))


  1. Impact feasibility assessment interviews with demonstration pilot staff topic guide. The team will conduct phone, video or in person interviews with grantee staff who are involved in management, enrollment, and program services in 2022. The project will conduct 30 interviews across the 27 pilots. Idaho Job Corps will have approximately five interviews, and the Job Scholars grantees will have approximately 25 interviews. Some Job Scholars grantees will have approximately three interviews and some grantees will have none. The total burden for impact feasibility interviews is 45 hours (30 interviews x 90/60 hours).



Table A.3. Estimated Annualized Respondent Hour and Cost Burden

Data Collection Activity

Number of respondents

Number of responses per respondent

Total number of responses

Average burden per response (in hours)

Annual
estimated burden hours

Average hourly a

Annual monetized burden hours

Program survey of grantees

30

1

30

120/60

60

$46.87

$2,812

Semi-structured interview with staff, partners, and stakeholders topic guide

175

1

175

90/60

263

$46.87

$12,303

Participant interview or focus group protocol

175

1

175

90/60

263

$20.17

$5,295

Impact feasibility topic guide

30

1

30

90/60

45

$46.87

$2,109

Unduplicated Total

410

--

410

--

631

--

$ 22,519

a The hourly wage of $46.87 is the May 2020 median wage across Education Administrators, Postsecondary (see https://www.bls.gov/oes/current/oes119033.htm); $20.17 is the May 2020 median wage across all occupations in the United States (see https://www.bls.gov/oes/current/oes_nat.htm#00-0000)


13. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected on the burden worksheet).

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collections services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.

There are no direct costs to respondents other than their time.

14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.

The total cost to the Federal government over three years is $902,709.81, and annualized cost to the federal government is $300,903.27. Costs result from the following categories:

The estimated cost to the federal government for the contractor to carry out the implementation data collect is $840,7131. Annualized, this comes to $280,237.67:

= $280,237.67

The annual cost borne by DOL for federal technical staff to oversee the contract is estimated to be $20,665.60. We expect the annual level of effort to perform these duties will require 200 hours for one federal GS 14 step 4 employee based in Washington, D.C., earning $64.58 per hour. (See Office of Personnel Management 2021 Hourly Salary Table at https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2021/DCB_h.pdf. To account for fringe benefits and other overhead costs, the agency has applied multiplication factor of 1.6:

200 hours × $64.58 × 1.6 = $20,665.60.

Thus the total annualized federal cost is $280,237.67+ $20,665.60= $300,903.27.

15. Explain the reasons for any program changes or adjustments reported on the burden worksheet.

This is a new information collection.

16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.

1. Tabulation

Survey responses will be reviewed for completeness and the team will follow up with respondents that have significant amounts of missing data. Occurrences of missing data that would warrant follow up include: missing data for a full module, missing data for later parts of the questionnaire (partial completes) and patterns of missing data that indicate the respondent systematically skipped questions.  Responses will also be reviewed for other issues that may warrant follow up, such as nonsensical answers or patterns of responses. After all responses have been received and cleaned, the team will conduct descriptive analyses of responses.

For most variables, we expect to do separate tabulation and means for the Job Scholar sites and the Idaho sites. For example, survey questions with categorical answers we would the distribution of the answers across all the Job Scholars sites and then separately for the Idaho sites. For continuous variables, such as how many staff the program had, means would be calculated separately for the Job Scholars sites and the Idaho sites. When variables had different numbers of respondents, the sample size will be noted either in the table or a table footnote.

The staff and partner interviews, participant interviews, and impact feasibility interviews will be transcribed or written up into standardized templates that reflect the content in the interview topic guide and then coded by theme and analyzed using Dedoose, specialized software used to analyze qualitative data. These interviews will be integrated with the survey findings in order to gain a fuller understanding of pilot implementation and address the key implementation-related research questions using mixed methods.

  1. Schedule

Table A.4. Schedule

Activity

Date

Implementation analysis plan

May 2021

Pilot survey fielding

February 2022

Site Visits to Pilots

March – June 2022

Phone/Virtual interviews

March – August 2022

Participation data collection

June 2021 – September 2023

Implementation Report

September 2024


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

The OMB approval number and expiration date will be displayed or cited on all forms completed as part of the data collection.

18. Explain each exception to the topics of the certification statement identified in “certification for paperwork reduction act submissions.”

No exceptions are necessary for this information collection.





1 The total contractor cost includes the cost for $25 gift cards paid to focus group and interview participants.

6


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJeanne Bellotti
File Modified0000-00-00
File Created2022-02-12

© 2024 OMB.report | Privacy Policy