Supporting Statement Part !

DOL SCC PRA_ICR Statement A_Implementation Evaluation SCC Cohort 2 and Cohort 3_02.15.2024.docx

Formative Data Collections for DOL Research

Supporting Statement Part !

OMB: 1290-0043

Document [docx]
Download: docx | pdf

Implementation Evaluation of the Strengthening Community Colleges Training Grant Program (SCC)

OMB CONTROL NUMBER: 1290-0043

OMB EXPIRATION DATE: 10/31/2025


This is a generic Information Collection to add to the umbrella Information Collection Request (ICR) for OMB Control Number 1290 – 0043



Implementation Evaluation of the Strengthening Community Colleges Training Grant Program (SCC) Cohort 2 and Cohort 3

1290 – 0043

Supporting Statement

Part A

January 2024

Submitted By:


Chief Evaluation Office

U.S. Department of Labor


200 Constitution Ave. NW

Room S-4307

Washington, DC 20210


Project Officer(s):

Savi Swick











Introduction

The Chief Evaluation Office of the U.S. Department of Labor (DOL) has requested an independent implementation evaluation of the Strengthening Community Colleges Training Grant (SCC). This program provides funding to assist community colleges in expanding workforce development program efforts, building capacity, and improving equity in the skilled workforce for key industry sectors. The program supports community college initiatives to develop, implement, and advance evidence-based strategies that address specific student and employer-perceived challenges and barriers by building a training-to-workforce pipeline program that partners community colleges with industry employers. Trewon Technologies, LLC, has been engaged to conduct this implementation evaluation of the SCC program, with a specific focus on SCC Cohort 2 (13 grantees) and SCC Cohort 3 (15 grantees). This document seeks approval for a generic clearance for a formative data collection to support the implementation evaluation of the Strengthening Community Colleges (SCC) Training Grant program for cohorts SCC2 and SCC3. This data collection will employ formative data collection techniques, including semi-structured focus groups, questionnaires, telephone or in-person interviews, and document reviews, and seeks to describe and more fully understand the implementation of the SCC program. The findings from this data collection will inform and support future and current research related to the outcomes of the SCC program. Still, findings are not highly systematic, do not seek causal relationships, nor are they intended to be statistically representative or otherwise generalizable.


Justification


  1. Circumstances Making the Collection of Information Necessary

The United States faces an ongoing equity gap in employing underrepresented workers in advanced manufacturing, computer science, and healthcare. Despite efforts to promote diversity and inclusion within these fields, disparities still exist regarding race, ethnicity, and gender. African Americans, Hispanics, and Native Americans remain underrepresented in advanced manufacturing, making up only 15.5% of its workforce (U.S. Census Bureau 2021).1 Women hold only 25.9% of computer science jobs (Schwartz et al., 2021).2 Boniol et al., (2019) found that although women make up most employees in health care, they remain underrepresented in key leadership roles and high-paying specialties such as surgery.3 These disparities highlight the necessity of targeted interventions to overcome systemic barriers and promote equitable representation in advanced manufacturing, computer science, and healthcare fields. The Strengthening Community Colleges Training Grant program allocates funds to enhance the quality of training provided at community colleges for entry into these fields. This evaluation will collect information to describe and understand the early and full implementation of the program at grantee sites. It will identify any reported challenges experienced by community colleges during performance and explore grant utilization practices considered successful by grantees.


In July 2020, the Employment and Training Administration announced the availability of approximately $40 million for grant funds under Section 169(c) of WIOA for a Strengthening Community College Training Grants Funding Opportunity Announcement (FOA). This program's goal was to address two interdependent needs: (1) expanding community college capacity and responsiveness for addressing identified equity gaps, and (2) meeting employers' skill development needs across in-demand industries and career pathways, as well as meeting those of underserved and underrepresented workers. These grants strengthen community colleges' capacity to address identified equity gaps and meet employers' skill development needs for in-demand industries and career pathways that lead to quality jobs. Successful programs funded through SCC are intended to result in long-term systemic changes to education and training through collaboration among community colleges, employers, and the public workforce development system that combines education and training with work experiences, industry credentials, and career growth. Evaluation of this program is allowed under this Act.


This proposed information collection meets the following goals of the Department of Labor’s (DOL) generic clearance for formative data collections (1290-0043):

  • Inform the development of DOL research

  • Maintain a research agenda that is rigorous and relevant

  • Ensure research products are as current as possible

  • Inform provision of technical assistance


  1. Purpose and Use of the Data Collection

DOL will use the data gathered through this request to inform and provide context for third party implementation evaluations of Strengthening Community Colleges grant programs. A final report will be generated summarizing third party implementation evaluations that will include document analysis of SCC2 and SCC3 grantee proposals, evaluation plans, and interim reports, as well as additional information compiled from clarifying calls with grantees and review of grantee program evaluation plans. The information collected through the process described in this document may be described in this report to provide context, but will not be the main focus of the report.


Overview of the evaluation

SCC Cohorts 2 and 3 will undergo an evaluation that focuses on implementation to gain a fuller understanding of program and partnership development in the first and second years of the projects and address various research questions as outlined below:

  • How are grant funds being utilized to improve community college training programs? What reported challenges and barriers have the selected community colleges encountered when implementing the training grant program?

  • How have partnerships between the selected community colleges and industry, workforce boards, and other stakeholders been formed to align training and education programs with labor market demands? Which factors were identified by stakeholders as increasing partnership development and employer involvement in implementation?

  • What types and combinations of activities and services were implemented to increase community colleges' capacity to deliver high-quality, industry-relevant education and training programs that lead to credentials, certificates, or degrees?

  • What are the characteristics of program participants, including individuals from low-income backgrounds, minorities, individuals with disabilities, and other disadvantaged groups recruited for these programs?

  • What measures have programs implemented to facilitate the integration of evidence-based practices and cutting-edge teaching methodologies for improved student outcomes and to support the transition into employment or further study?

  • How has the program been designed and disseminated information, tools, and resources to optimize the start-up of the training pipeline and employer partnership efforts?


Implementation evaluation will address these research questions through various means, including questionnaires with stakeholders from all 13 grantees from Cohort 2 and all 15 grantees from Cohort 3; review of grant documents from SCC2 and SCC3 grantees; workforce agency partner/employer questionnaires for all funded programs; virtual interviews with program stakeholders/employers and participant focus groups featuring SCC2/SCC3 grantees. This PRA clearance request includes all protocols used during this research project. All tools/instruments' titles match in this document with document titles submitted for review.


Overview of the data collection

Understanding how the SCC Training Grant program has been implemented requires collecting data from multiple sources. This clearance request contains implementation evaluation data collection instruments, including protocols for virtual synchronous interviews and focus groups with stakeholders representing the thirteen SCC2 and fifteen SCC3 grantees. Microsoft Teams and/or Zoom interviews will use an in-depth, semi-structured protocol with open-ended question prompts and address topics relevant to implementing an SCC-funded program. Researchers will prioritize issues pertinent to this analysis, such as community context, organization structure and administration, recruitment enrollment participant characteristics, SCC services, outside contractor or vendor services, outcomes, sustainability, and quality of partnerships. This package seeks approval for interview protocols with participants, including program staff, employers, workforce agency partners, and program participants. Detailed data collection consent forms and instrument protocols are in the Appendices.


Document Analysis Rubric and Coding Scheme

  • Document review will involve the initial and subsequent review of all documents associated with the SCC FOA, the proposals, and the funded projects. Document review will provide information to support the evaluation of the fidelity and quality of implementation when compared to the program’s proposed timeline, as well as program efficacy, reported challenges, and context within which the grantees’ implemented programs operated when compared with original goals and objectives, progress in performance, and original design and prescribed procedures (Smith, 2017).4 Modifications made during implementation to alter the program start-up or continuation, the level of stakeholder engagement, and an analysis of program resources from financial documents will round out this component of the information collection process for the implementation evaluation.


Program Stakeholder Questionnaire Protocol

  • This electronic questionnaire will be distributed via the contact information provided from each grantee site to stakeholders identified by each grantee and will gather information from program staff, faculty, workforce partners, employers, and student participants regarding roles played by each within funded programs; expectations regarding program and participation outcomes as well as basic demographic information, and will be administered before selecting samples for interview/focus group data collection (Johnson, 2018).5 Initial items on the questionnaires will provide information about stakeholder roles, with subsequent questions adjusting to this role and changing according to group affiliation. This questionnaire includes multiple-choice items and open-ended free-response questions and should take 30 minutes to complete. This analysis will not collect or include identifying data other than grantee affiliation, site information, and general demographic data. At most nine individuals within each stakeholder group will receive the questionnaire.


Program stakeholder interview protocol (virtual)

  • This protocol provides for semi-structured virtual synchronous interviews with selected grantee managers, staff, and workforce agency partners. The evaluation team will select interview participants using stratified sampling from information obtained during document review and stakeholder questionnaire data analysis. Interviews provide invaluable details about participant experiences with the program and insights into its implementation (Keen et al., 2022).6 This protocol will cover program structure and scope, community context, participant recruitment, service overview, participant characteristics and outcomes, and sustainability. Virtual synchronous interviews are 60 minutes long; the interviewee will consent before the interview for recording to begin. The researcher will complete subsequent transcription after each session. No more than nine individuals at any site will be selected for interview.


Employer interview protocol (virtual)

  • Employers' roles in program design and implementation, perception of the implementation of program services, and hiring plans/ future skill needs will be collected during semi-structured interviews with this protocol. Employers have a stake in the implementation and eventual success of the program, and collecting information about their expectations and experiences will inform the overarching understanding of the implemented program (CDC, 2018).7 Interviews will take approximately 60 minutes each to complete via virtual synchronous technology; the interviewee will provide consent before the interview and recording begins. The researcher will complete subsequent transcription following each session. No more than nine individuals on any funded site will be selected for interview.


Participant focus group protocol (virtual)

  • This protocol will enable virtual focus groups with small numbers of participants (four to six) at each site to collect data on participant backgrounds, motivations for seeking program services, experiences with program marketing and recruitment, and goals or expectations for program completion (Nobrega et al, 2021).8 To ensure maximum informed consent, the study team will collect signed written permission from all participants at the start of each focus group session. Written consent forms will provide details about the purpose and design of a study, data gathered, risks and benefits associated with participation, and participants' agreement for focus groups to take place onsite for 90 minutes each time to complete. The researcher will obtain the participants' consent before the session for recording to begin. The researcher will complete subsequent transcription following each session.


Participant focus group introduction

  • Participants will be given a group introduction at the start of each focus group. Each introduction will take 5 minutes to complete. We will introduce the purpose of the focus group and discuss voluntary participation and the informed consent process. Participant information is essential in a focus group to thoroughly understand the context through which the participant views the program (Namey et al., 2021).9


The proposed uses for each data collection activity are outlined in Table 1.


Table 1 How data will be used (by data collection activity)

Data collection activity

How the data will be used

Document review and analysis

Qualitative analysis of documents associated with the grant, grantee proposals, and funded programs will provide an overview of the program implementation across SCC2 and SCC3 and a base of knowledge from which to understand and describe program implementation.

Program stakeholder questionnaire

Multiple choice electronic questionnaire for program stakeholders (staff, employers, workforce partners, and students) with open-ended items for more detailed response will provide an overview of stakeholder perceptions of and experience with the early and full implementation of the program.

Stakeholder interviews (electronic)

Virtual interviews with grantee staff and partner representatives will provide expanded understanding of perceived successes and challenges experienced during the early and full implementation cycles.

Program stakeholder interviews (virtual)

Virtual interviews (if feasible) with SCC grantee and partner staff will collect information on program structure, community context, recruitment and participant characteristics, service overview, pathway potential, program modifications, and expected outcomes and sustainability.

Employer interviews (virtual)

Virtual interviews (if feasible) with employers will collect information about the employer's role in program design and implementation, their perception of the implementation experience, whether they see themselves as hiring or advancing participants, whether participants will acquire the skills required to be successful, and whether their training needs or role in the program has changed since initial planning.

Participant focus groups (virtual)

Virtual focus groups with a subset of participants will collect information to describe participant characteristics, reasons for seeking services, experiences with SCC to date, and expected participation outcomes.



  1. Use of Improved Information Technology and Burden Reduction

The evaluation team will primarily use email to help facilitate the logistics and scheduling of the data collection activities to reduce the burden on participants. Evaluators will record virtual interviews through the virtual platform, allowing them to conduct interviews in the shortest amount of time possible, as they will not be required to use interview time to take notes on the conversation's content. Interviewers will use no other information technology.

  1. Non-duplication and Use of Similar Information

The information collected through this request is not available from another source. The data is unique to the SCC program. The implementation evaluation of SCC 2 and SCC 3 will use available information from grantee applications and existing administrative data sets to ensure that data collected through interviews and focus groups is unavailable elsewhere.


  1. Impact on Small Business or Other Small Entities

The evaluation team may interview employers or program stakeholders from small businesses or other small entities. Information will only be requested for the intended use, thus minimizing the burden by restricting the length of interviews to the minimum required time.


  1. Consequences of Collecting the Information Less Frequently

The is a one-time data collection. This implementation evaluation represents an important opportunity for DOL to understand the current implementation of the SCC program and inform potential future research on the effectiveness of the SCC program.


  1. Special Circumstances

No special circumstances apply to the proposed data collection efforts.


  1. Federal Register and Efforts to Consult Outside the Agency as required by 5 CFR 1320.8(d)

No public comments are requested for this information collection.


Consultation With Experts Outside of the Study Evaluation Team

The evaluation team consulted with DOL/CEO and site-level SCC program staff on the evaluation design and met with Technical Working Group members to gather feedback.


Table 2 presents information about the individuals participating in the evaluation technical working group. The purpose of the consultation with program staff was to understand better the research design's feasibility within the regional context of grantees.


Table 2 Implementation Evaluation Technical Work Group Membership

Member

Role

Dr. Aubrey Comperatore

Program Manager

Dr. Carolyn Sullins

Principal Investigator

Dr. Kathryn Doherty

Implementation Evaluator

Dr. Danielle Allen

Evaluation Technical Assistance

Dr. Wilnise Horsey

Senior Evaluation Analyst

Dr. Yumi Huang

Analyst

Dr. Steven Petritis

Research Assistant

Easton Bates

Research Assistant


The Trewon Technologies evaluation team coordinates consultation on the research design and data needs and facilitates discussions with site-level program staff. The consultation aims to ensure the fidelity of evaluation findings and verify the importance, relevance, and accessibility of the information sought in the evaluation. This study will use no external consultants other than the Trewon team.


  1. Incentives for Participants

There are no payments or gifts to program and partner staff, as staff will act during their employment and will not receive additional compensation outside their regular pay. Participants in the virtual focus groups will receive a $25 gift card to increase the likelihood of their attendance.


  1. Privacy for Participants

Information collected will be kept private to the extent permitted by law. The evaluation team complies with DOL data security requirements by implementing security controls for processes routinely used in projects involving sensitive data. Further, the team is utilizing all relevant regulations and requirements. Participants will sign an informed consent agreement. Preliminary data collection will not require collecting or storing personally identifiable information (PII). However, initial outreach and scheduling will require the collection of names and emails kept in a secure SharePoint location.


Personal identifiers will be removed from the implementation evaluation data as early as possible in the collection process; electronic data will be stored on password-protected computers, while paper files will be kept secure in locked filing cabinets within an office environment. Only authorized research staff have access to these records; each staff member will sign confidentiality agreements, and data reports will only include aggregate-level data results.


The following Public Burden Statement will appear on the front cover of the data collection instruments:


Public Burden Statement

According to the Paperwork Reduction Act of 1995, an agency may not conduct or sponsor, and a person is only required to respond to, a collection of information if it displays a valid OMB control number. The valid OMB control number for this information collection is OMB 1290-0043.


  1. Sensitive Questions

There are no sensitive questions in this data collection.


12A. Estimation of Information Collection Burden


Total burden requested under this information collection

The annualized burden hours and costs to participants are in Table 3 below. The annual estimated burden is 345 hours, and the annual estimated cost is $13,664.


12B. Estimate of Annual Cost to Respondent for Burden Hours

Table 3 provides annual burden estimates for each data collection activity for which this data collection requests clearance. Activities covered by this request will take place over two years. To calculate the estimated cost burden for participants, the team multiplied the average hourly wages from the U.S. Bureau of Labor Statistics, National, State, Metropolitan, and Nonmetropolitan Area Occupational Employment and Wage Estimates for May 2022 by the number of hours per respondent type. The following section summarizes the annual burden estimates for the five data collection activities (see Table 3). Document review and analysis are not included in these calculations because there will be no burden to stakeholders from this evaluation team-level data collection initiative.


  • Program stakeholder questionnaires (electronic). Researchers will conduct grantee and partner staff questionnaires for all grantee sites 12 months after the program start-up. On average, electronic questionnaires will take 30 minutes to complete. Grantee staff named in the proposal—the grant manager and other key staff members—and non-employer partner staff, employer representatives, and student participants will complete the questionnaire per grantee. The total questionnaire burden is difficult to calculate because the number of responses cannot be determined prior to the questionnaire's administration. Estimating 25 participants per grantee site (up to 10 staff and employer participants and up to 15 student participants) produces an estimated annualized burden of 28 sites x 10 stakeholders plus 28 sites x 15 student participants = a population of 700. Using an online questionnaire with an average response rate of 44.1% (Wu et al., 2022),10 approximately 300 stakeholders will complete the questionnaire (99 stakeholders plus 201 student participants). Annual estimated burden hours = 99 x 30/60 plus 201 x 30/60 = 150 hours.


  • Program stakeholder interviews (virtual). Researchers will conduct grantee and partner staff interviews for 28 grantee sites 14 months into program start-up. On average, in-person interviews with grant managers and other stakeholders will take 60 minutes. One grantee staff member—the grant manager or another critical staff member—and one non-employer partner staff will interview at each site for two stakeholders per grantee. The total burden for site visit interviews is 56 hours (2 stakeholders x 28 grantees x 60/60 hours); the annualized burden is 56 hours.


  • Employer interviews. One employer interview will be conducted virtually for 28 grantee site visits for 28 participants (1 employer × 28 grantees) 14 months into program start-up. These interviews will take 60 minutes to complete. The total burden for the employer interviews is 28 hours (28 participants × 60/60 hours); the annualized burden is 28 hours.


  • Participant focus groups. Focus groups with a subset of participants will occur 16 months into the program start-up. Each focus group will take 90 minutes to complete. Seven participants are expected to participate at 14 sites, for 70 participants (5 participants × 14 grantees). The total burden is 105 hours (70 participants × 90/60 hours); the annualized burden is 105 hours. Participants will receive a $25 gift card for the focus group sessions, included in administration costs in #14 below.


  • Participant focus group introduction. Participants will be given a group introduction at the start of each focus group. Each introduction will take 5 minutes to complete. Five participants are expected to participate at 14 sites, for 70 participants (5 participants × 14 grantees). The total burden is 6 hours (70 participants × 5/60 hours); the annualized burden is 6 hours.


Table 3 Estimated Annualized Respondent Hour and Cost Burden

Data Collection Activity

Number of participants

Number of responses per respondent

Total number of responses

Average burden per response (in hours)

Annual

estimated burden hours

Mean hourly wages

Annual monetized burden hours

Program stakeholder questionnaires

300

1

300

.5

150

$55.38 $29.76

$5,708**

Semi-structured program stakeholder interviews (in-person)

56

1

56

1

56

$55.38

$3,101

Semi-structured employer interviews

28

1

28

1

28

$55.38

$1,551

Participant focus groups

70

1

70

1.5

105

$29.76

$3,125

Participant focus group intro

70

1

70

.08

6

$29.76

$179

Unduplicated Total

300

--

-

-

345

$55.38 $29.76

$13,664

*The hourly wage of $55.38 is the May 2022 median wage across Education Administrators, Postsecondary (see https://www.bls.gov/oes/current/oessrcst.htm); $29.76 is the May 2022 median wage across all occupations in the United States.

**49.5 x $55.38 plus 100.5 x $29.76.


13. Cost Burden to Respondents or Record Keepers

There are no additional costs to participants.


14. Annual Cost to the Federal Government

The estimated total cost to the federal government for the Contractor to carry out the data collection activities included in this request is $43,150.11


15. Change in Burden

This is for an individual information collection under the umbrella formative generic clearance for DOL research (1290-0043).


16. Collection, Tabulation, and Publication

The expected schedule for the implementation study activities is presented below in Table 4.


Table 4 Estimated Timeline for Implementation Study Activities

Activity

Expected Timeline

Data Collection-Distribute questionnaires, conduct interviews and focus groups

March 2024-December 2026

Data Analysis

January 2026 – December 2026


Data collection and reporting will continue annually throughout the OMB-approved clearance timeframe for this study.


Analysis plan

This implementation evaluation for the Strengthening Community Colleges Training Grant seeks to describe and understand the implementation of funded grantee programs. Results will only be used to describe the SCC implementation of SCC 2 and SCC 3 grantees. The SCC theoretical framework draws from program evaluation and implementation science literature and guides the data analysis. This framework includes the concept of fidelity. Fidelity refers to implementing programs as planned (Carroll et al., 2007).12 Fidelity for the Strengthening Community Colleges Training Grant will identify discrepancies between design and implementation, providing insight for improvement measures.


Implementation context is equally significant; program outcomes can depend heavily on it (Durlak & DuPre, 2008).13 An evaluation can assess this environment by considering elements like organizational capacity, stakeholder engagement, and leadership support. Framework considerations also encompass adaptation since programs must meet the unique needs and circumstances of a population or environment (Aarons et al., 2011).14 Therefore, an implementation evaluation should describe and provide information to more fully understand the Strengthening Community Colleges Training Grant.


Evaluation frameworks must also consider sustainability by looking at factors that impact program viability and long-term success (Scheirer & Dearing, 2011),15 which involves considering available resources, integrating existing organizational practices, and stakeholder commitment. The theoretical framework for analyzing implementation evaluation data of the Strengthening the Community Colleges Training grant program includes concepts like fidelity, context adaptation, and sustainability, which help the evaluation provide a complete picture of implementation.


Data analysis involves primarily qualitative analyses using multiple data sources (Creswell, 2018; Flick, 2018).16,17 The data will be analyzed using well-established qualitative analysis methods, such as coding interviews for themes. Coding qualitative data will involve creating and organizing a coding scheme according to key research questions and topics outlined in an implementation evaluation design plan, considering best practice literature supporting best implementation research practices and factors that affect implementation (Creswell, 2018; Damschroder et al.).18 Next, the evaluation team will code the data using qualitative analysis software (Creswell, 2018; Flick, 2018). To ensure reliability across team staff, all coders will review an initial set of documents to identify discrepancies (Creswell, 2018; Miles et al.; Braun & Clarke, 2006).19,20 This data will describe and understand why partnerships formed as they did, explore implementation perceived challenges and uncover promising practices (Creswell, 2018; Fixsen et al.).21


Publications

The findings from this data collection will be used to inform SCC grantees, program administrators, and other interested parties about what has been learned to date about the SCC program. For this reason, the findings will likely be provided in reports made available publicly, though such publication is not the primary purpose of the data collection and will not be the main focus of the reports.


The research team will produce a final report summarizing the information collected as described here and as part of separate third-party evaluations. In early 2027, this final report will summarize findings from these interactions with stakeholders at grantee sites and inform a comprehensive description and understanding of the implementation of SCC funded projects at grantee sites. This report will describe the data gathered from this request and the review of grantee proposals, interim clarifying phone calls, and technical assistance sessions. The analysis of data collected through instruments described in this document may be used to provide context but will not be the main focus of any reports.


17. Reason(s) Display of OMB Expiration Date is Inappropriate

DOL is not seeking an exception to not display the expiration date. All instruments will display the expiration date for OMB approval.


18. Exceptions to the Certificate of the Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.

1 U.S. Census Bureau. (2021). Annual Business Survey: Workforce Demographics by Race and Ethnicity (ABS-WDE). https://www.census.gov/programs-surveys/abs/workforce.html.



2 Schwartz, D., Clarkwest, A., Hashizume, M., Kappil, T., & Strawn, J. (2021). Building Better Pathways: An Analysis of Career Trajectories and Occupational Transitions. Report prepared for the U.S. Department of Labor, Chief Evaluation Office. Rockville, MD: Abt Associates.



3 Boniol, M., McIsaac, M., Xu, L., Wuliji, T., Diallo, K., & Campbell, J. (2019). Gender equity in the health workforce: Analysis of 104 countries. Working Paper 1. Geneva: World Health Organization.

4 Smith, K., Finney, S., & Fulcher, K. (2017). Actionable Steps for Engaging Assessment Practitioners and Faculty in Implementation Fidelity Research. Research & Practice Assessment, 12, 71–86.


5 Johnson, L. (2018). Surveying stakeholders in educational programs: A comprehensive approach. Educational Research Quarterly, 41(2), 22–37.



6 Keen, S., Lomeli-Rodriguez, M., & Joffe, H. (2022). From Challenge to Opportunity: Virtual Qualitative Research During COVID-19 and Beyond. International Journal of Qualitative Methods21, 16094069221105075. https://doi.org/10.1177/16094069221105075.


7 Data Collection for Program Evaluation: Interviews (2018). Center for Disease Control Evaluation Brief. Available at https://www.cdc.gov/healthyyouth/evaluation/pdf/brief17.pdf.



8 Nobrega, S., El Ghaziri, M., Giacobbe, L., Rice, S., Punnett, L., & Edwards, K. (2021). Feasibility of Virtual Focus Groups in Program Impact Evaluation. International Journal of Qualitative Methods20. https://doi.org/10.1177/16094069211019896.




9 Namey E, Guest G, O'Regan A, Godwin CL, Taylor J, Martinez A. (2021). How does qualitative data collection modality affect disclosure of sensitive information and participant experience? Findings from a quasi-experimental study. DOI: 10.1007/s11135-021-01217-4. PMID: 34493878; PMCID: PMC8412398.



10 Wu, M. J., Zhao, K., & Fils-Aime, F. (2022). Response rates of online surveys in published research: A meta-analysis. Computers in Human Behavior Reports, p. 7, 100206.



11Total contractor cost is 2 x 345 (the annualized total burden hours for respondents calculated in Table A.3) x $60/hour contractor cost plus the cost for $25 gift cards paid to 70 focus group participants ($1,750).

12 Carroll, C., Patterson, M., Wood, S., Booth, A., Rick, J., & Balain, S. (2007). A conceptual framework for

implementation fidelity. Implementation Science, pp. 2, 40. https://doi.org/10.1186/1748-5908-2-40.


13 Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3-4), 327-350. https://doi.org/10.1007/s10464-008-9165-0.


14 Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of evidence-based practice

implementation in public service sectors. Administration and Policy in Mental Health, 38(1), 4-23. https://doi.org/10.1007/s10488-010-0327-7.


15 Scheirer, M. A., & Dearing, J. W. (2011). An agenda for research on the sustainability of public health programs. American Journal of Public Health, 101(11), 2059-2067. https://doi.org/10.2105/AJPH.2011.300193.

16 Creswell, J. W. (2018). Research design: Qualitative, quantitative, and mixed methods approaches. Sage publications.


17 Flick, U. (2018). An introduction to qualitative research. Sage Publications Limited.


18 Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering

implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science, 4(1), 50.


19 Miles, M. B., Huberman, A. M., & Saldaña, J. (2014). Qualitative data analysis: A methods sourcebook. Sage

Publications.


20 Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101.


21 Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. University of South Florida, Louis de la Parte Florida Mental Health Institute.

OMB Supporting Statement A-page 10

Trewon Technologies, LLC

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDoherty, Kathryn
File Modified0000-00-00
File Created2024-07-27

© 2024 OMB.report | Privacy Policy