VOCS Draft SSA_final_2021-10-28_clean

VOCS Draft SSA_final_2021-10-28_clean.docx

Understanding the Value of Centralized Services Study

OMB: 0970-0587

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes



Understanding the Value of Centralized Services Study




OMB Information Collection Request

New Collection





Supporting Statement

Part A






November 2021






Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Marie Lawrence

Kathleen Dwyer



Part A




Executive Summary


  • Type of Request: This Information Collection Request is for a new request. We are requesting one year of approval.


  • Description of Request: The Understanding the Value of Centralized Services Study is a descriptive study that intends to document different models for centralizing services; the advantages, disadvantages, and costs of centralizing services; and the perspectives of staff and clients. The data collection effort will include one-time, individual and group interviews with program staff; focus groups with program clients; and observations of activities at centralized community resource centers (CCRCs). We do not intend for this information to be generalized to a broader population. We do not intend for this information to be used as the principal basis for public policy decisions.


  • Time Sensitivity: The contract for this project ends in September 2022.







A1. Necessity for Collection

This study is being conducted pursuant to the directive from Congress to the Administration for Children and Families (ACF) to “research how centralized community resource centers, which allow citizens to apply for several Federal social services in a single location, can reduce the burden on constituents and ensure the cost-effective allocation of Federal resources” (H. Rept. 116-62 Departments of Labor, Health and Human Services, and Education, and Related Agencies Appropriations Bill, 2020). ACF has contracted with MEF Associates and Mathematica, as a subcontractor, to conduct this study.



A2. Purpose

Purpose and Use

This information collection consists of one-time site visits to three sites, which will involve group and individual interviews with staff and focus groups with clients. The site visits will also include direct observations of program services. The data collection is part of a study designed to better understand social service delivery in a centralized location, in response to the Congressional directive in H. Rept. 116-62 Departments of Labor, Health and Human Services, and Education, and Related Agencies Appropriations Bill, 2020. The objectives of this research are to:

  1. Document different models for centralizing services

  2. Document the advantages, disadvantages, and costs of centralizing services, from the perspective of staff and clients

  3. Help establish a basis for future research and evaluation in this area


Findings from this research will be used to contribute to the body of knowledge about the advantages, disadvantages, and costs of providing services in a single location for families and individuals with low incomes. The research team will not use the data to make statistical inferences or generalize findings beyond the study sample. However, the information gathered will provide an initial understanding of centralized service provision for families and individuals with low incomes, which will help inform ACF’s future research and evaluation in this area. Findings from this study will also be used to inform Congress and community stakeholders of different approaches that have been used to centralize services to reduce constituents’ burden and contribute to the cost-effective allocation of resources.


The information collected is meant to contribute to the body of knowledge on ACF programs. It is not intended to be used as the principal basis for a decision by a federal decision-maker, and is not expected to meet the threshold of influential or highly influential scientific information.


Research Questions

  1. What is the range of models that have been used to provide centralized social services?

  2. What do we know about the different models used to deliver services centrally?

    • What are the benefits and challenges from the perspectives of staff and clients?

    • How does centralizing services in a single location enable or hinder access to services for potential clients? How does this vary for different groups, if at all?

    • What are the costs and benefits associated with different models of centralized services?

    • Do different models work better for particular programs, in different settings, or for different populations? What role does community context play?

    • How does centralization contribute to cost-effective allocation of resources for different models?

  1. What is the motivation for centralizing services? How does the impetus for centralization relate to the types or models of centralization?

  2. How are services being coordinated virtually and how does it differ from physical co-location? How does virtual coordination complement centralized services provided in person? What are the costs and benefits of virtual co-location? What lessons can practitioners learn from the COVID-19 pandemic as it relates to centralized services?


Study Design

At each of the three sites, the research team will conduct a one-time data collection effort including in-person staff interviews, client focus groups, and program observations. The three sites will operate different models of centralization (as described in Supporting Statement B, Section B2) and will be varied across dimensions such as primary service domain, type of primary agency, and geographic region. Participating sites will not be representative of the target population (i.e., all staff and clients of CCRCs), and results are not intended to be generalized to this population.


This approach is intended to surface a diverse set of staff and client perspectives on the advantages, disadvantages, and costs associated with centralizing services. The research team will document approaches and lessons learned from a small number of sites, to provide examples on the range of centralization models, process and costs for centralizing services, and benefits and challenges for staff and clients. The research team plans for all data collection to occur in person; however, recognizing that there may be public health constraints in place at the time of the site visits, the team is also planning for the possibility of conducting the site visits virtually (i.e., via videoconferencing).


See Supporting Statement B, Section B1 for additional detail, including a description of limitations of the data collection.


Data Collection Activity

Instruments

Respondent, Content, Purpose of Collection

Mode and Duration

Semi-structured interview

Instrument 1: Interview Guide for Leadership and Administrative Staff

Respondents: Up to six leadership staff at each of three centralized community resource centers (CCRCs)


Content:

  • Respondent background

  • Organization and site characteristics

  • Services offered at the CCRC, in person and virtually

  • Characteristics of clients served

  • Community context

  • History of centralized services at the CCRC and process of centralization

  • Funding sources and cost of centralizing services

  • Organizational partnerships

  • Data collection and reporting on client outcomes

  • Challenges, perceptions, and lessons learned about centralized service approach


Purpose: To document centralized service provision, the process for centralizing services, and benefits, challenges, and costs of centralization, from perspective of leadership staff.

Mode: in-person*


Duration: 75 minutes (average)

Semi-structured interview

Instrument 2: Interview Guide for Frontline Staff

Respondents: Up to 16 frontline staff at each of three CCRCs


Content:

  • Respondent background

  • Characteristics of clients served

  • Community context

  • Services offered at the CCRC, in person and virtually

  • Typical client flow

  • Organizational partnerships

  • Data collection and reporting on client outcomes

  • Challenges, perceptions, and lessons learned about centralized service approach


Purpose: To document centralized service provision and benefits, challenges, and costs of centralization, from perspective of frontline staff.

Mode: in-person*


Duration: 75 minutes (average)

Semi-structured interview

Instrument 3: Interview Guide for Finance Staff

Respondents: Up to three finance staff at each of three CCRCs


Content:

  • Respondent background

  • Funding sources and the cost of centralizing services

  • Organizational partnerships

  • Challenges, perceptions, and lessons learned about centralized service approach from a funding/finance perspective


Purpose: To document centralized service provision and benefits, challenges, and costs of centralization, from the perspective of finance staff.

Mode: in-person*


Duration: 60 minutes (average)

Semi-structured interview

Instrument 4: Interview Guide for IT/Data Staff

Respondents: Up to three IT/data staff at each of three CCRCs


Content:

  • Respondent background

  • Data systems and metrics used to track client participation, outcomes, and demographic information

  • Organizational partnerships

  • Challenges, perceptions, and lessons learned about centralized service approach


Purpose: To document centralized service provision and benefits, challenges, and costs of centralization, from an IT/Data staff perspective.

Mode: in-person*


Duration: 60 minutes (average)

Focus group

Instrument 5: Focus Group Guide for Clients

Respondents: Up to 10 clients from each of three CCRCs


Content:

  • Services received, in person and virtually

  • Impressions of services received at the CCRC and how they feel about accessing services centrally

  • Challenges and barriers to participation

  • Recommendations for other clients and staff members


Purpose: To document client experiences and benefits and challenges of centralized services from a client perspective.

Mode: in-person*


Duration: 90 minutes

*As noted above, the research team is also planning for the possibility of conducting the site visits virtually via videoconference if in-person visits are not possible due to COVID-19.


Other Data Sources and Uses of Information

In addition to the information collection described above, the research team will conduct observations of program activities. These could include “front-door” observations of the lobby, program service observations such as orientations or program workshops, or partnership meetings. These will contribute to the study by documenting how clients experience the program, how CCRC staff communicate services and programs to clients, and how staff from different agencies collaborate to deliver services. The research team review aggregate, readily available data from sites related to program services and client demographics, which will add context to the findings from the primary data collection.


The research team is in the process of conducting a review of the literature regarding centralized service provision for families and adults with low incomes. The final report for this project will synthesize findings from the literature review and the site visits to address the project’s research questions.



A3. Use of Information Technology to Reduce Burden

The information from site visits will be collected through semi-structured discussions that are not conducive to information technology allowing the electronic submission of responses. The research team will audio record interviews, with the permission of the respondent, in order to assist with written note-taking.


While the research team hopes to conduct all site visits in person, they will also plan for the possibility of conducting interviews and focus groups remotely, depending on the public health situation at the time of the site visits. If necessary, the research team will conduct interviews, focus groups, and observations via videoconferencing tools, such as Zoom. If there are public health concerns, use of videoconferencing tools will reduce burden on respondents by enabling them to meet with the research team safely in a remote setting.


A4. Use of Existing Data: Efforts to reduce duplication, minimize burden, and increase utility and government efficiency

Based on consultations with experts and federal stakeholders, we are not aware of any existing data that would meet the goals of this information collection. This study aims to document staff and client perspectives on the centralized approach to service delivery, from a small number of sites. We have engaged with research teams from several other ACF projects focused on integrated approaches to service delivery1, and we will avoid sites that have been involved in in-depth studies as part of recent research projects. While this study is related to the other studies, it has a distinct focus on co-location of services, which will add to ACF’s understanding of centralized services approaches not addressed by other research efforts. To the extent possible, the research team will review information available from existing data sources, such as administrative data and publications, to avoid asking respondents questions the team can find answers to elsewhere. When necessary, the team will confirm information from other sources with respondents.



A5. Impact on Small Businesses

The primary organizations involved in this study are CCRCs that provide multiple social services in a single physical location, some of which may be small organizations. The research team will minimize burden for all entities, including those that could be considered to be small organizations, by requesting only the information required to achieve the study’s objectives and scheduling interviews and focus groups at times that are convenient for staff and clients. There should be no adverse impact for any organizations participating in the study.



A6. Consequences of Less Frequent Collection

This is a one-time data collection.


A7. Now subsumed under 2(b) above and 10 (below)



A8. Consultation

Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on September 23, 2021, Volume 86, Number 182, page 52909, and provided a sixty-day period for public comment. During the notice and comment period, no comments were received.


Consultation with Experts Outside of the Study

The study consulted with several outside experts on the study design and data collection instruments. The experts are listed in Table A8.1.


Table A8.1 List of Experts Consulted

Name

Affiliation

Yulonda Griffin

Mecklenburg County Health and Human Services

Allison Holmes

Annie E. Casey Foundation

Joseph T. Jones

Center for Urban Families

Sherri Killins Stewart

BUILD Initiative



A9. Tokens of Appreciation

No tokens of appreciation are proposed for interview respondents, all of whom will be staff at the CCRCs. Each focus group participant, who is a client of the CCRC and an individual with low income, will receive a $50 gift card as a token of appreciation. The goal of the token of appreciation is to help with the recruitment of respondents. The team expects the token of appreciation to encourage clients to participate in the focus groups and to offset any incidental costs incurred, such as transportation or childcare costs needed for participation.


Focus group data are not intended to be representative in a statistical sense, in that they will not be used to make statements about the prevalence of experiences in population of individuals accessing centralized services. However, it is important to secure participants with a range of background characteristics to capture a variety of possible experiences.  All participants will be individuals who have accessed services at the centers that provide services to individuals with low-income, and therefore, the target population is low income. Without offsetting the direct costs incurred by respondents for attending the focus groups, such as arranging child care or transportation, the research team increases the risk that only individuals able to overcome financial barriers to attend will participate in the study. Participants will receive a $50 gift card to account for incidental expenses such as transportation and/or child care that may otherwise prevent their participation.


The amount proposed is based on research on tokens of appreciation and the study team’s experiences with other federal studies. Research has shown that a token of appreciation can be an effective way to increase study participation among populations including individuals from low-income2 or low-education households3, which are demographics of interest in the VOCS study. A token of appreciation of $50 is also supported by research. A study comparing how different amounts affect willingness to participate in qualitative research found that participants offered a $50 token of appreciation were more likely to participate in research than those who were offered smaller tokens of appreciation, but a rate higher than $50 had diminishing returns.4 Given that previous studies have found tokens of appreciation of $50 effective in increasing willingness to participate in 90-minute qualitative research activities, we believe this is an appropriate amount for the time and cost associated with participation in focus groups but is not so high as to appear coercive for potential participants.



A10. Privacy: Procedures to protect privacy of information, while maximizing data sharing

Personally Identifiable Information

The research team will collect names of program staff identified as potential interview respondents to keep track of the notes and interviews for the data cleaning process. The team will remove these names from interview notes or transcripts after data cleaning and before transmitting data. The team will not collect full names during focus groups.


While none of the interview questions solicits personally identifiable information (PII), participants’ responses, in combination, may be identifiable. For example, the research team will ask respondents to identify their role and the number of years they have held that role. Taken together, this information could make responses identifiable. However, it should be noted that the interview protocols do not collect any PII.


Information will not be maintained in a paper or electronic system from which data are actually or directly retrieved by an individual’s personal identifier.


Assurances of Privacy

Information collected will be kept private to the extent permitted by law. Respondents will be informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law. As specified in the contract, the research team will comply with all Federal and Departmental regulations for private information.


The research team plans to audio-record interviews after receiving verbal permission from the respondent to do so. Respondents will be notified of the study’s plan to destroy the notes and recordings at the end of the study and can ask the interviewer to pause or stop the recorder at any time during the interview. The consent script and information sheet are included in Appendices A-C.


Data Security and Monitoring

The research team will protect respondent privacy to the extent permitted by law and will comply with all Federal and Departmental regulations for private information. The research team has developed a Data Safety and Monitoring Plan that assesses all protections of respondents’ PII. The research team ensure that all of its employees, subcontractors (at all tiers), and employees of each subcontractor, who perform work under this contract/subcontract, are trained on data privacy issues and comply with the above requirements. 


As specified in the contract, the research team will use Federal Information Processing Standard compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of sensitive information during storage and transmission. The research team will securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Processing Standard.  The research team will: ensure that this standard is incorporated into the research team’s property management/control system; establish a procedure to account for all laptop computers, desktop computers, and other mobile devices and portable media that store or process sensitive information. Any data stored electronically will be secured in accordance with the most current National Institute of Standards and Technology (NIST) requirements and other applicable Federal and Departmental regulations. In addition, the research team must submit a plan for minimizing to the extent possible the inclusion of sensitive information on paper records and for the protection of any paper records, field notes, or other documents that contain sensitive or PII that ensures secure storage and limits on access.  


A11. Sensitive Information 5

The research team may collect sensitive information related to receipt of economic assistance or immigration/citizenship status. While uncovering this type of information is not the focus of our study, it may be relevant to the types of services that focus group respondents (clients of CCRCs) receive from the centers. This is important for informing how specific types of services or programs, including public assistance programs, can be integrated with other services and provided in a single location. It is also possible that respondents may speak critically of partners, services, or programs during conversations. Questions about their experiences are necessary to learn about when informing a broader understanding of how different types of models and partnerships contribute to centralized services. The IRB review is in process and will be finalized by the time data collection begins.



A12. Burden

Explanation of Burden Estimates

The research team will recruit staff from three CCRCs to participate in individual and group interviews and clients to participate in focus groups. The research team plans to interview up to 28 staff and conduct focus groups with up to 10 clients from each site. The team will also conduct observations of program services.


These one-time site visits will take place between January and March of 2022. We estimate interviews will average 75 minutes (for leadership/admin and frontline staff) and 60 minutes (for finance and IT/data staff), and focus groups will last up to 90 minutes. The planned observations will not impose any burden on individuals.


We estimate respondents’ hourly wage rates by category using the Bureau of Labor Statistics’ Occupation Employment Survey, 2020. We then include 40 percent of the rate on top of the base average hourly rate to account for benefits, for all respondents except clients (see below). We estimate the total burden to be $4,229.73, or 146 hours.


Estimated Annualized Cost to Respondents


Instrument

No. of Respondents (total over request period)

No. of Responses per Respondent (total over request period)

Avg. Burden per Response (in hours)

Total/Annual Burden (in hours)

Average Hourly Wage Rate

Total Annual Respondent Cost

Leadership/ Admin Interview

18

1

1.25

23

$50.58

$1,163.34

Frontline Staff Interview

48

1

1.25

60

$35.13

$2,107.80

Finance Interview

9

1

1

9

$35.13

$316.17

IT/Data Interview

9

1

1

9

$35.13

$316.17

Client Focus Groups

30

1

1.5

45

$7.25

$326.25

Total




146


$4,229.73



A13. Costs

There are no additional costs to respondents.


A14. Estimated Annualized Costs to the Federal Government

The total/annual cost for the data collection activities under this request will be $285,333.


Cost Category

Estimated Costs

Field Work

$106,684

Analysis

$49,422

Publications/Dissemination

$129,227

Total/annual costs over the request period

$285,333



A15. Reasons for changes in burden

This is a new information collection request.


A16. Timeline

Activity

Date

Site Visits

Scheduled one month after OMB approval, and all three site visits will be completed within two months

Data Analysis Completed

Three weeks after each site visit

Site Briefs

Four months after each site visit

Site Briefs Published

Five months after last site visit is completed

Final Report Completed

Five months after site visits completed

Final Report Published

Six months after site visits completed



A17. Exceptions

No exceptions are necessary for this information collection.


Attachments

Instrument 1: Interview Guide for Leadership and Administrative Staff

Instrument 2: Interview Guide for Frontline Staff

Instrument 3: Interview Guide for Finance Staff

Instrument 4: Interview Guide for IT/Data Staff

Instrument 5: Focus Group Guide for Clients

Appendix A: Staff Consent

Appendix B: Client Consent

Appendix C: Observation Consent

Appendix D: Observation Guides



1 Next Steps for Rigorous Research on Two-Generation Programs (NS2G) - OMB: 0970-0356; Assessing Models of Coordinated Services for Low-Income Children and Their Families (AMCS) - OMB: 0970-0356; Building Capacity to Evaluate Community Collaborations to Strengthen and Preserve Families - OMB: 0970-0531 and OMB: 0970-0541; Head Start Connects: Individualizing and Connecting Families to Comprehensive Family Support Services - OMB: 0970-0538

2 Singer, Eleanor and Richard A. Kulka. (2002). “Paying Respondents for Survey Participation.” Studies of Welfare Populations: Data collection and Research Issues. Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs, edited by Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro. Committee on National Statistics, Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press.

3 Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. (2000). Experiments with incentives in telephone surveys. Public Opinion Quarterly, 64:2, 171-188.

4 Kelly, Bridget, Marjorie Margolis, Lauren McCormack, Patricia A. LeBaron, and Dhuly Chowdhury. (2017). What Affects People’s Willingness to Participate in Qualitative Research? An Experimental Comparison of Five Incentives. Field Methods, 29:4, 333-350.

5 Examples of sensitive topics include (but not limited to): social security number; sex behavior and attitudes; illegal, anti-social, self-incriminating and demeaning behavior; critical appraisals of other individuals with whom respondents have close relationships, e.g., family, pupil-teacher, employee-supervisor; mental and psychological problems potentially embarrassing to respondents; religion and indicators of religion; community activities which indicate political affiliation and attitudes; legally recognized privileged and analogous relationships, such as those of lawyers, physicians and ministers; records describing how an individual exercises rights guaranteed by the First Amendment; receipt of economic assistance from the government (e.g., unemployment or WIC or SNAP); immigration/citizenship status.

11


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCarly Morrison
File Modified0000-00-00
File Created2021-12-01

© 2024 OMB.report | Privacy Policy