VOCS Draft SSB_final_2021-10-28_clean

VOCS Draft SSB_final_2021-10-28_clean.docx

Understanding the Value of Centralized Services Study

OMB: 0970-0587

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes





Understanding the Value of Centralized Services Study



OMB Information Collection Request

New Collection





Supporting Statement

Part B



November 2021









Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Marie Lawrence

Kathleen Dwyer


Part B


B1. Objectives

Study Objectives

The purpose of this study is to understand the advantages, disadvantages, and costs of providing services for families and individuals with low incomes in a single location. The study includes a literature review and site visits to three sites to understand the answers to the following research questions:

  1. What is the range of models that have been used to provide centralized social services?

  2. What do we know about the different models used to deliver services centrally?

    • What are the benefits and challenges from the perspectives of staff and clients?

    • How does centralizing services in a single location enable or hinder access to services for potential clients? How does this vary for different groups, if at all?

    • What are the costs and benefits associated with different models of centralized services?

    • Do different models work better for particular programs, in different settings, or for different populations? What role does community context play?

    • How does centralization contribute to cost-effective allocation of resources for different models?

  1. What is the motivation for centralizing services? How does the impetus for centralization relate to the types or models of centralization?

  2. How are services being coordinated virtually and how does it differ from physical co-location? How does virtual coordination complement centralized services provided in person? What are the costs and benefits of virtual co-location? What lessons can practitioners learn from the COVID-19 pandemic as it relates to centralized services?


This proposed study will conduct site visits that will include individual and group interviews with program staff, and focus groups with clients at centralized community resource centers (CCRCs) to document different models for centralizing services; the advantages, disadvantages, and costs to centralizing services; and the perspectives of staff and clients. Site visits will also include direct observations of program services at CCRCs.


Generalizability of Results

This study is intended to present internally-valid description of the service population and implementation of centralized service models in chosen sites, not to promote statistical generalization to other sites or service populations. The data may be used by service providers or researchers seeking to understand how different models of centralization are operationalized.


Appropriateness of Study Design and Methods for Planned Uses

Findings from the site visit will contribute to understanding the different types of centralization, the reasons sites centralize their services, and the benefits and challenges for staff and clients. With three site visits, the research team will highlight findings from a small group of sites and share lessons learned and findings from these specific sites. The team will not use the data to make statistical inferences or to produce generalizable findings. The site visits are not intended to be a representative sample of all sites providing centralized services. In addition, the study is descriptive and does not include an impact study; therefore, findings from the study should not be used to assess outcomes of individuals who participate in services at CCRCs. All products associated with the study will include these limitations in the discussion of findings.



As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.


B2. Methods and Design

Target Population

For each of the three selected site visit sites, the research team (MEF Associates and Mathematica) will collect information from program leadership; program frontline staff; IT and finance staff; and clients of the site, including clients who accessed multiple services and those who did not.


The sampling frame for each site will be the roster of staff that oversee the site, staff that provide services directly to clients, and staff that work on IT/data issues and finance related to the site, as well as a subset of clients who receive services from the site.


The research team will use non-probability, purposive sampling to identify potential respondents who can provide information on the study’s research questions. Because participants will be purposively selected, they will not be representative of staff at CCRCs or clients who receive services from CCRCs. For clients, we aim to obtain variation in clients’ experiences to understand the range of experiences related to centralized service provision.


Respondent Recruitment and Site Selection

The study will include three sites for site visits. Site selection began with a scan of potential sites based on recommendations from experts, stakeholders and colleagues, research team knowledge, and the literature review. The research team conducted online research to learn more about the sites and will narrow the list to 14 sites based on this initial scan. The research team conducted an in-depth scan of 14 sites before refining its suggestions to three recommended sites and three to four alternative sites. The scan included calls to nine programs. The program calls were tailored to each program with specific questions about their services to fill gaps in the information available online. The criteria for site selection include variation on key features such as type of site; type of primary agency (e.g., Community Action Agency, County Agency, One Stop); tenure, degree, and type of centralization; primary service domain; type of ACF programs offered; presence of virtual services or applications; size of site; and demographics of clients served. The selected sites will not be representative of the population of CCRCs but we hope to capture variation in the types of models implemented. Centralization models may differ across characteristics such as physical space, client flow, staffing structure, partnerships, data sharing, and funding.


To recruit sites, the research team will reach out to the selected sites and ask if they would like to participate in the site visits. The team will provide information on the timing and duration of the visits and the plan for the visits (e.g., whom the team would like to meet with and whether in-person or virtual depending on public health considerations at the time of the visit), as well as a description of the reports and publications in which the site would be highlighted. If a primary selected site does not agree to participate, the research team will then reach out to an alternative site. Once sites have agreed to participate, the research team will work with the site point of contact to identify staff to participate in interviews. The team will also work with site staff to recruit participants for focus groups. The team will share the project description on OPRE’s website with potential site staff and participants.


B3. Design of Data Collection Instruments

Development of Data Collection Instruments

The research team developed the data collection instruments to meet the objectives of the study and answer the research questions. The data collection instruments are designed to reduce burden on respondents by only asking questions of individuals who will be in a position to respond and include guidance for interviewers on tailoring the questions. Because site recruitment has not occurred and the structure of the sites to be visited is unknown, it is difficult to determine which staff roles exist and would be appropriate for each protocol in sites of different structures. The guidance included in the protocols will help ensure that the number of interview questions can be shortened and tailored for each respondent, as appropriate.


The research team, ACF, and the project’s experts reviewed the data collection instruments for length, content, and flow. Meetings with stakeholders also informed the content and data collection approach for the protocols, though the research team did not ask stakeholders to review the protocols directly.


All instruments will contribute to the four research objectives outlined in B1.


B4. Collection of Data and Quality Control

The research team will conduct all data collection. All interviews will be conducted in pairs, with one team member leading the interview and the other team member taking notes. With the permission of the respondent, the research team will record the interviews to assist with notetaking. The team will conduct site visitor training prior to the site visits, including introducing the protocols and consent forms, the plan for scheduling visits, and logistics such as recording, note-taking, and tokens of appreciation. After the visit to the first site, the site visitors will share their experiences and lessons learned with other research team staff, which will inform the visits to the following two sites.


The research team will recruit sites as described above. Once sites are identified, the team will work with the point of contact at the site to set up interviews with respondents and create a schedule that minimizes burden on the site while meeting study objectives. The research team anticipates that all site visits will occur in person but is also planning for the possibility of conducting virtual visits (e.g., via videoconferencing software), depending on public health constraints at the time of the site visits.



B5. Response Rates and Potential Nonresponse Bias

Site Selection

The site visits, including the interviews and focus groups, are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion. Response rates will not be calculated or reported.


The burden estimates account for the following possibilities:

  • The research team expects that some programs or respondents may invite multiple staff members to scheduled interviews. Because the research team does not always know in advance how many respondents will attend and does not want to exclude potentially helpful respondents from the interview collection, we prefer to account for the possibility that multiple respondents may attend a single interview.

  • To date, the sites for this project have not been selected. Therefore, there is a wide range of possibilities in terms of size and the number of staff members that the research team may interview.

  • It is possible that the maximum number of focus group members invited actually attend the focus group. The burden estimates include all 10 for each site so that the research team will not have to exclude respondents if all invited clients attend.


Non-Response

As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. Respondent demographics will be documented and reported in written materials associated with the data collection. The research team will seek focus group participants who are interested in participating. The research team will plan to recruit 8-10 focus group participants per site to yield focus groups of 6-9 individuals. As noted above, the burden estimates account for the possibility that all 10 invitees attend.



B6. Production of Estimates and Projections

The data will not be used to generate population estimates, either for internal use or dissemination.



B7. Data Handling and Analysis

Data Handling

The research team will monitor all data collection. The research team will take notes onsite and prepare a site summary following the visit. The team will use the recordings to fill in any gaps and double-check the information. The two site visitors will review and compare notes and will follow up with the site to ask any questions that remain.


The research team will also prepare site briefs for dissemination. Prior to the release of site briefs about each site, the research team will share with the site for review to confirm all the information included is correct.


Data Analysis

The research team will analyze the interview, focus group, and observation notes, based on themes that align with the interview protocols and research questions. The research team will analyze how these themes appear across respondents and, as appropriate, sites. The team will also utilize on-site notetaking approaches that will allow them to capture immediate impressions and data, such as field note templates, which will contribute to the data analysis. On the site visits, the research team will also ask for existing aggregated data on program services and demographics of clients. If available, the research team will use this information to contextualize the findings from the primary data analysis. All analyses will be descriptive.


The study will be pre-registered on the Open Science Framework.


Data Use

The data will be described in four publicly facing reports: three site briefs (one for each site visit) and a final report synthesizing all findings, including those from the literature review. There are also two additional dissemination products that are still to be determined. The products will share findings about centralization that will contribute to the field’s understanding of the benefits, challenges, and costs of centralizing services and how staff and clients experienced centralized services. The products will describe the research team’s data collection and analysis methods to contextualize the findings.



B8. Contact Persons

Co-Principal Investigator

Mary Farrell

[email protected]


Project Director and Instrument Development Task Lead

Kimberly Foley

[email protected]


Deputy Project Director and Site Visit and Analysis Task Lead

Carly Morrison

[email protected]



Attachments

Instrument 1: Interview Guide for Leadership and Administrative Staff

Instrument 2: Interview Guide for Frontline Staff

Instrument 3: Interview Guide for Finance Staff

Instrument 4: Interview Guide for IT/Data Staff

Instrument 5: Focus Group Guide for Clients

Appendix A: Staff Consent

Appendix B: Client Consent

Appendix C: Observation Consent

Appendix D: Observation Guides





8


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKimberly Foley
File Modified0000-00-00
File Created2021-12-01

© 2024 OMB.report | Privacy Policy