Virtual Human Service Delivery under COVID-19: Scan of Implementation and Lessons Learned
ASPE Generic Information Collection Request
OMB No. 0990-0421
Supporting Statement – Section B
Submitted: June 1, 2020
Program Official/Project Officer
Pamela Winston
Social Science Analyst
U.S. Department of Health and Human Services
Office of the Assistant Secretary for Planning and Evaluation
200 Independence Avenue, SW, Washington, D.C. 20201
202-774-4952
Section B – Data Collection Procedures
Respondent Universe and Sampling Methods
Two types of qualitative data collection will be conducted with respondents:
Virtual discussions with professionals involved in human services programs that have transitioned to virtual delivery. ASPE’s contractor, Mathematica, will conduct discussions with professional staff in approximately 18 programs in total across an estimated six states (about two to four programs per state). We will conduct individual telephone or video interviews (with possibly some small group discussions if respondents prefer this format) with an estimated four respondents in each of the program sites included in the scan. The professionals are expected to include: (1) program administrators or managers at the community, local, Tribal or state levels; and (2) frontline staff, such as caseworkers or home visitors.
Focus groups by videoconference or telephone with participants in the study programs. Mathematica will also conduct about six focus groups via a videoconferencing platform such as Webex or by telephone with individuals who have experienced program services by virtual methods, in particular since Spring 2020 when many human services programs shifted their services to primarily virtual methods. We will, in particular, seek a convenience sample of participants who have experienced programs both in person (prior to the shift spurred by the pandemic) and virtually (after the shift was implemented). Participants may include, for example: (1) parents who participate with their children in Head Start, (2) parents participating in TANF or home visiting programs, and (3) young adults receiving services to transition out of foster care.
Program Selection
We will use a purposive sample of program sites selected according to a range of criteria, including: program type, geographic region, degree of urbanicity (i.e. rural, urban, and suburban), demographics, and preliminary indications from existing information and program experts of innovative or otherwise noteworthy approaches to virtual human services delivery. The information we collect is not generalizable to human services programs more broadly, or to specific states and localities.
We will use several sources of information to select program sites for the scan. These include findings from ASPE’s earlier intramural work on virtual case management, as well as recent expert and practitioner panels and other resources on virtual human services delivery. In addition to these sources of information, ASPE staff will ask federal program staff and selected informants at the national level (fewer than 10) to recommend program sites that we should explore (e.g. those that may be using innovative strategies, such as different virtual modes for different client populations in an effective and appropriate way).
We will select six states, with two to four program sites in each, for virtual semi-structured discussions and focus groups. We have tentatively identified California, Colorado, Ohio, Iowa, New York, South Carolina, West Virginia and Wisconsin as states of interest that meet our selection criteria, but may revise this target list after speaking further with federal program staff and other key informants at the national level. We will also identify two alternate states, since not all target states may be able to participate.
Within each state, we expect to focus the scan on two to four program sites (an average of three for each state). Program areas of particular interest include: child welfare/foster care, income and employment assistance, early learning and development (e.g. Head Start), home visiting, and family strengthening. However, we may revise this program list based on information from federal staff and other informants.
To understand the range of perspectives and practices with regard to virtual human services delivery, we aim to identify programs that are delivered at the community or county level, and at the Tribal level. In some cases, such as TANF, programs may also be delivered at the state level. In addition to recommendations from federal staff, Mathematica will ask state-level participants in the initial discussions to recommend local, Tribal, or community programs. We also expect to include rural program sites, where lessons on remote service delivery may be particularly relevant in the long term, and program sites where the demographics indicate the likelihood of challenges such as a lack of access to the Internet or appropriate devices (the digital divide). In addition, we’ll include programs in states that reflect the country’s regional diversity.
Recruiting professionals for virtual semi-structured discussions
Recruiting professionals for the individual or small-group discussions will generally begin with a review of our data sources for prior research related to virtual human services delivery. For example, state and local program administrators have participated in webinars about the shift from in-person to virtual service delivery. We will also consider individuals recommended to us through key informant interviews with federal staff, as well as with a limited number (fewer than 10) of staff at national organizations such as the American Public Human Services Association or the National Governors Association. As we identify and speak with respondents, we will also use snowball sampling, i.e., asking them to recommend other professionals at the state, local, Tribal, or community levels whom we may recruit for the study. While we expect ASPE staff to conduct the interviews with federal and national organization staff, we will work closely with Mathematica in developing the list of potential study respondents.
Mathematica staff will first reach out via email to state, local, or Tribal program administrators or managers (see Attachment A: Recruitment Emails) to inform them of the study and ask for their participation or whether one of their colleagues would be willing to participate in an interview regarding their policies, practices, and perspectives on virtual human services delivery. We will also ask them to refer us to other appropriate respondents in local, Tribal, or community programs—both administrators and frontline staff. These recommendations will be added to those we receive from federal program staff and other informants at the national level. We will aim to recruit for the scan about 2 state or local administrators and approximately 2 frontline staff for each of the expected 18 program sites, for a total of 72 respondents across all 18 program sites (across 6 states).
Recruiting program participants for focus groups
Mathematica will recruit up to seven participants for each of about six focus groups conducted by videoconference (although a telephone call-in number will also be provided, in case that is preferable for participants). Participants will either be adults (e.g. parents of children in Head Start programs or recipients of TANF benefits or home visiting services), or young adults (e.g. youth over 18 transitioning out of foster care), for a total of about 42 participants across the six focus groups. If possible, we will conduct one focus group in Spanish. We expect that these participants will be recruited from a single program site, allowing us to understand their perspectives on the specific approach to virtual human services delivery in their program location, better complementing our knowledge from other program-site respondents. If it is not possible to recruit a full focus group using this method, we will shift as needed, such as to a small group discussion with as many participants as can be recruited. We expect the discussion itself to last about 60 minutes, but have estimated an additional 30 minutes of time for each respondent due to the need to become familiar with the videoconferencing platform (if that is the mode used), and to complete the necessary consent and incentive forms and the short demographic questionnaire (Attachment C) prior to the discussion.
To recruit participants for the focus groups, Mathematica will reach out to the program sites for their assistance in informing potential participants about the focus groups. Mathematica staff will develop digital recruitment flyers to be disseminated by the program sites through their normal communication channels (e.g. emails, online program newsletters, webinars, word of mouth). Program staff will be asked mention the opportunity to participate in the groups to program participants, stressing that it is entirely voluntary and entirely unrelated to the program’s services or activities, or to participants’ benefits. People interested in participating in the focus groups will likely sign up by calling the contractor’s 800 recruitment number. Mathematica will follow up by phone and email with people who have indicated an interest in participating, again stressing the voluntary nature of their participation, confirming their interest and suitability for the study, and providing logistical information. We will seek to primarily identify participants who participated in services in person before Spring of 2020 and are participating in largely virtual program services at the time of data collection. We will also seek a general mix of races, ethnicities, genders, and ages across the focus groups.
Mathematica will provide a $25 gift card to each focus group participant as an incentive to participate. Evidence shows that remuneration bolsters recruitment and attendance at small group interviews. Working individuals are busy people, and low-income parents and others face additional barriers to participating in focus groups. To ensure that our incentive is not coercive, consent scripts indicate and interviewers will be trained to make it very clear that participants who choose to leave the group early or prefer not to respond to certain questions will still receive the $25 gift card. Mathematica staff will track the cards that are paid out by completing a log when they send the gift card to participants.
Procedures for the Collection of Information
Mathematica will collect data on policies, practices, and perceptions about virtual human services delivery approaches put in place during the COVID-19 pandemic and maintained through the time of data collection by means of a scan of approximately 18 human services program sites in about 6 states. This will be done by means of (1) virtual semi-structured discussions (telephone or video) with about 72 respondents across the 6 states (approximately four staff in each of the 18 program sites); and (2) six virtual focus group discussions with a total of about 42 program participants (one discussion to be held with a program in each of the 6 state).
Teams of two Mathematica researchers each will collect the data, and they will be trained in the consent and data collection processes. Each team will be composed of one senior researcher to lead the interviews and one junior researcher to assist with scheduling, notetaking, and other supportive roles.
Before each semi-structured interview or focus group, the senior researcher will read a consent script to the participants to inform them about the objectives of the study, the voluntary nature of their participation, and any risks that may be involved. As the senior researcher concludes reading the consent script, he or she will ask the participants for their verbal consent to participate in the study and for their permission to record the interviews for the purpose of notes or transcription. We will have one consent script for the professional respondent interviews and one consent script for the focus groups (both may be found in Attachment B). In addition, for the focus groups, we will invite participants to complete a brief anonymous demographic questionnaire (Attachment C) at the beginning of the focus groups (the time to complete it is part of the 90 minutes overall for the focus group discussions).
Virtual semi-structured discussions with program sites
As mentioned, the discussions will be virtual, entailing approximately 4 semi-structured interviews for each of approximately 18 program sites in 6 states. These discussions will be conducted via videoconferencing or telephone with professionals, including administrators and frontline staff, who have experience with and knowledge of virtual human service delivery policies and practices in their program and location. Using virtual data collection methods allow us to save on travel costs and to better accommodate the participants’ schedules and, of course, allow us to conduct data collection at a time of potential continued physical distancing in some locations. Based on our experience, we also do not think that virtual methods will detract from the quality of the data collection. The discussions will focus on understanding policies, practices, and perceptions about virtual human services delivery, not on observing interactions among staff or clients (which would not be possible at this time).
We will use two basic guides for the semi-structured discussions, one for administrators/managers and one for frontline staff. These can be adapted to both the professional perspective and the knowledge of the different types of respondents. The discussions are expected to run for about 60 minutes, and we expect to recruit one to three participants per interview.
Focus group interviews
In addition, data collection includes conducting about 6 focus group interviews via videoconferencing (or telephone), primarily with those who have experienced both in-person and virtual human services delivery as a program participant. We plan to conduct one focus group for one program site in each of the 6 study states, and they will address participants’ experiences with virtual (and in-person) services in their programs. If possible, we will conduct one focus group in Spanish. We expect the discussion itself to last about 60 minutes, with another 30 minutes estimated for participants to become familiar with the videoconferencing platform (if necessary), and to complete the consent and incentive forms, and the short demographic form (Attachment C), prior to and after the discussion. Each focus group is therefore estimated at 90 minutes. We have developed one discussion guide for all focus groups, which will be adapted to the perspectives and knowledge of the different respondents in the particular session (see Attachment B, Discussion Guides). To ensure their confidentiality, participants will use a pseudonym during the focus group discussions and for the demographics questionnaire, and all data will be de-identified so as not to reveal the respondent. As mentioned, we are aiming to have about 7 participants for each focus group, for a total of about 42 participants across 6 focus groups.
Methods to Maximize Response Rates Deal with Nonresponse
We believe this data collection will be of interest to community, local, Tribal, state, and federal stakeholders, as well as program participants, increasing the likelihood of response. Issues related to virtual human services delivery, and its potential benefits and challenges for program systems and participants, have come up repeatedly since March 2020 when the pandemic and state and local public health orders drove agencies to rapidly adopt virtual methods in their interactions with clients. Multiple stakeholders have noted in recent months the potential for virtual methods to actually improve the quality of service delivery for many program participants and service systems in the long term. We believe therefore that the subject matter of this study is likely to be of significant interest, helping us recruit participants. In addition, conducting the interviews by video or telephone at respondents’ convenience also should lower the burden to participate and maximize participation. For all those recruited to participate, we will send confirmation and reminder emails in order to encourage follow through.
With respect to the non-professional respondents (program participants), in addition to confirmation and reminder emails, we expect the gift cards used as incentives will encourage response. In addition, we will seek to identify individuals interested in sharing their experiences with these evolving service delivery systems.
Test of Procedures or Methods to be Undertaken
All contractor staff who will be leading and participating in semi-structured interviews and focus groups will receive training on the discussion guide protocols to ensure consistent interviews across sites. All senior researchers have subject matter expertise in the range of human services the study addresses and have extensive experience conducting qualitative interviews. Discussion guide protocols have been developed by ASPE staff in close coordination with the contractor.
Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
ASPE subject matter experts with technical and statistical expertise were consulted on the development of this exploratory data collection. This includes: (1) Pamela Winston, Social Science Analyst at HHS/ASPE, email: [email protected], phone: 202-774-4952; (2) Amanda Benton, Social Science Analyst at HHS/ASPE, email: [email protected], phone: 202-834-6545; (3) Jennifer Tschantz, Social Science Analyst at HHS/ASPE, email: [email protected], phone:202-260-2865; (4) Annette Waters, Social Science Analyst at HHS/ASPE, [email protected], phone:202-260-0196; and (5) Kelly Kinnison, Director of the Division of Family and Community Policy at HHS/ASPE, email: [email protected], phone: 202-868-9279. .
LIST OF ATTACHMENTS – Section B
Note: Attachments are included as separate files as instructed.
Attachment A: Recruitment emails
Attachment B: Semi-structured discussion guides including consent scripts
Attachment C: Demographic questionnaire
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | gel2 |
File Modified | 0000-00-00 |
File Created | 2021-01-14 |