Supporting Statement B_Revised_CleanNew

Supporting Statement B_Revised_CleanNew.docx

State Child Welfare Data Linkages Descriptive Study

OMB: 0970-0594

Document [docx]
Download: docx | pdf



State Child Welfare Data Linkages Descriptive Study



OMB Information Collection Request

New Collection





Supporting Statement

Part B



May 2022









Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officer:

Jenessa Malin

Part B

B1. Objectives

Study Objectives

The State Child Welfare Data Linkages Descriptive Study (SCW Descriptive Study), conducted by Mathematica on behalf of the Administration for Children and Families (ACF) Office of Planning, Research, and Evaluation (OPRE), will provide OPRE with information on the extent to which states connect their child maltreatment data to other data sets, whether by linking or integration; information on how any connected data sets are created, managed, and used; and supports or challenges to linking data. As described in Supporting Statement A, the study will survey state child welfare directors and staff who engage with connected data efforts and will interview staff at a variety of levels who are involved with state and county connected data efforts. Each of these data collection activities provides a different perspective on the implementation and use of connected data in the child welfare context. The study will provide understanding of how administrative data is used to understand child maltreatment incidence and related risk and protective factors.

Generalizability of Results

This study is intended to present internally-valid descriptions of the use of connected data in child welfare agencies, not to promote statistical generalization to other agencies.

Appropriateness of Study Design and Methods for Planned Uses

The study is descriptive in nature. Both qualitative and quantitative data sources will serve to capture the experience of state child welfare agency staff with connected data efforts:

  • Initial Survey of child welfare directors (Instrument 1)

  • Connected Data Survey (Instrument 2)

  • Interview Guide for individuals responsible for connected data efforts (Instrument 3)

This study design and the methods used will allow us to answer the research questions (see section A2) and will provide insight into the overarching study objective, demonstrating how connected data efforts are implemented and used by state child welfare agencies. Information from the SCW Descriptive Study may be disseminated through briefs, reports, and other publicly available products. We will develop products that are useful to a variety of audiences, including child maltreatment researchers as well as state and federal agency staff. Survey data may be archived, as appropriate.

This study does not assess impact and should not be used to assess participants’ outcomes. Written products associated with the study will include key limitations. As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.

B2. Methods and Design

Target Population

All child welfare agencies from the 50 states, the District of Columbia, and Puerto Rico are eligible for inclusion in the study. Target respondents include the directors of state child welfare agencies and additional state and county agency staff with knowledge of and experience working with their agencies’ connected data efforts. We will invite 52 state and territory child welfare agency directors to complete the initial survey (Instrument 1),1 and we estimate that 42 will complete the survey (81%). The target respondents for the connected data survey (Instrument 2) are staff in child welfare agencies, identified by the directors, who work closely with a connected data set. Because the survey is tailored to a specific data sharing agreement, in some cases one individual will receive multiple surveys (if they are the contact person for multiple connected data sets). We will send the connected data survey (Instrument 2) to all 52 states and territories and each state or territory will receive up to 4 surveys, to account for multiple data sets. This accounts for up to 208 responses for Instrument 2.2 We will invite staff from all states with connected data sets, estimated to be up to 42 states, covering up to 78 connected data sets, to complete Instrument 2, and we estimate that staff from 34 states, covering about 63 connected data sets, will complete the survey (81%). State child welfare agency directors, along with other state agency staff and some county agency staff, are eligible for the interviews (Instrument 3). The target respondents include up to 120 individuals – a director for each state and territory and additional state agency staff in each state, along with up to 10 county staff and 8 county child welfare diretors involved in data sharing agreements with the state agencies. We estimate that we will complete 96 interviews in 42 states, for a response rate of 80%. We aim to obtain variation in staff experiences to understand how connected data is used within each state.

Sampling

Respondent recruitment for the initial survey (Instrument 1) will be based on the list of state child welfare agencies and their directors compiled by the Children’s Bureau on the Child Welfare Information Gateway site. Because of expected variation in practices across states, we will contact respondents from each state rather than selecting a subsample of states from which to draw. We do not know which characteristics may be associated with which behaviors, so we can not be certain that a sample would be representative. In addition, this is the first time that this information is being collected and we have a well-defined population to survey.

The study will engage in snowball sampling from the Instrument 1 sample. Respondents for the connected data survey (Instrument 2) will be individuals the Instrument 1 respondents identified as working directly with connected data efforts and for whom contact information was provided. The sample for the interviews (Instrument 3) will be comprised of the samples from Instruments 1 and 2, as well as any additional contacts identified as working on the connected data efforts of focus during those earlier surveys. We will invite the child welfare director and one respondent to Instrument 2 for an interview, which may be conducted jointly or separately, and up to two county staff if the state indicates that a county is involved in connected data sets separately from the state. These interviews will provide insight from the perspectives of the state agency and its technical staff, and possibly the county agency and its technical staff.

B3. Design of Data Collection Instruments

Development of Data Collection Instruments

We developed the data collection instruments based on the essential data required to answer the priority research questions, as listed in section A2. We developed the surveys and interview guide based on information learned during the design stage of the SCW Descriptive Study.

The team closely examined all instruments designed for this study to confirm that they were streamlined and did not collect duplicative data. The instruments have also been reviewed by technical experts. Additionally, we conducted a pretest of all instruments in fall 2021 with seven current and former child welfare agency staff familiar with their agency’s connected data to collect information about their clarity, completeness, and burden. As part of the study background, the pretest respondents were told that the first survey would be sent to state child welfare directors for them or their designees to complete; the second survey would be sent to staff with more technical knowledge of connected data efforts; and the interviews would be conducted with child welfare directors and technical staff. We estimated the survey burden based on their average completion time and made revisions to the instruments based on feedback from the pretest respondents.

Table 1 presents a crosswalk between the data collection instruments and the study aims.

Table 1. Crosswalk between data collection instruments and study aims

Study Aims

Instrument 1:
Initial survey of state child welfare directors

Instrument 2:
Survey of connected data efforts

Instrument 3:
Interviews with individuals responsible for connected data efforts

Which states link or share their child welfare data?

X

X


How are linked data sets created, managed, and used?

X

X

X

What are supports or challenges to linking data?


X

X


B4. Collection of Data and Quality Control

Web-based surveys

Using the low-burden, web-based platform Confirmit, we will administer surveys to child welfare directors and state and county agency staff (Instruments 1 and 2). Target respondents will be contacted by ACF one week before emailing the survey. Staff will then be sent invitations to complete the survey, along with a secure web link, by email. We will also send multiple reminders over each data collection period, as needed. If we have not received a response from respondents in the closing weeks of data collection, then we will call them to remind them to complete their surveys at their earliest convenience. These communications are included in Appendices A and B. The project team will monitor survey data for quality and completeness throughout the data collection period.



Interviews

We will conduct interviews virtually over a secure WebEx call. Target respondents will be contacted via email by ACF one week prior to Mathematica initiating contact to schedule the interview. If we do not receive a response from respondents, then we will send up to four e-mail reminders and call them to remind them to schedule the interview. These communications are included in Appendix C. We will record the interviews, with the consent of the interviewee, and a second staff person will attend to take notes. Transcripts will be produced through WebEx, edited by the notetaker, and approved by the interviewer.

Interview guides will be used for every interview for consistency, but they will be altered slightly for each respondent to account for differences and specifics collected during the surveys. The interview guides include a consent statement at the beginning. Project staff will read the consent statement to respondents before beginning the interviews. The consent statement provides assurances that the information shared will be reported in a manner that will not identify individual respondents. It will also inform respondents that their states may be identified in public reports of study findings based on the collected data or in archived data. However, public reports and archived data will not identify respondents by name. Staff will ask participants to provide verbal consent after reading the consent statement.

B5. Response Rates and Potential Nonresponse Bias

Response Rates

The surveys and interviews are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion. Response rates will be calculated for the surveys overall, but not for specific items.

NonResponse

As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. Respondent demographics will be documented and reported in written materials associated with the data collection.

B6. Production of Estimates and Projections

The data will not be used to generate population estimates, either for internal use or dissemination.

B7. Data Handling and Analysis

Data Handling

We will perform quality assurance checks for reliability on coded qualitative data. We will review quantitative data for completeness and accuracy of data entry and program edit checks into the web-based surveys to minimize errors in entry.

Data Analysis

After data collection concludes, we will analyze according to the procedures described below for each type of data (qualitative or quantitative). Then, we will analyze each data source separately before combining the descriptive themes that emerge to identify any findings supported by multiple data sources.

Quantitative Data

Quantitative data from Instruments 1 and 2 will be analyzed in the same manner. Data will be tabulated at the state level and presented in maps, tables, or figures as appropriate. Responses will be presented as counts or percentages, both overall by states and by administration. There are three administration types: state, county, and hybrid. Statistical significance cannot be tested on differences between state and county or hybrid administrations, due to the expected small sample sizes of county and hybrid administration types.

Qualitative Data

Interviews will be recorded with the consent of interviewees in preparation of coding.

We will use a combination of predetermined and emerging codes to capture constructs of interest as discussed in the interviews. Predetermined codes are developed prior to interview coding and revised as needed. We will code high-level themes to help document current state or county approaches to connected data and inform future efforts to strengthen the existence of and use of connected data.

Mathematica will use the qualitative analysis software NVivo to manage and code transcripts. To ensure reliability across coders, coders will be trained in the predetermined codes and issued a codebook for reference. All team members will code an initial document and compare codes to identify and resolve discrepancies. As the coding proceeds, the team will meet periodically to discuss and resolve any issues or differences.

Following the coding process, we can search the codes to identify major themes and particular ideas. Using a program such as NVivo will also enable us to retrieve data on codes by the type of study participant, enabling comparisons across administration type. We will aggregate the themes, removing from analysis any that are only relevant to three of fewer states to protect anonymity.

Interviews provide context for quantitative findings from surveys. In the final reports, interview data will be used to present themes and provide examples for descriptive aspect to counts and percentages. Every key finding will have one or more example statements from interviews. Examples will be drawn from a range of states.

Data Use

The SCW Descriptive Study team will use the information collected to provide insight into the overarching research questions. The team will use data to develop a series of publicly disseminated deliverables, which may include briefs, reports, and other products for a variety of audiences. The briefs may focus on findings from a single data collection activity or may be thematic based using data from multiple data collection activities. The technical appendix will detail all data collection activities and include the instruments, research questions, and response rates.

General limitations to the resulting study data will be included in materials and presentations. Written products will clearly state that this study aims to present a high quality description of connected data efforts in participating child welfare agencies, not to promote statistical generalization to other agencies.

Survey data collected from this study will be archived as appropriate. We will identify the appropriate archive, develop a plan, and properly document all data for storage. We will dispose of sensitive collected data that will not be archived.

B8. Contact Persons

Mathematica is conducting this project under contract number 47QRAA18D00BQ/75ACF121F80021. Mathematica developed the plans for this data collection. Experts in their respective fields from the ACF Office of Planning, Research, and Evaluation and Mathematica helped developed the design, data collection plan, and instruments for which we request clearance.

The following people can answer questions about this data collection effort:

Staff from Mathematica will be responsible for collecting, processing, and analyzing the information for the Office of Planning, Research, and Evaluation.

Attachments

Instruments

Instrument 1: Initial Survey

Instrument 2: Connected Data Survey

Instrument 3: Interview Guide

Appendices

Appendix A: Outreach to State Child Welfare Directors

Appendix B: Outreach to Respondents for the Connected Data Survey

Appendix C: Outreach to Child Welfare Directors, State Staff, and County Staff for Interviews

Appendix D: Public Comment on Federal Register Notice 2022-02928



1 We will invite the child welfare agency directors from all 50 states, the District of Columbia, and Puerto Rico to complete the initial survey.

2 Our estimates for burden hours in Supporting Statement A, Table 2, define respondent by survey administration and not necessarily by the number of different people completing the survey.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorTara Strelevitz
File Modified0000-00-00
File Created2022-09-08

© 2024 OMB.report | Privacy Policy