BIAS-NG Generic Supporting Statement A_third domain_to submit_final v2

BIAS-NG Generic Supporting Statement A_third domain_to submit_final v2.docx

ACF Behavioral Interventions to Advance Self-Sufficiency Next Generation (BIAS-NG) Project

OMB: 0970-0502

Document [docx]
Download: docx | pdf




ACF Behavioral Interventions to Advance Self-Sufficiency Next Generation (BIAS-NG) Project


Generic Data Collection (OMB Number 0970-0502)


SUPPORTING STATEMENT PART A



Revision June 2019
















Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201






 Executive Summary


  • This information collection request (ICR) is for revisions to the ACF Behavioral Interventions to Advance Self-Sufficiency Next Generation (BIAS-NG) Project Overarching Generic (#0970-0502). Under this generic clearance, interventions have been and will continue to be developed in the program area domains of Temporary Assistance for Needy Families (TANF) and child welfare. This revision would also allow for collection of data in the Early Head Start/Head Start program area.


  • Status of Study: The information collected under this generic clearance is intended to inform the diagnosis and design, as well as the evaluation, of 9 behavioral interventions that will be rigorously tested in the BIAS-NG project. Due to the rapid and iterative nature of this work, the Office of Planning, Research, and Evaluation (OPRE) at the Administration for Children and Families (ACF) sought and received approval for a generic clearance to conduct this research.

    • Under this generic clearance, in the approved domains (TANF and child welfare), diagnosis and design have been completed for four interventions in three sites. For these three sites, evaluation is underway and implementation research instruments were approved by the Office of Management and Budget (OMB) in three separate individual generic information collection requests. Additionally, diagnosis and design is ongoing for two additional sites total in TANF and child welfare.

    • For this revision, the design, method, instruments, and analytic approach will remain the same as the approved overarching generic clearance package but we request the following changes: To add the third domain of Head Start/Early Head Start and, because this revision is requesting to add up to two new sites, to increase the overall burden. The original table was calculated for up to six sites.




A1. Necessity for the Data Collection

OPRE seeks OMB approval to add a third domain to our approved pilot generic clearance to conduct interviews, focus groups, and surveys with regional, state, and local agencies as part of the Behavioral Interventions to Advance Self-Sufficiency Next Generation (BIAS-NG) Project. The BIAS-NG project is applying behavioral insights to a range of ACF programs in order to design and test interventions intended to improve the operations and efficacy of human services programs. We are seeking to add Early Head Start/Head Start (EHS/HS) program administrators, staff, and clients for the same types of information collections as approved for the TANF and child welfare (CW) domains. The purpose of these data collection efforts is to inform the design of and to better understand the mechanisms and effects of interventions informed by behavioral science and intended to improve program outcomes.


This submission provides revised supporting statements to include the third domain, EHS/HS. There are no changes to the proposed types of data to be collected, types of respondents, methods for collection, or proposed uses of the information.

Study Background

The September 2015 Executive Order “Using Behavioral Insights to Better Serve the American People” stated that “A growing body of evidence demonstrates that behavioral science insights -- research findings from fields such as behavioral economics and psychology about how people make decisions and act on them -- can be used to design government policies to better serve the American people” and encouraged federal agencies to “develop strategies for applying behavioral science insights to programs and, where possible, rigorously test and evaluate the impact of these insights.” In keeping with this directive, OPRE is conducting the BIAS-NG project. This project uses behavioral insights to design and test interventions intended to improve the operations and efficacy of human services programs. The BIAS-NG project builds on a prior OPRE project, the Behavioral Interventions to Advance Self-Sufficiency (BIAS) project, which relied exclusively on administrative data to test the short-term impact of small “nudge” interventions in human services programs. The BIAS-NG project is building on and going beyond the BIAS project by applying behavioral insights to additional ACF programs, going beyond testing simple “nudges” to include: helping programs be more self-reflective about how they present choices and options to participants; testing alternative approaches to presenting those options and, importantly, by collecting qualitative information from program staff and participants to better understand the mechanisms and effects of behavioral interventions. Information collected from interviews, focus groups, and surveys with program staff and participants will first enable the research team to better diagnose problems amenable for behavioral interventions. Based on this information, the research team will be able to design relevant interventions. Information collected during the implementation of the interventions will provide additional information as to whether the intervention was successful and, just as importantly, why or why not.


OPRE sought and received generic clearance to conduct these interviews, focus groups, and surveys over a period of three years. The BIAS-NG study is designed such that each specific intervention is designed in consultation with the agency leaders; the timeframes are shorter than many evaluations because outcomes of interest are proximate to the intervention point; and these studies often lend themselves to rapid cycle evaluation where testing a particular intervention design can inform subsequent tests of related program improvement efforts.


The iterative and rapid nature of these tests poses a challenge to complying with the timeline for seeking full approval of each individual information collection activity subject to the Paperwork Reduction Act (PRA). Thus, OPRE sought and received generic clearance to conduct this work. For each specific information collection under this generic approval, instruments have been and will continue to be tailored to the specific intervention and the specific site; once a set of instruments for a particular test is developed, and prior to use in the field, OPRE submits a supporting statement Part A and B and submits the specific instruments to be used to OMB for approval. Each specific information collection may include up to two submissions: first, a submission for the formative stage research, to include supporting statements (Stage 3 in Exhibit 1 below); and second, a submission for the test and evaluation materials, to include supporting statements (Stage 4 in Exhibit 1 below).

Legal or Administrative Requirements that Necessitate the Collection

There are no legal or administrative requirements that necessitate the collection. ACF is undertaking the collection at the discretion of the agency.


A2. Purpose of Survey and Data Collection Procedures

Overview of Purpose and Approach

  • The goal of this generic IC is to conduct qualitative and descriptive quantitative research to identify and understand the psychological and behavioral factors that can affect the effectiveness of human service programs.

  • Intended use of the resulting data is to identify ways to apply behavioral insights that have the potential to improve the delivery and/or quality of services administered by human service agencies in the areas of Child Welfare, TANF, and EHS/HS.

  • The qualitative data collection has collected and will continue to collect data using rapid assessment methods, including: semi-structured qualitative interviews; focus groups; direct observations; and document reviews.

    • This qualitative data has been and will continue to be supplemented with administrative data the agencies are already collecting.

  • The populations to be studied include regional, state, and local TANF, CW, and EHS/HS program administrators, staff, and clients.

  • Qualitative data has been and will continue to be analyzed using qualitative analysis methods, such as coding interviews for themes relevant to psychological and behavioral barriers to service delivery, uptake, and quality.


Generic Information Collections (GenICs) submitted under this control number will consist of the following criteria:

  • A full Supporting Statement A and Supporting Statement B has accompanied and will continue to accompany each of the GenICs submitted under this generic clearance. These include:

    • A discussion of the respondents. Administrators, staff, and clients are the subjects of our research during this IC.

  • Information about the context of each specific IC. Researchers speak with and conduct surveys with specific populations in a particular geographic location/setting/agency.

    • A description of the planned qualitative data collection including submission of the specific instruments for review. Instruments include focus group/interview protocols and short surveys specific to each informant group (agency administrators, staff, and clients).

    • A description of the qualitative analyses planned. Audio recordings and notes from interviews/focus groups will be analyzed for patterns and themes.

    • A description of the administrative data that the agencies are already collecting and that the project will utilize. It is important to note that collecting administrative data does not and will not impose a burden on respondents or record keepers, as we ask sites to provide data as it currently exists. We have not and will not be requesting that it be provided in any particular format that is different from the format in which the agency typically keeps it.

    • A description of the planned intervention associated with each specific IC.

    • Information about planned communication about the findings. Study outcomes will be communicated to state and national stakeholders in a position to consider and implement site-specific improvements to ACF agency programs.

  • Final proposed instruments have accompanied and will continue to accompany each of the Gen ICs submitted under this generic clearance.

  • Any supplementary materials (advance letters, emails, etc.) have accompanied and will continue to accompany each of the Gen ICs submitted under this generic clearance, as appropriate.


The study is designed to develop tools to: apply behavioral insights to ACF human services programs; design and test interventions informed by behavioral science; encourage rapid cycle tests that may lead to further improvements in human services programs; and enable regional, state, and local program staff to learn skills to engage in behavioral diagnosis and design, and conduct rigorous tests of future interventions. The interventions we design for this study have addressed and will continue to address problems that have broad relevance for TANF and Child Welfare, and following approval of this current request, EHS/HS programs. While it is our intention for the specific findings from each intervention to provide information that could be useful in the design and operation of programs that provide similar services to similar populations, the specific findings from these interventions will only be suggestive and preliminary, based on this research. The limitations of such findings will be made clear in any related communications.


The majority of the work in each site is conducted in five phases. Exhibit 1 provides an overview of the process in each site, which consists of planning phases to determine the program area domains and learn about the problems of interest to stakeholders (Phase 1) and identify sites (Phase 2). Phase 3 is where we engage with administrators, program staff, and clients through interviews (via telephone or in-person) and/or focus groups. These interactions are needed to develop the interventions to test. During Phase 4 we conduct implementation research with sites, interviewing administrators, program staff, and clients to better understand how the test is being implemented. The below bullets provide more detail on the work during each phase.





Planning Phases

TANF and Child Welfare (currently approved under OMB #0970-0502)

  • Phase 1 (late 2015 – 2018):

    • Select Program Area Domains

      • The TANF and Child Welfare were pre-selected by ACF and were included under the original approval for generic clearance.

    • Define the Problem Areas in Each Domain

      • To ensure that our pilot interventions do not address problems idiosyncratic to a particular program, we identified a set of problems that broadly affect TANF and Child Welfare programs.


  • Phase 2 (mid 2016 – 2019):

    • Identify up to 6 Sites

      • As of Q2 2019, 5 sites have been identified across TANF and Child Welfare. Interest in participating in BIAS NG has been high and systematic recruitment of sites has not been necessary.


Generic Information Collection Phases

  • Phase 3: Diagnose up to 6 Sites and Design 9 Tests (early 2017 – 2019)

    • Conduct behavioral diagnosis and design at each of the 6 sites

      • Behavioral diagnosis and design is a procedure in which we examine the process related to the problem of interest (to better understand the factors that may be inhibiting the desired outcomes and design solutions that are informed by behavioral science research to help improve outcomes). For example, through this process we have identified barriers that TANF recipients may face that contribute to their lack of engagement in welfare-to-work programs.

      • This phase involves reviewing preexisting administrative data from each site and site observations in order to best identify the bottlenecks and when and how an intervention would be the most useful. While it has not yet been necessary, for future sites, we may complete the first round of interviews/focus groups and surveys included under this clearance.


  • Phase 4: Conduct 9 Evaluation Tests (mid 2017 – 20231)

    • Conduct evaluation of the designed intervention.

      • Three sites across the two approved domains have launched their evaluations.

    • The mixed methods evaluations consist of implementation, impact, and cost research.

      • The implementation studies rely in part on the second round of interviews/focus groups and surveys included under this clearance.

      • For the three sites that have launched evaluations, we have submitted and received approval for implementation research as individual information collections requests under the generic clearance.


Dissemination Phase

  • Phase 5: Disseminate Findings and Archive Data (2020 – 2024).

    • Write briefs describing the results of all 9 tests.



Early Head Start/Head Start

Note: The phases for the EHS/HS domain mirror those previously approved for TANF and CW.

  • Phase 1 (late 2018 - mid 2019 ):

    • Define the Problem Areas in the Third Domain

      • To ensure that our pilot interventions do not address problems idiosyncratic to a particular program, we will identify a set of problems that broadly affect EHS/HS programs.


  • Phase 2 (2019):

    • Identify Up to 2 Sites

      • Identify up to 2 sites in the third-identified domain (EHS/HS). As evidenced from the first two domains, interest in participating in BIAS-NG has been high and it is not expected that systematic recruitment of sites will be necessary.


Generic Information Collection Phases

  • Phase 3: Diagnose and Design Interventions for up to 3 Tests (2019-2020)

    • Conduct behavioral diagnosis and design at each site.

      • Behavioral diagnosis and design is a procedure in which we examine the process related to the problem of interest (to better understand the factors that may be inhibiting the desired outcomes and design solutions that are informed by behavioral science research to help improve outcomes). For example, through this process, we can identify barriers that families may face that contribute to their lack of engagement in programs.

      • During this phase we plan to review preexisting administrative data from each site and may complete the first round of interviews/focus groups and surveys included under this clearance in order to best identify the bottlenecks, and when and how an intervention would be the most useful.

  • Phase 4: Conduct 2 Evaluation Tests (2020 – 20222)

    • Conduct evaluation of the designed intervention(s).

    • The mixed methods evaluation will consist of implementation, impact, and cost research.

      • The implementation study will rely in part on the second round of interviews/focus groups and surveys included under this clearance.


Dissemination Phase

  • Phase 5: Disseminate Findings and Archive Data (early 2022 – 2024)

    • Write briefs describing the results of all EHS/HS tests.


In addition to collecting data from administrators, staff, and clients with focus groups, interviews, and surveys, we will also supplement this information with administrative data the agencies are already collecting. Collecting administrative data will not impose a burden on respondents or record keepers, as we ask sites to provide data as it currently exists. We will not be requesting that it be provided in any particular format that is different from the format in which the agency typically keeps it. In addition, we will not be asking more than nine individuals to provide the administrative data.


Research Questions

For the purposes of designing the intervention and conducting an evaluation of its implementation, we have conducted and will continue to conduct interviews, focus groups, and surveys with administrators, staff, and clients. These qualitative data collection activities are critical to designing an effective intervention, allowing the research team to properly diagnose ways in which agencies are not maximizing their impact for the populations they serve. These activities allow the team to gather structured in-depth information to understand the program process from both the administrative and client perspectives. Focus groups and interviews are essential to identifying the points in the outreach and delivery of services, or in the client’s experiences, that are most amenable to a behavioral intervention. They allow the BIAS-NG team to map a correspondence between the insights of behavioral science with the on-the-ground implementation of programs and subsequent client experiences.


These qualitative data collection activities are also essential to conducting implementation research, to describe and document each site’s intervention, how it operated, and provide information about the contrast in treatment between the research groups – both whether the planned contrast between the treatment and control condition occurred (implementation fidelity) as well as how the treatment implemented actually differed from the status quo (implementation contrast). This information is critical to interpreting the findings of our interventions.

Please see Attachments A.1-A.5 for sample interview, focus group, and survey questions. Once sites are selected and instruments are tailored for each site, and for both Phase 3 and Phase 4, we will submit individual IC requests with additional detail about the site, the final tailored instruments, and the site-specific study methodology.


Study Design

Phase 3: Diagnosis and Design

During Phase 3, we have collected and will continue to collect qualitative data from administrators, staff, and clients via focus groups, interviews, and surveys, which helps to inform our intervention design. Changes to instruments used by the federal study team have been and will continue to be submitted to OMB for approval. We also collect administrative data from agency MIS systems to better understand client experiences with the program and identify points where service delivery might need improvement.

Phase 4: Evaluation Tests


Impact Study


During Phase 4, we have designed and will continue to design and conduct impact analyses of behavioral interventions. Such interventions have included or may include, but are not limited to:

  • participant reminders, such as emails, text messages, or telephone calls to facilitate the completion of a particular action;

  • implementation prompts, which encourage participants to make a plan for when they are going to complete an action;

  • easy tracking tools for clients to make it simpler for them to show they are meeting program requirements;

  • self-affirmation exercises to counter individuals’ tendency not to complete an action if they perceive it as a threat to their self-conception or identity;

  • restructured work flows and processes to improve service delivery;

  • automatic enrollment, which defaults eligible participants into a program so that they must opt out rather than opt in;

  • pre-population of forms to make it easier and faster for clients to complete lengthy or confusing forms; and

  • co-location of services to reduce the barriers associated with traveling to multiple offices for different benefits.

It is possible that, in conjunction with some of the behavioral interventions, sites may decide to change what data they collect and/or the questions they ask the public to answer. Such decisions will be controlled by the sites not the project. Our framework of selecting sites within the domain of TANF, Child Welfare, and EHS/HS and targeting similar problems across these sites could also provide opportunities for replication and to determine if similar interventions are effective in different settings trying to get to the same outcomes. When appropriate, we have used and may continue to use factorial or sequential study designs to assess the effectiveness of each intervention component with the goal of building the most efficient intervention possible.

Implementation Study

Additionally, in Phase 4, we have begun to and will continue to conduct an implementation study to describe and document each site’s intervention, how it operates, and provide information about the contrast in treatment between the research groups—both whether the planned contrast between the treatment and the control condition occurred (implementation fidelity) as well as how the treatment implemented actually differed from the status quo (treatment contrast). This information is important for interpreting the findings of the impact study. Exhibit 2 presents research questions that has been and will continue be addressed by information collection in Phase 4. Changes to instruments used by the federal study team have been submitted and approved for the first three sites and will continue to be submitted to OMB for approval. Phase 4 also includes a cost analysis.


Exhibit 2: Research Question and Instrument Matrix

Research Questions

Administrator interviews/focus groups

Staff interviews/focus groups

Client interviews/focus groups

Client survey

Staff Survey

How are sample members identified and recruited for the intervention?

X

X



X

To what extent were the interventions implemented with fidelity?

X

X

X

X

X

For example, what are the patterns of participation (if appropriate as a proximal measure) and do these patterns adhere to the intervention design?

X

X



X

What were the challenges and barriers the site experienced?


X



X

How did the system within which the program operates influence implementation?

X

X



X

What is the organizational culture and how does it support or hinder responses to the behavioral intervention?

X

X



X

To what extent did the intervention require collaboration between multiple agencies or units, and what worked well and what did not?

X

X



X

What are the participant perspectives on their response to the intervention?



X

X




A3. Improved Information Technology to Reduce Burden

Planning site visits have been and will continue to be done collaboratively with each of the sites. We have used and will continue to use conference calls and emails to the extent possible to minimize burden.

The interviews have been and will continue to be conducted either individually or as a focus group. To minimize the burden, we hold semi-structured group discussions (focus groups), rather than individual conversations, whenever possible. For example, one group discussion may be held with multiple front-line workers at the same or similar levels, such as case workers or outreach specialists. A separate group discussion may be held with supervisors of front-line staff. A third discussion group may include staff at the management or administrative level, such as directors of offices or agencies. If there is a single staff member in a particular level, however, an individual discussion is held. Staff at each of these levels often have different perspectives and thus different experiences. Group discussions have allowed and will continue to allow us to reduce the length of time spent at the site while still obtaining valuable feedback on the planning grants from staff with a range of experiences. The surveys have been and will continue be administered on the web, on mobile devices, or in-person while the research team is on-site.


A4. Efforts to Identify Duplication

The information collection requirements for this study have been carefully reviewed to determine what information is already available from existing studies and program documents and what needs to be collected for the first time. Although information from existing sources improves our understanding of the planning process, ACF does not believe that it provides sufficient information on how TANF, Child Welfare, and EHS/HS agencies interact with their clients. This data collection is intended to yield new and useful information about TANF, Child Welfare, and EHS/HS processes. The interviews and focus groups support a deeper exploration of patterns seen in the survey and/or administrative data or review of documents.


A5. Involvement of Small Organizations

While all the sites have not been chosen yet, staff and families at small HS/EHS centers may be part of this data collection effort if they are a sub-grantee to the chosen EHS or HS grantee. If we need to conduct interviews with individuals in small centers, we will schedule interviews at times that are convenient in order to minimize disruption of daily activities.


A6. Consequences of Less Frequent Data Collection

Rigorous evaluation of innovative initiatives is crucial to building evidence of what works and how best to allocate scarce government resources. These data collection undertakings represent an important opportunity for ACF to both learn about activities associated with TANF, Child Welfare, and EHS/HS, and to design behavioral interventions to improve service delivery and uptake.


Not collecting information from the three categories of respondents (administrators, staff, and clients) during Phase 3 would limit the government’s ability to design appropriately targeted interventions that appropriately match the barriers administrators, staff, and clients face in the quest for optimal service delivery. Not collecting information during Phase 4 would hinder the government’s ability to learn how interventions were implemented and whether and to what degree the interventions had the outcome desired.


A7. Special Circumstances

There are no special circumstances for this data collection.


A8. Federal Register Notice and Consultation

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on May 23, 2017, Volume 82, Number 98, page 23572, and provided a 60-day period for public comment. A copy of this notice is included as Attachment 1. No substantive comments were received during the notice and comment period. A thirty day comment period is available to provide comments on the addition of the EHS/HS domain. This notice was published on DATE, Volume XX, Number XX, page XXXX. Comments are directed to OMB.

Consultation with Experts Outside of the Study

We have consulted and may continue to consult with relevant stakeholders and experts on the study design and data collection instruments. When needed, specific consultants will be identified in each Generic IC.


A9. Incentives for Respondents

In order support data collection representing a range of experiences, we currently offer clients participating in focus groups, interviews, and surveys a gift card worth up to $20. Incentives are intended to offset the financial burden that may result from travel, additional cell-phone data or phone minutes, or child care costs associated with participation in focus groups, interviews, and surveys.


The overarching incentive amount originally approved in this generic clearance was $20. Under the proposed increase, we plan to continue using $20 as the default, especially in situations where we are able to access clients during an already scheduled meeting or appointment at the site. However, based on experiences in the field to date, we have found that the $20 incentive may not be sufficient to support an adequate response rate in all situations in which we will be conducting client interviews and focus groups. This is likely to be especially true when the study team asks clients to attend a separate meeting to participate in interviews or focus groups and/or when the client is a parent with young children. For example, in the Allegheny County child welfare site, only four respondents out of 13 scheduled completed a client interview, even after several reminder calls, as $20 was not enough to offset an extra trip to the child welfare office, including costs for child care and transportation.

Incentives have not been and will not be used as a substitute for other best-practice persuasion strategies designed to increase participation, such as explanatory advance letters, endorsements by people or organizations important to the population being surveyed, and assurances of privacy.


We have included and will continue to include a written justification in the specific generic IC request for any planned incentives or tokens of appreciation. We have secured and will continue to secure Institutional Review Boards (IRB) approval for the use and monetary value of the use of incentives prior to fielding the survey and hosting focus groups. Additional information has been and will continue to be provided in each individual generic ICR.


A10. Privacy of Respondents

All respondents who participate in research under this clearance have been and will continue to be read a statement that will explain the study and will inform individuals that their participation is voluntary and of the extent of their privacy as respondents. (See Attachments A.1-A.5.) Participants are and will continue to be told verbally that their conversations will not be shared in a form that identifies them with anyone outside the research team. As ACF’s prime contractor, MDRC implements all data collection activities. If data collection activities are performed by a subcontractor, that subcontractor has maintained and will continue to maintain the same standards of privacy as required by MDRC. Information has been and will continue to be kept private to the extent permitted by law and in accordance with current federal information security standards and other applicable regulations.

MDRC employees are required to maintain and process quantitative and qualitative data in designated project folders on the MDRC network. With the exception of the temporary storage of data during onsite collection, MDRC employees are not allowed to download, keep, or process individual-level data on the hard drives of their MDRC work stations or any other storage. Information is not and will not be maintained in a paper or electronic system from which they are actually or directly retrieved by an individuals’ personal identifier.

The project Data Manager organizes BIAS-NG project folders and supervises storage of BIAS-NG data files on a “need-to-know” basis. Following standard MDRC practice, the project Data Manager and project programmers replace all PII from incoming source data with a randomly-generated project ID number. Also these files are saved in secure folders with limited access on a “need-to-know” basis. Thereafter, most data processing for the project is performed on analysis files that have been stripped of PII. All reports, tables, and printed materials are limited to presentation of aggregate numbers. MDRC has destroyed and will continue to destroy all paper records and electronic records containing PII when no longer needed for research purposes in accordance with funder and contractual requirements, as well as MDRC retention policies.


A11. Sensitive Questions

There are no sensitive questions in this data collection.


A12. Estimation of Information Collection Burden

Exhibit 3 provide details about how the estimate of burden hours and costs were calculated for the third domain. Exhibit 4 shows the previously approved burden estimates for the first two domains: TANF and child welfare. Exhibit 5 shows the total burden estimates including all three domains. We base the third domain estimates on the assumption that we would conduct three tests in the EHS/HS domain. An EHS/HS “site” is a EHS or HS grantee or delegate agency. On average, a grantee consists of approximately five centers. The client perspectives will be obtained by talking to parents in the EHS or HS centers.


During the Diagnosis and Design Phase (Phase 3), we plan to talk to the grantee’s administrators, but potentially also talk to or survey center directors and center-level staff to understand the behavioral barriers facing families. We anticipate interviewing individually or in a focus group with a maximum of:

  • 8 grantee administrators at up to 3 sites for a total of 24 people

  • 10 center directors per site at up to 3 sites, for a total of 30 people

  • 20 frontline staff at 5 centers per site at up to 3 sites, for a total of 300 people

  • 20 parents at 5 centers per site at up to 3 sites, for a total of 300 people.

We plan to administer surveys to up to:

  • 20 parents at 5 centers per site at up to 3 sites for a total of 300 people.

2 staff members at 5 centers per site up to 3 sites for a total of 30 staff surveys.

During the Evaluation Phase (Phase 4), we anticipate interviewing with approximately twice the number of respondents per category as in Phase 3 up to:

  • 48 administrators,

  • 60 center directors,

  • 600 staff, and

  • 600 parents.


We plan to survey up to:

  • 100 parents at up to 20 centers in up to 3 sites (6,000 total parents).

  • 10 staff at up to 20 centers in each of up to 3 sites (600 total staff).


These are the number of people we intend to extend the survey to, not the number of people who actually respond (we will strive for the 80 percent response rate standard). As discussed in Part B, we will endeavor to reduce burden on individual respondents by asking only relevant questions. Accordingly, we think that the estimate below represents an upper bound on potential burden.


We calculated the overall burden per respondent by multiplying the frequency of response by the time to complete each data collection item. We anticipate that focus groups for administrators, staff, and clients (Attachments A.1-A.5) will each take 1 hour to complete. We anticipate the client and staff surveys to each take approximately 15 minutes to complete online. The information collection for both phases is specific to each site and is not intended to continue once the study is over.


Exhibit 3: Additional Burden Hours (third domain)

Instrument

Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Total Burden Hours

Average Hourly Wage

Total Cost

PHASE 3: DIAGNOSIS AND DESIGN

Administrator interviews/ focus groups

24

1

1

24

$23.10

$ 554.40

Center director interviews/focus groups

30

1

1

30

$23.10

$693.00

Staff interviews/ focus groups

300

1

1

300

$23.10

$6,930.00

Client interviews/focus groups


300

1

1

300

$8.57

$2,571.00

Client survey

240*

1

0.25

60

$8.57

$ 514.20

Staff Survey

24*

1

0.25

6

$23.10

$ 138.60

PHASE 4: EVALUATION

Administrator interviews/focus groups

48

1

1

48

$23.10

$ 1,108.80

Center director interviews/focus groups

60

1

1

60

$23.10

$1,386.00

Staff interviews/focus groups

600

1

1

600

$23.10

$ 13,860.00

Client interviews/focus groups

600

1

1

600

$8.57

$5,142.00

Client Survey

4800*

1

0.25

1200

$8.57

$10,284.00

Staff survey

480*

1

0.25

120

$23.10

$2,772.00

Total

7,506



3,348


$45,954.00


*Survey number of respondents is calculated at the target 80 percent response rate standard.


During the Diagnosis and Design Phase (Phase 3), we anticipate meeting with approximately 4 administrators per site, at 2 sites per year for each of the 3 years, for a total of 24 people. We anticipate meeting with up to 8 frontline staff per site, at 2 sites per year, for each of the 3 years, for a total of 48 people. We plan to meet with up to 8 clients per site, at 2 sites per year, for each of the 3 years, for a total of 48 people. We plan to administer surveys to up to 100 clients per site, at two sites per year, for each of the 3 years for a total of 600 clients. We anticipate administering surveys to up to 20 staff members at 2 sites for each of the 3 years for a total of 120 staff surveys.


During the Evaluation Phase (Phase 4), we anticipate meeting with approximately twice the number of respondents per category as in Phase 3 (48 administrators, 96 staff, and 96 clients). We plan to survey ten times the number of clients as in Phase 3, 1,000 clients at each of the 6 sites (6,000 total clients). We anticipate surveying the same number of staff as in Phase 3, 20 staff at each of the 6 sites (120 total staff). These are the number of people we intend to extend the survey to, not the number of people who actually respond (we will strive for the 80 percent response rate standard, although experience in Allegheny has shown this may be difficult to achieve). As discussed in Part B, we will endeavor to reduce burden on individual respondents by asking only relevant questions. Accordingly, we think that the estimate below represents an upper bound on potential burden.

Exhibit 4: Previously Approved Burden Hours (TANF and Child Welfare)

Instrument

Total Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Total Burden Hours

Average Hourly Wage

Total Cost

PHASE 3: DIAGNOSIS AND DESIGN

Administrator interviews/ focus groups

24

1

1

24

$23.10

$554.40

Staff interviews/ focus groups

48

1

1

48

$23.10

$1,108.80

Client interviews/focus groups


48

1

1

48

$8.57

$411.36

Client survey

600*

1

.25

150

$8.57

$1,285.50

Staff Survey

120*

1

.25

30

$23.10

$693.00

PHASE 4: EVALUATION

Administrator interviews/focus groups

48

1

1

48

$23.10

$1,108.80

Staff interviews/focus groups

96

1

1

96

$23.10

$2,217.60

Client interviews/focus groups

96

1

1

96

$8.57

$822.72

Client Survey

6,000*

1

.25

1,500

$8.57

$12,855.00

Staff survey

120*

1

.25

30

$23.10

$693.00

Total

7,200



2,070


$21,750.18

*Survey number of respondents is calculated at the target 80 percent response rate standard.



Exhibit 5: Total Burden Hours (TANF, Child Welfare, and EHS/HS)

Instrument

Total Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Total Burden Hours

Average Hourly Wage

Total Cost

PHASE 3: DIAGNOSIS AND DESIGN



Administrator interviews/focus groups

48

1

1

48

$23.10

$1,108.80

Staff interviews/focus groups

378

1

1

378

$23.10

$8,731.80

Client interviews/focus groups

348

1

1

348

$ 8.57

$2,982.36

Client survey

840*

1

.25

210

$ 8.57

$1,799.70

Staff Survey

144*

1

.25

36

$ 23.10

$831.60

PHASE 4: EVALUATION



Administrator interviews/focus groups

96

1

1

96

$23.10

$2,217.60

Staff interviews/focus groups

756

1

1

756

$23.10

$17,463.60

Client interviews/focus groups

696

1

1

696

$8.57

$5,964.72

Client survey

10,800*

1

.25

2,700

$8.57

$23,139.00

Staff Survey

600*

1

.25

150

$23.10

$3,465.00

Total

14,706



5,418


$67,704.18

*Survey number of respondents is calculated at the target 80 percent response rate standard.



Total Cost

We estimate the average hourly wage for staff to be the average hourly wage of “community and social service occupations” taken from the U.S. Bureau of Labor Statistics, May 2017 National Occupational Employment and Wage Estimates ($23.10). To compute the total estimated cost for clients in the third domain, the total burden hours were multiplied by $8.57, the U.S. average minimum wage, calculated from the U.S. Department of Labor, Minimum Wage Laws in the States, updated July 1, 2018. The estimated total cost for the third domain is $45,954.00 and for all three domains is $67,704.18.


A13. Cost Burden to Respondents or Record Keepers

The data collections proposed under this generic ICR involve imposing time burdens on very busy administrative and frontline staff in human services agencies. Based upon our experience in the field to date under this package, we propose offering a small honorarium of $20 to program staff participating in future data collections under this generic ICR, in recognition of the time and professional expertise they contribute to the studies. These honoraria are intended to both encourage staff participation and recognize their efforts to support a timely and high-quality data collection.


A14. Estimate of Cost to the Federal Government

The total cost for the data collection activities under this current request will be approximately $7,416,426. Annual costs to the Federal government will be approximately $2,472,142.


A15. Change in Burden.

This request is to revise the umbrella generic to include an additional domain (EHS/HS) and therefore additional potential individual GenICs under the umbrella generic. As a result, the burden estimates have increased.


A16. Plan and Time Schedule for Information Collection, Tabulation and Publication

Time Schedule and Publication

The estimated time schedules include proposed efforts beyond the current expiration date. As noted previously, we plan to submit a request for an extension for the overarching generic, along with updates on the status of information collections and we note that this work is dependent on approval of that request for an extension.

Exhibit 6A: TANF and Child Welfare Generic IC and Publications Time Schedule 3

 

CY 2017

CY 2018

CY 2019

CY 2020

CY 2021

CY 2022

CY 2023

CY 2024

 

Q1-Q2

Q3-Q4

Q1-Q2

Q3-Q4

Q1-Q2

Q3-Q4

Q1-Q2

Q3-Q4

Q1-Q2

Q3-Q4

Q1-Q2

Q3-Q4

Q1-Q2

Q3-Q4

Q1-Q2

Q3-Q4

PHASE 3:

Diagnosis and Design

Diagnosis and Design







 

 



PHASE 4: Evaluation

 




Evaluation



PHASE 5: Dissemination

 

 

 

 







Dissemination


Phase 3: Diagnosis and Design: This phase involves the development of site-specific diagnosis and design of behavioral intervention(s) and an evaluation plan using a collaborative process with the site, behavioral science and program content experts, and ACF staff. During this time period we will undertake Phase 3 for five total sites.


Phase 4: Evaluation: Phase 4 consists of implementing the behavioral intervention(s) and evaluating them. During this time period we will undertake Phase 4 for five total sites, with up to two tests per site, for a total of up to 8 tests.


Phase 5: Dissemination: Dissemination efforts during the time of this clearance includes site specific reports, infographics, dissemination products aimed at practitioners, sharing findings at conferences, and publicizing our findings and our work on social media.

Exhibit 6B: EHS/HS Generic IC and Publications Time Schedule


CY 2019

CY 2020

CY 2021

CY 2022

CY 2023

CY 2024

 

Q1-Q2


Q3-Q4

Q1-Q2

Q3-Q4

Q1-Q2

Q3-Q4

Q1-Q2

Q3-Q4

Q1-Q2

Q3-Q4

Q1-Q2

Q3-Q4

PHASE 3:

Diagnosis and Design

Diagnosis and Design





 


 


PHASE 4: Evaluation

 


Evaluation





PHASE 5: Dissemination

 

 

 

 



Dissemination


We recognize that this extends beyond the current period over which the overarching generic was approved. We will submit an extension request at an appropriate date and data collection beyond the current expiration date is dependent on approval of that request

Phase 3: Diagnosis and Design: This phase involves the development of site-specific diagnosis and design of behavioral intervention(s) and an evaluation plan using a collaborative process with the site, behavioral science and program content experts, and ACF staff. During this time period we will undertake Phase 3 for up to two sites.


Phase 4: Evaluation: Phase 4 consists of implementing the behavioral intervention(s) and evaluating them. During this time period we will undertake Phase 4 for up to two sites, with one or two tests per site.


Phase 5: Dissemination: Dissemination efforts during the time of this clearance includes site specific reports, infographics, dissemination products aimed at practitioners, sharing findings at conferences, and publicizing our findings and our work on social media.

A17. Reasons Not to Display OMB Expiration Date

All instruments will display the expiration date for OMB approval.


A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.

1 We will submit a request for an extension for the overarching generic, along with updates on the status of information collections at that time.

2 We will submit a request for an extension for the overarching generic, along with updates on the status of information collections at that time.

3

16


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJones, Molly (ACF)
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy