Supporting Statement Part B

Supporting Statement Part B.doc

Cross-Site Evaluation of the Childrens Bureau Grantee Cluster: Supporting Evidence-Based Home Visiting to Prevent Child Maltreatment (EBHV)

OMB: 0970-0375

Document [doc]
Download: doc | pdf





Cross-Site Evaluation of the Children’s Bureau’s Grantee Cluster:

Supporting Evidence-Based Home Visiting

to Prevent Child Maltreatment (EBHV)


Supporting Statement, Part B

For OMB Approval



September 23, 2009






B. STATISTICAL METHODS (USED FOR COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS)

This section provides supporting statements for each of the five points outlined in Part B of the Office of Management and Budget (OMB) guidelines, in order to collect information for the cross-site evaluation of the Administration for Children and Families (ACF) of the Department of Health and Human Services (DHHS) Children’s Bureau’s grantee cluster, Supporting Evidence-Based Home Visiting to Prevent Child Maltreatment (EBHV). The submission requests clearance for data collection instruments and procedures, plans for data analysis, and reporting of findings from the Cross-Site EBHV Evaluation. As noted in Part A, the Cross-Site EBHV Evaluation is a five-year project. The current submission requests clearance for the first three-year period of the Cross-Site EBHV Evaluation. ACF will submit an extension request at the beginning of the third year for the continuation of these activities for the last two years of the project.

B.1. Respondent Universe and Sampling Methods

The Cross-Site EBHV Evaluation will collect information from 17 EBHV grantees, which will augment the existing evidence by identifying successful strategies for adopting, implementing, and sustaining high quality home visiting programs to prevent child maltreatment. This will be done through supporting rigorous local evaluations as well as using data from those local evaluations and cross-site research to assess participant, program, and systems outcomes. The Cross-Site EBHV Evaluation will use seven instruments and two products to gather data from one or more respondent group (for example, a grantee, a home visitor), described in Table B.1. Data will be collected from all 17 grantees for each instrument. Any sampling that will occur to identify respondents will be within a grantee; a consistent sampling approach will be applied across the grantees.


The sampling approach will either be no sampling or purposeful, depending on the instrument type and respondent group. Four instruments and two products entail no sampling. The EBHV grantee systems web-based data entry and the grantee data quality progress table will gather information from the universe of 17 grantees. All participants and their respective home visitors who are a part of the grantee’s local evaluation will complete the participant-home visitor and home visitor-participant relationship questionnaires. Thus, for purposes of the cross-site evaluation, no sampling is associated with data collection for these instruments. The same is true for the two required products grantees will submit (local analytic reports and data files). Purposeful identification will be used to select respondents for site visit interviews, utilizing the EBHV grantee and key staff-partner interview guide, and for the EBHV grantee-partner systems partnership survey. In each group, purposeful selection is appropriate because insights and information can come only from individuals with particular roles or knowledge, such as EBHV lead grantee staff or home visitors participating in the local evaluation. Purposeful selection will also be used for the agency fidelity/cost web-based data entry, but for this instrument the grantee will select the agencies to be included in its grant activities.


TABLE B.1
SUMMARY OF SAMPLING APPROACH

Instrument Type

Respondents or Focus Group
Participants, by Interview Type,
if Appropriate



Sampling Approach

EBHV grantee and key staff-partner interview guide

Individual or Small-Group Interview

EBHV grantee staff

EBHV key staff (local evaluation team; home visiting program manager)

EBHV partners (private- and state-level funders; partners and committee members; referral sources)

Purposeful Identification: Cross-site evaluation team obtains names and contact information from grantees and arranges individual or small group interviews.


Focus Group

EBHV key staff (home visiting supervisor; home visitors)

Purposeful Identification: For participating home visiting providers of each grantee, cross-site evaluation team obtains names and contact information. All individuals will be invited to focus group, as small numbers are anticipated.

EBHV grantee systems web-based data entry

EBHV grantees

No sampling

EBHV agency fidelity/cost web-based data entry

EBHV grantee agencies

Purposeful Identification: Selected by grantees.

EBHV grantee data quality progress table

EBHV grantees

No sampling

Participant-home visitor relationship questionnaire

Participants

No sampling

Home visitor-participant relationship questionnaire

Home visitors

No sampling

EBHV grantee-partner network survey

EBHV grantees and partners

Purposeful Identification: Cross-site evaluation team obtains names and contact information from grantees. Individuals will receive the survey from cross-site evaluation team.

Local analytic reports

EBHV grantees (no additional burden)

No sampling

Data file for NDACAN

EBHV grantees (no additional burden)

No sampling


NDACAN = National Data Archive on Child Abuse and Neglect

B.2. Procedures for the Collection of Information

The Cross-Site EBHV Evaluation will collect data using seven instruments along with gathering two products. The procedures for the seven instruments which impose burden on respondents to collect information are detailed below, by instrument.


EBHV Grantee and Key Staff-Partner Interview Guide. The cross-site evaluation will conduct two site visits to each grantee and interview grantee and key staff as well as partners. The first site visit will occur in spring 2010 and the second in spring 2012. To facilitate communication between the cross-site evaluation project and grantees, a grantee liaison was assigned at the beginning of the grant period. The liaison is a senior member of the cross-site evaluation team. To facilitate planning and ensure familiarity, site visits will be conducted by the liaison for each grantee and a junior cross-site evaluation team member. Each visit will last multiple days to provide sufficient time to complete data collection activities. It is anticipated that 11 to 14 interviews will be completed during each visit.


EBHV Grantee Systems Web-Based Data Entry. Grantees will use the web-based data entry system to provide requested data on a semiannual basis. Every six months, grantees will enter data into the web-based system related to grantee-specific goals in the systems change domain. Grantees will spend on average 1 hour completing each entry for this task.


EBHV Agency Fidelity/Cost Web-Based Data Entry. Agencies implementing the home visiting programs associated with each grantee will use the web-based data entry system to provide requested data on a monthly basis. Additionally, agencies will enter data collected from the participant-home visitor and home visitor-participant relationship questionnaires (described below). Much of this data will be one-time entries on a rolling basis as new families enroll, with some monthly updating. Grantees will spend on average 9 hours each month completing this task.


EBHV Grantee Data Quality Progress Table. Grantees will complete a data quality progress table that will capture information such as response rates and missing data for the family and child outcomes data collected by grantees. This information will alert the cross-site evaluators to possible technical assistance needs concerning the family and child outcomes data collection. Grantees will complete the progress table approximately four times a year at key data collection points (middle and end of data collection wave) and will spend on average 4.25 hours each reporting period.


Participant-Home Visitor Relationship and Home Visitor-Participant Relationship Questionnaires. At the local grantee sites, participants in the local evaluation and their respective home visitor will each complete a relationship questionnaire. The participant-home visitor relationship questionnaire is a modified version of the client short-form of the Working Alliance Inventory (WAI; Santos 2005 modifying Horvath 1994; Tracey and Kokotovic 1989). The home visitor-participant relationship questionnaire is a modified version of the therapist short-form of the WAI. The paper-and-pencil instrument contains 12 items rated on a 7-point scale and takes approximately 15 minutes to complete.


EBHV Grantee-Partner Network Survey. The cross-site evaluation team will identify each grantee’s partners, who will be asked to complete each round of the grantee-partner systems partnership survey. The cross-site evaluation team will coordinate distribution and collection of the survey, but assistance from grantee staff may be necessary if response is low. The web-based survey will take approximately 25 minutes to complete.


B.3. Methods to Maximize Response Rates and Deal with Nonresponse

The Cross-Site EBHV Evaluation expects to obtain a very high response rate (80 percent or more) for all instruments used. The grantee liaison will serve as a link to work with EBHV grantees if needed to address response. Strategies for maximizing response in the data collection efforts are described below.


EBHV Grantee and Key Staff-Partner Interview Guide. All interviews conducted with key grantee staff and partners will occur during site visits. It is anticipated that all grantees will agree to participate in these visits. Our past experience indicates that participation rates are typically close to 100 percent. To help ensure high participation, we will coordinate with the grantees, key staff, and partners to determine convenient dates for these visits.


EBHV Grantee Systems Web-Based Data Entry and EBHV Agency Fidelity/Cost Web-Based Data Entry. We anticipate that all EBHV grantees and agencies will complete data entry into the web-based system. To support their effort, the cross-site evaluation team will provide three main types of help: (1) a user’s manual, (2) system orientation through conference calls with the grantees, and (3) ongoing technical assistance. Additionally, we will provide direct assistance to grantees and agencies that fall behind in data entry or fail to provide the needed data and will be tailored to each grantee or agency. This may include providing reminders and requests to grantees or agencies via email or U.S. mail, conducting telephone follow-up to ensure reminders and requests are received and reviewed, and providing direct assistance through one of the cross-site evaluation team’s data liaisons.


EBHV Grantee Data Quality Progress Table. We anticipate that all EBHV grantees will complete the data quality progress table. The cross-site grantee liaisons will hold regular monitoring calls to support rigorous evaluations, utilizing the progress table to guide those meetings. If needed, we will follow-up with grantees that do not provide these tables prior to a scheduled meeting. Assistance will be tailored to each grantee and may include providing reminders and requests to grantees via email or U.S. mail, conducting telephone follow-up to ensure reminders and requests are received and reviewed.


Participant-Home Visitor Relationship and Home Visitor-Participant Relationship Questionnaires. The web-based data entry system will include a built-in prompt to alert grantees about participants and home visitors requiring the relationship questionnaire. This prompting will build from each family’s enrollment date so that grantees receive a note about families requiring the questionnaire at the beginning and ending of home visiting services. It is anticipated that this will increase the timely administration of the paper-and-pencil questionnaires by grantees. Additionally, the cross-site evaluation team will monitor the monthly web-based uploads to assess the number of questionnaires submitted for participants and home visitors. If needed, the cross-site evaluation team will follow-up with grantees that fall behind in data entry or fail to provide the needed data. Assistance will be tailored to each grantee and may include providing reminders and requests to grantees via email or U.S. mail, conducting telephone follow-up to ensure reminders and requests are received and reviewed, and providing direct assistance through one of the cross-site evaluation team’s data liaisons.


EBHV Grantee-Partner Network Survey. The cross-site evaluation team will solicit the assistance of grantees to field the web-based network survey and obtain a high response rate. Grantees will provide the team with a list of network survey respondents for their site and their contact information. The team will then inform respondents about the survey and ask them to complete it. If concerns with response rate within a grantee develop, the cross-site evaluation team will work with lead grantee staff to identify strategies for increasing receipt of completed surveys. If needed, the cross-site evaluation team will ask lead grantee staff to send an email to all network survey respondents in their site encouraging their response. This should assist with response rates, as lead grantee staff will have personal relationships with their partners and can use their close proximity to encourage responses. Additionally, the cross-site evaluation team will deploy survey staff with expertise in obtaining responses to conduct one round of telephone follow-up with non-respondents, if email reminders and requests from the grantee prove ineffective. Finally, the network surveys will be occurring around the time of the site visit. As a last resort, the team conducting the site visit can follow-up directly with partners at that time.

B.4. Test of Procedures or Methods to be Undertaken

Most of the instruments to be used in the Cross-Site EBHV Evaluation build on existing measures and previous experience from other studies completed by the cross-site evaluation team. Consequently, pretesting of previously used instruments or measures has not been planned. For example, the EBHV grantee and key staff-partner interview guide will build on interview guides used in similar studies such as the Early Head Start Enhanced Home Visiting Pilot Evaluation. The modules in the EBHV grantee web-based data entry system will build on existing data collection completed for the national home visiting program models used by the grantees. Each model has existing methods for tracking program implementation and services, and incorporating these into the data entry system builds on the success of their existing models. Additionally, the planned collection of family characteristics aligns with data collected on previous surveys, for example, the Early Head Start Research and Evaluation Project. The EBHV grantee data quality progress table is modeled on prior experience on the Analytic and Technical Support for Advancing Education Evaluations project for the Institute for Education Sciences, U.S. Department of Education to monitor randomized control trials being conducted by regional education laboratories. The participant-home visitor relationship and home visitor-participant relationship questionnaires are structured and existing instruments that do not require pretesting.


The one new instrument, the EBHV grantee-partner network survey, will include a pretest. The cross-site evaluation team will identify one to five grantees and ask a member of their staff to pilot test the survey for the cross-site evaluation. Additionally, usability testing of the EBHV web-based data entry system will be conducted with up to five grantees.

B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

This team is led by Melissa Lim Brodowski, CB/ACF/DHHS, project officer; co-project directors, Kimberly Boller, Mathematica Policy Research, Inc., and Deborah Daro, Chapin Hall at the University of Chicago; and Debra Strong, Mathematica, project manager. The plans for statistical analyses for this study were developed by Mathematica with support provided by Heather Koball, Mathematica. The plans for the implementation process component were developed by Mathematica with support provided by Heather Zaveri, Mathematica.

REFERENCES

Horvath, Adam O. “Empirical Validation of Bordin’s Pantheoretical Model of the Alliance: The Working Alliance Inventory Perspective.” In The Working Alliance: Theory, Research and Practice, edited by Adam O. Horvath and Leslie S. Greenburg. New York: Wiley, 1994.

Santos, Robert G. “Development and Validation of a Revised Short Version of the Working Alliance Inventory.” Unpublished doctoral dissertation. Winnipeg, Manitoba: University of Manitoba, 2005.

Tracey, Terence J., and Anna M. Kokotovic. “Factor Structure of the Working Alliance Inventory.” Psychological Assessment: A Journal of Consulting and Clinical Psychology. vol. 1, no. 3, 1989, pp. 207-210.



File Typeapplication/msword
AuthorDawn Smith
Last Modified ByDHHS
File Modified2009-09-23
File Created2009-09-23

© 2024 OMB.report | Privacy Policy