Project LAUNCH Supporting Statement Part B

Project LAUNCH Supporting Statement Part B.doc

Project LAUNCH Cross-Site Evaluation

OMB: 0970-0373

Document [doc]
Download: doc | pdf

CONTENTS (continued)



Section Page






Project LAUNCH Cross-Site Evaluation


Supporting Statement Part B for OMB Approval


September 17, 2009








B. STATISTICAL METHODS (USED FOR COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS)

B.1 Respondent Universe and Sampling Methods


The cross-site evaluation will collect information from all 18 Project LAUNCH grantees (6 grantees funded in September 2008 and 12 grantees funded in September 2009). As part of their grant requirements, grantees are expected to participate in cross-site data collection activities, including annual site visits and semi-annual web-based data reporting on systems development and services delivery. Accordingly, the study will represent the universe of funded grantees, and results will be completely representative of that pool of grantees. In addition, because the grantees include projects serving a variety of individuals, in a wide range of locations (urban, rural, tribal), it may be possible to draw meaningful inferences about the comparability of many of these grantees’ experiences to other similarly situated Project LAUNCH grantees. While the results are not entirely generalizable, they may provide insights for promising or best practices in the field given a specific type of community and resources available.


Annual Site Visit Interviews

Purposeful identification will be used to select respondents for the annual site visit interviews. Purposeful selection is appropriate because information can come only from individuals with particular roles or knowledge of the community and its systems.


Semi-Annual Web-Based Data Reporting: Systems Measures

Project LAUNCH staff for each grantee will complete the systems measures for that site. No sampling is required, as this information will be reported by the universe of Project LAUNCH grantees.


Semi-Annual Web-Based Data Reporting: Services Measures

Data on the universe of children, families and providers participating in Project LAUNCH services will be collected semi-annually through the services instrument on the web-based reporting system. Grantees will report aggregate information on children and families served under different service models (e.g., home visitation), including demographic information. They will report aggregate information on providers who participate in training or other activities offered by Project LAUNCH, and on provider practice and provider setting changes.


Additional grantee-specific aggregate child and family outcomes will be reported by the grantees as part of their local evaluations. Grantees will select the outcomes they wish to examine based on the evidence-based practices they are implementing at the local level and will report aggregate outcome data on all children and families who receive a particular service(s).




B.2 Procedures and Methods for the Collection of Information


Annual Site Visits

The evaluation will include annual site visits to all Project LAUNCH grantees. One or two-person teams from the cross-site evaluation senior staff will conduct the site visits, which are expected to take between 1.5 – 2 days to complete.


Together, these site visitors will speak with key staff involved in implementing and overseeing Project LAUNCH activities. Likely respondents include: the State Child Wellness Coordinator, State Wellness Council Members, the State ECCS Project Director, the Local Child Wellness Coordinator, the Local Wellness Council Members, the local evaluator, and local service providers.


Information collected on site will cover the following broad categories:


  • General Program Information

  • Community/Tribal Context

  • Strategic Planning Process

  • Child Wellness Council and Program Oversight

  • Project Outreach

  • Project LAUNCH Service Delivery

  • Workforce Development and Capacity Building

  • System Coordination

  • Cultural Competence

  • Quality Monitoring and Improvement

  • Sustainability and Replication


The interviews will follow discussion guides, which will have been populated with information gathered from existing sources, to the extent possible, in advance of the site visits. During site visits, evaluators will verify information about the target population and the population served, implementation of Project LAUNCH activities at the State, Tribal, and local levels, infrastructure development and systems changes, collaboration and coordination across child-serving systems including partnerships established, strategies developed and adopted to promote implementation of evidence-based promotion and prevention programs, staff training procedures, barriers encountered and how they were addressed, and achievements.


Semi-annual Web-based Data Reporting

Grantee staff will participate in semi-annual web-based data reporting activities. These data will be used to describe the Project LAUNCH activities that have occurred over the most recent six month reporting period and will correspond to SAMHSA’s project reporting periods (e.g., April 1st-September 30th, October 1st-March 31st). Grantees will provide data through a secure website that will be run using the Decision Support 2000+ platform. Although the evaluation will not dictate which Project LAUNCH staff member will be tasked with providing these data, we anticipate that the State Child Wellness Coordinator, the Local Child Wellness Coordinator, and/or the local evaluator at each grantee will assume primary responsibility for this effort. Grantees will be asked to report on systems development activities at the State/Tribal and local levels, and on services delivery at the local level.


Systems Development Activities at the State, Tribal, and Local Levels:

  • The degree to which Project LAUNCH has assisted in the development and implementation of an integrated system of care for children and families; collaboration and coordination with other agencies; work force development; and sustainability of LAUNCH activities over time.

  • Activities undertaken to improve the local service system (e.g. trainings conducted, collaboration/infrastructure development activities, quality monitoring activities)

  • Activities undertaken to improve cultural competence in local service systems and service delivery


Service Delivery at the Local Level:

  • For each service model being implemented (e.g., home visitation):

  • Program model and core components of the model

  • Proportion of program funded by LAUNCH

  • Whether the program is targeted or universal and, if targeted, characteristics of the target children or families

  • Number of families in the targeted area eligible for services

  • Number of eligible families and children served in a given reporting period

  • Number of staff in the community who could be delivering the service

  • Number of staff in the community who are delivering the service

  • Demographics of children and families in the service

  • Fidelity of implementation of the program model

  • Type and frequency of training provided

  • Provider and provider setting changes

Quality Control


Annual Site Visits

We have instituted a variety of methods to ensure the quality of the data collected in the site visits. All site visitors will attend training before annual interviews are conducted, which will cover site visit protocols, definitions for terms used in the interview guides, and any questions about the data collection instrument or procedures. Staff will review strategies for working with diverse populations and for ensuring that data are collected in a manner that is culturally sensitive and respects the backgrounds and traditions of Project LAUNCH staff. During site visits, the cross-site evaluation team will not have direct contact with service recipients.


Semi-annual Web-based Data Reporting

Evaluation team members will work closely with Project LAUNCH staff to ensure the quality of the data collected in the secure electronic data capture system. Each Project LAUNCH grantee has been assigned two liaisons from the cross-site evaluation team who will be trained on the data reporting system by the developer of the system. Both the developer of the web-based system and the two site liaisons will be available to provide technical assistance to the grantees as they have questions about the data reporting requirements. (The two liaisons will be the same two individuals who conduct the annual site visit to their assigned grantee.)


The evaluation team’s technical assistance pertaining to online data collection will ensure that the data provided is of the highest quality. The assistance will focus on assisting grantees with the cross-site evaluation data collection protocols, websites, and tracking forms that they will be using to collect and report data. This will include demonstration of these protocols and websites, practice in using them, troubleshooting, and ongoing assistance throughout the evaluation to ensure complete and reliable data collection and submission. Assistance will cover data cleaning and data checks, issues of privacy, and timelines for data submission. A manual will be provided to grantees with detailed instructions for reporting each data element and measure (including definitions).



B.3 Methods to Maximize Response Rates and Deal with Nonresponse


Project LAUNCH grantees are required to participate in the cross-site evaluation as a condition of receiving Project LAUNCH funding. Grantees recognize that they are involved in a new and innovative initiative, and that results from this initiative could have an important and profound impact on the field.


The grantees are expected to comply with the requirements to submit data to the electronic data reporting system because it will be fed back to the grantees through reports generated by the system. These reports will allow them to use the data in their own local evaluations by, for example, helping them track their efforts over time. Our expected response rate for this effort is 100% and, accordingly, is well above the 75 percent threshold discussed in the OMB guidelines.


B.4 Test of Procedures or Methods to be Undertaken


Wherever possible, the evaluation uses measures that have been previously developed and tested, and that have demonstrated validity and reliability. The Cross-Site Evaluation Team has vast experience developing and incorporating existing instruments where necessary. Systems development questions were selected based on a comprehensive review of existing measures from a number of other studies. Questions on child and family characteristics were derived from existing national surveys, including the Early Head Start Research and Evaluation Study and the SLAITS. To ensure that we have selected the most appropriate measures, the draft instruments have been vetted with project consultants, local evaluators from the first cohort of 6 grantees, and Federal staff from ACF and SAMHSA. Usability testing of the web-based data entry system will be conducted by evaluators from the first cohort of grantees.




B.5 Individuals Collecting and/or Analyzing Data


Abt Associates Inc. and its RE-AIM subcontractor, Kaiser Permanente, are conducting this project under contract to ACF. The plans for statistical analyses for this study were developed by Abt Associates. The team is led by Deborah Klein Walker, Principal Investigator; and Barbara Goodson, Evaluation Design Lead. At the Federal level, Maria Woolverton is the ACF COTR for the Cross-Site Evaluation and Jennifer Oppenheim and Marna Hoard are the SAMHSA grantee Project Officers.



File Typeapplication/msword
File TitleMEMORANDUM
AuthorBarbara Carlson
Last Modified ByDHHS
File Modified2009-09-21
File Created2009-09-21

© 2024 OMB.report | Privacy Policy