Supportingst0990-0271-Round_II_OWH_CCOE

Supportingst0990-0271-Round_II_OWH_CCOE.doc

DataCollection for the Identification ofComparisonGroups for the National Com. Ctrs of Excellencein Women's Health (CCOE) Program

OMB: 0990-0271

Document [doc]
Download: doc | pdf

Office On Women’s Health


U.S. Department of Health and Human Services

5600 Fishers Lane, Room 16A-55

Rockville, Maryland 20857


(301) 443-1402

(301) 443-1384 (Fax)

[email protected]

Barbara F. James

Director, National Community Centers of Excellence in Women’s Health Program


October 27, 2006



Table of COntents

OMB Supporting Statement 1

A. JUSTIFICATION (Sections 1 – 18) 1

1. Circumstances Making the Collection of Information Necessary 1

2. Purpose and Use of the Information 3

3. Use of Information Technology and Burden Reduction 5

4. Efforts to Identify Duplication and Use of Similar Information 5

5. Impact on Small Businesses or Other Small Entities 5

6. Consequences of Collecting the Information Less Frequently 6

7. Special Circumstances Relating to the Guidelines of 5 CFR § 1320.5(d)(2) 6

8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency 7

9. Explanation of Any Payment or Gift to Respondents 7

10. Assurance of Confidentiality Provided to Respondents 7

11. Justification of Sensitive Questions 8

12. Estimates of Hour Burden Including Annualized Hourly Costs 8

13. Estimates of Total Annual Cost Burden to Respondents or Recordkeepers 10

14. Estimate of Annualized Cost to the Federal Government 10

15. Explanation for Program Changes or Adjustments 12

16. Plans for Tabulation and Publication and Project Time Schedule 12

17. Reason(s) Display of OMB Expiration Date is Inappropriate 15

18. Exceptions to Certification for Paperwork Reduction Act Submissions 15

B. DESCRIPTION OF INFORMATION COLLECTION 15

1. Respondent Universe and Sampling Methods 15

2. Procedures for the Collection of Information 17

3. Methods to Maximize Response Rates and Deal with Nonresponsiveness 20

4. Test of Procedures or Methods to be Undertaken 21

5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 23



OMB Supporting Statement

A. JUSTIFICATION (Sections 1 – 18)

1. Circumstances Making the Collection of Information Necessary

The Department of Health and Human Services Office on Women’s Health (OWH) is seeking a revised clearance to conduct four data collection efforts as part of their updated evaluation of the National Community Centers of Excellence in Women’s Health (CCOE) program. This data collection is authorized under Section 301 of the Public Health Service Act (42 U.S.C.241).

For additional detail regarding this evaluation, see Attachment I, the Evaluation Methodology for the National Community Centers of Excellence in Women’s Health Program National Evaluation: Round II. The National Evaluation: Round II is designed to measure the effectiveness and growth of the CCOE program since the initial National Evaluation (Round I) conducted in 2003. The program consists of community-based grantee organizations located across the country. A total of fourteen CCOEs have been funded, each with multiple community partners. The Round II evaluation will mirror the evaluation processes and utilize the same data collection instruments used by OWH for the Round I evaluation. The Round II evaluation will measure how successful the CCOEs have been at meeting their overarching goal: the development of an integrated, innovative, community-based, interdisciplinary, and comprehensive health care delivery system that extends quality health care services to women of all ages and racial/ethnic groups. The success of the CCOEs will be measured by reviewing the baseline data from Round I and comparing that to new data collected during Round II. This review and comparison of data will provide a comprehensive picture of the CCOE programs, and their outcomes relative to the established program goals.

OWH’s evaluation methodology includes multiple phases and varied layers of data collection that build upon one another, using the same approach utilized for Round I. A strong mix of qualitative and quantitative data gleaned from multiple sources provides a solid basis for developing a comprehensive picture of the CCOE program. Comparing this information to data collected during Round I will help in developing a comprehensive picture of the CCOE program’s evolution and growth since 2003. A key element to this data collection effort, just as with Round I is to obtain input from multiple sources where each source has a different experience and knowledge of the CCOE. The four data collection instruments are:

  • CCOE Center Director and Program Coordinator Survey;

  • CCOE Community Partner Survey;

  • CCOE Client Survey; and

  • CCOE Site Visit Protocol.

Each CCOE has a designated Center Director and Program Coordinator. Obtaining their updated input via the CCOE Center Director and Program Coordinator Survey will provide essential information on the network of services the CCOE developed, the extent to which they are integrated as well as any growth and progress towards program goals the CCOEs have achieved since Round I. This information is not available by any other means.
A copy of the CCOE Center Director and Program Coordinator Survey is included as Attachment II.

Partner input via the CCOE Community Partner Survey will provide additional information about the services offered to CCOE clients. Surveying partner organizations will allow OWH to develop a current comprehensive understanding of the CCOE’s structure and the resources that have been linked together to develop each CCOE’s network of services. Additionally, it will help OWH understand how services and activities are integrated within each CCOE and how that integration has improved and evolved since Round I. The CCOE partners will also provide additional utilization information for the services and activities they provide on behalf of the CCOE. Using Round I data as a baseline, new information on the services and activities offered by the CCOE partners collected during Round II will help gauge growth and development of the CCOEs. Like the CCOE Center Director and Program Coordinator data, this information is not available by any other means. A copy of the CCOE Community Partner Survey is included as Attachment III.


The CCOE Client Survey is a key element in the data collection effort. Client feedback will provide a snapshot of individual experiences relating to the access and delivery of services at the current time and will be compared to the baseline responses gathered in the Round I evaluation to determine if patients are more satisfied with various services offered through the CCOEs. This information is not available by any other means. A copy of the CCOE Client Survey is included as Attachment IV.

Site visits will allow an independent assessment of the CCOEs that will provide a more detailed and thorough appraisal of the CCOEs than can be achieved through surveys, and they will provide valuable insight into the current day-to-day operations of each CCOE. Information gathered during the site visits will allow for further refinement, validation, and understanding of survey data gathered during the initial CCOE evaluation. Additionally, the Evaluation Team will obtain copies of distributed literature (e.g., smoking cessation tips, how to eat healthy), visit and interview a sample of partners, and observe any ongoing CCOE activities. This information is not available by any other means. A copy of the CCOE Site Visit Protocol is included as Attachment V.

The National Evaluation Round II is crucial to OWH’s decision-making process regarding the continued existence, design, and funding levels of the CCOE program. Round II evaluation findings will validate results from the National Evaluation and enable OWH to see the impact of the CCOE designation and funding on program-related outcomes. Additionally, because the CCOE program was developed as part of the national effort to eliminate health disparities, Round II evaluation findings will help OWH understand how the program is contributing to closing the health disparities gap and to improving care for underserved populations of women across the country.

2. Purpose and Use of the Information

Information (e.g., survey questions) provided by each of the data collection instruments will be used to answer the CCOE evaluation research questions, which are outlined in the Evaluation Methodology (see Attachment I). All data will be used to evaluate the overall performance of the CCOE program in relation to its goals, objectives, and growth over time relative to the baseline established during Round I.

The CCOE Center Director and Program Coordinator Survey will consist of gathering both qualitative and quantitative descriptive data and comparing that information to the baseline data collected previously. Topics to be included are:

  • Description of CCOE structure;

  • Description of any new CCOE activities and services (if applicable);

  • Information related to the integration of women’s health care delivery;

  • Training activities;

  • Community-based research;

  • Public education and outreach;

  • Leadership development activities;

  • List of partners, including description of type of services offered and contact information;

  • Best practices or lessons learned (if applicable) and,

  • Client and/or community demographics.


The CCOE Community Partner Survey will provide additional information on CCOE services and help ensure that a comprehensive and up-to-date picture of the CCOE services and activities is understood. This will, in turn, help OWH better understand the growth of CCOE relationships in the community, integration with partners, and expansion of service offerings since Round I. The data requested will be both qualitative and quantitative and will include:

  • Descriptions of partner activities and services;

  • Service utilization and activity participation statistics;

  • Descriptions of communication channels between the partner organization, the CCOE and other CCOE partners;

  • Perceptions on the partner organization’s level of integration with the CCOE and other partners; and

  • Best practices or lessons learned (if applicable).


CCOE Client Survey responses will provide feedback on the impact of the CCOEs in their personal lives as well as in the surrounding community. Additionally, the evaluation criteria used in the CCOE program evaluation relies heavily on collecting CCOE client feedback. Exhibit 1 presents some of the program evaluation criteria and a brief rationale for using the patient survey as a means of measuring the criteria.

Exhibit 1. Evaluation Criteria Requiring a Client Survey

Evaluation Criteria

Rationale for a Client Survey

Service includes a full range of care

A CCOE may provide a full range of services, however OWH needs to measure patient awareness and use of these services

Service delivery network demonstrates improvement in access to targeted communities

Clients can report how accessible the CCOE is and if they use transportation assistance

Training activities are provided on topics aimed at improving women’s health

Clients can provide insight into how useful and relevant these classes are to them

Subject of materials and outreach activities are relevant to the needs of the community

Clients will be the best source for identifying subjects of community interest and need


The CCOE Client Survey will provide the following types of patient information deemed necessary for a thorough evaluation and comparison of all evaluation criteria for Round II.

  • Access to services;

  • Types of services utilized;

  • Satisfaction with CCOE services; and,

  • Client demographics.

Lastly, the use of CCOE Site Visits will provide OWH with critical information on CCOE structure and operations that are hard to assess via surveys or other indirect measurements. Key items that the Evaluation Team will review during the site visit include:

  • Record-keeping systems;

  • Information technology infrastructure;

  • Accessibility of CCOE facilities;

  • Demonstrated evidence of growth; and

  • Staff input on quality and type of services offered, success factors, barriers, and key lessons learned.

3. Use of Information Technology and Burden Reduction

The CCOE Center Director and Program Coordinator Survey and CCOE Community Partner Survey will be administered as a web based survey. An internet-based survey provides many benefits for the evaluation effort. Internet-based (or web) surveys provide a cost-effective mechanism for gathering information. Web surveys reduce the time and level of effort needed from respondents by allowing them to directly key in their answers. Internet-based surveys also reduce the time and level of effort needed to analyze responses because responses are automatically entered into an electronic database.

Information technology will not play a large role in the administration of the Client Survey. Technology is not an appropriate solution in conducting the client survey as the population base is comprised of individuals who are often transient, have low literacy levels, and may lack access to a computer. The facilities where this survey will be administered also do not have computers available for this effort.

During the in person Site Visit, data will be collected through focus groups and interviews, but recorded/typed directly into an electronic data collection tool by the site visit team.

All “raw” data collected from each of the data collection instruments will be warehoused or stored in an Access database.

4. Efforts to Identify Duplication and Use of Similar Information

No effort to collect similar data is being conducted within the agency. Additionally, no data collection efforts outside the agency have been made to collect this data. OWH staff sought similar studies from all sources and determined no similar data collection efforts outside the agency have been made to collect this data.

5. Impact on Small Businesses or Other Small Entities

This data collection effort may impact small entities as many of the CCOE grantees are small businesses or non-profit organizations that are independently owned and operated. To minimize the impact on these groups, the CCOE Center Director and Program Coordinator Survey, CCOE Community Partner Survey, and CCOE Site Visits will be administered only once. The CCOE Client Survey will be administered for up to five to six months and will require ongoing support. OWH provided each of the CCOE Centers with additional funding for one of their staff members to devote a portion of their time to support client survey efforts. Each CCOE Center Director was made aware of the evaluation schedule and understands that responding to this evaluation effort is mandatory and part of their required activities in return for receiving OWH funding.

6. Consequences of Collecting the Information Less Frequently

This is a one-time data collection effort and respondents will be asked to participate only once (e.g., a client will not have to fill out two client surveys). If this collection is not conducted, this will negatively affect OWH’s ability to accurately measure and evaluate the impact of the CCOE program against its stated objectives. Failure to perform this data collection effort as part of the overall Round II evaluation will limit the validity of the Round I results and delay OWH’s movement towards achieving its program goals. This could result in unnecessary modifications to the existing program, or to unwarranted budget cuts/expenditures to the CCOE program.

7. Special Circumstances Relating to the Guidelines of 5 CFR § 1320.5(d)(2)

The proposed survey fully complies with all guidelines of 5 CFR § 1320.5 (d) (2). The information collection will not be conducted in a manner:

  • Requiring respondents to report information to the agency more often than quarterly;

  • Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • Requiring respondents to submit more than an original and two copies of any document;

  • Requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

  • In connection with a statistical survey that is not designed to produce valid and reliable results that can be generalized to the universe of the study;

  • Requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

  • That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or,

  • Requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.


8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

The 60-day data collection notice for the CCOE evaluation was published in the Federal Register, pages 52102 on September 1, 2005, Volume 70, Number 169. This notice can be found at: http://frwebgate3.access.gpo.gov/cgi-bin/waisgate.cgi?WAISdocID=56360615183+3+2+0&WAISaction=retrieve.

The original 30-day data collection notice was published on pages 71309-71310 o November 28, 2005 (Volume 70, Number 227). This notice can be found at: http://frwebgate3.access.gpo.gov/cgi-bin/waisgate.cgi?WAISdocID=56360615183+2+2+0&WAISaction=retrieve.

Additionally, OWH hired the consulting firm Booz Allen Hamilton Inc. (Booz Allen), who is experienced in managing and conducting program evaluations. The Booz Allen team assisted in development of the survey instruments and evaluation methodology for the CCOE evaluation for Round I and any language updates for Round II. Booz Allen provided expertise on issues including the availability of data, frequency of collection, clarity of instructions, record keeping, confidentiality, disclosure of data, reporting format, and necessary data elements.

9. Explanation of Any Payment or Gift to Respondents

There will be no payment, gift, or reimbursement given to respondents for time spent completing surveys.

10. Assurance of Confidentiality Provided to Respondents

A statement of confidentiality is included on the first page of each data collection instrument. The confidentiality statement informs respondents that any information provided will be kept confidential and will not be disclosed to anyone other than OWH (who is sponsoring the evaluation) without their consent. Only aggregated data will be reported to OWH. The confidentiality statement reads as follows:

The information you provide will be kept confidential, and will not be disclosed to anyone but the Department of Health and Human Services (DHHS) Office on Women’s Health (OWH) in the aggregate, except as otherwise required by law. If you have any questions on how to complete this survey please contact: Fatima Riaz at [email protected] or (240) 314-5675.

Furthermore, respondents for any of the four data collection instruments will not include their name (only their organization’s name, if applicable). The use of an Internet-based survey to administer two of the data collection instruments will help to ensure confidentiality of survey responses for these two efforts. It will allow respondents to key in their responses and submit them directly to the contracted Evaluation Team. Once submitted, survey responses will automatically be uploaded into a database for analysis. No individual identifiers will be requested. For the patient survey, respondent will be provided with a private area to complete their responses, and will not have to enter in any personal identifiers. Responses will be stored in a secure location for delivery to the Evaluation Team and then keyed into a database for analysis. While the site visit team will record the names of those who participated in each focus groups, this information will be kept confidential and no information attributable to an individual will be reported to OWH.

The Evaluation Team will house all data collected in a database located within their secure facilities. They will ensure that no information is shared with any entities outside of OWH, and will delete all individual identifiers prior to sharing any data outside of the Evaluation Team. All of these processes were utilized effectively during Round I.

11. Justification of Sensitive Questions

There are no questions believed to be sensitive. Although if the respondent is uncomfortable, for any reason, responding to a question, he or she will not be forced to complete that question.

12. Estimates of Hour Burden Including Annualized Hourly Costs

Exhibit 2 indicates the estimated burden for each data collection activity and the total across all activities for Round II. Over 6,000 respondents, mostly clients, will participate in this effort. The estimated average response time it will take across all data collection activities is estimated to be 12 minutes. The total annual burden hours is estimated to be 1595 hours. Burden to respondents will be in terms of their time only. There are no monetary annualized hourly costs for respondents. More specific information about each data collection activity is listed in Exhibit 2.

Exhibit 2. Estimated Burden

Data Collection Activity

Number of Respondents

Estimated Response Time

Estimated Annual Burden Hours

CCOE Center Director and Program Coordinator Survey

14

30 minutes (0.5 hours)

7 hours

CCOE Community Partner Survey

196

25 minutes (0.42 hours)

82 hours

CCOE Client Survey

5,600

15 minutes (.25 hours)

1400 hours

CCOE Site Visit

14

7.5 hours (1.5 hours per focus group)

105 hours

TOTAL

5,824

12 minutes (0.2 hours) per response (considering all respondents and data collection activities)

1594 hours


The CCOE Center Director and Program Coordinator Survey will be targeted to 14 respondents (i.e., one response per CCOE grantee organization). It is a one-time survey with a total estimated burden of 7 hours. The Round I survey instrument was pilot tested and adjusted to allow completion within 30 minutes. Since Round I was successfully completed within the estimated time frame and Round II will use the same survey instruments (with minor modifications made to update language as needed and with a few questions modified to focus on growth since Round I), the estimated time frame will remain the same.

The CCOE Community Partner survey will be administered to approximately 14 partners per CCOE depending on the number and types of partners each CCOE utilizes. This estimate is based on the information available about the number and service mix offered by current CCOE partners. The total number of respondents will be approximately 196 partners. It is a one-time survey with an estimated 82 annual burden hours. The Round I survey instrument was pilot tested and adjusted to allow completion within 20 to 30 minutes (25 minutes average). Since Round I was successfully completed within the estimated time frame and Round II will use the same survey instruments (with minor modifications made to update language as needed and with a few questions modified to focus on growth since Round I), the estimated time frame will remain the same.

The CCOE Client survey will be targeted to approximately 5,600 respondents (approximately 400 clients at each of the14 CCOEs). It is a one-time survey with an estimated 1400 annual burden hours. The Round I survey instrument was pilot tested and modified to allow completion within 15 minutes. Again, since Round I was successfully completed within the estimated time frame and Round II will use the same survey instruments (with minor modifications made to update language as needed and with a few questions modified to respond to areas currently of interest to OWH) the estimated time frame will remain the same.

Each CCOE Site Visit will include approximately 14 respondents (each CCOE center network is considered a single entity or respondent). It is a one-time data collection effort with an estimated 7.5 hour burden at each center spread out across multiple individuals and during multiple focus groups at each center. The focus groups that will take place over the course of two days will last between one to one and a half hours. The total estimated burden is 105 hours. Since Round I was successfully completed within the estimated time frame and Round II will use the same survey processes (with minor modifications made to update language as needed and with a few questions modified to focus on growth since Round I), the estimated time frame will remain the same.

Using Department of Labor (DOL) 2005 Wage Rates, the annualized cost to respondents for the hour burdens for collections of information is provided in Exhibit 3. Because the profession of the respondents will vary greatly for each given data collection activity and are not easily categorized into the general categories available on the DOL website, two broad groupings were used to characterize respondents. The total estimated cost burden across all data collection activities is estimated to be $27,439.82.

Exhibit 3. Estimated Burden

Data Collection Activity

Type of Respondent

Total Burden Hours

Mean hourly Wage Rate

Estimated Total Respondent Costs

CCOE Center Director and Program Coordinator Survey

White Collar (excluding sales)

7 hours

$24.03

$168.21

CCOE Community Partner Survey

White Collar (excluding sales)

82 hours

$24.03

$1,970.46

CCOE Client Survey

Blue Collar

1400 hours

$16.27

$22,778.00

CCOE Site Visit

White Collar (excluding sales)

105 hours

$24.03

$2,523.15




Total

$27,439.82

13. Estimates of Total Annual Cost Burden to Respondents or Recordkeepers

Time and effort will be the only burden to respondents who participate in the evaluation. Participants will incur no direct financial cost for responding to the data collection initiatives.

14. Estimate of Annualized Cost to the Federal Government

The estimated cost for the administration of all four data collection efforts once is $220,898. Exhibit 4 outlines the cost for each data collection effort and the cost associated with each major function for the data collection effort.

Exhibit 4. Cost of the Proposed Study

CCOE Center Director and Program Survey Activity

Cost

Program and Update Web-based Survey and Update Database

$11,196

Conduct Survey

$5,258

Analyze and Report Results

$9,167

Subtotal

$25,621

CCOE Community Partner Activity

Cost

Program and Update Web-based Survey and Maintain Survey Database

$11,196

Conduct Survey

$5,258

Analyze and Report Results

$9,167

Subtotal

$25,621

CCOE Client Survey Activity

Cost

Update Administrator Training and Sampling Plan

$8,285

Provide Ongoing Monitoring of Administrators

$3,428

Create English and Spanish Version of Client Survey

$2,000

Administer Surveys

$5,258

Complete Data Entry

$2,519

Maintain Survey Database

$2,628

Analyze and Report Results

$9,167

Subtotal

$33,285

CCOE Site Visits Activity

Cost

Plan and Coordinate Site Visits

$1,848

Travel

$40,499

Conduct Site Visits

$79,477

Maintain Database of Results

$2,628

Analyze and Report Results

$11,919

Subtotal

$136,371

GRAND TOTAL

$220,898


The overall cost for developing and administering the CCOE Center Director and Program Coordinator Survey for Round II is approximately $25,621. This includes programming the web-based survey, developing the database, administering the survey, and analyzing and reporting survey results.

The overall cost of the CCOE Community Partner Survey for Round II is approximately $25,621. This includes costs for programming the web-based survey, maintaining the survey database, administering the survey, and analyzing the survey results.

The CCOE Client Survey for Round II will be administered during a five to six month period (depending on the individual CCOE’s ability to gather enough data). The overall cost to train CCOE staff, revise the individual CCOE sampling plans, make Spanish surveys available, administer the survey, perform data entry, maintain the survey database, and analyze and report survey results is approximately $33,285.

The CCOE Site Visit Protocol for Round II will be a one-time data collection effort at each CCOE and with some of their community partners. The overall cost to plan and coordinate all site visits, conduct the site visit, maintain the survey database, and analyze and report findings is approximately $136,371.

15. Explanation for Program Changes or Adjustments

There are no changes in burden. This is a revision of a previously approved OMB project. This revised project includes two new CCOEs that will participate in the data collection efforts. Similar to the Round I, the two new CCOEs will provide information concerning their growth and attainment towards OWH goals and objectives.

16. Plans for Tabulation and Publication and Project Time Schedule

Exhibit 5, Project Time Schedule, outlines the major milestones in the project timeline.

Exhibit 5. Project Time Schedule

Activity

Time Period

Submit Federal Register Notice and Obtain OMB Clearance

October 2006 – November 2006

Data Collection

December 2006 – March 2007

Complete Analysis

April 2007 – May 2007

Provide Evaluation Report & Executive Summary

June 2007


Publication

There are no plans to publish data obtained from this information collection effort. Round II evaluation findings will be summarized in a comprehensive Evaluation Report and Executive Summary developed by Booz Allen for OWH.

Analysis Plan

A comprehensive analysis plan links each survey question to OWH’s goals and objectives for the Round II evaluation. These links identify whether the analyses that will be conducted for each survey question will be quantitative or qualitative. The analysis plan further identifies how the different types of data will be analyzed. Survey results will be analyzed using the statistical software SAS, where appropriate, and compared to Round I data. Statistical analysis will be conducted for all quantitative data, such as frequencies, descriptives, and cross-tabulations and significance testing (i.e., Analysis of Variance (ANOVA) or T-test), if appropriate. Qualitative responses will be reviewed and aggregated based on key themes. The reported findings will include aggregate CCOE results, as well as individual CCOE summary results. Detailed information on the specific analyses to be conducted is provided below.

CCOE Center Director and Program Coordinator Survey

Round II survey results will be analyzed using frequency of responses (percentages), descriptive statistics, and bivariate correlations. Data results from Round I will be compared with the new information collected during Round II. Additionally, a qualitative analysis will be conducted for all open-ended questions. Data will be reported both in aggregate and by individual CCOE; however, data will not be used for comparison between CCOEs.

The Evaluation Team will begin analysis of the survey data once all CCOEs have completed the survey. Survey data will then be linked to archival data (quarterly and annual CCOE reports). The combined data (survey and archival) will be first assessed for data quality (i.e., checking for missing data, extreme outliers and/or data that does not fit expectations). Frequencies and mean scores will then be conducted on all quantitative data. Correlations will be conducted between similar questions on the Partner and Program Coordinator survey and other surveys (with different target audiences) used in the CCOE evaluation. Qualitative data, such as descriptions of best practices or improvement ideas, will be reviewed and aggregated based on key themes. The reported findings will include aggregate CCOE results, as well as individual CCOE summary results.

CCOE Community Partner Survey

A majority of the survey results will be analyzed by conducting a qualitative analysis of open-ended questions. Quantitative data will be analyzed by using frequency of responses (percentages). Partner data will be reported in aggregate at the CCOE program level. Partner data will be summarized and included with individual CCOE data gathered during the CCOE Center Director and Program Coordinator survey. Data analysis is not focused on individual partner organizations. First, data quality will be assessed (i.e., checking for missing data, extreme outliers and/or data that does not fit expectations). Frequencies will then be conducted on all quantitative data. Qualitative data, such as descriptions of partner services and activities, will be reviewed and aggregated based on key themes.

CCOE Client Survey

For each question, excluding open-ended questions, frequencies will be generated to ensure data quality. Any data quality issues, such as extreme outliers, will be resolved prior to analysis. All questions with the response categories “Yes”, “To Some Extent”, or “No” will be coded numerically and then averaged. Additionally, composite scores will be calculated for each respondent in each of the four major survey sections (e.g., health care services and research study). Significance testing, specifically a one-way Analysis of Variance (ANOVA), will be used to compare differences between patient demographic characteristics and each survey section. It is anticipated that there will be differences in the following demographic categories: health status, age, race/national origin, type of insurance, educational level and income. The aggregate CCOE report will also highlight trends based on ANOVA analyses.

CCOE Site Visit Protocol

Most data gathered during the site visits will be qualitative. Data will be aggregated first by type of question (if multiple focus groups answer the same question) and then grouped by partner organization for individual CCOE results. The data will then be aggregated across all CCOEs to determine national level responses. Data will be summarized for individual CCOEs and at the national level in five main categories: day-to-day operations, CCOE governance, CCOE services, lessons learned/best practices, and other feedback and comments. There will not be any between CCOE comparisons. Any additional questions or comments added to the site visit protocol as a result of prior evaluation findings will be analyzed in a similar manner as described above.

17. Reason(s) Display of OMB Expiration Date is Inappropriate

OMB expiration dates will be displayed on all materials.

18. Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to the certification statement identified in item 19 “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I.

B. DESCRIPTION OF INFORMATION COLLECTION

1. Respondent Universe and Sampling Methods

The applicable population (universe) and sampling method varies for each data collection effort but are the same as those used during the Round I evaluation effort. Further detail for each data collection activity is provided below.

CCOE Center Director and Program Coordinator Survey


The respondent universe includes designated leadership at each CCOE. This leadership consists of a Center Director and Program Coordinator who administer the CCOE program within their organization. These individuals will be responsible for completing the CCOE Center Director and Program Coordinator Surveys. All fourteen CCOEs will be asked to participate because data on each CCOE is integral in conducting a meaningful evaluation, especially because each CCOE deals with a very different patient population. Baseline information will be compared to Round II evaluation data to obtain a comprehensive picture of the CCOEs progress towards goals and growth over time. Sampling is not appropriate and there are too few CCOEs to employ a psychometrically sound sampling strategy. The anticipated response rate is 100% as participation is a requirement for receiving the CCOE designation and associated funding from OWH (please see B.2 for further justification of this expected response rate). This collection has not been conducted previously.

CCOE Community Partner Survey

The respondent universe includes designated points of contacts at all CCOE partner organizations that have been in a partnership with a CCOE for 3 months or more. Due to the wide variety in the number of and type of services offered by partners at each CCOE, the sampling strategy for the partner survey will be developed in detail after a review of Phase I data. The sampling strategy chosen will be based on the CCOE’s current number and mix of partners to ensure a consistent, comprehensive, and accurate picture of partner activities is developed. One potential sampling strategy consists of sampling all partners for each CCOE. This was the methodology that was used during Round I after a review of the volume and service-mix of CCOE partners. If the number or mix of partners are considerably higher than during Round I , the Evaluation Team will consider alternative sampling strategies. For example, the Evaluation Team could sample one partner organization for each service type offered. Alternatively, another potential strategy could entail surveying a random stratified sample of partners that provide services in each one of the CCOE core components in which partners organizations are involved at each CCOE.

The Evaluation Team anticipates sampling no more than 12 to 15 partner organizations for each CCOE based on existing information regarding the numbers and service mix of current CCOE partners. Because the CCOE program is continuously evolving and is continually adding, replacing, or losing partner organizations, the sampling strategy will be determined based on the number of CCOE partners and the service mix offered at the time of the evaluation. This strategy is cost-effective, does not overburden the CCOE network with an undue amount of data collection, and still enables OWH to gather a comprehensive and up-to-date picture of the CCOE program and its services. Overall response rates are estimated at 85-90%.

CCOE Client Survey

The respondent universe is all individuals who are registered clients of the CCOE program who are 18 years of age or older. Every CCOE client will have an equal probability of selection and will not be re-sampled during the six-month data collection effort. Participation is voluntary; any patient may decline to participate without penalization. Respondents will be chosen using a random sampling strategy. Each CCOE will be asked to administer the survey to 400 patients. Therefore, the number of patients sampled weekly at each CCOE will differ based on patient volume. The survey effort will target an 85% response rate and sample the CCOE patients accordingly. This amount depends on the number of patients who decline to participate, these numbers will be monitored throughout the survey period. Results from the survey can be generalized to the CCOE’s patient population because a random sampling scheme will be used.

CCOE Site Visit Protocol

The respondent universe includes CCOE leadership (at each CCOE), such as Center Directors, CCOE staff, and partner organization leadership and staff. All CCOE leadership will participate in the Center Director introductory brief and debriefing meetings. The eleven CCOEs and three Ambassadors for Change (AFC)1 will be asked to participate because additional information from each CCOE is necessary to clarify data gathered during the CCOE Center Director and Program Coordinator Survey and in order to understand some of the unique processes and tools developed at each CCOE. Therefore, sampling is not appropriate and there are too few CCOEs to employ a psychometrically sound sampling strategy.

However, not all CCOE staff will be invited to participate in the focus groups. Focus groups will consist of no more than six to eight staff. Staff will be chosen so as to minimize any disruptions in service at the CCOE. Focus groups will consist of a representative sample of CCOE staff; however, this will be limited to who is working at the time of the site visit and their availability to meet.

In choosing which partner facilities to visit, Booz Allen will first consider the number of community partners each CCOE has (a key factor influencing this decision include the physical distance between facilities). Booz Allen, with feedback from the CCOE, will select partners based on frequency of interaction with the CCOE and volume of CCOE patients. For each partner facility chosen, a senior leader or liaison partner will be asked to participate in an interview. The senior leader or liaison partner will choose employees to participate in one of two focus groups. Similar to the CCOE staff focus group, employees will be chosen so as to minimize any disruptions in service at the partner facility. A representative sample of employees will be chosen to participate.

2. Procedures for the Collection of Information


The procedure for collecting data varies for each data collection instrument, however the collection procedures will be similar or the same as those used for the Round I evaluation. Any differences will incorporate lessons learned during Round I. The procedures to collect data are described below. Each data collection instrument will be administered one time only, thus, data will be collected only once.

CCOE Center Director and Program Coordinator Survey

The Evaluation Team will send each CCOE Center Director and Program Coordinator an email inviting him or her to complete a survey for his or her CCOE. The email will include detailed instructions on how to access the online survey, what data are needed, and the time frame provided for completion. A point of contact on the Evaluation Team will be provided to answer questions regarding the survey or the CCOE evaluation. The CCOEs will have approximately one month to gather requested information, fill out and submit the completed survey. Based on pilot testing, estimated time for completing the survey is on average approximately 30 minutes. (This time may vary by CCOE depending on how many partners and services are offered and how easy it is to generate utilization information from their information technology system.) Survey respondents will have one month to collect the necessary information and complete the survey at their convenience. Throughout this time, the Evaluation Team will be actively involved, providing survey related support and monitoring survey response rates.

CCOE Community Partner Survey

The Evaluation Team will send an email invitation to each partner organization chosen to participate. The email will include detailed instructions on how to access the online survey, what data are needed, and the time frame provided for completion. A point of contact on the Evaluation Team will be provided to answer questions regarding the survey or the CCOE evaluation. The partners will have approximately one month to gather requested information, fill out and submit the completed survey. Based on pilot testing, it will take approximately 20-30 minutes to complete the survey. This will vary according to the type of services the partner offers CCOE patients. The amount of time it takes to complete the survey will decrease as the number of services offered decreases. The Evaluation Team will enlist CCOE leadership to actively encourage partner participation in the evaluation and to follow up with their partners to ensure timely completion of the survey, a strategy that worked well during Round I. Partners will have a month to collect the necessary information and complete the survey, allowing them to complete the survey at their convenience. Throughout this time, the Evaluation Team will be actively involved, with the survey process, providing survey related support and monitoring survey response rates. No unusual circumstances that would require specialized sampling procedures are anticipated.

CCOE Client Survey

Data will be collected from CCOE clients via a point-of-service, pen, and paper instrument. Four hundred patients from each CCOE will be surveyed over a six-month time frame. Participants will be randomly selected2 to ensure that every patient who visits the CCOE during the survey administration period have an equal probability of selection. The survey has been designed to allow completion within 15 minutes, and survey administrators will be available to answer patient questions regarding the survey or to translate from English to Spanish, if necessary. Prior to CCOE staff administering the survey, they will review the Client Survey Administration Training Guide and participate in an hour-long training session. This will acquaint the survey administrators with the methodology and overall sampling strategy to be used at their CCOE.3 Additionally, Booz Allen will remain in close contact with CCOEs so that they can effectively modify the sampling strategy when necessary and monitor data quality.

The purpose of this survey is twofold. Gathering information on the current patient population will validate the findings from Round I. The surveys will also provide a current understanding of this population’s perception of the services offered or coordinated through the CCOE. There are no a priori hypotheses regarding group differences; therefore, the sample will not be stratified by demographic characteristic. The sampling plan considers not only the desired number of survey participants (400 per CCOE) but also the client volume at each CCOE during the survey administration period. For example, if CCOE #1 provides services to 400 patients a month, they will see 2400 patients during the 6-month administration period. If we estimate that 10% of those asked to participate will decline, then our sampling ratio becomes 1:5. In other words, at CCOE #1 the survey administrator will ask every 5th client if she would like to participate in the survey. The brevity of the survey, anonymity, and the in-person survey administration should reduce client nonresponse. However, when a client elects not to participate or has been previously surveyed, the survey administrator will revert back to the original sampling plan, and invite the next identified client to participate.

CCOE Site Visit Protocol

Each site visit will take approximately two to three days, including travel time. Four to five interviews and focus groups will take place at each CCOE. Interviews will be conducted with leadership and staff. Two to three interviews and focus groups will occur with each partner organization selected during the CCOE site visit. Each of these focus groups/interviews will last no more than 1.5 hours. This protocol mimics the one successfully used during Round I.

The Booz Allen Evaluation Team will send two staff members to conduct each site visit. This ensures that there is always one individual available to record information during all interviews, discussions, and demonstrations that take place during the site visit. One of the team members conducting the site visit will be a certified clinician, public health expert or other Subject Matter Expert (SME) in the medical and/or public health fields. The other team member will be skilled in facilitation and data collection to ensure that all topics of interest are addressed appropriately during the site visit.

An interview protocol will be used to guide the site visits. The Booz Allen Evaluation Team members will interview staff and gather documentation to gain clarification, validate, and obtain more information on previously submitted data gathered in the CCOE and/or partner surveys, as needed. Information gathered for each protocol question will be entered into a template (either Word or Excel) to facilitate the data interpretation process.

3. Methods to Maximize Response Rates and Deal with Nonresponsiveness

One of the requirements for receiving a CCOE designation and accompanying OWH funding is participation in the CCOE program evaluation. This requirement served as an incentive for ensuring the Center Directors and Program Coordinators responded to this survey effort during Round I, for which there was 100% participation. Each of the CCOEs in existence at the time was actively involved in the development of the evaluation methodology and the survey instruments for Round I and had multiple opportunities to provide input to these tools for that evaluation. The CCOEs have all been informed of the upcoming Round II evaluation and are aware that this is an effort that they will be asked to participate in as a requirement related to their grant funding. We anticipate that the implementation of all of these strategies will help to ensure that all CCOEs complete a survey, providing a response rate of 100% percent.

CCOE partners were very responsive to the CCOE Community Partner Survey during Round I. CCOEs have developed cooperative relationships with their partners in order to offer integrated services under the seamless model of care that each of the CCOEs is working to develop. In many cases, these relationships are defined by memoranda of understanding or actual contractual arrangements. During Round I each CCOE reached out to their partners to ascertain their willingness and ability to participate in a data collection effort. We anticipate that a high response rate will be achieved for Round II with the use of the same methods used in Round I and described here.

The use of an Internet-based survey that allows respondents to directly key in their responses, will help to maximize response rates for both the CCOE Center Director and Program Coordinator and CCOE Community Partner Surveys. The survey was constructed for ease of use. Clear and concise instructions, similar to those used in Round I with updates made to reflect reference Round II appropriately, will be included with the surveys. The Booz Allen Evaluation Team will be available to provide instructions and guidance on how to complete the survey and to answer general questions as needed. The Evaluation Team will monitor response rates and work with CCOEs and their partners to ensure completion. These methods were utilized successfully during Round I and we anticipate their successful use during Round II as well.

Non-response is not anticipated to be a factor in the CCOE Site Visits since meeting times with CCOE staff and partner organization will be planned prior to the site visit. There were no issues with participation/response during Round I, so this method has already proven successful. In addition, the CCOEs have experience with hosting site visits through their regularly scheduled OWH site visits for program review and technical assistance. The Booz Allen Evaluation Team will work with each CCOE and partner organization to determine which staff will be chosen to include in the interviews/focus groups for Round II. As with the previous evaluation effort, the team will work with the CCOEs to structure the site visit so as to minimize any disruptions in service.

OWH is planning to provide additional funding to each of the CCOEs for one of their staff members to support the client survey data collection efforts for this evaluation. This method was used successfully during Round I. Just as with Round I, a point-of-service (or in-person) survey administration will maximize response rates for the CCOE Client Survey. Prior to taking the survey, the potential respondent will be informed of the purpose of the survey and how the collected information will be used. They will also be informed that their responses will be kept anonymous. The survey is designed to be administered in a short period of time (10-15 minutes), and it will be administered immediately after a patient agrees to take the survey. The survey administrator(s) will be available to answer questions and translate to primary language spoken in the local community, if necessary. Each of these techniques was utilized during Round I, and it is anticipated that they will help to increase the likelihood of client participation in Round II as well.


4. Test of Procedures or Methods to be Undertaken

The CCOE Center Director and Program Coordinator Survey was pilot tested with multiple CCOEs prior to the Round I evaluation. The specific CCOEs that participated in the pilot testing of this instrument include Northeast Missouri Health Council, Inc, Kirksville, Missouri; St. Barnabas Hospital and Healthcare System, New York, New York; NorthEast Ohio Neighborhood Health Services, Inc., Cleveland, Ohio; and Hennepin County Department of Primary Care, Minneapolis, Minnesota. During the pilot test, length of time to complete the survey was determined, as well as individual CCOE leadership reaction to survey content, clarity, and design. Feedback from this pilot test was used to make refinements to the survey instrument. Additionally, this instrument was utilized during Round I of the National Evaluation on each of the 12 CCOEs in existence at the time. It was administered using the same methods and approach planned for Round II.

The CCOE Community Partner survey was pilot tested with 9 CCOE partners who were partnering with three of the CCOE programs (Northeast Missouri Health Council, Inc, Kirksville, Missouri; Hennepin County Department of Primary Care, Minneapolis, Minnesota; Northeastern Vermont Area Health Education Center, St. Johnsbury, Vermont) prior to the National Evaluation. During the pilot test, length of time to complete the survey was determined, as well as partner reaction to survey content, clarity, and design. Feedback from this pilot test was used to make refinements to the survey instrument. This instrument was also utilized during the Round I National Evaluation, using the methodology for Round II as described in this supporting statement.

The CCOE Client Survey was pilot tested with 9 CCOE patients at an existing CCOE (NorthEast Ohio Neighborhood Health Services, Inc., Cleveland, Ohio). During this pilot test, length of time to complete the survey was determined, as well as patient reaction to survey content, clarity, and design. Feedback from this pilot test was used to make refinements to the survey instrument. This instrument was also utilized during Round I, administered using the methods described.

No pretest had been conducted for the CCOE site visits prior to Round I of the National Evaluation, however OWH’s previous experiences with conducting site visit reviews for the CCOE program and other similar programs for grant program management purposes were taken into consideration when determining the structure and methods for conducting the CCOE site visits. These methods were implemented successfully during Round I and will be repeated for Round II.

5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


Statistical Aspects Contact

Barbara F. James

Director, National Community Centers of Excellence in Women’s Health Program

Office on Women’s Health

(301) 443-1402


Data Collection/Analysis and Statistical Contact

Fatima Riaz

Associate

Booz Allen Hamilton

240-314-5675

  1. Evaluation Methodology

  2. CCOE Center Director and Program Coordinator Survey

  3. CCOE Community Partner Survey

  4. CCOE Client Survey

  5. Site Visit Protocol


1 As of FY 2007, seven AFCs are in existence, however four of them will have transitioned to AFC status in October 2006 and will not have sufficient experience functioning as an AFC to respond to the evaluation questions from an AFC perspective at the time data collection is anticipated to begin. Thus, they will be treated as full CCOEs for the purposes of this evaluation and will be asked to respond to evaluation questions from a CCOE perspective.

2 As stated in B.1, stratified random sampling will be employed for CCOEs where the survey will be administered at more than one clinical care site.

3 A customized sampling strategy will be developed for each CCOE based on the monthly volume of patients and type of CCOE services offered.



File Typeapplication/msword
File TitleSupporting Statement for Paperwork Reduction Act Submissions
AuthorBA&H User
Last Modified ByDHHS
File Modified2006-12-20
File Created2006-12-20

© 2024 OMB.report | Privacy Policy