WISEWOMAN OMB Part B_11 07 2014

WISEWOMAN OMB Part B_11 07 2014.docx

WISEWOMAN National Program Evaluation: Implementation Assessment

OMB: 0920-1068

Document [docx]
Download: docx | pdf


WISEWOMAN National Program Evaluation


Supporting Statement


Part B: Statistical Methods


November 7, 2014









Contact: Tiffany Dearman

Division of Heart Disease and Stroke Prevention

Centers for Disease Control and Prevention

Atlanta, Georgia

Phone number: 770-488-8151

Email address: [email protected]



This page left blank for double-sided copying.

CONTENTS

B. Statistical Methods 1

1. Respondent Universe and Sampling Methods 1

2. Procedures for the Collection of Information 3

3. Methods to Maximize Response Rates and Deal with Nonresponse 5

4. Test of Procedures or Methods to be Undertaken 7

5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 8



Attachment A: Authorizing legislation

A1: BREAST AND CERVICAL CANCER MORTALITY PREVENTION ACT OF 1990

A2: PUBLIC HEALTH SERVICE ACT

Attachment B: Proposed outcomes, measures and data sources

Attachment C: PROGRAM Survey INSTRUMENT AND SUPPLEMENTARY DOCUMENTS

C1: PROGRAM SURVEY

C2: PROGRAM SURVEY INVITATION EMAIL

C3: PROGRAM SURVEY REMINDER EMAIL

Attachment D: Network Survey INSTRUMENT AND SUPPLEMENTARY DOCUMENTS

D1: NETWORK SURVEY

D2: NETWORK SURVEY INVITATION EMAIL

D3: NETWORK SURVEY REMINDER EMAIL

Attachment E: Site Visit data collection instruments AND SUPPLEMENTARY DOCUMENTS

E1: DISCUSSION GUIDE-GROUP INTERVIEW WITH KEY ADMINISTRATIVE STAFF

E2: DISCUSSION GUIDE-GROUP INTERVIEW WITH HEALTHY BEHAVIOR SUPPORT STAFF

E3: DISCUSSION GUIDE-GROUP INTERVIEW WITH STAFF AT PARTNER CLINICAL PROVIDERS

E4: DISCUSSION GUIDE-COMMUNITY PARTNERS

Attachment F.1: Federal Register Notice

Attachment F.2: summary of public comments

This page left blank for double-sided copying.

TABLES

B.1 Summary of the WISEWOMAN data collection activities 1

B.2 Individuals consulted on methods 8



This page left blank for double-sided copying.

B. Statistical Methods

1. Respondent Universe and Sampling Methods

Table B.1 summarizes the potential respondent universe, targeted respondents, and methods for selecting respondents for the three proposed new data collection activities. Below, we discuss the targeted respondents and methods in more detail for each data collection activity.

Table B.1. Summary of the WISEWOMAN data collection activities

Data collection method

Respondent universe

Targeted respondents

Methods for Selection

Program survey

All 22 WISEWOMAN awardees

All 22 WISEWOMAN awardees

(program directors, program/managers, and data managers may collaborate to complete sections of the survey)

Not applicable – All awardees will be asked to complete the survey.

Network survey

Partners of all 22 WISEWOMAN awardees

All 22 WISEWOMAN awardees and their partners

(10 project partners per awardee)

Awardees will assist in identifying potential partners that actively collaborate with them, such as other federally funded programs in the state; local chapters of nonprofit organizations; and other community-based organizations

Site visits

All 22 WISEWOMAN awardees

Program staff and partners across selected 18 WISEWOMAN awardee sites

(7 respondents per site)

Sites identified through the program survey, awardee reports, and CDC project officer interactions as having emerging and best practices in prioritized areas for evaluation.

Program directors at each site will be asked to help identify key program staff and partners that are best suited to provide the information sought through the site visits.



Attachment C includes the program survey instrument. Attachment D includes the network survey instrument. Attachment E includes the site visit data collection instruments.

Program survey

All WISEWOMAN awardees will be asked to complete the program survey to ensure that consistent information is collected about implementation across the awardees and to enable analysis of variation in implementation to contribute to the summative evaluation. There are 22 WISEWOMAN awardees (20 states and 2 tribal organizations) with four-year cooperative agreements from 2013 to 2017. Program directors, program managers/coordinators, and data managers may collaborate to complete the sections of the survey. The program survey will be self-administered through an editable PDF. Data collection is anticipated for Program Years 2 and 4 to capture early and mature implementation. A pre-test of the program survey instrument was conducted with a small subset of awardees in Program Year 1. Results of the pre-test are provided in Section 4.

Network survey

The network survey will be fielded with awardees and program partners at each of the 22 awardee sites. We anticipate including approximately 10 partners per awardee for a total of 220 across the awardees. The total number of respondents will be up to 242 (22 awardees plus 220 partners). Collecting information from all awardees and from a range of the various partners will provide information that is representative of the WISEWOMAN awardees and their partners. In addition, the information can be combined with outcomes from all awardees to be used in the summative evaluation in identifying community and program characteristics associated with changes in outcomes.

Awardees will assist in identifying potential partners that actively collaborate with them, such as other federally funded programs in the state (for example, the Breast and Cervical Cancer Early Detection Program, Million Hearts Initiative, Diabetes Program, and Tobacco Cessation Program); local chapters of nonprofit organizations (such as the American Heart Association); and other community-based organizations (such as the YMCA and Weight Watchers). If there are more than 10 per site, we will randomly select up to 10 partners. The survey will be self-administered through a web-based application by partners during Program Years 2 and 4. The pre-test of the network survey instrument was conducted with a small subset of awardees in Program Year 1. Results from the pre-test are provided in Section 4.

Site visits

We propose to conduct up to six site visits per year in Program Years 2 through 4 (18 total visits) to assess implementation across the awardees. Sites will be selected based on emerging and best practices in prioritized areas for evaluation as identified through the program survey, awardee reports, and CDC project officer interactions. During the site visits, in-person interviews will be conducted with key program staff (for example, program directors, program managers, data managers, and lifestyle programs (LSP) and health coaching (HC) coordinators) and their partners (for example, clinical and LSP/HC providers, American Heart Association, quitline, and community-based organizations). The program directors at each site will be asked to help identify key program staff and partners that are best suited to provide the information sought through the site visits. Conducting site visits at 18 sites with seven respondents per site should provide information that is representative of the broad range of WISEWOMAN awardees while fitting within the project resources and not placing an unreasonable burden on the respondents or awardees.

We anticipate conducting seven interviews per site: one administrator; two additional key program staff; two providers; and two partners. This target would result in 42 interviews at 6 sites per year and 126 total interviews in three years across the 18 sites. The number of key informant interviews that can ultimately be scheduled per site within the allotted time will depend on the availability of respondents, project resources, and the amount of travel time required between interviews.

2. Procedures for the Collection of Information

Each component of the evaluation is unique, and as such, the procedures will be described separately below.

Program Survey

No statistical methods will be used to draw a sample for this survey, as it will include the full universe of all 22 WISEWOMAN awardees. The survey will be administered twice, in Program Years 2 and 4, using the same questionnaire and instrument in each round of administration. The second round of administration will be used to measure change across the variables of interest. The survey will be conducted by CDC’s evaluation contractor.

This survey will be conducted using an editable PDF, in a self-administered format sent by email. The survey will take approximately one hour to complete. Awardee program managers are the target respondents, but they may choose to delegate sections of the survey to other knowledgeable staff at their discretion. Respondents will be provided with detailed instructions on how to complete the questionnaire, both in the beginning of the instrument and through visual cues throughout. CDC’s evaluation contractor will provide a toll-free telephone number, as well as a dedicated email for technical assistance for the project. This information will be printed on the questionnaire. Trained project staff will respond to any questions that potential respondents have about specific items in the questionnaire or about participation overall.

The 8-week field period will launch with an invitation email including the editable pdf (Attachment C.2). All non-responders will receive up to 4 reminders via email (Attachment C.3) (sent weeks 2-8) and reminder telephone calls placed by trained project staff (weeks 4-8). These staff will confirm the contact information on record, ensure the mailing was received, and encourage participation. These staff will address any concerns raised about participation in the survey, both through non-response follow-up, as well as through responding to inquiries from the study’s toll-free telephone number and email address. Based on response to similar surveys, a response rate of 100 percent is anticipated for this effort.

As completed cases are returned, they will be reviewed for completeness prior to data entry. If a completed questionnaire is missing critical items, or if other questions arise in the review, the contractor will reach out to respondents for clarification. After roughly half of the awardees have returned completed questionnaires, CDC’s contractor will review the existing data and conduct an initial data cleaning. This step will help identify potential issues for cleaning and analysis early so that strategies may be employed to correct these issues. Data will be cleaned based on specifications developed by CDC and its evaluation contractor, and the final data file will be used for analysis.

Network Survey

No statistical methods will be used to draw a sample for this survey, as it will be defined by the responses provided by the 22 WISEWOMAN awardees, each of whom will identify up to 10 organizations with whom they partner most closely to implement their grant. As such, the sample size could include up to 242 organizations (22 awardees and up to 220 partners). Each will be contacted to complete a 30-minute questionnaire in Program Years 2 and 4. Administrators from each of these partner organizations are the target respondents, but they may choose to delegate sections of the survey to other knowledgeable staff, at their discretion.

The survey will be conducted by CDC’s contractor. It will be collected in a self-administered format using a web-based survey software, with non-response follow-up conducted by telephone. The web-based application will undergo vigorous testing by programmers, CDC’s evaluation contractor, and staff from CDC prior to launch. CDC’s contractor will provide a toll-free telephone number, as well as an email address for technical assistance for the project. This information will be provided in all emails, letters, and will appear on the bottom of the screen in the web survey. Trained project staff will respond to any questions that potential respondents have about specific items in the questionnaire or about participation overall. Prior to the launch of each round of survey administration, the awardees will send an advance notification/endorsement email to the point-of-contact identified as the target respondents for their partners. This email will let the partners know the survey is coming, establish credibility, and relay the importance of participation.

The 12-week field period will launch with an invitation email, which will include the link to complete the web survey in a hyperlink format (to reduce burden), as well as the sample member’s unique password (Attachment D.2). One week later, an email will follow to provide the link and password again. All non-responders will receive up to 6 reminders via email (Attachment D.3) (sent weeks 2-12) and reminder telephone calls placed by trained project staff (weeks 8-12). These staff will confirm the contact information on record, ensure the mailing was received, and encourage participation. These staff will address any concerns raised about participation in the survey, both through non-response follow-up, as well as through responding to inquiries from the study’s toll-free telephone number and email address. Based on response to similar surveys, a response rate of 80 percent is anticipated for this effort. If needed, awardees may be asked to reach out to their partners, prior to the close of the field period, to encourage participation in the survey or respond to concerns about the legitimacy of the effort.

The web-based application will allow respondents to stop (at any time, if needed) and save the survey and return to it later, reducing burden as they may complete it at their convenience. In addition, internal skip patterns and range checks will be programmed into the survey to ensure the accuracy of data and that respondents do not answer questions unnecessarily. The web-based software will also provide the opportunity to clarify responses or flag out-of-range values in real time. For example, if a respondent leaves an item blank, or if a response is provided outside of the anticipated range of values, pre-programmed text will appear to guide the respondent, minimize item-level non-response, and promote better data quality overall. Once respondents complete the questionnaire, they will click on a “submit” button and the status of the case will be updated in real time. This real-time synchronization is important in the final phase of the field period, so as to avoid placing undue burden (associated with follow-up phone calls or emails) for those who have participated. The web-based application also identifies partially completed questionnaires, along with the last item completed. This information will be used to tailor non-response follow-up, as needed.

If a completed questionnaire is missing critical items, or if other questions arise in the review, the contractor will reach out to respondents for clarification. After roughly ten percent of the awardees have completed the questionnaire via the web, CDC’s contractor will review the existing data and conduct an initial data cleaning. This step will help identify potential issues for cleaning and analysis early so that strategies may be employed to correct these issues. Data will be cleaned based on specifications developed by CDC, and the final data file will be used for analysis

No individually identifiable information will be collected. The evaluation contractor will assist CDC in administering the survey and develop a web-based survey that will operate through secure servers and data will be stored securely by the contractor. At the end of data collection and analysis, the evaluation contractor will securely transmit survey data to CDC, and data will be permanently destroyed on the contractor servers.

Site visits

Site visits will be conducted with 18 of the 22 WISEWOMAN awardees. These visits will take place across three years (Program Years 2, 3, and 4) with six visits occurring each year. No statistical methods will be used to identify the awardees selected for the site visits. Instead, they will be selected based on emerging best practices in prioritized areas for the evaluation as identified through awardee reports, discussions with CDC staff, and the program survey from Program Year 2.

Lead administrators at each site will work with CDC’s contractor to identify optimal dates and times for the visits. Furthermore, they will notify applicable staff that the contractor will be reaching out to set up appointments. Prior to each visit, all staff identified as target participants will be contacted by email or telephone to set up the appointment. All interviews will be in person. Up to 15 minutes is assumed in burden estimates for time spent engaging with CDC’s contractor to book and confirm interview appointments. The remainder of the time will be spent completing the consent form and in the interview itself. Interviews with WISEWOMAN administrative staff will last up to 60 minutes, with one conducted per site. Interviews with administrators of up to two partner organizations will be completes in each visit and each is expected to last up to 30 minutes. Up to two 30-minute interviews will be conducted with awardee organization’s “healthy behavior” staff. Up to two 30-minute interviews will also be conducted with WISEWOMAN health care providers. We anticipate an average of 7 interviews per site for a total of 126 interviews across the 18 selected Healthy Start sites. At each site, we will attempt to schedule interviews to take place over a day and a half and within regular work hours. A member from CDC’s evaluation contractor will lead the interviews. Interviews will be audio recorded, if key informants agree. After the visit is completed, audio recordings will be transcribed and then coded for analysis.

3. Methods to Maximize Response Rates and Deal with Nonresponse

The data collection procedures discussed below are designed to maximize response rates and to promote the accuracy and completeness of information collected across each component of the evaluation.

Program and network surveys

The program and network surveys are anticipated to have minimal non-response because of the value awardees and partners place on the evaluation and its role in helping them serve the populations of interest. Because participation in the surveys is seen as a value-added endeavor, non-response is less likely to stem from unwillingness or not recognizing the value of participation, but rather finding the time to complete the questionnaire amidst a heavy workload carried by program administrators. Because the questionnaires are designed in a self-administered format, they can be completed across multiple sessions, as time permits. As noted in section B2, CDC’s evaluation contractor will use highly trained staff to reach out to non-responders by telephone to address any barriers to participation. In addition, all correspondence will include the email and telephone number for a help desk - provided to answer any questions that respondents may have across the field period.

To collect high-quality data in each of the surveys, the following components are included as part of the planned data collection:

  • Review and follow-up (as needed) for submitted program survey questionnaires to ensure high quality data: Reviewing each completed questionnaire to ensure: 1) the returned questionnaire meets criteria established to be designated as completed; 2) responses were provided for all critical variables; 3) skip patterns were followed; 4) responses fell within allowable ranges.

  • Use of a web-based survey application (network survey): By deploying the network survey in a web-based application, we seek to provide a way to collect high quality and consistent data and minimize burden by (1) routing respondents through the form, thus avoiding routing errors; (2) including range checks so that out-of-range values are checked and flagged for respondents to correct immediately; and (3) including consistency checks to ensure that the respondent’s answers are consistent throughout the questionnaire. We will develop clear instructions and program the web-based application to be as intuitive as possible. The toll-free number and email address will be provided on each screen, should respondents have questions about the instrument or should they experience any difficulties in navigation on the web.

  • Reviewing data frequencies. Frequency reviews are an important tool in ensuring data quality. To determine whether the instrument is performing as specified, frequencies will be reviewed early in the field period. If programming errors are detected (for example, erroneous skip logic or inadequate range specifications), CDC’s contractor will correct the errors immediately. If missing data need to be retrieved from respondents, staff will be instructed to follow up and obtain the information.

  • Minimizing non-response bias. We anticipate minimal non-response to the surveys, as participation is integral to their work in implementing WISEWOMAN program activities.

Site visits

Response to the site visits is expected to be high because of the interest of WISEWOMAN awardees in the evaluation. For the site visits, interviews will be scheduled at the convenience of the key informant. We anticipate at least 95 percent of the key informants will complete interviews during the site visits, based on experience with similar activities and a typically high level of motivation from the WISEWOMAN staff and their partners. The notes are then enhanced through the use of transcriptions from the recordings, which will allow the interviewer to engage actively with the respondent without having to take notes and the transcriptions can then be coded and analyzed.

4. Test of Procedures or Methods to be Undertaken

Testing will occur for both of the survey efforts in the evaluation. The site visit data collection instruments will not undergo any testing prior to data collection as it is more qualitative in nature and the interviewer can adjust the data collection instrument to meet the time limits of the interview.

The program survey was pre-tested with a convenience sample of administrators from three awardees. Because the universe for the survey is small, the pre-test sample was also small to minimize burden for the pre-test participants (who will also complete the questionnaire during the evaluation itself). The questionnaire was completed on paper, with a debriefing conducted by a professionally trained member of the team at CDC’s evaluation contractor for the data collection effort. Participants were asked to provide feedback on any questions they felt were confusing, overly burdensome, or difficult to provide accurate responses to without excessive burden. The length of administration was timed to verify the estimates provided in section A. The timing of the survey ranged from 65 minutes to 166 minutes with a mean of 108 minutes. To further reduce burden, we pared down the survey to an average of 60 minutes by eliminating items that were deemed non-essential or exceptionally burdensome, and items that are available from other sources though not on a systematic basis. Additionally, we revised survey items for clarity. These revisions included: updating survey introduction text to suggest that grantees have program documents and annual reports in hand prior to beginning the survey; adding a “don’t know” option to the series of questions about team-based care; clarifying questions to consistently use the phrase “clinical services” to describe services offered by medical health providers; splitting the question about waist-hip measurements into two items to collect data on whether grantees collect hip measurements and separately whether they collect waist measurements; adding the probe “If you do not share evaluation findings, please select ‘no’ for all” to provide further instruction to respondents at items D5 and D6; and adding response options to a number of multiple-response items based on input from respondents to expand coverage of possible responses.

The network survey was pre-tested with a convenience sample of five administrators from four awardees and their partners. To identify respondents for the WISEWOMAN network survey pre-test, we contacted the four WISEWOMAN grantees participating in the WISEWOMAN program survey pre-test and asked them to provide contact information for two or three organizations with which they partner. Specifically, we requested that each grantee provide a variety of partners including medical providers, non-profits, and other community-based organizations. Each of the four grantees responded with two or three contacts for a total of 11 partners in the sample, anticipating that we would have several nonresponses that would result in nine or less respondents. We sent an introductory email to each individual who was identified and received seven responses from individuals willing to assist; one refusal due to lack of time; and we received no response from three partner agencies. We proceeded to send the survey to the seven respondents who agreed to participate.

By the end of the three-week pre-test period, we received completed surveys from a total of five individuals, including one behavioral health provider, two health care provider organizations, and two individuals from offices within state health departments. The questionnaire was completed on paper and was timed by section and overall. The self-reported survey lengths ranged from 14 minutes to 46 minutes excluding any breaks, with an average of 27.4 minutes per complete. This timing was consistent with the expected survey length of 30 minutes.

After completing the questionnaire, the respondents sent feedback on specific items of interest, as well as any questions respondents felt were confusing, overly burdensome, or that raised concern about their ability to provide an accurate response. Based on the results of the pre-test, the instrument underwent revisions. First, we added instructions for respondents working within state health departments. Second, we added a response option to the series of questions about service areas. Third, we added a “No Barriers” option to the questions that are about types of barriers that respondents’ organizations experience. Fourth, we made clarifications to items that respondents found to be confusing and overly wordy.

5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

CDC staff from the WISEWOMAN program and the Applied Research and Evaluation Branch (AREB) and staff from Mathematica Policy Research were consulted about the substantive, methodological, and statistical aspects of the study. Their recommendations were incorporated into the study design and instruments on an ongoing basis. Table B.2 lists the individuals consulted.

Staff members from Mathematica designed the data collection tools with input from CDC. Mathematica staff will also be responsible for overseeing and executing the collection of data with input from CDC staff. The data analysis will be led by Mathematica in close consultation with CDC staff throughout the process. Tiffany Dearman, Eileen Chappelle, and Derrick Gervin are responsible for receiving and approving contract deliverables.



Table B.2. Individuals consulted on methods

Eileen Chappelle, Health Scientist
Evaluation and Program Effectiveness Team, AREB
[email protected]
770-488-8144

Derrick Gervin, Health Scientist
Evaluation and Program Effectiveness Team, AREB
[email protected]
770-488-5004

Isam Vaid, Health Scientist
Division for Heart Disease And Stroke Prevention
WISEWOMAN Program
[email protected]
770-488-8000

Alexis Kaigler, Project Officer
Division for Heart Disease and Stroke Prevention
WISEWOMAN Program
[email protected]
770-488-5261

So O’Neil, Senior Researcher
Mathematica Policy Research
[email protected]
617-301-8975

David Jones, Senior Researcher
Mathematica Policy Research
[email protected]
617-674-8351

Allison Hedley Dodd, Senior Researcher
Mathematica Policy Research
[email protected]
202-484-4687

Holly Matulewicz, Survey Researcher
Mathematica Policy Research
[email protected]
617-674-8362



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEmily Wharton
File Modified0000-00-00
File Created2021-01-26

© 2024 OMB.report | Privacy Policy