Revised Supporting Statement Part B

Revised Supporting Statement Part B.doc

Voluntary Customer Surveys Generic Clearance for the Agency for Healthcare Research and Quality

Revised Supporting Statement Part B

OMB: 0935-0106

Document [doc]
Download: doc | pdf

Supporting Statement Part B


Evaluation of AHRQ’s Guide to Clinical Preventive Services


OMB No. 0935-0106


Version 11-7-2007



Introduction


To meet the objectives of this project, AHRQ will employ an evaluation intended to provide AHRQ with valuable reporting information recipients’ knowledge of, attitudes towards and practices/use of the guide in clinical practice. Multi-method data collection procedures (e.g., surveys, semi-structured interviews) will be employed throughout the evaluation.


B.1. Respondent Universe and Sampling Methods

Attachment A-1 presents the proposed time schedule for this project. Attachment B-1 presents the respondent universe, sampling methods and proposed sample sizes for the target audience identified for this evaluation. The sampling methods and methodologies proposed have been carefully developed to ensure the collection of data from a large number of respondents in a relatively small amount of time, with the least cost and minimum burden to respondents. It is expected that the selected sampling methods will provide a response rate of at least 80 percent.


An existing list of “customers” (e.g., a list of individuals who previously agreed to participate when they ordered the publication being evaluated and/or who self-select to respond to a Web-linked survey) will be used for a sampling pool.




B.2. Information Collection Procedures/Limitations of the Study


Statistical Methodology for Stratification and Sample Selection

The primary purpose of this project is to gather information from recipients of a specified publication, the 2005 and 2006 versions of Guide to Clinical Preventive Services (Guide), in order to (1) inform future product versions or enhancements and (2) examine the extent to which, and how, they have used the guides in clinical practice.

To meet the objectives of this evaluation, AHRQ will employ a multi-method data collection approach, which includes surveys and semi-structured interviews. The multi-method approach balances the burden for the respondents.  For instance, the survey will collect the information from the target audiences in a quick and less time consuming manner, while the semi-structured interviews will collect in-depth qualitative data from the target audiences. The sample selection by data collection methodology and estimated sample sizes for the target audience is provided in Attachment B-1.

Data Collection Procedures

As stated earlier, this evaluation will use a multi-method data collection procedure. Surveys (mailed, emailed, Web-linked and telephone surveys) and semi-structured interviews will be used to collect data from target audience members. A description of the data collection activities is described below, followed by a description of the procedures for each of the data collection methods.

Mailed, Emailed, and Telephone Surveys: Data collection procedures for the mailed, emailed, and telephone surveys will follow the Tailored Design Method (TDM) (Dillman, 2000) to collect data (see section B.3). The specific steps of the data collection procedure are organized around a series of contacts with the target audience. Once contact is made with a member of a target audience, the success of that contact is monitored (i.e., tracked) and steps for future contact are made. The following subsections provide an example of how the Tailored Design Method is used to collect mailed-survey data. Similar procedures are used to collect emailed or telephone data; however, contact with respondents is made through the telephone, or email, rather than through the mail.

Example: Tailored Design Method for Mailed Surveys

In preparation for mailing the surveys, all surveys will be coded with a unique identifying number to track responses (see section A.10). As discussed in section A.10, the purpose of this unique identifying number is to track responses to determine if additional follow up is needed to achieve the desired response rate. Data collection will begin by mailing a questionnaire package to respondents. Included in this package will be an introductory letter to respondents and a survey (see Attachment B6 for the Introductory Letter and Attachment B2 for the Survey). Attached to the back of each survey will be a stamped, pre-addressed return envelope for respondents to use. The introductory letter will be from AHRQ. It will explain the purpose of the survey, encourage a response as soon as possible, assure confidentiality to the respondent, and offer the Contractor’s "800" number in case any questions arise. Respondents will be asked to complete and return the questionnaire.

Within 2 weeks of the initial mailing, all respondents will be sent a reminder postcard (see Attachment B6). The card will thank those who have already responded, and ask those who have not responded to complete the survey and return it as soon as possible. It will ask those who have lost or misplaced their survey to call the "800" number for a replacement. Requests for replacements will be sent by priority mail within 48 hours. This added emphasis, it is believed, will increase the likelihood that someone who has already called for a replacement questionnaire will complete and return the questionnaire. Mailing the postcard to all participants is more cost-effective than sorting through the list of respondents. It also provides an opportunity to thank those who have participated.

Within 4 weeks of the initial mailing, a replacement survey will be sent to all those who have not responded. These participants will be identified through the tracking system. This package will contain the survey and a new cover letter. It will explain that the survey was sent and has not been received, and that a replacement is being provided for the convenience of the participant in case the original was lost or misplaced. Respondents will be reminded that the survey can also be completed using the electronic version of the instrument (see section A.3).


In order to ensure a high response rates, telephone follow-up will be used to contact all potential respondents who have not responded after receiving the replacement survey. To prepare for this contact, staff will be trained in a follow-up call procedure. Within 2 weeks of sending the replacement survey, calling of all non-respondents will begin. Calls will be monitored by supervisory staff on a random basis. A maximum of 6 attempts at contacting non-respondents will be made. Incentives will be mailed following the survey’s completion.


Web-Linked Surveys: Potential participants will be recruited by clicking on a graphic placed on two AHRQ Webpages that contain versions of the Guide to Clinical Preventive Services. The graphic will be designed in a manner consistent with past successful recruiting efforts by other Federal agencies to have participants respond to the survey. The graphic will remain in place for approximately 3-months. To the extent possible, individuals already on listservs for the Guide to Clinical Preventive Services, may also be directed to the link.


Semi-structured Interviews: Once potential participants have been selected, researchers will make telephone contact with each potential participant to request participation in the semi-structured component of the evaluation (see Attachment B3 for the Semi-structured Interview Telephone Recruitment Script). This introductory telephone call will explain the purpose of the evaluation, identify AHRQ as the sponsoring agency, explain all interviewing procedures, assure the respondent confidentiality, and request participation in the evaluation. For those individuals who agree to participate, an interview time will be scheduled at their convenience (see Attachments B-4 for the Interview Guide). Interviews will take place at the time and place arranged during the introductory phone call. With the permission of the respondent, all interviews will be audio-taped for later transcription. Incentives will be provided to participants after the interview is complete.


Data Collection Instruments

All measurement instruments will be designed to achieve the evaluation's five objectives:

  1. To determine the extent to which the target audience accepts these guides.


  1. To determine target audiences’ attitudes toward these guides.


  1. To determine how the target audience uses these guides and to what extent it has improved care.


  1. To learn ways to strengthen the content and format of future versions of these AHRQ guides.


  1. To determine the extent to which the target audience is aware of AHRQ and its role in the healthcare field.

Specifically, all measurement instruments (surveys and semi-structured interview guides) will be designed to measure respondents’ knowledge, attitudes, and use regarding the AHRQ publication being assessed, as well as to determine the target audiences’ awareness of AHRQ and its role in the healthcare field.

Mailed, Emailed, and Telephone Survey: One survey design will be used for the mailed, e-mailed and telephone data collection methodologies and two surveys will be used for the Web-linked data collection methodologies. Hard copies of the survey will be used for the mailed data collection methodology (although potential respondents will be offered the opportunity to answer the mailed survey on-line).  The electronic version of the survey will be used for telephone and emailed data collection methodologies (See Attachments B-2 for a hard copy of the survey). The electronic versions of the survey will contain the exact same questions as the hard copy. The only difference is respondents will access the survey electronically. For the Web-linked surveys, potential respondents will self-select to participate by clickig on a link on the respective AHRQ Webpages. In order to optimally use resources, the online version of the survey will be completed upon receipt of the OMB approval.

The survey will include demographic questions (e.g., age gender, education, ethnicity, race, occupation).  This data will be collected in order to better understand the characteristics of the user and non-users of AHRQ's products and resource materials. The remaining questions are included to achieve specific data collection goals in order to meet the evaluation objectives listed above. For the survey, response sets are organized to make it easy to complete each question.  For example, skip patterns are used to move a respondent through the survey quickly by allowing them to skip questions they need not answer throughout the survey questionnaires.  Efforts were made to keep the instrument as brief as possible, and to ensure that the screener questions and skip patterns are accompanied by a clear set of instructions. The survey will also contain color images of the Guide, when applicable.  The images will help orient respondents to the subject of the surveys. 



Limitations of the Study


The purpose of the Evaluation of AHRQ’s Guide to Clinical Preventive Services is to evaluate the use and usefulness of the Guide in promoting evidence-based preventive services. To that end the project is designed to obtain information from the intended audiences of the Guide. The methodology is designed to maximize the number of respondents who could provide meaningful information to allow the project purpose to be achieved and research questions answered. Despite the application of rigorous methods, some limitations to the study design exist, and thus will be disclosed to facilitate the interpretation of the results. These include response bias and self report bias. The possibility for response bias exists, in that those clinicians that respond to the survey may not be representative of the targeted audience. Self report and recall limitations may exist for the study and are common concerns for many retrospective studies and will be considered when interpreting the results. The self reported nature of the survey means that results may be limited by the potential inaccuracy of human recall. The results will be published in a Final Report submitted to AHRQ.


B.2.1. Statistical Methodology for Stratification and Sample Selection

See Section B2.



B.2.2. Estimation Procedure

See Section B2.


B.2.3. Degree of Accuracy Needed for the Purpose Described in the Justification

See Section B2.


B.2.4. Unusual Problems Requiring Specialized Sampling Procedures

See Section B2.


B.2.5. Use of Periodic (Less Frequent Than Annual) Data Collection Cycles

See Section B2.



B.3. Methods for Maximizing the Response Rate and Addressing Issues of Nonresponse


Maximizing response rates is a key priority for this project. The researchers will employ a variety of strategies to ensure acceptable response rates are achieved (i.e., 80% response rate). Response rates will be determined by the number of respondents who complete the surveys and semi structured interviews divided by the total number of potential respondents (i.e., those respondents who received a survey, were asked to participate in an interview, etc.) and multiplying by 100.

The specific strategies that will be used to maximize response rate are discussed:

  1. Incentive offers

  2. Tailored Design Method (TDM)

  3. Research/evaluation introduction, cover letter, related product(s), and e-mail message

The study introduction, cover letters and related products and resource materials were discussed in Section B.2. Incentive offers and TDM are discussed below.



Incentive Offers

According to Dillman (1978, 2000), the use of incentives has positive impact on increasing the response rates. Although participation in this project is voluntary, respondents are likely to perceive a time cost and burden associated with their participation. We therefore, intend to offer non-monetary incentives to encourage responses and to increase response rates. Generally, incentives will be distributed based on the methodology used. A range of gifts will be identified to be used for the evaluation activities (e.g., survey participants might be offered an AHRQ pen whereas healthcare professionals that participate in a semi-structured interview may be offered a decorative calculator/clock).  Pilot tests will be conducted with potential participants to determine the appropriateness of each potential incentive.  It is anticipated that as much as possible, incentives will range from $2 to $4, depending on the type of incentive used.  As part of instrument pilot testing, we will ask respondents to comment on the type and amount of incentives that is appropriate for participating in the data collection activity.

 The Tailored Design Method (TDM)

The Tailored Design Method (TDM) is structured around the Total Design Method developed by Dillman (1978). The Total Design Method was recently updated by Dillman (2000) and renamed the Tailored Design Method.  This data collection process has been used extensively since its publication in 1978 and has proven to be successful in obtaining consistently high response rates. The TDM prescribes virtually all the details involved in conducting mail, email, and telephone surveys, from the type of questions to ask to the number and timing of follow-ups. The TDM was used to assist in the development of all survey instruments as well as with the design of the multi-staged effort needed to obtain completed information from the respondents surveyed. Previous use of the TDM provides confidence that using the TDM will result in the desired response rates. This confidence is based on the result of using a similar approach with similar populations that achieved 80% response rates (CSAT, 2002).  In addition, use of the TDM during the "Assessment of NIDA's Public Health Information Publication" project (2005) demonstrated that the TDM is successful when applied to an evaluation project examining products and resource materials.



B.4. Tests of Procedures or Methods


Pilot tests of all measurement instruments and data collection procedures were conducted with sub-samples of the target population.  The number of participants per instrument ranged from 2-4. The pilot study participants for a particular instrument were asked to complete the measurement instruments (i.e., mailed/emailed and Web-linked surveys). They were then asked to comment on the clarity of the questions and identify any problems or issues (i.e. contents and formats of the questionnaires). Participants were asked to provide feedback on the appropriateness of the questions for the intended audience. A similar procedure was used for pilot tests of semi-structured interview guides and procedures. Respondents were asked to answer the questions and provide feedback on the clarity of the questions and guide, identify any questions or issues associated with the intended procedure, and comment on the appropriateness of the questions and procedures for the intended audience. Attachment B10 provides a summary of pilot test feedback for each of the measurement instruments tested and outline the changes that were made to the data collection instruments based on this feedback.


B.5. Names and Telephone Numbers of Individuals Consulted


Responsibility for Data Collection and Statistical Aspects

Johnson, Bassin & Shaw, International Inc. (JBS) will conduct all data collection procedures. JBS is responsible for all the data analysis for the project. The representative of the Contractor responsible for overseeing the planned data collection and analysis is

Susan Hayashi, Ph.D.

Project Director

Johnson, Bassin & Shaw International, Inc.

8630 Fenton Street, 12th Floor

Silver Spring, MD 20910

Tel: 301-495-1080

[email protected]

Agency Responsibility

Within the agency, the following individual will have oversight responsibility for all contract activities, including the data analysis:

Randie Siegel, M.S.

Director of Publishing

The Agency for Healthcare Research and Quality

540 Gaither Road

Rockville, MD 20850

Tel: 301-427-1852

[email protected]

B-8

File Typeapplication/msword
File TitleIntroduction:
Authoretait
Last Modified Bywcarroll
File Modified2007-11-07
File Created2007-08-10

© 2024 OMB.report | Privacy Policy