OMB Clearance Package
SUPPORTING STATEMENT
Part B
Evaluation
of Patient-Centered Outcomes Research Trust Fund–
Training
Program
November 2019
Agency for Healthcare Research and Quality (AHRQ)
Table of Contents
B. Collections of Information Employing Statistical Methods 3
2. Information Collection Procedures 5
The Patient-Centered Outcomes Research Trust Fund–Training Program (PCORTF–TP) evaluation data collection comprises two surveys (Attachments A-B) and one key informant interview guide (Attachment C). Study participants include scholars, mentors, and investigators who have received or participated in the grant award mechanisms listed below to increase skill and capacity for conducting comparative effectiveness research (CER) as applied to patient-centered outcomes research (PCOR).
Surveys
PCORTF–TP grant award programs, survey respondents, and corresponding Form Names are displayed in Table 1:
Table 1.
PCORTF–TP Grant Award Mechanisms |
Survey Respondents |
||
|
K Awardees |
K12 Scholar Appointees |
Mentors |
Mentored individual research scientists |
Early Career Investigators |
N/A
|
Primary Mentors
|
Mentored individual clinical investigators |
Early Career Investigators |
||
Mentored individual mid-career and senior investigators |
Mid-Career & Senior Investigators |
N/A |
|
Mentored and Independent support for individual scientists to transition to independence |
Early Career Investigators |
N/A |
|
Mentored institutional early career scholars |
N/A |
Early Career Scholars |
|
Form Name: |
Attachment A K Awardee Survey/K12 Scholar Survey (52 items) |
Attachment B K Awardee /K12 Scholar Primary Mentor Survey (24 items) |
AFYA will conduct a non-random convenience sample survey, meaning the sample will consist of individuals who voluntarily participate during the recruitment period. No sampling strategy will be employed, since AHRQ seeks to obtain feedback from as many individuals associated with PCORTF–TP as possible. Jointly, the AHRQ project lead and AFYA staff will contact these individuals to introduce the PCORTF–TP survey effort and invite their participation.
The two surveys are designed with skip logic to ensure respondents see only items applicable to their grant award. Each survey will be fielded online via a web-based tool that provides customizable surveys for the Agency for Healthcare Research and Quality (AHRQ).
K Awardee Survey/K12 Scholar Survey
Attachment A provides the 52-item survey for K01, K08, K18, and K99/R00 (i.e., collectively referred to as “K Awardees”) and K12 scholars. This survey will be administered to volunteer participants representing the 147 individuals in the program respondent categories listed below. The population sizes in the list below reflect our current knowledge of the number of people in each individual awardee and scholar (appointee) category:
K01 Early Career Investigators: N=8
K08 Early Career Investigators: N=21
K18 Mid-Career and Senior Investigators: N=15
K99/R00 Early Career Investigators: N=9
K12 Scholars: N=94 (representing 15 programs across 2 funding cycles)
K Awardee /K12 Scholar Primary Mentor Survey
The second survey (Attachment B) will be administered to volunteer participants representing the 128 individuals in the mentor categories listed below. The population sizes in the list below reflect our current knowledge of the number of people in each mentor category.
K01 Primary Mentors; N=8
K08 Primary Mentors: N=21
K18 Primary Mentors: N=13
K99/R00 Primary Mentors N=9
K12 Primary Mentors: N=77
Key Informant Interviews
PCORTF–TP grant award program, key Informant Interview respondents, and corresponding Form Name are displayed in Table 2:
Table 2.
PCORTF–TP Grant Award Mechanism |
|
|
Key Informants |
Mentored institutional early career scholars |
Program Directors |
Form Name: |
Attachment C Key Informant Interview Guide for K12 Program Directors (10 items) |
The key informant interviews will be conducted to ask the K12 program directors discuss their perceptions of how the program contributes to the field of CER/PCOR, how training has supported these contributions, what factors contribute to program sustainability, and what key lessons have been learned during implementation. This group will consist of 13 K12 program directors. Attachment C shows the key informant interview protocol.
The questions ask participants to provide qualitative details about their experiences in the training and research programs, and the impacts of the support received from PCORTF–TP. Interviews are expected to take 60 minutes. They will be recorded for reference.
Surveys
AHRQ will initiate correspondence with program participants by sending an email message, introducing the survey effort and AFYA evaluation team contacts; AFYA will send subsequent email invitations to scholars and mentors (Attachments E-G). The correspondence will contain a hyperlink to allow easy access to the online survey. AFYA will send follow-up notices midway through the survey data collection period and 1 week prior to the end of this period in order to maximize the response rate. The reminder email notice will provide the hyperlink to access the survey, the estimated time (in minutes) it will take to complete the survey, and the impending deadline for submission of their responses.
The web-based survey will be accessible to our target audience 24 hours a day for a total of 4 weeks. Upon entrance into the survey, respondents will view an introduction page that explains the survey objectives and stresses the importance of participation. Following the access page will be a page providing specific instructions on how to complete the survey. Respondents will be able to respond easily to the survey items by clicking on pre-coded options for close-ended items and by typing in text boxes for open-ended items. The survey also contains a skip logic pattern that allows respondents to complete only relevant questions.
Following data collection, questionnaire responses will be compiled and assessed formally for data quality to produce a finalized database for statistical analyses. Incomplete response data poses a substantial threat to confident interpretation and generalization of study results. We will exclude surveys for which respondents answered less than 25 percent of the total number of questions.
This will be a one-time data collection, and respondents will not be contacted subsequent to their survey submission.
Interviews
The key informant interview guide (Attachment C) was developed to elicit responses from K12 program directors about their experience with the research training elements of PCORTF–TP K12 program. Key informant interviews will be conducted by phone. An experienced AFYA interviewer will conduct all key informant interviews. The key informant interviews will last approximately 60 minutes. Interviews will be digitally recorded using features from our XO Conference Call resources. Interview summaries and any additional notes will be stored on secure laptop computers provided by AHRQ.
A qualitative software package will be used to construct a database of interview responses. The software also will be used for coding and analysis. A preliminary code list will be defined, then revised based on review of the data, which may identify additional common themes. The revised coding scheme will be used to code the appropriate text fragments in each of the transcripts. Coding will be performed by two research associates. Intercoder agreement will be tested by double coding an initial set of interviews. Once 80 percent agreement has been reached, coding will proceed. Analysis will be based on text analysis and analysis of the results of queries using the software that will identify text fragments with particular codes of interest. A memo for each major code or topic of interest will be prepared, and these memos will be used to draft the summary document.
This will be a one-time data collection, and respondents will not be contacted subsequent to their key informant interview participation.
Surveys
For the PCORTF–TP evaluation survey, the expected response rate is 10 percent to 68 percent, based on estimates in the literature for comparable email-initiated online surveys (RAND, 2002).4 The response rate for the current project will depend on the willingness of the targeted individuals who receive AHRQ-related information to respond to the survey. Further, for any survey, the acceptable response rate is often not achieved initially.
AHRQ’s prior similar, though smaller-scale Career Development (K) Award training program evaluation grantee survey effort yielded a robust 76% response rate.5 Following a similar survey administration process, response rate for the current project is anticipated to be maximized by sending an initial email to program participants from AHRQ introducing the survey effort and AFYA evaluation team contacts, followed by AFYA sending time-staggered email notices of the opportunity to participate in the survey. The AFYA survey announcement will be sent out a total of three times over a 4-week survey implementation period. The initial AFYA announcement will be followed by two additional announcements (at the midpoint, and 1 week prior to the end of the survey). The email invitation will emphasize the importance of responses for an accurate evaluation and for AHRQ’s strategic planning and support. Providing information explaining the relevance of issues addressed by evaluation concerns has been found to have a strong positive correlation with response rates of email and web-based surveys (Sheehan & McMillan, 1999).6 The language to be used in the email notification of the survey and the invitations is included in Attachments E-G.
We will hold the key informant interviews at times that are most convenient for respondents. Conducting interviews via teleconference makes the location more convenient for participants.
Measuring Non-Response Bias
We will perform a comparison of early survey responses to responses obtained from individuals who respond late (following the last email notification; late respondents) on key survey metrics. Studies7,8 have shown that late respondents, or those who respond after several attempts, tend to have some similarities with individuals who do not respond (non-respondents). Any differences between these subsets of survey respondents will provide a measure of the potential non-response bias. In addition, because we are using a mixed-methods approach (i.e., survey and interviews) to collect data for the relevant metrics, we also will compare responses obtained from the key informant interviews to the survey responses to assess the degree to which the survey data may reflect non-response bias. We will assess non-response bias as a function of the different grant mechanisms and whether the projects are completed or ongoing.
In order to test web-based survey procedures, AFYA will conduct a pre-test of the web-based survey with a sub-sample of no more than nine potential respondents, including SWG members. Our SWG members consist of experts representing multiple PCORTF–TP funding mechanisms. During pre-testing, we will ask volunteers to “think aloud” as they answer each question. This will allow the research team to examine the thought processes of respondents as they hear, interpret, and decide on answers. The results of the pre-testing will be used to refine the survey prior to field-testing. In the event that fine tuning is required, OMB will be notified in a memorandum with a copy of the final version of the web-based survey.
We also will test the data capture procedures to ensure that each web-enabled survey captures and renders correctly. Two members of our project team will do this by manually completing 10 surveys (on hard copy), in parallel with our online data entry component, and comparing the outputs to ensure that all data were captured correctly.
AFYA, Inc., will serve as the primary consultant for statistical aspects of the design and analysis of the web-based survey. Dr. Michelle Tregear, AFYA’s project director, is the primary point of contact for statistical design and analyses. She can be reached at [email protected] or 301-957-3040.
4 Schonlau, M., Fricker, RD., & Elliott, MN. (2002). Conducting Research Surveys via email and the web. RAND Distribution Services. Pg. 142 (ISBN/EAN: 0-8330-3110-4)
5 Agency for Healthcare Research and Quality Individual Career Development (K) Award Program. (2016). Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/funding/training-grants/kaward/evaluation-report.html.
6 Sheehan, K. B., & McMillan, S. J. (1999). Response variation in email surveys: An exploration. Journal of Advertising Research, 39 (4), 45-54.
7 Lahaut VM, Jansen HA, van de Mheen D, Garretsen HF, Verdurmen JE, van Dijk A. Estimating non-response bias in a survey on alcohol consumption: comparison of response waves. Alcohol. 2003 Mar-Apr;38(2):128-34
8 Bose, J. (2001) Nonresponse bias analyses at the national center for Education statistics. Proceedings of Statistics, Canada Symposium 2001: Achieving Data Quality in a Statistical Agency: a Methodological Perspective. Pgs: 8
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | SUPPORTING STATEMENT |
Author | Robin Pugh Yi;Michelle Tregear |
File Modified | 0000-00-00 |
File Created | 2021-01-14 |