Survey Sample Plan

COVID VSignals Survey Sampling Plan final 06152020.docx

Clearance for A-11 Section 280 Improving Customer Experience Information Collection

Survey Sample Plan

OMB: 2900-0876

Document [docx]
Download: docx | pdf



COVID-19 VSignals Survey

Sampling Methodology Report



Prepared by

Veteran Experience Office

Version 1.3

June 2020

Contents



Executive Summary

The COVID-19 Surveys are designed to measure the Veteran customer experience during the COVID-19 pandemic. VA understands that COVID-19 has significantly impacted the medium, frequency, and overall experience in which Veterans receive their healthcare. These COVID-19 surveys are designed to begin to understand how Veterans’ care has been impacted during this time and whether they would feel safe returning to VA facilities.

There are three COVID-19 Surveys: (1) the first survey aims to understand what will make Veterans feel safe to return to an in-person healthcare appointment; (2) the second survey aims to measure the Veterans’ experience as it relates to COVID-19 after they have been seen for an in-person outpatient appointment; (3) the third survey aims to measure the Veterans’ experience with Telehealth during the COVID-19 pandemic.

Veterans experience data will be collected using an online transactional survey disseminated via an invitation email sent to selected beneficiary. The data collection will occur every two weeks. After the survey has been distributed, recipients have two weeks to complete the survey. Invitees will receive a reminder email after one week.

The survey questionnaire is brief and contains general Likert-scale (a scale of 1-5 from Strongly Disagree to Strongly Agree) questions and free-text questions to assess customer satisfaction.

The sample will be distributed across the Telehealth and Outpatient population. The first survey will be sent to Veterans who have either scheduled or cancelled a Telehealth or Outpatient appointment in the past 30 days but have not yet had that appointment. The second survey will be sent to Veterans who recently had an Outpatient appointment. The third survey will be sent to Veterans who recently had a Telehealth appointment.

This report describes the methodology used to conduct the COVID-19 Survey. Information about quality assurance protocols, as well as limitations of the survey methodology, is also included in this report.

Part I – Introduction

A. Background

The Enterprise Measurement and Design team (EMD) is part of the Veterans Experience Office (VEO). The EMD team is tasked with conducting transactional surveys of the Veteran and Beneficiary population to measure their satisfaction with the Department of Veterans Affairs (VA) numerous benefit services. Thus, their mission is to empower Veterans by rapidly and discreetly collecting feedback on their interactions with such VA entities as NCA, VHA, and VBA. VEO surveys generally entail probability samples which only contact minimal numbers of beneficiaries necessary to obtain reliable estimates. This information is subsequently used by internal stakeholders to monitor, evaluate, and improve beneficiary processes. Beneficiaries are always able to decline participation and have the ability to opt out of future invitations. A quarantine protocol is maintained to limit the number of times a beneficiary may be contacted, in order to prevent survey fatigue, across all VEO surveys.

In order to continue to provide quality service to Veterans, VEO has been commissioned to measure the satisfaction Veterans as it relates to the healthcare they receive during the COVID-19 pandemic.


B. Basic Definitions

Coverage

The percentage of the population of interest that is included in the sampling frame.

Measurement Error

The difference between the response coded and the true value of the characteristic being studied for a respondent.

Non-Response

Failure of some respondents in the sample to provide responses in the survey.

Transaction

A transaction refers to the specific time a beneficiary interacts with the VA that impacts the beneficiary’s journey and their perception of VA’s effectiveness in caring for beneficiaries.

Response Rate

The ratio of participating persons to the number of contacted persons. This is one of the basic indicators of survey quality.

Sample

In statistics, a data sample is a set of data collected and/or selected from a statistical population by a defined procedure.

Sampling Error

Error due to taking a particular sample instead of measuring every unit in the population.

Sampling Frame

A list of units in the population from which a sample may be selected.

Reliability

The consistency or dependability of a measure. Also referred to as standard error.



C. Application to Veterans Affairs

Customer experience and satisfaction are usually measured at three levels to: 1) provide enterprises the ability to track, monitor, and incentivize service quality; 2) provide service level monitoring and insights; and 3) give direct point-of-service feedback. This measurement may bring insights and value to all stakeholders at VA. Front-line VA leaders can resolve individual feedback from beneficiaries and take steps to improve the customer experience; meanwhile VA executives can receive real-time updates on systematic trends that allow them to make changes.



Part II – Methodology

A. Target Population, Frame, and Stratification

The target population of the COVID-19 Survey is different for each survey. The target population for the first survey (general) is any Veteran who has either scheduled or cancelled an outpatient or telehealth appointment in the past 30 days but has not yet had that appointment. The target population for the second survey (post-in-person appointment) is any Veteran who received an in-person outpatient appointment in the past 7 days. Finally, the target population for the third (Telehealth) survey is any is any Veteran who received an At-Home-or-Mobile Telehealth appointment in the past 7 days.

For both surveys, a random sample of outpatient and telehealth beneficiaries will be sampled. The beneficiary is the primary sampling unit and is randomly selected from the population according to a stratified design with a fixed allocation. The strata consist of appointment type for the Post-Appointment Survey and telehealth modality for the General survey. These strata are defined explicitly and contain allocation targets which may fluctuate with monthly changes in the population. To ensure demographic representation, the sampling within each stratum is also proportional with regard to Age Group and Gender. Additionally, the Post-In-Person Appointment and Telehealth survey will also be proportional by VAMC.

Table 1. Target Population

Survey

Outpatient Population

Telehealth Population

General
(Safe Return to In-Person Care)

Sub-Pop 1a: Any Veteran who has scheduled an outpatient appointment in the past 30 days but have not yet had that appointment

Sub-Pop 1b: Any Veteran who has scheduled a telehealth appointment in the past 30 days but have not yet had that appointment

Sub-Pop 2a: Any Veteran who has cancelled an outpatient appointment in the past 30 days (or had an appointment cancelled)

Sub-Pop 2b: Any Veteran who has cancelled a telehealth appointment in the past 30 days (or had an appointment cancelled)

Post-In-Person Appointment
(After Outpatient Appointment)

Any Veteran who has had an Outpatient appointment in the past 7 days is eligible for this survey, regardless of whether or not they’ve also had a telehealth appointment.

N/A: The target population for this survey is outpatient. However, Veterans who recently had a telehealth appointment are not excluded from eligibility for this survey

Telehealth

(After Telehealth-at-Home or Mobile Appointment)

N/A: The target population for this survey is telehealth. However, Veterans who recently had an outpatient appointment are not excluded from eligibility for this survey

Any Veteran who has had a Telehealth At-Home or Mobile Appointment in the past 7 days is eligible for this survey, regardless of whether or not they’ve also had an outpatient appointment.



  1. Sample Size Determination

This survey aims to collect enough responses per month to begin to understand the Veteran Customer Experience under COVID-19. However, in order to manage the service recovery of free-text responses that will be collected through this survey, the amount of targeted responses will be limited. Therefore, VEO’s traditional approach of developing a probabilistic survey with a 90-95% confidence interval with a 3-5% MOE will not apply to this survey. Instead, this survey will be used to develop qualitative and quantitative survey results and are not meant to be statistically significant.

Below, Table 2a and 2b shows the target population for the General survey, Post-In-Person-Appointment Survey, and Telehealth Survey. The General survey will target both outpatient and telehealth scheduling or cancellation encounters in the past 30 days. The estimated response rates are gathered from the existing Telehealth Veteran Survey and Outpatient Services Surveys. The Post-In-Person Appointment survey will cover all stop codes (visit types) for in-person outpatient appointments within the past 7 days. The estimated response rate is gathered from the existing Outpatient Services Surveys. The Telehealth Survey will cover telehealth encounters in the past 7 days for the At-Home-and Mobile appointment type. The estimated response rates are gathered from the existing Telehealth Veteran Survey

Table 2. Target Population Figures Across Three Surveys:

Survey

Number of Monthly Survey Invitations

Estimated Monthly Responses

Estimated Response Rate

General
(Safe Return to In-Person Care)

1,840

300

~16%

Post-In-Person Appointment
(After Outpatient Appointment)

1,700

300

~18%

Telehealth

(After Telehealth-at-Home or Mobile Appointment)

2,000

300

~15%

Total

5,540

900

~16%



Table 2A. Proposed Monthly Sample Targets by Stratum for General Survey

Sub-Population

Survey Stratum

Number of Monthly Survey Invitations

Estimated Monthly Responses

Estimated Response Rate

Sub-Pop 1a: Any Veteran who has scheduled an outpatient appointment in the past 30 days but have not yet had that appointment

Telehealth Appointment Scheduling

420

75

~18%

Sub-Pop 2a: Any Veteran who has cancelled an outpatient appointment in the past 30 days (or had an appointment cancelled)

Telehealth at Home or Mobile Appointment

500

75

~15%

Sub-Pop 1b: Any Veteran who has scheduled a telehealth appointment in the past 30 days but have not yet had that appointment

Telehealth Appointment Scheduling

420

75

~18%

Sub-Pop 2b: Any Veteran who has cancelled a telehealth appointment in the past 30 days (or had an appointment cancelled)

Telehealth at Home or Mobile Appointment

500

75

~15%

Total

1,840

300

~16%



Table 2B. Proposed Sample Targets by Stratum for Post-In-Person Appointment Survey

Sub-Population

Survey Stratum

Number of Monthly Survey Invitations

Estimated Monthly Responses

Estimated Response Rate

Any Veteran who has had an Outpatient appointment in the past 7 days is eligible for this survey, regardless of whether they’ve also had a telehealth appointment.

All Stop Codes for In-Person Visits

1,700

300

~18%




Table 2C. Proposed Sample Targets by Stratum for Telehealth Survey:

Sub-Population

Survey Stratum

Number of Monthly Survey Invitations

Estimated Monthly Responses

Estimated Response Rate

Any Veteran who has had a Telehealth At-Home or Mobile Appointment in the past 7 days is eligible for this survey, regardless of whether they’ve also had an outpatient appointment.

Home or Mobile Telehealth Appointment

2,000

300

~15%





The sample will be drawn using a systematic sampling methodology. This statistical valid approach allows the team to balance the sample across several variables such as age, gender, and geographic location, if desirable. This balancing variable are often referred to as implicit strata. In the coming wave, the VEO team will begin to leverage this capability because, though the effect on margin of error is difficult to measure, this methodology has been proven to improve the accuracy of estimates, stabilize weights, and reduce the variability that make trends difficult to interpret.

Each email address encountered is validated in several ways:

  • Validation that the email address has a valid structure

  • Comparison with a database of bad domains

  • Correction of common domain spellings

  • Comparison with a database of bad emails including

    • Opt outs

    • Email held by multiple beneficiaries

  • Comparison to a database of valid TDLs (e.g. “.com”, “.edu”)

Veterans email addresses are extracted from the Corporate Data Warehouse (CDW).


  1. Data Collection Methods

Recruitment occurs every two weeks for all Survey Types. Veterans will have two weeks to complete the survey. A reminder email is sent after one week to non-respondents, to remind them that the survey is available for another week. Once the Veteran completes the survey, their response data is immediately available within Veterans Signals (VSignals).

  1. Reporting

Researchers will be able to use the Veteran Signals (VSignals) system for interactive reporting and data visualization. VA employees with a PIV card may access the system at https://va.voice.medallia.com/sso/va/. Access to the COVID-19 Dashboards within VSignals will be approved by the COVID-19 stakeholders. Data are depicted within time series plots to investigate trends. Finally, filter options are available to assess scores at varying time periods and within the context of other collected variable information.

Recruitment is once every two weeks for all Survey Types, but the sample is optimized for monthly analysis. Therefore, results should be analyzed where at least four weeks (2 waves) of data are available. Short interval estimates are less reliable for small domains, (i.e., VAMC-level) and should only be considered for aggregated populations. Monthly estimates will have larger sample sizes, and therefore higher reliability. Estimates over longer periods are the most precise but will take the greatest amount of time to obtain and are less dynamic in that trends and short-term fluctuation in service delivery may be missed. Users examining subpopulations should be particularly diligent in assuring that insights stem from analysis with sufficient sample in the subpopulations being examined or compared.

  1. Quality Control

To ensure the prevention of errors and inconsistencies in the data and the analysis, quality control procedures will be instituted in several steps of the survey process. Records will undergo a cleaning during the population file creation. The quality control steps are as follows.

  1. Records will be reviewed for missing sampling and weighting variable data. When records with missing data are discovered, they will be either excluded from the population file or put into separate strata upon discussion with subject matter experts.

  2. Any duplicate records will be removed from the population file to both maintain the probabilities of selection and prevent the double sampling of the same beneficiary.

  3. Invalid emails will be removed.

The survey sample loading and administration processes will have quality control measures built into them.

  1. The survey load process will be rigorously tested prior to the induction of the survey to ensure that sampled beneficiaries are not inadvertently dropped or sent multiple emails.

  2. The email delivery process is monitored to ensure that bounce-back records will not hold up the email delivery process.

The sum of the weighted respondents will be compared to the overall population count to confirm that the records are being properly weighted. When the sum does not match the population count, weighting classes will be collapsed to correct this issue.



  1. Sample Weighting, Coverage Bias, and Non-Response Bias

Weighting is commonly applied in surveys, to adjust for nonresponse bias and/or coverage bias. Nonresponse is defined as failure of selected persons in the sample to provide responses. This is observed virtually in all surveys, in that some groups are more or less prone to complete the survey. The nonresponse issue may cause some groups to be over- or under-represented. Coverage bias is another common survey problem in which certain groups of interest in the population are not included in the sampling frame. The reason that these beneficiaries cannot participate is because they cannot be contacted (no email address available). In both cases, the exclusion of these portions of beneficiaries from the survey contributes to the measurement error. The extent that the final survey estimates are skewed depends on the nature of the data collection processes within an individual line of business and the potential alignment between beneficiary sentiment and their likelihood to respond.

Survey practitioners recommend the use of sample weighting to improve inference on the population so that the final respondent sample more closely resembles the true population. It is likely that differential response rates may be observed across different age and gender groups. Weighting can help adjust for the demographic representation by assigning larger weights to underrepresented group and smaller weights to over-represented group. Stratification can also be used to adjust for nonresponse by oversampling the subgroups with lower response rates. In both ways of adjustments, weighting may result in substantial correction in the final survey estimates when compared to direct estimates in the presence of non-negligible sample error.

Due to the low number of responses in this survey (in order to effectively manage service recovery of free-text responses), weighting will not be feasible and will not be applied. As mentioned in Part I, Section B, this survey is not meant to be statistically significant.



  1. Quarantine Rules

VEO seeks to limit contact with Veterans as much as possible, and only as needed to achieve measurement goals. These rules are enacted to prevent excessive recruitment attempts upon Veterans. VEO also monitors Veteran participation within other surveys, to ensure Veterans do not experience survey fatigue. All VEO surveys offer options for respondents to opt out, and ensure they are no longer contacted for a specific survey.

Table 5. Proposed Quarantine Protocol

Quarantine Rule

Description

Elapsed Time

Repeated Sampling for the COVID-19 Survey

Number of days between receiving one invite and receiving another for the COVID-19 Surveys.

4 weeks

Other Surveys

Veterans engaged that have recently completed other VEO surveys will not be selected for 30 days.

30 Days

Opt Outs

Persons indicating their wish to opt out of either phone or online survey will no longer be contacted.

Indefinite




Part III – Assumptions and Limitations

Coverage Bias due to Email-Only Data Collection

Since the COVID-19 Survey is email-only, there is a segment of the population of COVID-19 recipients that cannot be reached by the survey. This will correspond to persons that lack access to the Internet, and those who do not have an email address, or elect to not share their email address with VHA. Such Veterans may have different levels of general satisfaction with their service they received. Moreover, email addresses are currently obtained from VHA health records, and this process may also contribute to coverage bias because only Veterans who happen to have accessed VA Healthcare in the last are contacted.



Low Number of Responses for COVID-19 Survey

This survey aims to collect enough responses per month to begin to understand the Veteran Customer Experience under COVID-19. However, in order to manage the service recovery of free-text responses that will be collected through this survey, the amount of targeted responses will be limited. Therefore, VEO’s traditional approach of developing a probabilistic survey with a 90-95% confidence interval with a 3-5% MOE will not apply to this survey. Instead, this survey will be used to develop qualitative and quantitative survey results and are not meant to be statistically significant.


Appendix 1. List of Data Extraction Variables


Variables

 SURVEY_TYPE

 FIRST_NM

 LAST_NM

 SERVICE_DATE

 DOB 

 GENDER

 EMAIL

STOP_CODE

MODALITY




Appendix 2. Survey Questions


General Survey

  1. I trust [Facility Name] to provide safe health care. (Likert-Scale)

  2. I would benefit from VA information on pre-visit preparations for protecting against COVID-19. (Likert-Scale)

  3. I prefer to receive VA information in the following manner: (respondents can select multiple options)

    1. Mail

    2. Email

    3. Text Message

    4. Phone Call

    5. Other

  4. When you consider your options for your care, do you prefer a video Telehealth visit, phone visit, or in-person visit? (respondents can select one option):

    1. Video Telehealth

    2. Phone

    3. In-Person

    4. No Preference

  5. Please share any concerns you have about returning to VA for in-person health care. Please do not include any personally identifiable information, Social Security Number, Veteran ID, or medical information, but do provide details about your experience. (Free-Text Response limited to 400 characters)

  6. Can VA contact you about your feedback? (respondents can select one option):

    1. Yes, VA can contact me about my patient experience.

    2. No, I do not want VA to contact me about my patient experience.

  7. Would you like to volunteer your demographic information to help VA better serve you? (respondents can select one option):

    1. Yes

    2. No

*Demographic Questions if Veterans Select “Yes” to Question 13:

  1. Are You Hispanic or Latino? (respondents can select one option):

    1. Yes

    2. No

  2. What is your race? Please choose one or more. (respondents can select multiple options)

    1. White

    2. Black or African American

    3. Asian

    4. Native Hawaiian or Other Pacific Islander

    5. American Indain or Alaska Native




Post In-Person Survey

  1. The screening procedures while entering the VA facility made me feel safe. (Likert-Scale)

  2. I observed my health care team using hand sanitizer and/or proper hand washing procedures. (Likert-Scale)

  3. My health care team connected on a personal level and made me feel valued while social distancing. (Likert-Scale)

  4. The cleanliness of the facility met my expectations for a safe health care environment. (Likert-Scale)

  5. Would you like to provide additional feedback with a concern, compliment, orrecommendation about your experience(s) with [Facility Name]? Please select from one of the following options. (respondents can select one option):

    1. Compliment

    2. Concern

    3. Recommendation

    4. No Additional Feedback

  6. Use the text box below to enter details of the additional feedback (optional). Please do not include any personally identifiable information, Social Security Number, Veteran ID, or medical information, but do provide details about your experience. (Free-Text Response limited to 400 characters)

  7. Are you aware of the virtual care options (video Telehealth visit, phone visit) that VA offers? (respondents can select one option):

    1. Yes

    2. No

  8. When you consider your options for your care, do you prefer a video Telehealth visit, phone visit, or in-person visit? (respondents can select one option):

    1. Video Telehealth

    2. Phone

    3. In-Person

    4. No Preference

  9. Can VA contact you about your feedback? (respondents can select one option):

    1. Yes, VA can contact me about my patient experience.

    2. No, I do not want VA to contact me about my patient experience.

  10. Would you like to volunteer your demographic information to help VA better serve you? (respondents can select one option):

    1. Yes

    2. No

*Demographic Questions if Veterans Select “Yes” to Question 13:

  1. Are You Hispanic or Latino? (respondents can select one option):

    1. Yes

    2. No

  2. What is your race? Please choose one or more. (respondents can select multiple options)

    1. White

    2. Black or African American

    3. Asian

    4. Native Hawaiian or Other Pacific Islander

    5. American Indain or Alaska Native




Telehealth Survey

  1. The VA staff gave me information about connecting to my video Telehealth appointment. (Likert-Scale)

  2. Connecting to my VA Video Connect appointment was easy. (Likert-Scale)

  3. After I connected to my appointment, the overall quality of the video Telehealth visit remained good. (Likert-Scale)

  4. I was able to see the provider clearly by video. (Likert-Scale)

  5. I was able to hear the provider clearly by video. (Likert-Scale)

  6. At the beginning of the video Telehealth visit, the provider addressed privacy concerns. (Likert-Scale)

  7. I felt confident that the video Telehealth visit addressed my needs and the reason for the visit.

  8. Overall, I am satisfied with the video Telehealth visit. (Likert-Scale)

  9. When you consider your options for your care, do you prefer a video Telehealth visit, phone visit, or in-person visit? (respondents can select one option):

    1. Video Telehealth

    2. Phone

    3. In-Person

    4. No Preference

  1. I trust Telehealth as part of my overall VA healthcare. (Likert-Scale)

  2. Would you like to provide additional feedback with a concern, compliment, or recommendation about your experience(s) with [Facility Name]? Please select from one of the following options. (respondents can select one option):

    1. Compliment

    2. Concern

    3. Recommendation

    4. No Additional Feedback

  3. Use the text box below to enter details of the additional feedback (optional). Please do not include any personally identifiable information, Social Security Number, Veteran ID, or medical information, but do provide details about your experience. (Free-Text Response limited to 400 characters)

  4. Can VA contact you about your feedback? (respondents can select one option):

    1. Yes, VA can contact me about my patient experience.

    2. No, I do not want VA to contact me about my patient experience.

  5. Would you like to volunteer your demographic information to help VA better serve you? (respondents can select one option):

    1. Yes

    2. No

*Demographic Questions if Veterans Select “Yes” to Question 13:

  1. Are You Hispanic or Latino? (respondents can select one option):

    1. Yes

    2. No

  2. What is your race? Please choose one or more. (respondents can select multiple options)

    1. White

    2. Black or African American

    3. Asian

    4. Native Hawaiian or Other Pacific Islander

    5. American Indain or Alaska Native




Appendix 3. References

Choi, N.G. & Dinitto, D.M. (2013). Internet Use Among Older Adults: Association with Health Needs, Psychological Capital, and Social Capital. Journal of Medical Internet Research, 15(5), e97

Kish, L. (1992). Weighting for unequal P. Journal of Official Statistics, 8(2), 183-200.

Lohr, S. (1999). Sampling: Design and Analysis (Ed.). Boston, MA: Cengage Learning.

Liu, J., Iannacchione, V., & Byron, M. (2002). Decomposing design effects for stratified sampling. Proceedings of the American Statistical Association’s Section on Survey Research Methods.





File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSoldan, Bridget M. (BAH)
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy