Survey Sample Methodology

Sample Plan Clinical Contact Centers.docx

Clearance for A-11 Section 280 Improving Customer Experience Information Collection

Survey Sample Methodology

OMB: 2900-0876

Document [docx]
Download: docx | pdf



Service Level Measurements ECCC Clinical Contact Center Survey

Sampling Methodology Report



Prepared by

Veteran Experience Office

Version 1

June 2020

Executive Summary

The ECCC Clinical Contact Center Survey is designed to measure customer experience after contacting one of the clinical care contact centers.

Veterans experience data is collected by using an online transactional survey disseminated via an invitation email sent to randomly selected beneficiary. The data collection occurs once per week with invitation being sent out within 8 days of calling the ECCC Clinical Contact Center. The questionnaire is brief and contains general Likert-scale (a scale of 1-5 from Strongly Disagree to Strongly Agree) questions to assess customer satisfaction as well as questions assessing the knowledge, speed, and manner of the interaction. After the survey has been distributed, recipients have two weeks to complete the survey and will receive a reminder email after one week.

The overall sample size for the ECCC Clinical Contact Center Survey population is selected to optimize the reliability of monthly survey estimate for each survey type given the amount of available sample while being conscious of the burden placed on the veteran. In this case the largest cohort (Nurse Triage) is targeted to achieve a +/-4% margin of error at a 95% confidence level. The survey will be sent to a representative sample of Veterans. Once data collection is completed, the participant responses in the online survey will be weighted.

This report describes the methodology used to conduct the ECCC Clinical Contact Center Survey. Information about quality assurance protocols, as well as limitations of the survey methodology, is also included in this report.

Part I – Introduction

A. Background

The Enterprise Measurement and Design team (EMD) is part of the Insights and Analytics (I&A) division within the Veterans Experience Office (VEO). The EMD team is tasked with conducting transactional surveys of the Veteran population to measure their satisfaction with the Department of Veterans Affairs (VA) numerous benefit services. Thus, their mission is to empower Veterans by rapidly and discreetly collecting feedback on their interactions with such VA entities as NCA, VHA, and VBA. VEO surveys generally entail probability samples which only contact minimal numbers of Veterans necessary to obtain reliable estimates. This information is subsequently used by internal stakeholders to monitor, evaluate, and improve beneficiary processes. Veterans are always able to decline participation and have the ability to opt out of future invitations. A quarantine protocol is maintained to limit the number of times a Veteran may be contacted, in order to prevent survey fatigue, across all VEO surveys.

The VEO team designed two surveys related to the ECCC Clinical Contact Center Survey. The standard survey will be administered with beneficiaries to who receive services directly through the Clinical Contact Center (for RN Triage and Pharmacy). The Licensed Independent Provider (LIP) Survey will be administered to beneficiaries who receive telephonic care through contracted clinical care specialists.

In order to continue to provide quality services to Veterans, VEO has been commissioned to measure the satisfaction with the ECCC Clinical Contact Center. To complete this goal, VEO proposed to conduct a brief transactional survey with selected Veterans who had received telephonic care through the ECCC Clinical Contact Center. The core survey consists of eight questions developed using a human-centered design, focusing on Veterans’ experience with regard to their recent encounter and centered on to the factors of Trust, Ease, Effectiveness, Helpfulness, Quality and Emotion. These Likert-scale (a scale of 1-5) questions are designed through extensive Veteran input and recommendations from subject matter experts in the VA. Veterans also have an opportunity to provide a free-text response about their experience.1

Veterans are selected to participate in the survey via an invitation email. A link is enclosed so the survey may be completed using an online interface, with customized participant information. The data is collected on a weekly basis and the survey is reported on a monthly basis. The purpose of this document is to outline the planned sample design and provide a description of the data collection and sample sizes necessary for proper reporting.


B. Basic Definitions

Coverage

The percentage of the population of interest that is included in the sampling frame.

Measurement Error

The difference between the response coded and the true value of the characteristic being studied for a respondent.

Non-Response

Failure of some respondents in the sample to provide responses in the survey.

Transaction

A transaction refers to the specific time a Veteran interacts with the VA that impacts the Veteran’s journey and their perception of VA’s effectiveness in caring for Veterans.

Response Rate

The ratio of participating persons to the number of contacted persons. This is one of the basic indicators of survey quality.

Sample

In statistics, a data sample is a set of data collected and/or selected from a statistical population by a defined procedure.

Sampling Error

Error due to taking a particular sample instead of measuring every unit in the population.

Sampling Frame

A list of units in the population from which a sample may be selected.

Reliability

The consistency or dependability of a measure. Also referred to as standard error.

C. Application to Veterans Affairs

Customer experience and satisfaction are usually measured at three levels to: 1) provide enterprises the ability to track, monitor, and incentivize service quality; 2) provide service level monitoring and insights; and 3) give direct point-of-service feedback. This measurement may bring insights and value to all stakeholders at VA. Front-line VA leaders can resolve individual feedback from Veterans and take steps to improve the customer experience; meanwhile VA executives can receive real-time updates on systematic trends that allow them to make changes.

1) To collect continuous customer experience data

2) To help field staff and the national office identify areas of improvement.

3) To understand emerging drivers and detractors of customer experience.

Part II – Methodology

A. Target Population, Frame, and Stratification

The target population of the ECCC Clinical Contact Center Survey is defined as any Veterans who has received telephonic care through the ECCC Clinical Contact Center in the past weeks are eligible for participation.

The sample frame is prepared by extracting population information directly from VHA’s Corporate Data Warehouse. These extracts are also used to obtain universe figures for the sample weighting process. The Veteran is the primary sampling unit and is randomly selected from the population according to a stratified design. The primary stratification will be the type of contact which fall into 3 strata—nurse triage, pharmacy, and Licensed Independent Provider (LIP). The survey will also utilize implicit stratification or balancing by age, gender, and location.

  1. Sample Size Determination

To achieve a certain level of reliability, the sample size for a given level of reliability is calculated below (Lohr, 1999):

For a population that is large, the equation below is used to yield a representative sample for proportions:

where

  • = is the critical Z score which is 1.96 under the normal distribution when using a 95% confidence level (α = 0.05).

  • p = the estimated proportion of an attribute that is present in the population, with q=1-p.

  • Note that pq attains its maximum when value p=0.5 or 50%. This is what is typically reported in surveys where multiple measures are of interest. When examining measures closer to 100% or 0% less sample is needed to achieve the same margin of error.

  • e = the desired level of precision or margin of error. For example, for the ECCC Clinical Contact Center Survey the targeted margin of error is e = 0.04, or +/-4%.

For a population that is relatively small, the finite population correction is used to yield a representative sample for proportions:

Where

  • = Representative sample for proportions when the population is large.

  • N = Population size.



The margin of error surrounding the baseline proportion is calculated as:

Where

  • = 1.96, which is the critical Z score value under the normal distribution when using a 95% confidence level (α = 0.05).

  • N = Population size.

  • n = Representative sample.

  • p = the estimated proportion of an attribute that is present in the population, with q=1-p.



Estimates from the population files drawn for the first 5 months of 2020 indicate that in the average month 16,508 calls are made to the ECCC Clinical Contact Center by Veterans. The proposed sample plan is designed to achieve an MOE of +/-4% at a 95% confidence for the Nurse Triage strata. For the LIP and Pharmacy stratum, there is insufficient sample to achieve this level of accuracy. The plan, therefore, is to achieve an MOE of +/-5% at 80% confidence for LIP by using most, if not all, of the available sample. Pharmacy calls have the least amount of available sample. To address this, the plan recommends using all available sample.

Table 1A indicates the population figures based on numbers from that period, as well as estimated population with email addresses on file and the proportion that is likely to be usable after removing duplicates and quarantine rules across VEO surveys.

Table 1A. Target Population Figures, Sample Size, and Email Contacts

 

Estimated Monthly Callers

Estimated Monthly Callers w/ Email Addresses

Estimated Monthly Callers w/ Email Addresses Available After Exclusion Rules and Dedup-lication

Target MOE2

Conf-idence

Min-imum Monthly Resp-onses Needed

Resp-onse Rates

Minimum Monthly Sample Needed

Nurse Triage

14,508

8,175

6,540

4.00%

95%

560

18%

3,107

LIP

1,677

1,082

866

5.00%

80%

143

18%

791

Pharmacy

323

147

117

13.00%

80%

21

18%

117


Table 1B shows the estimated sample frame and minimum target sample size on a weekly basis. Minimum targets are rounded upward to assure the prescribed accuracy is achieved.


Table 1B shows the weekly sample availability and sample needs.

 

Estimated Weekly Callers w/ Email Addresses Available After Exclusion Rules and Deduplication

Minimum weekly sample needed

Rounded weekly sample targets

Sampling Rate

Nurse Triage

1,505

129

720

47.8%

LIP

199

33

185

93.0%

Pharmacy

27

5

27

100.0%


The sample will be drawn using a systematic sampling methodology. This statistical valid approach allows the team to balance the sample across several variables such as age, gender, and location. These balancing variables are often referred to as implicit strata. This has been shown to stabilize trends and improve accuracy of estimates.

Email addresses will be acquired by matching Veteran ID numbers to the VBA’s Enterprise Data Warehouse (EDW) and the VHA’s Corporate Data Warehouse (CDW). The CDW will be prioritized if the two sources produce different and valid email addresses. Each email address encountered is validated in several ways:

  • Validation that the email address has a valid structure

  • Comparison with a database of bad domains

  • Correction of common domain misspellings

  • Comparison of a database of bad emails including

    • Opt outs

    • Email held by multiple veterans

  • Comparison to a database of valid TDLs (e.g. “.com”, “.edu”)



  1. Data Collection Methods

Invitations will be sent out each week to assure that initial invites are sent within eight days of their call to the ECCC Clinical Contact Center. Caller information will be regularly extracted from VHA database resource: the VHA’s Corporate Data Warehouse (CDW). The extraction process will be executed and validated by the Office of Performance Improvement and Assessment (PA&I). with the population extracts sent to VEO twice a week. Invitation will be sent on Mondays. Invitees that have not completed the survey will receive a reminder after one week. The survey will remain open for a total of two weeks. Survey responses are immediately available within VSignals as soon as feedback is submitted.

  1. Reporting

Researchers will be able to use the Veteran Signals (VSignals) system for interactive reporting and data visualization. VA employees with a PIV card may access the system at https://va.voice.medallia.com/sso/va/. The scores may be viewed by Age Group, Gender, and Race/Ethnicity in various charts for different perspective. They are also depicted within time series plots to investigate trends. Finally, filter options are available to assess scores at varying time periods and within the context of other collected variable information.

Recruitment is continuous but the results should be combined into a monthly data file for more precise estimates, at the call center level. Short interval estimates are less reliable for small domains, (i.e., VAMC-level) and should only be considered for aggregated populations. Monthly estimates will have larger sample sizes, and therefore higher reliability. Estimates over longer periods are the most precise but will take the greatest amount of time to obtain and are less dynamic in that trends and short-term fluctuation in service delivery may be missed. Users examining subpopulation should be particularly diligent in assuring that insights stem from analysis with sufficient sample in the subpopulations being examined or compared.

  1. Quality Control

To ensure the prevention of errors and inconsistencies in the data and the analysis, quality control procedures will be instituted in several steps of the survey process. Records will undergo a cleaning during the population file creation. The quality control steps are as follows.

  1. Records will be reviewed for missing sampling and weighting variable data. When records with missing data are discovered, they will be either excluded from the population file or put into separate strata upon discussion with subject matter experts.

  2. Any duplicate records will be removed from the population file to both maintain the probabilities of selection and prevent the double sampling of the same Veteran.

  3. Invalid emails will be removed.

The survey sample loading and administration processes will have quality control measures built into them.

  1. The survey load process will be rigorously tested prior to the induction of the survey to ensure that sampled customers is not inadvertently dropped or sent multiple emails.

  2. The email delivery process is monitored to ensure that bounce-back records will not hold up the email delivery process.

The weighting and data management quality control checks are as follows:

  1. The sum of the weighted respondents will be compared to the overall population count to confirm that the records are being properly weighted. When the sum does not match the population count, weighting classes will be collapsed to correct this issue.

  2. The unequal weighting effect will be used to identify potential issues in the weighting process. Large unequal weighting effects indicate a problem with the weighting classes, such as a record receiving a large weight to compensate for nonresponse or coverage bias.

  1. Sample Weighting, Coverage Bias, and Non-Response Bias

Weighting is commonly applied in surveys to adjust for nonresponse bias and/or coverage bias. Nonresponse is defined as failure of selected persons in the sample to provide responses. This is observed virtually in all surveys, in that some groups are more or less prone to complete the survey. The nonresponse issue may cause some groups to be over- or under-represented. Coverage bias is another common survey problem in which certain groups of interest in the population are not included in the sampling frame. The reason that these Veterans cannot participate is because they cannot be contacted (no email address available). In both cases, the exclusion of these portions of Veterans from the survey contributes to the measurement error. The extent that the final survey estimates are skewed depends on the nature of the data collection processes within an individual line of business and the potential alignment between veteran sentiment and their likelihood to respond.

Survey practitioners recommend the use of sample weighting to improve inference on the population so that the final respondent sample more closely resembles the true population. It is likely that differential response rates may be observed across different age and gender groups. Weighting can help adjust for the demographic representation by assigning larger weights to underrepresented group and smaller weights to overrepresented group. Stratification can also be used to adjust for nonresponse by oversampling the subgroups with lower response rates. In both ways of adjustments, weighting may result in substantial correction in the final survey estimates when compared to direct estimates in the presence of non-negligible sample error.

Weights are updated live within the VSignals reporting platform3. Proportions are set based on the monthly distribution of the previous month.4

If we let wij denote the sample weight for the ith person in group j (j=1, 2, and 3), then the CW formula is:

As part of the weighting validation process, the weights of persons in an age and gender group are summed and verified that they match the universe estimates (i.e., population proportion). Additionally, we calculate the unequal weighting effect, or UWE (see Kish, 1992; Liu et al., 2002). This statistic is an indication of the amount of variation that may be expected due to the inclusion of weighting. The unequal weighting effect estimates the percent increase in the variance of the final estimate due to the presence of weights and is calculated as:

where

  • cv = coefficient of variation for all weights .

  • s = sample standard deviation of weights.

  • = sample mean of weights, ij.



  1. Quarantine Rules

VEO seeks to limit contact with Veterans as much as possible, and only as needed to achieve measurement goals. These rules are enacted to prevent excessive recruitment attempts upon Veterans. VEO also monitors Veteran participation within other surveys, to ensure Veterans do not experience survey fatigue. All VEO surveys offer options for respondents to opt out, and ensure they are no longer contacted for a specific survey.


Table 5. Proposed Quarantine Protocol

Quarantine Rule

Description

Elapsed Time

Past waves

Number of days between completing online survey any VEO survey and receiving another invitation.

30 Days

Active Waves

Number of days between receiving an invitation to a VEO survey and receiving another invitation.

14 Days

Anonymous

Callers explicitly wishing to remain anonymous will not be contacted.

N/A

Opt Outs

Persons indicating their wish to opt out of either phone or online survey will no longer be contacted.

N/A



Part III – Assumptions and Limitations

A) Population Estimation Error

The population estimates for this survey include some uncertainty due to 1) fluctuation in the call volumes due to the current pandemic (Covid 19); 2) an increase over time in the use of telemedicine over time; and 3) potential policy shift (e.g. shift to more reliance on contractor or LIP). Estimates tried to account for these factors. None-the-less, a large amount of uncertainty exists. To address this risk, we recommend evaluating the sample plan over time to determine how well the estimates hold up.

B) Coverage Bias due to Email-Only Data Collection

Since the ECCC Clinical Contact Center Survey is email-only, there is a segment of the population of ECCC Clinical Contact Center callers that cannot be reached by the survey. This will correspond to persons that lack access to the internet, and those who do not have an email address, or elect to not share their email address with the VA. Such beneficiaries may have different levels of general satisfaction with their service they received.










Index 1. Survey Questions


Standard Survey Questions

A-11 Customer Experience Domains

I waited a reasonable amount of time to speak to an agent.

Efficiency/Speed

It was easy to reach the right person about my need.

Ease/Simplicity

The agent took a reasonable amount of time to address my need.

Efficiency/Speed

I understood the information provided by the [contact center agent].

Employee Helpfulness

The agent I interacted with was helpful.

Employee Helpfulness

The issue that I contacted [contact center] about on [date/today] was resolved.

Quality

I am satisfied with the service I received from [contact center].

Satisfaction

I trust VA to fulfill our country's commitment to Veterans.

Confidence/Trust


LIP (Telehealth) Survey

A-11 Customer Experience Domains

I am satisfied with the care I received during this interaction / appointment.

Satisfaction

It was easy to reach the right person about my need.

Ease/Simplicity

The issue that I contacted [contact center] about on [date/today] was addressed.

Quality

The VA telehealth provider I interacted with was helpful.

Employee Helpfulness

I understood the information provided by the VA telehealth provider.

Employee Helpfulness

After my virtual visit, I knew what I needed to do next.

Ease/Simplicity

I waited a reasonable amount of time to speak to a VA telehealth provider.

Efficiency/Speed

The VA telehealth provider took a reasonable amount of time to address my need.

Efficiency/Speed

If you sought a telehealth appointment at VA, tell us the main reason why:

  • I did not want to go to VA in person because of COVID-19 concerns

  • I was given the option to have a telehealth appointment because it would meet my care needs

  • Other

N/A

I trust VA to fulfill our country's commitment to Veterans.

Confidence/Trust



Index 2. References

Choi, N.G. & Dinitto, D.M. (2013). Internet Use Among Older Adults: Association with Health Needs, Psychological Capital, and Social Capital. Journal of Medical Internet Research, 15(5), e97

Kish, L. (1992). Weighting for unequal P. Journal of Official Statistics, 8(2), 183-200.

Lohr, S. (1999). Sampling: Design and Analysis (Ed.). Boston, MA: Cengage Learning.

Liu, J., Iannacchione, V., & Byron, M. (2002). Decomposing design effects for stratified sampling. Proceedings of the American Statistical Association’s Section on Survey Research Methods.



Appendix 1. References

          1. References



Choi, N.G. & Dinitto, D.M. (2013). Internet Use Among Older Adults: Association with Health Needs, Psychological Capital, and Social Capital. Journal of Medical Internet Research, 15(5), e97

Kalton, G., & Flores-Cervantes, I. (2003). Weighting Methods. Journal of Official Statistics, 19(2), 81-97.

Kish, L. (1992). Weighting for unequal P. Journal of Official Statistics, 8(2), 183-200.

Kolenikov, S. (2014). Calibrating Survey Data Using Iterative Proportional Fitting (Raking). The Stata Journal, 14(1): 22–59.

Lohr, S. (1999). Sampling: Design and Analysis (Ed.). Boston, MA: Cengage Learning.

Liu, J., Iannacchione, V., & Byron, M. (2002). Decomposing design effects for stratified sampling. Proceedings of the American Statistical Association’s Section on Survey Research Methods.

Wong, D.W.S. (1992) The Reliability of Using the Iterative Proportional Fitting Procedure. The Professional Geographer, 44 (3), 1992, pp. 340-348









1 The LIP Survey is different from the core questionnaire with the exception of a universal Trust question. It contains an additional Likert-scale question and a question regarding their reason for seeking telehealth services.

2 MOE measures assume that non-response to the survey is randomly distributed.

3 Realtime weighting may cause some distortions at the beginning of each cycle due to empty cells or random variance in small sample distributions.

4 Using previous months data is a design option for handling the problem of setting targets prior to fielding each month. An alternative design is to set targets off annualized estimates to create more stability month to month. If the population is known to fluctuate from month to month, past month population estimates may not be the optimal solution.

2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAlbert, Evan
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy