2900-0744_Supporting_Statement_B_03.29.18

2900-0744_Supporting_Statement_B_03.29.18.docx

VBA Call Center Satisfaction Survey

OMB: 2900-0744

Document [docx]
Download: docx | pdf

Supporting Statement B for:

OMB #2900-0744

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


Number of calls


1. Provide a numerical estimate of the potential respondent universe and describe any sampling or other respondent selection method to be used. Data on the number of entities (e.g., households or persons) in the universe and the corresponding sample are to be provided in tabular format for the universe as a whole and for each stratum. Indicate expected response rates. If this has been conducted previously include actual response rates achieved.


The respective target population for this survey is as follows:

  • Survey of Satisfaction with the VBA Call Center Experience: gathers satisfaction data from clients (customers) who recently contacted a VBA National Call Center, the Education Call Center or the National Pension Call Center

This section describes the universe of respondents for the survey, and the sampling plan that will be employed to achieve a representative sample for each survey.


Sample Design


Each call center will collect caller data (name, phone number) every third business day throughout the month for up to six days per month (with exception of holidays). A minimum of 4,500 call records will be provided each month for each call center in order to complete 333- 334 interviews per call center and ensure a representative random sample in the requisite 3,000 total surveys per month.


The National Call Centers and the National Pension Call Center will collect caller data using the Customer Relationship Management/Unified Desktop Optimized (CRM/UD-Optimization) system. For these call centers, caller data for every call received on a collection day is delivered to the contractor 1for computer assisted telephone interviews (CATI). The Education Call Center has not implemented the CRM/UD-O system and will collect sample manually until CRM-UDO is implemented. In order to minimize the burden of manual sample collection on call center representatives at the Education Call Center, call center managers provide a rotation of 20 call center representatives to collect caller data on each sample collection day. The rotation is utilized to ensure a representative sampling of incoming calls. On the sample collection day, each selected representative will collect data for all of the incoming calls they receive. The collected data will be sent to the contractor who will randomly select 500 records from each call center for interviewing. CATI interviews are completed until 55-56 interviews per call center are completed from each sample file. This ensures that interviews are collected throughout the month and the requisite 333-334 surveys per call center per month is a representative random sample (36,000 total surveys annually).


The contractor utilizes the CASRO (Council of American Survey Research Organizations)2 and adheres to the CASRO Code of Standards and Ethics for Survey Research, an internationally cited set of standards which has long been the benchmark for the market research industry. As noted, the contractor is subject to change during the life of the program. The future contractors may have different survey research membership and would be updated by administrative changes to the collection.


Table 5 displays the number of clients to be surveyed annually, the expected response rate, the FY2017 response rate, and the expected yield of completed Call Center Satisfaction surveys. VA anticipates a response rate of 40%.



TABLE 5:

CALL CENTER SATISFACTION SURVEY, EXPECTED RESPONSE RATE AND SURVEY YIELD

Number of Customers

Expected Response Rate

FY2017

Response

Rate3

FY2017 Cooperation4 Rate

Completed Surveys Expected


486,000

40%

46%

85%

36,000



VA and the Contractor also align their practices with the Office of Management and Budget (OMB) Guidance on Agency Survey and Statistical Information Collections, January 20, 2006 “Response rates have been calculated in a wide variety of ways, making comparisons across different surveys difficult. Recently, there have been attempts to standardize the calculation of response rates to provide a common basis for comparison. For example, the American Association for Public Opinion Research (AAPOR) has provided a set of six standard definitions of response rates as well as other formulas for calculating cooperation rates, refusal rates, and contact rates.44 The variations in response rate calculations depend on how partial responses are considered and how cases of unknown eligibility are handled. Agencies are encouraged to use the AAPOR standard formulas in calculating and reporting response rates in their ICRs; however, agencies may use other formulas as long as the method used to calculate response rates is documented in the [information collection request] ICR.”


This ICR focuses on the cooperation rate as the measure of survey response. VA and the contractor also align their practices with the Office of Management and Budget (OMB) Standards and Guidelines for Statistical Surveys dated September 2006 and Questions and Answers When Designing Surveys for Information Collections modified October 2016, originally published January 2006.



2. Describe the procedures for the collection of information, including: Statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose in the proposed justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The Call Center Satisfaction Survey will entail simple random sampling of 4,500 client interactions from each NCC, the NPCC, and the ECC per month. VBA is using a 95% confidence interval for categorical variables for the survey. There are no unusual procedures that will be required to draw a representative sample meeting these criteria.


3. Describe methods used to maximize the response rate and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


VBA will obtain the services of a contractor to develop, administer and analyze this survey.


Strategies to Maximize Response Rates


VBA will employ a variety of methods to minimize respondent burden and to maximize survey response rates. This section identified the strategies to be employed to reach these objectives. Each strategy is outlined below.


Strategy # 1 to Maximize Response Rates: Conducted Cognitive Labs/Pre-testing of Surveys


The contractor conducted cognitive tests of the survey instrument prior to fielding the initial survey in 2010. Cognitive lab feedback was implemented to ensure the survey was easily understood and correctly interpreted. Detail on the tests can be found in response to question 4.


Strategy # 2 to Maximize Response Rates: Maintaining a Toll-Free Survey Hotline


During the period that the survey is in the field, the contractor will provide and maintain a toll-free telephone line to answer any questions respondents or VA employees may have about the survey (e.g., how to interpret questions and response items, the purpose of the survey, etc). An after-hours voice messaging system will be available to receive messages after regular business.


Strategy # 3 to Maximize Response Rates: Excluding Questions of a “Sensitive” Nature


None of the questions included in the survey are sensitive, or private in nature, which will encourage compliance.


Strategy # 4 to Maximize Response Rates: Maintaining Confidentiality and Opt-in provisions

Survey respondents will be assured that their responses will not impact their current or future eligibility for benefits. Additionally, the survey is designed to provide anonymous feedback. Respondents may elect to have their comments associated with their name for additional file review from VBA to provide further representative training or refine processes. Respondents are reported to VA anonymously unless there is a danger to others (e.g. threats of self-harm, assault, illegal activities) or the respondent opts-in.


Strategy # 5 to Maximize Response Rates: Secure Networks and Systems


The contractor has a secure network infrastructure that will be validated by VBA information security officers (ISO). The servers are protected by a strong firewall system and the operations center is in a secure temperature-controlled environment, where network services are continually monitored by automated real-time programs to ensure the integrity and availability of all critical components. All key servers are supported by a backup power supply that can continue to run the systems in the event of a power outage. Additionally, the contractor will be immediately alerted if critical monitor thresholds are exceeded, so the contractor can proactively respond before outages occur.


Strategy # 6 Maximize response rates: Provide a clear explanation to respondents about the purpose of the study and how the data collected will be used


If respondents raise questions about the study during the interviews, the contractor will provide VBA approved explanations about the purpose of the study and how the study results will be used. Respondents will be assured their responses will not impact their current or future eligibility for benefits and will not be linked to them unless they opt-in.


Approach to Examine Non-Response Bias


Two types of non-response can affect the interpretation and generalization of survey data: item non-response and unit non-response. Item non-response occurs when one or more survey items are left blank in an otherwise completed, returned questionnaire. This is because satisfaction items (i.e. ratings of knowledge, courtesy, responsiveness, etc.) tend to be highly correlated (r >/=. 70) and therefore are collinear. The Call Center Satisfaction Survey has a missing item response rate of 0% (or an item response rate of 100%) because the survey is administered via phone and respondents are required to answer all questions bearing skip patterns. The second type of non-response is unit non-response, which is non-participation by an individual that was intended to be included in the survey sample and failed to respond to the survey. Unit non-response – the failure to return a questionnaire – is what generally is recognized as survey non-response bias. The Call Center Satisfaction Survey has a unit response of 46%.


Non-response bias refers to the error expected in estimating a population characteristic based on a sample of survey data that are not included or are under-represented within the resulting sample. Stated more technically, non-response bias is the difference between a survey estimate and the actual population value. Two factors determine the degree to which non-response bias will impact the accuracy and reliability of the resulting survey estimate. The first factor is the overall amount of non-response (e.g. overall response rate). The second factor is whether there are meaningful differences between those who respond and those who do not respond on the key outcome measures. Since the overall response rate for the Call Center Satisfaction Survey is high and the sample is representative of the total population, non-response bias is unlikely to impact the accuracy and reliability of the survey estimate.


In the case of non-response bias, there are two approaches to tackling the effects. One is to minimize the chances of non-response at the data collection stage. This may involve introducing measures which aim to maximize the response rate. The other approach is to make statistical adjustments at a survey follow-up stage when all the data is collected. Both approaches are described in the next paragraphs of this section.


Since it is not always possible to measure the actual bias due to unit non-response, there are strategies for reducing non-response bias by maximizing response rates across all types of respondents. In the face of a long-standing trend of declining response rates in survey research (Steeh, 1981; Smith, 1995; Bradburn, 1992; De Leeuw & Heer, 2002; Curtin & Presser, 2005), these strategies include:


  • Contacting respondents a maximum of 6 times – varying the call time in the respondents’ time zone to increase the likelihood of completing the survey and scheduling callbacks during respondent requested times.

  • Use of well-designed questionnaires and the promise of confidentiality.

  • Providing a contact name and telephone number for inquiries.



Employing these strategies to the administration of the survey will be crucial for maximizing high response rates across all respondent types (see section on maximizing response rates above).


Non-response follow-up analyses can help identify potential sources of bias and can help reassure data users, as well as the agency collecting and releasing the data, of the quality of the data collected. The approach to examining the presence of non-response bias will be conducted as follows:


  • Compare the Demographics of Respondents from the VBA Call Center Satisfaction Survey to the Demographics of Non-Respondents from the VBA Call Center Satisfaction Survey. To examine the presence of non-response bias, VA will compare the demographics of responders (i.e., those who responded to the VBA Call Center Satisfaction Survey) to the non-responders (i.e., those who did not respond to the VBA Call Center Satisfaction Survey).


  • The comparison between responders and non-responders will be made on the following variables for this survey:


Definition and source of the analysis

    • Reason for or outcome of the call center contact—Reason for contacting the VBA or the resulting outcome of the contact (e.g., claim denied) may result in some clients being more or less likely to complete the survey

    • Degree and depth of interaction with call centers – it is possible that respondents and non-respondents may differ with respect to the degree and depth of their interaction with VBA call centers.

    • Call Center– it is possible that participants from certain Call Centers may respond to the survey at a higher rate than those who interact with another Call Center



Based on the steps discussed above, VBA will identify issues with respect to non-response bias for the survey.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions of 10 or more individuals.


The contractor conducted cognitive labs with three test users for the survey, to determine whether respondents understand the survey questions and answer choices, as intended. Working closely with VBA, the contractor drew a small pool of names from potential participants in the survey for inclusion in the cognitive labs. Cognitive lab participants were drawn from the same population that will be used for the main study. The contractor submitted the list of potential participants to VBA for review and approval. Once identified, the contractor contacted participants by telephone and asked them to participate.


Once the participants were selected, VA conducted cognitive lab sessions aimed at identifying needed additions or refinements to the questionnaire. Cognitive labs are one-on-one sessions with potential survey participants, in which respondents are asked to complete the questionnaire while thinking aloud. The primary purpose of these sessions was to gather feedback on survey questions and answer choices to ensure they are easily understood and correctly interpreted. Outcomes of cognitive labs include, but are not limited to: addition or omission of specific questions, changes to wording of questions, clarification of question response options, addition of response options, and changes to ordering of questions.


In accordance with Memorandum for the Heads of Executive Departments and Agencies and Independent Regulatory Agencies dated July 22, 2016, it is noted in the Office of Information and Regulatory Affairs, Flexibilities under the Paperwork Reduction Act for Compliance with Information Collection Requirements, that “Non-substantive changes may also be used to facilitate and finalize larger changes to a particular collection, as long as the public is provided with some opportunity to comment on possible options or changes, as well as the circumstances that will trigger those options, as part of the original approval.”


VBA acknowledges that non-substantive changes will be required to align the survey instruments with current VBA processes. VBA anticipates that two to three questions on each form in this instrument collection may require wording changes each year. Wording changes will include: updates to naming conventions (e.g. websites and benefits programs), internal process changes (e.g. form delivery options, application options), Agency Performance Goal changes), and refinement of questions to better reflect the intention of the agency. Questions and components of the survey (e.g. introduction and closing) may be moved based on respondent feedback received during CATI interviews. Response options and coding for those options in the CATI system may be altered. Respondent feedback will also be used to determine when questions should be skipped or omitted due to previous line item responses.


VBA also anticipates refinement of questions based on customer feedback during CATI interviews. Examples of refinement include splitting a question into multiple parts for clarity and asking open capture questions to better target respondents core needs. Questions may also be added to future VBA contact center customer service instruments submitted under this control number, and any new associated burden will be accounted for in such submissions.



Additionally, survey questions may require additional cognitive labs and focus groups with more than five participants. There may be multiple rounds of cognitive labs or focus groups to refine the survey questions. Cognitive labs, if required, will not receive separate public comment notice. Final survey instruments and updated respondent burdens will be provided in accordance with OMB standards.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The following is a list of the persons involved in the survey.


Statistical Design Contractors

Mr. Greg Truex, MPP, J.D. Power (contractor) 805-418-8078

Ms. Tara Porter, Project Manager J.D. Power (contractor) 202-383-3707

Jay Myers, Statistician J.D. Power (contractor) 805-418-8078




Statistical Design and Analysis

Maribel Aponte Survey Statistician, VA (202)266-4688

Pamela Liverman Assistant Director, VBA (202) 560-9439

Dawna Quick Program Analysis, VBA (202) 530-9397

Kathleen Reavy VBA Contacts Operations Manager (215) 842-2000 X4253

Regina Yount BAS Training Chief (202) 530-9229


1 The current contractor is J.D. Power. Contract personnel are subject to change and will be updated as administrative changes. The term contractor is used throughout the document.

2 CASRO represents more than 300 companies and market research operations in the United States and abroad. http://www.casro.org/

3 AAPOR Response Rate 3 Formula = (Completes + Qualified Mid Terminates)/(Completes + Qualified Mid Terminates + Refused + Language Problem + Term: Take me off the list + (.90*Active Numbers))

4 Cooperation Rate Formula = (Completes + Named respondent works for VA + Did not contact the VA +  Respondent called more than once that day) / (Completes + Named respondent works for VA + Did not contact the VA +  Respondent called more than once that day + Qualified mid-terminates + Refused + Term: Take me off the list)


- 7 -

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorHee-Seung L. Seu
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy