5 BVA_(B) Employing Statistical Methods

5 BVA_(B) Employing Statistical Methods.doc.docx

Board of Veterans' Appeals Voice of the Veteran Appellant Surveys

OMB: 2900-0816

Document [docx]
Download: docx | pdf

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


1. Provide a numerical estimate of the potential respondent universe and describe any sampling or other respondent selection method to be used. Data on the number of entities (e.g., households or persons) in the universe and the corresponding sample are to be provided in tabular format for the universe as a whole and for each stratum. Indicate expected response rates. If this has been conducted previously include actual response rates achieved.


The respective target population for this survey is:


  • Survey of Satisfaction with the Board of Veterans’ Appeals Appellant Experience: Veterans who have been issued a Board decision on an appeal.


The universe of respondents for the survey and the sampling plan that will be employed to achieve a representative sample for each survey is outlined below.


The targeted population of the Board appellant experience satisfaction study will be those appellants who received a decision on their appeal in FY 2013. In FY 2012 the Board received 49,611 appeals and issued 44,300 decisions of which 12,334 were hearings. The Board will provide names and telephone numbers of appellants who have been issued a Board decision in FY 2013. The sample will be stratified based on the population of the sample and different Hierarchies inherent in the sample (i.e. military rank, status, regions, etc.). Interviewing will be conducted quarterly. Sample will be selected based on the date of the decision. The Contractor will sample the list equally over four interviewing periods.


Table 5 displays the number of appellants to be surveyed, the expected response rate, and the expected yield of completed appellant surveys. Assuming a 90% incidence and a 30% cooperation rate, VA anticipates a response rate of 27%.


TABLE 5:

APPELLANT SATISFACTION SURVEY, EXPECTED RESPONSE RATE AND SURVEY YIELD

Number of Veterans

Expected Response Rate

Completed Surveys Expected

44,300

27%

11,782 (telephone)

11,782

25%

2,945 (eSurvey)


2. Describe the procedures for the collection of information, including: Statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose in the proposed justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The appellant satisfaction study will entail simple random sampling of 44,300 appellants who have received decisions from the Board. The Board is using a 95% confidence interval for categorical variables for the survey. There are no unusual procedures that will be required in order to draw a representative sample meeting these criteria.


3. Describe methods used to maximize the response rate and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


The Board will obtain the services of a Contractor to develop, administer, and analyze this survey.


Strategies to Maximize Response Rates


The Board will employ a variety of methods to minimize respondent burden and to maximize survey response rates. This section identified the strategies to be employed to reach these objectives. Each strategy is outlined below.


  • Strategy # 1 to Maximize Response Rates: Using Phone and Web Technology for Ease of Contact and Response


The Contractor will reach out to appellants via telephone. A maximum of 7 attempts will be made on every phone number until the requisite number of surveys has been completed. Attempts will be made during different parts of the day so as to reach the maximum number of appellants.


Appellants who agree and have an email address will receive an email invitation requesting them to participate in the research. This email invitation will include a URL and password.


The sample will be de-duplicated and Veterans that appear in the sample list multiple times will only receive one invitation to take the survey.


The web address that the surveys will be posted on will be included in all of the email notifications.


Both the phone and web-based surveys will be developed with the end user in mind; a user-friendly form will help maximize response rates.


The online survey technology will incorporate several features to maximize response rates and respondent usability. These include a password system, which prevents any one person from completing more than one survey and allows respondents to begin the survey then come back at a later point to finish it.


  • Strategy # 2 to Maximize Response Rates: Maintaining a Toll-Free Survey Hotline


During the period that the survey is in the field, the Contractor will provide and maintain a toll-free telephone line to answer any questions respondents and regional office points of contact may have about the survey (e.g., how to interpret questions and response items, the purpose of the survey, how to get another survey if their copy has been lost/damaged). Project staff will be available to answer telephone calls during regular business hours (8:30 a.m.to 6 p.m. ET). A voice messaging system will be available to receive messages after regular business hours so after-hours calls can be responded to within 24 hours.


Strategy # 3 to Maximize Response Rates: Excluding Questions of a “Sensitive” Nature


None of the questions included in the survey are sensitive, or private in nature, which will encourage compliance.


Strategy # 4 to Maximize Response Rates: Assuring and Maintaining Confidentiality

Survey respondents will be assured that their personal anonymity will be maintained. Upon completion of the field period, the contractor will undertake to destroy any customer information that it has in its possession, in order to ensure that all customer information is kept private to the extent of the law.


Strategy # 5 to Maximize Response Rates: Secure Networks and Systems


The Contractor will have a secure network infrastructure that will protect the integrity of the databases, the survey application, and all associated server resources. The servers must be protected by a strong firewall system and the operations center must be in a secure temperature-controlled environment with video surveillance, where network services are continually monitored by automated real-time programs to ensure the integrity and availability of all critical components. All key servers will be supported by a backup power supply that can continue to run the systems in the event of a power outage. Additionally, the Contractor must be immediately alerted if critical monitor thresholds are exceeded, so that they can proactively respond before outages occur.


Approach to Examine Non-Response Bias


Two types of non-response can affect the interpretation and generalization of survey data: item non-response and unit non-response. Item non-response occurs when one or more survey items are left blank in an otherwise completed, returned questionnaire. In most satisfaction surveys, however, missing data rates on satisfaction items need to be at or above 50% before item non-response negatively impacts the results. This is because satisfaction items (i.e., ratings of knowledge, courtesy, responsiveness, etc.) tend to be highly correlated (r >/=. 70) and therefore are collinear. Unit non-response is non-participation by an individual that was intended to be included in the survey sample and failed to respond to the survey. Unit non-response – the failure to return a questionnaire – is what generally is recognized as survey non-response bias.


Non-response bias refers to the error expected in estimating a population characteristic based on a sample of survey data that are not included or are under-represented within the resulting sample. Stated more technically, non-response bias is the difference between a survey estimate and the actual population value. Two factors determine the degree to which non-response bias will impact the accuracy and reliability of the resulting survey estimate. The first factor is the overall amount of non-response (e.g., overall response rate). Typically, response rates of 25% or better provide resulting survey estimates that are within an acceptable margin of error from which to generalize to the overall population. However, this is not always the case if the second factor is also present. That is, in cases where there are meaningful differences between those who respond and those that do not on the key outcome measures or on other measures that predict or explain the primary outcome measures, than high response rates may not adequately minimize non-response bias.


There are two approaches to tackling the effects of non-response. One is to minimize the chances of non-response at the data collection stage. This may involve introducing measures which aim to maximize the response rate. The other approach is to make statistical adjustments at a survey follow-up stage when all the data is collected. Both approaches are described in the next paragraphs of this section.


Since it is not always possible to measure the actual bias due to unit non-response, there are strategies for reducing non-response bias by maximizing response rates across all types of respondents. In the face of a long-standing trend of declining response rates in survey research (Steeh, 1981; Smith, 1995; Bradburn, 1992; De Leeuw & Heer, 2002; Curtin & Presser, 2005), these strategies include:


  • Contractor will attempt to contact respondents a maximum of 7 times over a 3-day period – varying the call time in the respondents’ time zone to increase the likelihood of completing the survey.

  • Use of an extended survey field-period to afford opportunities to respond for subgroups having a propensity to respond late (e.g., males, young, full-time employed).

  • Use of well-designed questionnaires and the promise of confidentiality.

  • Providing a contact name and telephone number for inquiries.

Employing these strategies to the administration of the survey will be crucial for maximizing high response rates across all respondent types (see Strategies 1-5).


Non-response follow-up analyses can help identify potential sources of bias and can help reassure data users, as well as the agency collecting and releasing the data, of the quality of the data collected. The approach to examining the presence of non-response bias will be conducted as follows:


  • Compare the Demographics of Respondents from the Board appellant satisfaction survey to the Demographics of Non-Respondents from the Board appellant satisfaction survey. To examine the presence of non-response bias, the Board will compare the demographics of responders (i.e., those who responded to the Board appellant satisfaction survey) to the non-responders (i.e., those who did not respond to the Board appellant satisfaction survey).



  • The comparison between responders and non-responders will be made on the following variables for this survey:

    • Region – it is possible that participants from a certain part of the country (i.e., region) may respond to the survey at a higher rate than those who are from another part of the country. Contractor will analyze regions based on converting telephone numbers to zip codes and using those to map regional analysis. In the case of cell phones, (which can be identified by Contractor), respondents will be asked their zip code as part of the survey.

    • Gender – it is possible that participants from a certain gender (i.e., male) may respond at a higher rate than their counterpart. Gender will be recorded in the survey by interviewer. Respondents will only be asked to clarify their gender if interviewer cannot determine.


Based on the steps discussed above, the Board will identify issues with respect to non-response bias for the survey.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions of 10 or more individuals.


No testing is required.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


For this survey, the Board contracted the services of J.D. Power and Associates to administer the survey. The following is a list of the persons involved in the survey.


  • Mrs. Gina Pingitore, PhD, J.D. Power and Associates, 805-418-8043

  • Mr. Greg Truex, MPP, J.D. Power and Associates, 202-383-3511

  • Ms. Tara Wutke, J.D. Power and Associates, 202-383-3707


- 4 -

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorHee-Seung L. Seu
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy