2900-0711 Supporting Statement Pt B

2900-0711 Supporting Statement Pt B.docx

Veterans Benefits Administration (VBA) Loan Guaranty Service Lender Satisfaction Survey

OMB: 2900-0711

Document [docx]
Download: docx | pdf

Supporting Statement Part B for

VBA Loan Guaranty Service Lender Satisfaction Survey

OMB 2900-0711


B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS



1. Provide a numerical estimate of the potential respondent universe and describe any sampling or other respondent selection method to be used. Data on the number of entities (e.g., households or persons) in the universe and the corresponding sample are to be provided in tabular format for the universe as a whole and for each stratum. Indicate expected response rates. If this has been conducted previously include actual response rates achieved.


As noted in part A, this information collection comprises a suite of customer satisfaction surveys of the Veterans Benefits Administration (VBA) Loan Guaranty (LGY) Service. The respective target populations for these surveys are as follows:


  • Survey of Lender Satisfaction with the VA Home Loan Guaranty Process (i.e. Lender Survey): gathers satisfaction data from lending institutions that participated in VBA LGY program during the past fiscal year. [Note that the survey title will include the FY in which the loans were guaranteed; i.e. The FY 2018 Lender Survey will survey lenders who made 12 or more loans in FY 2017].


This section describes the universe of respondents for each survey, and the sampling plan that will be employed to achieve a representative sample for each survey.


Lender Survey


For the Lender survey, useful customer satisfaction data can only be obtained from lenders who are entirely familiar with the VA Home Loan Program. Using the population of all participating LGY lenders would likely yield a large number of inexperienced lenders, and would therefore not serve the purpose of the survey. To ensure useful data, the survey population will be limited to those lenders who have processed 12 VA loans or more in the prior 12 months. All such lenders can be assumed to have the familiarity with the VA Home Loan Program required to provide useful data. Using this threshold also facilitate comparison of findings with previous iterations of the Lender survey, since this criterion for selection was used previously.


The first stage is therefore to identify those lenders who meet the 12-loan criteria from the population of all participating lenders. Based on administrative data from VA, the size of this universe is approximately 1,100. Project resources allow the VA to survey the full census of lenders meeting the 12-loan threshold. Surveying the census of those conducting 12 or more loans, rather than drawing a sample, eliminates concerns regarding sampling error with respect to this particular survey, making this an attractive methodological choice.


The table below displays the universe of qualifying lenders, the expected response rate, and the expected yield of completed Lender surveys. We anticipate a response rate of 35% for FY18, similar to the response rate achieved in the FY09 Lender Survey.


LENDER SURVEY UNIVERSE, EXPECTED RESPONSE RATE AND SURVEY YIELD

No. of Lenders making 12+ VA Loans in FY17

Expected Response Rate

Completed Surveys Expected

1,100

25%

275


2. Describe the procedures for the collection of information, including: Statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose in the proposed justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The Lender survey will be sent to the universe of eligible respondents because these populations are small, and 95% confidence intervals are desired for the statistical estimates of customer satisfaction that are produced.


3. Describe methods used to maximize the response rate and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


LGY will develop, administer, and analyze this set of survey. The development of the survey will reflect the comments, suggestions, and results from previous iterations of each survey


Strategies to Maximize Response Rates


LGY will employ methods to minimize respondent burden and to maximize survey response rates. This section identified the strategies to be employed to reach these objectives. Each strategy is outlined below.


  • Strategy # 1 to Maximize Response Rates: Using Web Technologies for Ease of Response


The Lender survey will be web-based and will be administered online. The web address where the survey is posted will also be included in the mailings indicated below. Lenders will then connect to the appropriate web page and complete the survey. It is reasonably expected that all lending agencies will have computers and Internet connections, the Lender survey will be administered online to maximize the timeliness, efficiency, and response rate of data collection.


The web-based Lender survey will be developed with the end user in mind, with the goal of providing a user-friendly website in which to complete the survey.


The online survey technology will incorporate several features to maximize response rates and respondent usability. These include a password system, which prevents any one person from completing more than one survey. Other features include user-friendly drop-down boxes, internal links to the directions throughout the survey, and internal links to key terms and definitions.


  • Strategy # 2 to Maximize Response Rates: Conduct Cognitive Labs/Pre-testing of Surveys


LGY staff will conduct cognitive labs with three or more test users for the survey to determine whether respondents understand the survey questions and answer choices, as intended. LGY staff will draw a small pool of names from potential participants in the survey for inclusion in the cognitive labs. Cognitive lab participants will be drawn from the same population that will be used for the main study. The contractor will submit the list of potential participants to the VBA for review and approval. Once identified, LGY staff will contact potential participants by telephone and ask them to participate. Cognitive lab sessions will take place in the metropolitan Washington, DC area.


Once the participants have been selected, LGY staff will conduct cognitive lab sessions aimed at identifying needed additions or refinements to the questionnaire. Cognitive labs are one-on-one sessions with potential survey participants, in which respondents are asked to complete the questionnaire while thinking aloud. The primary purpose of these sessions is to gather feedback on survey questions and answer choices to ensure they are easily understood and correctly interpreted. Outcomes of cognitive labs include, but are not limited to: addition or omission of specific questions, changes to wording of questions, clarification of question response options, addition of response options, and changes to ordering of questions.


LGY staff will prepare a summary report of the cognitive testing session for the web version of the Lender customer satisfaction survey. The results of the cognitive labs will be taken into account when revising and finalizing the survey questionnaires.


  • Strategy # 3 to Maximize Response Rates: Maintaining a Survey Hotline


During the period that the surveys are in the field, LGY staff will provide and maintain a telephone line to answer any questions respondents and Regional Office points of contact may have about the survey (e.g., how to interpret questions and response items, the purpose of the survey, how to get another survey if their copy has been lost/damaged). Project staff will be available to answer telephone calls during regular business hours (8:30 a.m.-6 p.m. ET). A voice messaging system will be available to receive messages after regular business hours so after-hours calls can be responded to within 24 business hours.


  • Strategy # 4 to Maximize Response Rates: Excluding Questions of a “Sensitive” Nature


The fact that none of the questions included in the surveys are sensitive, or private in nature, will encourage participation.


  • Strategy # 5 to Maximize Response Rates: Assuring and Maintaining Confidentiality

Respondents for the survey will be assured that their personal anonymity will be maintained. For the Lender survey, each response will be identified by its corresponding ‘Lender ID number’; thus, each response will be attributed to a specific lending agency, not an individual. Respondents will be informed of this fact in the initial pre-notification letter and subsequent survey notification letters.


  • Strategy # 6 to Maximize Response Rates: Secure Networks and Systems


LGY will have a secure network infrastructure that will protect the integrity of the databases, the survey application, and all associated server resources. The servers are protected by a strong firewall system and the operations center is in a secure temperature-controlled location, where network services are continually monitored by automated real-time programs to ensure the integrity and availability of all critical components. All key servers will be supported by a backup power supply that can continue to run the systems in the event of a power outage.


Approach to Examine Non-Response Bias


Non-response bias refers to the error expected in estimating a population characteristic based on a sample of survey data that under-represents certain types of respondents. Stated more technically, non-response bias is the difference between a survey estimate and the actual population value. Non-response bias associated with an estimate consists of two components – the amount of non-response and the difference in the estimate between the respondents and non-respondents. While high response rates are always desirable in surveys, they do not guarantee low response bias in cases where the respondents and non-respondents are very different. Two types of non-response can affect the interpretation and generalizability of survey data: item non-response and unit non-response. Item non-response occurs when one or more survey items are left blank in an otherwise completed, returned questionnaire. Unit non-response is non-participation by an individual that was intended to be included in the survey sample. Unit non-response – the failure to return a questionnaire – is what is generally recognized as survey non-response bias.


There are two approaches to tackling the effects of non-response. One is to minimize the chances of non-response at the data collection stage. This may involve introducing measures which aim to maximize the response rate. The other approach is to make statistical adjustments at a survey follow-up stage when all the data is collected. Both approaches are described in the next paragraphs of this section.


Since it is not always possible to measure the actual bias due to unit non-response, there are strategies for reducing non-response bias by maximizing response rates across all types of respondents. In the face of a long-standing trend of declining response rates in survey research (Steeh, 1981; Smith, 1995; Bradburn, 1992; De Leeuw & Heer, 2002; Curtin & Presser, 2005), these strategies include:


  • Use of notification letters, reminder letters and postcards.

  • Use of novelty in correspondence such as reminder postcards designed in eye-catching colors.

  • Use of well-designed questionnaires and the promise of confidentiality.

  • Providing a contact name and telephone number for inquiries.

Employing these strategies to the administration of the VA LGY Lender Customer Satisfaction Survey will be crucial for maximizing high response rates across all respondent types (see section on maximizing response rates above).


Non-response follow-up analyses can help identify potential sources of bias and can help reassure data users, as well as the agency collecting and releasing the data, of the quality of the data collected. The approach to examining the presence of non-response bias will be conducted in two steps:


    • Step 1: For the survey, compare the demographics of respondents from the previous VA LGY Surveys to the demographics of non-respondents from the previous VA LGY Surveys. To further examine the presence of non-response bias, we will compare the demographics of responders (i.e., those who responded to the VA LGY Surveys) to the non-responders (i.e., those who did not respond to the VA LGY Surveys).

    • Step 2: The comparison between the two groups mentioned above, will be made on the following variables for the survey of lenders:

      • Size of Lender – it is possible that respondents and non-respondents may differ with respect to the size of the lender.

      • Length of Time in Industry – it is possible that respondents and non-respondents may differ with respect to the length of time in the industry.

Based on the two steps discussed above, we will identify issues with respect to non-response bias.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions of 10 or more individuals.


LGY staff will conduct cognitive labs with at least three, but not more than 10 test users for the survey to determine whether respondents understand the survey questions and answer choices, as intended. LGY staff will draw a small pool of names from potential participants in each of the surveys for inclusion in the cognitive labs. Cognitive lab participants will be drawn from the same population that will be used for the main study. Once identified, LGY staff will contact potential participants by telephone and ask them to participate. Cognitive lab sessions will take place in the metropolitan Washington, DC area.


Once the participants have been selected, we will conduct cognitive lab sessions aimed at identifying needed additions or refinements to the questionnaire. Cognitive labs are one-on-one sessions with potential survey participants, in which respondents are asked to complete the questionnaire while thinking aloud. The primary purpose of these sessions is to gather feedback on survey questions and answer choices to ensure they are easily understood and correctly interpreted. Outcomes of cognitive labs include, but are not limited to: addition or omission of specific questions, changes to wording of questions, clarification of question response options, addition of response options, and changes to ordering of questions.


  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Our LGY contact person is Carleton Sea, 202-632-8827.




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDepartment of Veterans Affairs
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy