Part B -Veteran Housing Survey - Tracked changes

Part B -Veteran Housing Survey - Tracked changes.doc

Loan Guaranty Service (LGY) Foreclosure Impact Survey - Veterans Recently Separated

OMB: 2900-0754

Document [doc]
Download: doc | pdf

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

1. Provide a numerical estimate of the potential respondent universe and describe any sampling or other respondent selection method to be used. Data on the number of entities (e.g., households or persons) in the universe and the corresponding sample are to be provided in tabular format for the universe as a whole and for each stratum. Indicate expected response rates. If this has been conducted previously include actual response rates achieved.


The respective target populations for the foreclosure impactVeteran Housing sSurvey are is as follows:


  • 1) Veterans who have filed a claim for disability benefits in the last 5 five years (May 2004 - April 2009January 2004 – December 2009)


  • 2) Veterans who have recently separated from service (within the last 12 months, May 2008 - April 2009)



There will be just one survey, but the number and types of questions answered will vary slightly based on the target population.


This section describes the universe of respondents for the survey, and the sampling plan that will be employed to achieve a representative sample for the survey.


1) Veterans who have filed a claim for disability benefits in the last 5 years

Surveying this population will enable LGY to respond to the second element of P.L. 110-389, Section 502. Specifically, Congress requests “An assessment of the effects of any lag or delay in the adjudication by the Secretary of claims of veterans for disability compensation on the capacity of veterans to maintain adequate or suitable housing.”


Several pieces of data can be collected outside the survey (e.g., claim filed date, claim adjudicated date, compensation amountdisability rating), and this will maintain the overall results’ objectivity. However, the effects of delays in the adjudication of compensation benefits on a veteran’s housing situation cannot be determined through existing data; the question must be asked directly.


The eligible survey population will include veterans who filed a disability claim in the last five years, regardless of award decision or rating decision. The eligible population includes veterans making original or reopened claims. It does not include veterans making Dependency and Indemnity Compensation (DIC), pension, or hospitalization claims. The eligible population totals approximately 3.5 million veterans.


Table 3 displays the number of veterans who have filed a claim in the last five years to be surveyed, the expected response rate, and the expected yield of completed surveys. VA anticipates a response rate of 30%.



TABLE 3:

VETERANS FILING CLAIMS WITHIN LAST FIVE YEARS,

EXPECTED RESPONSE RATE AND SURVEY YIELD


Number of Veterans

Expected Response Rate

Completed Surveys Expected

3,000

30%

900




2) Veterans who have recently separated from service (within the last 12 months)

Surveying this population will enable LGY to respond to the first element of P.L. 110-389, Section 502: “A general assessment of the income of veterans who have recently separated from the Armed Forces.” There is currently no data detailing the current income levels of this veteran population, nor how these veterans’ income levels affect their housing situations.


The eligible population will include all service members who separated from active duty or reservist service in the last 12 months (taking into account all reasons for separation except death). There are an estimated 245,000 veterans who separated in the last 12 months. VA expects a response rate of 30%.


Table 4 displays the sampling frame and the expected response rates and the expected yield.






2. Describe the procedures for the collection of information, including: Statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose in the proposed justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The Foreclosure ImpactVeteran Housing Survey will entail stratified random sampling of 6,0003,000 veterans across the two population segments outlined in Section 1. VA is using a 95% confidence interval for categorical variables for all surveys. There are no unusual procedures that will be required to draw a representative sample meeting these criteria.



3. Describe methods used to maximize the response rate and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


LGY has obtainedwill obtain the services of a contractor to develop, administer, and analyze this set of surveys.


Strategies to Maximize Response Rates


LGY will employ methods to minimize respondent burden and to maximize survey response rates. This section identified the strategies to be employed to reach these objectives. Each strategy is outlined below.


  • Strategy # 1 to Maximize Response Rates: Using Web Technologies for Ease of Response


Veterans will have the option to complete the Foreclosure ImpactVeteran Housing Survey on paper or via the web-based form. The web address that the surveys will be posted on will be included in all of the mailing notifications as indicated below. This initial notification will include a cover letter and a URL and password. LGY expects most surveys will be administered online, which will maximize the timeliness, efficiency, and response rate of data collection.


Both the paper and web-based surveys will be developed with the end user in mind; a user-friendly form will help maximize response rates.


The online survey technology will incorporate several features to maximize response rates and respondent usability. These include a password system, which prevents any one person from completing more than one survey and allows respondents to begin the survey then come back at a later point to finish it. Other features include user-friendly drop-down boxes, internal links to the directions throughout the survey, and internal links to key terms and definitions.


  • Strategy # 2 to Maximize Response Rates: Using Advance and Follow-Up Mailings to Publicize the Surveys and Encourage Response


LGY will use a 2-step survey and follow-up process to administer the surveys (see Table 7 below). An increase in the overall response rate is the major advantage of using this process. The use of a letter as a follow-up tends to increase the response rate by between 5 and 8 percentage points.









Mailing

Mailing Material


#1


Notification/cover letter w/ URL & password and

paper survey


#2


Reminder notification w/ URL & password





  • Strategy # 3 to Maximize Response Rates: Conduct Cognitive Labs/Pre-testing of Surveys


The contractor will conduct cognitive labs with three test users for the survey to determine whether respondents understand the survey questions and answer choices, as intended. LGY will provide the contractor with lists of potential test users. The contractor shall be responsible for securing the participation of test users, from this list. Prior to user testing, the contractor shall provide LGY staff with a list of the selected test-users.


VA will conduct cognitive lab sessions aimed at identifying needed additions or refinements to the questionnaire. Cognitive labs are one-on-one sessions with potential survey participants, in which respondents are asked to complete the questionnaire while thinking aloud. The primary purpose of these sessions is to gather feedback on survey questions and answer choices to ensure they are easily understood and correctly interpreted. Outcomes of cognitive labs include, but are not limited to: addition or omission of specific questions, changes to wording of questions, clarification of question response options, addition of response options, and changes to ordering of questions.


The contractor will prepare a summary report of the cognitive testing session for the foreclosure impact survey. The results of the cognitive labs will be taken into account when revising and finalizing the survey questionnaires.


  • Strategy # 4 to Maximize Response Rates: Maintaining a Toll-Free Survey Hotline


During the period that the survey is in the field, the contractor will provide and maintain a toll-free telephone line to answer any questions respondents may have about the survey (e.g., how to interpret questions and response items, the purpose of the survey, how to get another survey if their copy has been lost/damaged). Project staff will be available to answer telephone calls during regular business hours (8:30 a.m.-6 p.m. ET). A voice messaging system will be available to receive messages after regular business hours so after-hours calls can be responded to within 24 hours.


Strategy # 5 to Maximize Response Rates: Excluding Questions of a “Sensitive” Nature


None of the questions included in the surveys are sensitive, or private in nature, which will encourage compliance.

Strategy # 6 to Maximize Response Rates: Assuring and Maintaining Confidentiality

Survey respondents for all surveys will be assured that their personal anonymity will be maintained. All hard copy questionnaires will be scannable, and consist of approximately six printed pages, printed back to back with a numeric Litho-Code on the front and back cover. Veterans will be provided unique passwords that will allow the contractor to identify when a respondent has completed the survey and exclude them from further reminder letter or postcards.


Strategy # 7 to Maximize Response Rates: Secure Networks and Systems


The contractor will have a secure network infrastructure that will protect the integrity of the databases, the survey application, and all associated server resources. The servers must be protected by a strong firewall system and the operations center must be in a secure temperature-controlled environment with video surveillance, where network services are continually monitored by automated real-time programs to ensure the integrity and availability of all critical components. All key servers will be supported by a backup power supply that can continue to run the systems in the event of a power outage. Additionally, the contractors must be immediately alerted if critical monitor thresholds are exceeded, so that they can proactively respond before outages occur.


Approach to Examine Non-Response Bias


Non-response bias refers to the error expected in estimating a population characteristic based on a sample of survey data that under-represents certain types of respondents. Stated more technically, non-response bias is the difference between a survey estimate and the actual population value. Non-response bias associated with an estimate consists of two components – the amount of non-response and the difference in the estimate between the respondents and non-respondents. While high response rates are always desirable in surveys, they do not guarantee low response bias in cases where the respondents and non-respondents are very different. Two types of non-response can affect the interpretation and generalization of survey data: item non-response and unit non-response. Item non-response occurs when one or more survey items are left blank in an otherwise completed, returned questionnaire. Unit non-response is non-participation by an individual that was intended to be included in the survey sample. Unit non-response – the failure to return a questionnaire – is what is generally recognized as survey non-response bias.


There are two approaches to tackling the effects of non-response. One is to minimize the chances of non-response at the data collection stage. This may involve introducing measures which aim to maximize the response rate. The other approach is to make statistical adjustments at a survey follow-up stage when all the data is collected. Both approaches are described in the next paragraphs of this section.


Since it is not always possible to measure the actual bias due to unit non-response, there are strategies for reducing non-response bias by maximizing response rates across all types of respondents. In the face of a long-standing trend of declining response rates in survey research (Steeh, 1981; Smith, 1995; Bradburn, 1992; De Leeuw & Heer, 2002; Curtin & Presser, 2005), these strategies include:


  • Use of notification letters, duplicate survey mailings, reminder letters and postcards.

  • Use of novelty in correspondence such as reminder postcards designed in eye-catching colors.

  • Use of an extended survey field-period to afford opportunities to respond for subgroups having a propensity to respond late (e.g., males, young, full-time employed).

  • Use of well-designed questionnaires and the promise of confidentiality.

  • Providing a contact name and telephone number for inquiries.

Employing these strategies to the administration of these surveys will be crucial for maximizing high response rates across all respondent types (see section on maximizing response rates above).


Non-response follow-up analyses can help identify potential sources of bias and can help reassure data users, as well as the agency collecting and releasing the data, of the quality of the data collected. The approach to examining the presence of non-response bias will be conducted as follows:


  • Compare the Demographics of Respondents from the Foreclosure ImpactVeteran Housing Survey to the Demographics of Non-Respondents from the Foreclosure ImpactVeteran Housing Survey. To examine the presence of non-response bias, VA will compare the demographics of responders (i.e., those who responded to the Foreclosure ImpactVeteran Housing Survey) to the non-responders (i.e., those who did not respond to the Foreclosure ImpactVeteran Housing Survey).



The comparison between responders and non-responders will be made on the following variables for this survey:

    • Region – it is possible that participants from a certain part of the country (i.e., region) may respond to the survey at a higher rate than those who are from another part of the country.

    • Gender – it is possible that participants from a certain gender (i.e., male) may respond at a higher rate than their counterpart.

    • Disability rating – it is possible that participants from a certain range of disability ratings may respond at a higher rate than another set of participants from another disability range.

    • Separation date - it is possible that participants who separated from the military at a later date may respond at a higher rate than participants who separated at a later date


Based on the steps discussed above, VA will identify issues with respect to non-response bias for the survey.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions of 10 or more individuals.


The contractor will conduct cognitive labs with three or more test users for the survey, to determine whether respondents understand the survey questions and answer choices, as intended. Working closely with VBA, the contractor will draw a small pool of names from potential participants in each of the surveys for inclusion in the cognitive labs. Cognitive lab participants will be drawn from the same population that will be used for the main study. The contractor will submit the list of potential participants to VBA for review and approval. Once identified, the contractor will contact potential participants by telephone and ask them to participate. Cognitive lab sessions will take place in the metropolitan Washington, DC area.


Once the participants have been selected, VA will conduct cognitive lab sessions aimed at identifying needed additions or refinements to the questionnaire. Cognitive labs are one-on-one sessions with potential survey participants, in which respondents are asked to complete the questionnaire while thinking aloud. The primary purpose of these sessions is to gather feedback on survey questions and answer choices to ensure they are easily understood and correctly interpreted. Outcomes of cognitive labs include, but are not limited to: addition or omission of specific questions, changes to wording of questions, clarification of question response options, addition of response options, and changes to ordering of questions.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The LGY contact persons are:

  • Mike Frueh, 202-461-9521

  • Matthew Jamrisko, 202-461-9323

  • Carleton Sea, 202-461-9523

LGY has contracted the services of JD Power & Associates (JDPA) to administer the survey. JDPA contacts are as follows:

  • Greg Truex, 805-418-8522

File Typeapplication/msword
AuthorHee-Seung L. Seu
File Modified2010-10-05
File Created2010-10-05

© 2024 OMB.report | Privacy Policy