Response to OMB call

VERS.doc

Veterans Employability Survey (VERS)

Response to OMB call

OMB: 2900-0694

Document [doc]
Download: doc | pdf

Mr. Kraemer,


Thank you for the call yesterday to discuss your thoughts on the Employment Histories Survey of Recently Discharged Veterans. Both yours and Brain’s comments were helpful.


During the call, you noted that the assurance of confidentiality description was appropriate, but it did not address the Privacy Act. The Office of Policy and Planning will collect records from which information can be retrieved by name of the individual, so we are required to comply with the Privacy Act of 1974, 5 U.S.C., 552a (2000). Our system of records is number 43VA008, Veterans, Dependents of Veterans, and VA Beneficiary Survey Records. Our system of record was last amended on October 13, 2000 (Vol. 65, No. 199).


During the call, Brian asked about the design effects and response rate that were reported by Mike Battaglia. I shared your questions with Mike and he provided complete answers that I will share with you exactly as he provided them:


Q. Were design effects and response rates taken into account in the power calculations?


A. The pilot survey represents one of the first efforts at the VA to collect longitudinal data on veterans—specifically data on employment experiences—and to learn from the experience to help design future longitudinal data collection efforts.  A two-wave panel sample, with a 12-month time interval, will be used for the pilot study.  The sample design involves equal sample sizes of regular service veterans and National Guard/Reserve veterans, with the objective of obtaining 970 Wave 1 interviews with each group.  Within each group, the sample will be equally split between veterans discharged from December 2005 to January 2006 and December 2004 to November 2005.  This secondary stratification is important in that it will provide samples of recently discharged veterans as well as veterans who were discharged as long as 24 months prior to Wave 1 data collection.  The design for each group is an equal-allocation stratified element sample as contrasted with a proportionate stratified sample design.  Depending on the distribution of the population across the two discharge periods, the sampling variance will be increased compared with a proportionate allocation.


At Wave 2, we hope to complete about 780 interviews with each group, based on our Wave 1 and Wave 2 response rate assumptions.  We determined this sample size by using McNemar's test for the equality of paired proportions in a 2 x 2 contingency table (employed versus not employed at Wave 1 and at Wave 2).  This approach involves testing the significance of the difference in employment status of veterans at the first wave with the second wave, 12 months later.  Although a larger difference may be observed, we assumed a relatively small five percentage point change in employment status for the pilot study.  Using the 0.05 significance level for a two-tailed test, a sample size of 780 interviews is needed for each group at the second wave to achieve 80 percent power.  One aspect of the power calculation for McNemar’s test is the assumed proportion of discordant pairs.  This statistic is the proportion of pairs in the 2 x 2 table that are discordant—that is, employed/not employed and not employed/employed.  We assumed a value of 0.20 for the power calculations, but, of course, one purpose of the pilot study is to determine the proportion of respondents actually changing employment status between the two waves.  A simple random sample of about 620 pairs is required to achieve 80 percent statistical power for McNemar’s test.  We assumed a design effect of roughly 1.25 to calculate the sample size of 780. 


The short answer is that the design effect and the response rates were taken into account in the power calculations.  At the same time, we realize this pilot test is one of the first longitudinal efforts with veterans undertaken by the VA, and we expect the pilot survey to provide considerable information about the accuracy of these assumptions.  If the response rates are lower than expected, the design effects are higher, the proportion discordant pairs is higher, or the difference in the percent employed is smaller, there is always the option of combining the two groups and focusing more of the analysis on the entire sample.


Q. Are the response rate for wave 1 too optimistic and the response rate for wave 2 too pessimistic?


A. We assumed a 20 percent attrition rate between Wave 1 and Wave 2 so that, among those veterans interviewed at Wave 1, 80 percent would also be interviewed at Wave 2.  It is possible that we will achieve a higher response rate at Wave 2 but, given that the proposed effort is a pilot study, a conservative estimate was used.  It is always difficult to estimate ahead of time the number of veterans who will move during the 12 month time period between the two waves.  If the response rate were higher than 80 percent at Wave 2, this increased rate will enhance our ability to detect a five percentage point difference in employment with 80 percent power, even if some of the other assumptions enumerated above do not hold.


For Wave 1 we assumed a 72 percent response rate.  We hope to obtain accurate locating/contact information for 80 percent of the initial sample.  For those sample veterans with accurate locating/contact information, we expect to be able to achieve a high response rate—around 90 percent.  The locating/contact assumption will also be tested in the pilot study.  The DMDC database will provide us with the equivalent of a “last known” address and telephone number.  For some veterans, this information will be their parents’ residence and such individuals will reside at that location or the parents may be able to provide us with new addresses and telephone numbers.  For veterans with spouses, “last known” addresses may be less likely to still be their current residences.  We have proposed an optional task that would allow us to employ well-established tracking procedures to help locate current addresses of those individuals who have moved.  To date, this optional task has not been funded. 


Among those veterans with accurate current addresses and telephone numbers, we think a 90 percent interview completion rate is not unreasonable.  We believe that the $10 incentive for completion of the Wave 1 interview and the use of an advance letter on VA letterhead will be two key aspects of the survey methodology that will make achieving a high interview completion rate possible.

__________________________________  


You also requested that we share reports that we make public from this study. I described four reports that will come from this study, as described in PL 108-454 (211): Ideal Data Inventory (6/15/06), Extant Data Analysis (12/15/06), Private Sector Recommendations (2/22/07), and the Final Report that will include the survey results (6/15/08). Please feel free to request any of these after their completion dates.


If there are any other issues I can address on this study, please let me know.


Thank you,

David



David M. Paschane, Ph.D.

Office of Policy and Planning

U.S. Department of Veterans Affairs

202-273-6784 | 202-256-5763



File Typeapplication/msword
Authorkraemer_j
Last Modified Bykraemer_j
File Modified2007-01-08
File Created2007-01-08

© 2024 OMB.report | Privacy Policy