Justification B _Form 10-21089

Justification B _Form 10-21089.doc

Survey of Post-Deployment Adjustment Among OEF and OIF Veterans

OMB: 2900-0727

Document [doc]
Download: doc | pdf

Survey of Post-Deployment Adjustment Among OEF and OIF Veterans


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


1. Provide a numerical estimate of the potential respondent universe and describe any sampling or other respondent selection method to be used. Data on the number of entities (e.g., households or persons) in the universe and the corresponding sample are to be provided in tabular format for the universe as a whole and for each strata. Indicate expected response rates. If this has been conducted previously include actual response rates achieved.


Although it is estimated that approximately 1.6 million troops have served in Operation Enduring Freedom (OEF) or Operation Iraqi Freedom (OIF), we will be sampling from the subgroup of OEF/OIF veterans that have become eligible for VA healthcare.  This subgroup forms the basis of a roster maintained by the VA’s Environmental Epidemiology Service. The numerical estimate of the universe of veterans in this roster is 751,273 as of August, 2007. In order to focus on those service members involved in direct ground combat operations, we will restrict our sample to those that served in the Army or Marines (active duty or Reserves/Guard). Sampling will begin by taking a random sample of 5500 veterans from the roster of OEF/OIF veterans, using information provided by the Environmental Epidemiology Service, to obtain our target sample of at least 4000 respondents.  Males and females will be considered separate populations and will be sampled in equal numbers (2750 each) with the ultimate goal of obtaining at least 2000 male and at least 2000 female respondents.  Given our experience conducting surveys among military samples and the experience of Education Development Center (EDC), our survey research firm, with survey methodology and implementation, we estimate that a 20-minute survey with our population should yield at least a 70 - 75% response rate.


The table below shows the population and sample of OEF/OIF veterans stratified by gender.  As an alternative to a more complex sampling design involving stratification on other variables, adjustment will be made in the regression models for demographic variables such as race, geographic region, and other variables.


Samples of male and female OEF/OIF veterans

                                   

                    Male                                            Female


Population

Sample

Population

Sample

667299

2750

83593

2750


  

The rationale for a target sample size of at least 4000 participants is two-fold.  Survey responses from at least 4000 participants will be sufficient to conduct analyses across a range of combat exposure experiences (broadly defined as including exposure to low, moderate and high levels of combat) and sexual trauma experiences (broadly defined as a range of coercive sexual experiences including, but not limited to, rape) consistent with our primary aims. It is estimated that approximately 30% of deployed male and female OEF/OIF service members will experience low levels of combat (Mental Health Advisory Team [MHAT] IV Operation Iraqi Freedom 05-07 Final Report).  While no previous studies have examined sexual trauma among deployed OEF/OIF personnel, data from an active duty sample suggest that 23% of men and 54% of women will experience some form of unwanted sexual attention, including sexual harassment and sexual assault (Department of Defense Armed Forces 2002 Sexual Harassment Survey Final Report).  However, it is also clear from this research that the most severe forms of combat experiences and sexual trauma experiences will be limited among female or male veterans. For example, in a sample of deployed OEF/OIF personnel, it was estimated that while 33% of men have experienced high levels of combat, only 3% of women have experienced high levels of combat (MHAT Final Report). Similarly, in a non-deployed active duty sample, while 3% of women experienced rape during the preceding year only 1% of men experienced rape during the same timeframe (Armed Forces 2002 Sexual Harassment Survey).  Accordingly, in addition to allowing us to conduct detailed analyses of the more common combat and sexual trauma experiences, a sample of 2000 men and 2000 women will also enable us to conduct statistical analyses among the subgroups that have experienced the most severe, least common forms of these events, without burdening more people than necessary to conduct the analyses proposed in our primary aims. 


Terms of Clearance: Approved consistent with VA changes outlined in supplemental documents submitted to OMB and included in the docket. VA shall report to OMB on response rates achieved and the results of non-response bias analyses conducted upon completion of this survey.


In the initial sampling frame of 6000, 940 potential participants did not have a valid address on file and no additional address information could be identified via location searches (i.e., Intelius, WhitePages.com). Of the remaining 5060 potential participants, 123 were ineligible (e.g., were not deployed as part of OEF/OIF) and 213 declined participation.  An additional 4 participants were removed from the final sample as information from the administrative database used for weighted analyses was unavailable.  The final sample consisted of 2344 participants (1137 male and 1207 female Veterans), representing a response rate of 48.6% after correcting for estimated ineligibility among nonresponders.


In an effort to identify the extent to which this sample approximates the full sampling frame on key demographic characteristics, we compared survey responders to nonresponders on demographic and military characteristics drawn from administrative records data. Differences between responders and nonresponders were small and not meaningful with regard to gender (phi = -0.021), race (Cramer’s V = 0.069), military rank (officer vs. enlisted; Cramer’s V = 0.146), military branch (Cramer’s V = 0.055), and duty status (active duty vs. Reserves/Guard; Cramer’s V = 0.093). The difference between responders and non-responders on age was a small-to-medium effect (Cohen’s d = -0.445), indicating that responders were approximately 4 years older, on average, than non-responders.


  1. Describe the procedures for the collection of information, including:

  • Statistical methodology for stratification and sample selection

  • Estimation procedure

  • Degree of accuracy needed

  • Unusual problems requiring specialized sampling procedures

  • Any use of less frequent than annual data collection to reduce burden


The data collection process will begin with a sampling frame, stratified by gender, comprised of records held by the Department of Veterans Affairs Environmental Epidemiology Service. This service has agreed to use its automated file of OEF/OIF veterans to provide the names, date of birth, social security numbers, last known addresses and telephone phone numbers of possible participants, following the requirements for distribution across gender. The addresses we obtain from Environmental Epidemiology Service will be compared to IRS records to ensure the list of addresses we have are the most current addresses for potential participants.  Education Development Center (EDC), a not-for-profit organization with a history of working with the VA, will provide additional location services by working with an address update service such as Intelius to verify or update address information. Once the master list of names and addresses has been checked, EDC will begin the mail survey methodology procedures outlined below.


EDC is an international not-for-profit organization with more than 325 projects dedicated to enhancing learning, promoting health, and fostering a deeper understanding of the world. For more than four decades EDC has been a pioneer, building bridges among research, policy, and practice. Working closely with clients at the local, state, national, and international levels, EDC brings extensive experience conducting surveys and needs assessments to inform program development, culturally relevant curricula, training materials, interventions, and policy planning. EDC researchers and program developers bring expertise addressing positive mental health; suicide, violence and injury prevention; the prevention of alcohol, tobacco and other drug abuse, as well as a host of related health and education issues. The EDC team is experienced in multiple types of data collection around sensitive and confidential issues, and protocols are in place for monitoring quality assurance, protecting and storing data files, and training staff on human subjects issues.


The survey mailing methodology we will employ has long been accepted to be an effective strategy to obtain a high response rate (Blumberg, H.H., Fuller, C., Hare, A.P. [1974]. Response rates in postal surveys. Public Opinion Quarterly, 38, 113-123; Dillman, D.A. [2007] Mail and Internet Surveys: The Tailored Design Method, Hoboken, NJ: Wiley & Sons; James, J.M. & Bolstein, R. [1990] The effect of monetary incentives and follow-up mailings on the response rate and response quality in mail surveys. Public Opinion Quarterly, 54, 346-361). This method is as follows:


  • Veterans will receive an introductory letter alerting them that they will soon be receiving a survey and that their responses are important and appreciated.

  • A second mailing approximately one week after the first mailing will include the first copy of the survey, a cover explanatory letter and a small incentive. The cover letter will explain the purpose of the study, assure confidentiality, emphasize the voluntary nature of participation, provide a mechanism for “opting out” and no longer receiving mailings, describe risks and benefits, and provide other information relevant to informed consent and protection of human subjects. A completed, returned survey will be considered participant consent. Returned, unopened letters will be used to identify incorrect addresses.

  • Approximately one week after the initial survey, a thank-you/reminder postcard will be sent to all participants.

  • A replacement survey will be sent to non-respondents 2-4 weeks after the original survey was mailed; this mailing will include a cover letter indicating that we have not yet received a returned survey, and urging participants to respond.

  • Finally, participants who have not returned the survey 2-4 weeks after the second survey mailing will receive a third and final copy sent via priority mail.


Data entry will be overseen by EDC and subcontracted to an experienced firm with a proven track record of high quality, timely completion. All data will be double entered and verified. Participants will be instructed not to place their names on their surveys. Responses in the database will be coded only with a participant identification number assigned by EDC; participant contact information for mailings will be stored separately from survey data and securely stored in password protected files. The data entry firm will not have access to this contact information.


Section A.16 above provides detailed information on the approaches to the analysis of data from this survey effort. Collection of the data utilizing this strategy will give the investigators the ability to answer the questions highlighted in the specific aims.


3. Describe methods to maximize response rate and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


Obtaining a good response rate is one of our highest priorities. We’ve taken a number of steps aimed at attaining the highest response rate possible. First, we are aware that survey length can influence participation. In constructing the survey, we have balanced the need to obtain information that will allow us to address our specific aims with the need to keep the survey as short as possible. We have eliminated items not central to research questions, formatted the instrument to make it as easy to complete as possible, and included skip patterns to reduce respondent burden. (Edwards, P., Roberts, I., Clarke, M., DiGuiseppi, C., Pratap, S., Wentz, R., Kwan, I. & Cooper, R. [2007] Methods to increase response rates to postal questionnaires. Cochrane Database of Systematic Reviews, 4.) Second, a cash incentive of $5 will be sent with the first survey mailing. Sending a cash incentive has been shown to double the odds of response and sending an unconditional incentive (i.e., an incentive sent with the initial survey and not based on survey return) has been shown to increase the odds of response by 61% (Edwards et al, 2007). Third, the cover letter and survey will be personalized with each respondent’s name. This small gesture has been shown to increase the likelihood of response by 16% (Edwards et al, 2007). Fourth, as stated above in section 2, a reminder card and up to two additional surveys will be sent to participants to give them multiple chances to participate. This is a widely used and accepted strategy, considered to be one of the most effective methods of increasing response to mail surveys (James, J.M. & Bolstein, R. [1990] The effect of monetary incentives and follow-up mailings on the response rate and response quality in mail surveys. Public Opinion Quarterly, 54, 346-361). Finally, it has also been shown that potential participants are between 2 and 3 times more likely to respond to a questionnaire that is of interest to them, rather than one that is not. Given that this survey will be asking about what is likely to be an event (serving in OEF or OIF), with great relevance to respondents’ lives, the interest in the survey among potential respondents will be great.


While data collection is occurring, there will be regular review and trouble shooting of the process by EDC, so that patterns and problems in mailings and response rates can be detected and quickly addressed. Weekly meetings will be held with the study management staff and EDC to discuss any problems that may arise related to non-response.


Because this study will be using random sampling to select participants from the population of interest and has minimal inclusion/exclusion criteria, our results will be generalizable to the population of veterans studied. The major generalizability issue when conducting a mail survey is non-response bias – that is that those who do not respond to the survey differ from those who do respond. We address this issue by giving careful consideration to survey methodology prior to the start of the study, and ensuring that procedures are in place to obtain the highest response rate possible.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions of 10 or more individuals.


Once approved by VA Boston’s Institutional Review Board for the protection of human subjects, the survey instrument will be administered to a small number of individuals prior to the formal initiation of data collection; refinements then can be made as needed. This pilot testing of the questionnaire will involve 9 or fewer participants.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


All decisions about statistical aspects of the design were made by the Principal Investigators, Dr. Amy Street (857-364-5998), a research psychologist who has experience conducting large-scale studies, including statistical design and analyses, and Jaimie Gradus, MPH (857-364-6688) who has extensive training in both Epidemiology and Biostatistics.


Page 5

File Typeapplication/msword
File TitleBold black = OMB questions
Authorvhacobickoa
Last Modified Byvhacoharvec
File Modified2012-04-17
File Created2012-02-25

© 2024 OMB.report | Privacy Policy