DRRI_Supporting Statement B - Final_updated_4.2012

DRRI_Supporting Statement B - Final_updated_4.2012.docx

Deployment Risk and Resilience Inventory (DRRI)

OMB: 2900-0730

Document [docx]
Download: docx | pdf

Development of the

Deployment Risk and Resilience Inventory (DRRI)

VA Form 10-21087

OMB 2900-0730



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


1. Provide a numerical estimate of the potential respondent universe and describe any sampling or other respondent selection method to be used. Data on the number of entities (e.g., households or persons) in the universe and the corresponding sample are to be provided in tabular format for the universe as a whole and for each strata. Indicate expected response rates. If this has been conducted previously include actual response rates achieved.



Male

Female

Total

Respondent universe

1,340,526

165,683

1,506,209

Corresponding sample

750

750

1,500


The respondent universe includes all individuals who were deployed to Iraq or Afghanistan since 2001. An estimated response rate of 75% is expected, so an initial sampling frame of 2,000 individuals will yield 1,500 participants (750 male and 750 female participants).


  1. Describe the procedures for the collection of information, including:

  • Statistical methodology for stratification and sample selection

  • Estimation procedure

  • Degree of accuracy needed

  • Unusual problems requiring specialized sampling procedures

  • Any use of less frequent than annual data collection to reduce burden


Statistical methodology for stratification and sample selection

The survey will involve administering DRRI scales to two stratified national random samples of 750 OEF/OIF veterans each. Women and National Guard/Reservist personnel will be oversampled relative to their representation in the population to enhance dispersion in deployment experiences, and thus, provide sufficient levels of individual differences for the psychometric analyses (Nunnally & Bernstein, 1994). Relative to women’s proportion in the population (11% based on DMDC figures; C. Park, personal communication, May 2, 2006), women will be oversampled to yield a 50% female-50% male gender distribution. Relative to the representation of Regular Active Duty and National Guard/Reservists in the population (70% and 30% based on DMDC figures; C. Park, personal communication, May 2, 2006), the full sample will consist of 50% Regular Active Duty and 50% National Guard/Reservists.


Estimation procedure and Degree of accuracy needed

The sample size of 750 participants per wave of data collection was purposefully selected so that there would be sufficient power for each form of data analysis proposed. For both sets of CTT analyses (initial psychometric analyses in Phase I and confirmation of psychometric properties in Phase II) there will be data from 750 participants. According to Nunnally & Berstein (1994), item analyses should proceed using a 10-to-1 respondents-to-items ratio (per construct). This ratio is considered sufficient to achieve stable estimates of item characteristics, especially item-total correlations and internal consistency reliability coefficients. With item sets for DRRI scales ranging from approximately 20 to 35 items, the minimum sample size needed for these analyses is 350 (maximum number of items (35) x 10 = 350 respondents). Therefore, a sample size of 750 per wave of data collection should be more than sufficient for these analyses.


There will also be a sample size of 750 participants available for both sets of the IRT analyses (initial item characteristics based on the first survey and confirmation of item characteristics based on the second survey). Given that a sample size of 500 is considered the minimum acceptable for IRT analyses (Reise & Yu, 1990), it’s anticipated that a sample size of 750 OEF/OIF veterans per phase will be more than sufficient. This sample size will also facilitate the exploration of secondary research questions about military subgroups (e.g., women, Active Duty versus National Guard/Reservist personnel) and other research questions of interest to the research team.


Unusual problems requiring specialized sampling procedures

There are no unusual problems anticipated for the current study. Sampling procedures used to enhance response rates are described in the next section.


Any use of less frequent than annual data collection to reduce burden


This data collection activity will occur one time only (with two versions of the survey instrument administered to separate samples of 750 potential participants).


The number of potential participants completing the opt-out form has been considerably lower than we anticipated (originally set to N=500, and later revised to N=450). This number has been further updated to reflect what we anticipate with respect to the final number of opt-outs (N=180).


With respect to oversampling women to improve the generalizability of findings to women, women were oversampled to yield a 50% female-50% male distribution, as opposed to 25% female-75% male. This change has been reflected.


3. Describe methods to maximize response rate and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


As with any study using a survey technique, there are several potential limitations, most of which center on the ability to achieve acceptable response rates. Several steps will be taken to maximize response rates. Perhaps the greatest constraint is the amount of time that one can reasonably expect a respondent to contribute to a study. In this regard, data collection will be split into two waves to minimize the number of items each participant is asked to complete. Only half of the sample (n = 750), that which is required to ensure adequate power for hypothesis testing, will be asked to complete both DRRI scales and all health measures. Also included in the budget is $30 to offer all potential participants as a token of our appreciation. The estimated time burden does not exceed one hour for either version of the survey, and thus, it is not anticipated that the length of the survey will be a problem.


In addition to survey-length sensitivity, the application of a widely-accepted multi-stage mailing procedure that was used with success in prior research should further enhance response rates. A modification of Dillman’s (2007) and Mangione (1998) well-regarded mail survey procedure will be applied to both waves of data collection. Specifically, for each wave of data collection, a letter will be mailed as an invitation to participate in the study. The letter will explain the purpose of the study, assure the confidentiality of all responses provided, emphasize the voluntary nature of participation, state an estimated time to complete the survey instrument, provide a mechanism to withdraw prior to receiving the questionnaire, emphasize that the interest is in group data and not a particular person’s individual standing, provide information on risks and benefits, and otherwise conform to standards for the protection of human subjects. A postcard that can be returned to indicate that an individual does not want to be contacted again will also be included in this mailing. Approximately two weeks later, all potential participants will receive the assessment package with a cover letter that reiterates the points included in the introductory letter. A cover page detailing all elements of consent will be appended to the beginning of the questionnaire. A brief demographic sheet will also be included to obtain data on background and military characteristics for the purpose of describing the sample and making group comparisons. Consistent with Dillman’s (2007) recommendations for repeated contacts with targeted respondents, a reminder postcard will be mailed two weeks later, followed by a second mailing of the assessment package to non-respondents two weeks after and a final reminder postcard two weeks later. Consistent with evidence that response rates are better when incentives are used, also included in the first mailing of the survey will be $30 as a token of our appreciation. Similar studies involving the administration of mail surveys to military veteran samples have resulted in quite reasonable response rates [i.e., up to 87% (M. Murdoch, personal communication, December 2, 2005)].


The sampling frame for both waves of data collection will be secured from DMDC. This procedure, the use of DMDC for national-level surveys of military and veteran populations, has been employed repeatedly by the research team and colleagues in the National Center for PTSD for studies of female military personnel, Gulf War I veterans, Bosnia veterans, Somalia veterans, and National Guard military personnel. Thus, it is a well-established method for reaching and obtaining the participation of military and veteran samples, and it is believed that it will be effective in gaining a national sample of OEF/OIF veterans for this project.


Once the sampling frame from DMDC is secured, names and social security numbers will be submitted to an Internal Revenue Service (IRS) address search, through a Department of Veterans Affairs Environmental Epidemiology Service (EES) interagency agreement with the IRS. This method will be extraordinarily effective in obtaining valid addresses for the development of the DRRI, and enable the investigators to reach more participants than needed for the study.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions of 10 or more individuals.


Due to timing constraints related to funding, we were limited in our ability to perform the pre-testing and cognitive testing (previous anticipated N=29) as planned. As such, the respondents and corresponding burden hours for pre-testing and cognitive testing have been removed from item #12 on Supporting Statement A.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Dr. Lynda King, a research psychologist at the Women's Health Sciences Division of the National Center for PTSD, and Dr. Daniel King, a research psychologist at the Behavioral Science Division of the National Center for PTSD, were consulted on all statistical aspects of the design. Their work telephone number is (857) 364-4938.


Dr. Dawne Vogt, a research psychologist at the Women's Health Sciences Division of the National Center for PTSD, will be responsible for directing collection and analysis of the data. Her telephone number is (857) 364-5976.


All data will be collected by the research team at the Women's Health Sciences Division of the National Center for PTSD in the VA Boston Healthcare System.



Page 1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleBold black = OMB questions
Authorvhacobickoa
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy