Supporting Statement B.11.3.10

Supporting Statement B.11.3.10.doc

Morbidity Study of Former Marines, Dependents, and Employees Potentially Exposed to Contaminated Drinking Water at USMC Base Camp Lejeune

OMB: 0923-0042

Document [doc]
Download: doc | pdf

Morbidity study of former Marines, dependents, and employees potentially exposed to contaminated drinking water at USMC Base Camp Lejeune

Supporting Statement

Part B

November 2010


Project Officers:


Perri Ruckart, MPH

Epidemiologist, Agency for Toxic Substances and Disease Registry

770-488-3808 telephone

[email protected]

770-488-7187 fax


Frank Bove, ScD

Epidemiologist, Agency for Toxic Substances and Disease Registry

770-488-3809 telephone

[email protected]

770-488-7187 fax



Table of Contents

B. Collections of Information Employing Statistical Methods…………………………..

B. 1. Respondent Universe and Sampling Methods……………………………………..

B. 2. Procedures for the Collection of Information……………………………………...

B. 3. Methods to Maximize Response Rates and Deal with Nonresponse………………

B. 4. Tests of Procedures or Methods to be Undertaken………………………………...

B. 5. Individuals Consulted on Statistical Aspects and Individuals

Collecting and/or Analyzing Data……………………………………………………….

B. Collections of Information Employing Statistical Methods

B.1. Respondent Universe and Sampling Methods

As required by H.R. 4986: National Defense Authorization Act for Fiscal Year 2008, the Agency for Toxic Substances and Disease Registry (ATSDR) must survey all individuals (active duty, civilian, and dependents) who were served by contaminated drinking water at Camp Lejeune who can be identified. Contamination of two drinking water systems at the base began by the mid-1950s and continued until at least February 1985 when the most contaminated wells were shut down. The available data that can be used to identify those who lived or worked at the base include the Defense Manpower Data Center (DMDC) personnel database, the parents and children (who are now adults) included in the 1999-2002 ATSDR survey of 12,598 births who were carried or conceived at Camp Lejeune during 1968-1985, and those who have registered with the United States Marine Corps (USMC) or provided contact information to ATSDR in order to be notified of future health activities at the base.

The computerized personnel databases at the DMDC that can be used to identify active duty Marines and Navy personnel and civilian employees stationed at Camp Lejeune are available starting in 1975 (for active duty Marines and Navy personnel) and 1972 (for civilian employees). The USMC created a notification registry to assist them in their efforts to notify former Marines, dependents, and civilians that lived or worked at Camp Lejeune prior to 1986 about the drinking water contamination. As of September 28, 2009, more than 140,000 individuals have been registered with the USMC; however, there is considerable overlap between the DMDC-identified cohorts, the ATSDR 1999-2002 survey, and the USMC registry.

To improve the credibility of the study, it is necessary to include an external, unexposed comparison group, similar in all respects to the Marines and civilian workers at Camp Lejeune except for exposure to VOC-contaminated drinking water. ATSDR proposes to randomly sample from the DMDC personnel databases approximately 50,000 Marines and 10,000 civilians from those stationed or employed at Camp Pendleton anytime during the period 1975-1985 who were never stationed or employed at Camp Lejeune during the period of drinking water contamination. The size of the proposed sample will ensure that sufficient numbers of unexposed active duty and civilian employees will be available to the study to achieve satisfactory statistical power for the diseases under study. Camp Pendleton was chosen for the comparison population because the base is similar to Camp Lejeune. Camp Pendleton provides training for Marines residing west of the Mississippi while Camp Lejeune provides training for Marines residing east of the Mississippi. Camp Pendleton has toxic waste sites just like Camp Lejeune. The major difference is that Camp Pendleton did not have a contaminated drinking water supply. Additionally, the available personnel records are similar for both bases.

In order to have an unbiased sampling frame, the study population will consist of those identified by computerized databases (i.e., DMDC and the 1999-2002 ATSDR survey) to have lived or worked at Camp Lejeune during the period of drinking water contamination and the comparison sample from Camp Pendleton. The “registered group”, consisting of individuals identified solely by the fact that they registered with the USMC, will not be included in the study population because they possibly constitute a biased sample (e.g., because those who registered may have more health problems and may know they were exposed compared to those who did not register). Instead, those who were identified solely because they registered with the USMC will be analyzed separately, primarily in a descriptive manner, and their self-reported diseases will not be confirmed.

In summary, the health survey will be mailed to the study population consisting of:

  1. active duty Marines and Navy personnel identified from the DMDC computerized personnel database as having been stationed at Camp Lejeune anytime during the period 1975-1985;

  2. civilians identified from the DMDC computerized personnel database as having worked at Camp Lejeune anytime during the period December 1972 to December 1985;

  3. respondents and children (now all adults) in the 1999-2002 ATSDR survey; and

  4. the sample of active duty Marines and civilians from Camp Pendleton.

A locating firm will be used to obtain correct addresses for the study subjects. ATSDR’s goal is to reach a high participation rate (e.g., 65%) using intensive methods involving several mail reminders and a phone reminder. Achieving a response rate of 65% may be a realistic goal because a recent review of 13 health surveys estimated an average response rate of 65% (Nakash et al 2006) and the median response rate for Gulf War related survey research is about 65% (Hotopf and Wessely 2005). However, a mailed survey of Navy active duty women with a 1993 pregnancy that evaluated occupational and environmental exposures and adverse pregnancy outcomes achieved only a 56% response rate among those who were reached by the mailing (Hourani and Hilton 2000). Additionally, the mailed survey of the Millenium Cohort (256,400 sampled from U.S. military personnel) achieved a response rate of about 36% (Ryan et al. 2007). Therefore a more realistic goal for participation rate may be somewhere in the range of 35%-50%.

As required by law, health surveys will be mailed to those who registered with the USMC. Health surveys completed by those who were identified solely because they registered with the USMC (i.e., the “registered group”) will be analyzed separately.

Type of Respondent

Number of entities

Former active duty marines and navy personnel stationed on base any time during June 1975 to December 1985 – Camp Lejeune

210,000

Former civilian workers who worked on base anytime during December 1972 to December 1985 – Camp Lejeune

8,000

Former dependents (now all adults) and former Marines who lived at Camp Lejeune and were identified only through the 1999-2002 ATSDR survey

29,000

Former active duty marines and navy personnel stationed on base any time during 1975-1985 – Camp Pendleton

50,000

Former civilian workers who worked on base any time during 1975-1985 – Camp Pendleton

10,000

“Registered Group”

50,000

Total

357,000


B.2. Procedures for the Collection of Information

Using Dillman’s Tailored Design Method (Dillman 2007), participants will be mailed a personalized pre-notice letter signed by the highest ranking officer of the USMC (see Attachment D) explaining that a survey would be arriving soon and encouraging participation. A personalized letter of invitation (see Attachment E), hardcopy survey (see Attachment C), and a preaddressed stamped return envelope will be mailed one-two weeks after the pre-notice letter; the letter of invitation will also direct participants to a web-based version of the survey if they prefer to answer on-line. An e-mail invitation (see Attachment F) will also be sent when an e-mail address is available. If a study participant is deceased, the survey will be mailed to the next of kin if available.

Within two weeks, a stamped postcard reminder/thank you (see Attachment G) will be sent via U.S. mail as well as an email reminder/thank you (see Attachment H) if possible. A second survey mailed with a letter (see Attachment I) similar to the initial survey mailing and a second email reminder (see Attachment J) if possible will be sent to those participants who have not responded within four weeks after receiving the postcard reminder. This mailing will include a postcard for people who chose not to respond to the survey to indicate the reason(s) for their non-response. Telephone reminders (see Attachment K) will also be conducted if participants have not responded to the survey within two weeks after the second mailing. Registrants only will be mailed pre-notice and invitation letters (see Attachments D and E); the Dillman Total Design Method will not be employed. Data will only be collected one time from each respondent. Informed consent, either hardcopy or electronic, will be obtained from the participants (see Attachment L). Registrants only will have a separate informed consent (see Attachment M).

To address quality control, all electronically entered information obtained from hardcopy surveys will be reviewed for incorrectly entered data. Internal consistency and validity programs will be used to identify and correct coding and data entry errors. Data entry will be verified for accuracy by using software data match features. The web-based survey will include prompts to alert participants if they incorrectly answer or skip questions; drop down boxes that present ranges of possible answers; and electronic skip patterns that automatically skip irrelevant questions.

For a sample size calculation, the values of the alpha error, beta error, and minimum meaningful effect size are selected, and the required sample size is calculated. However, since the number of exposed subjects cannot be increased, and the alpha and beta errors should be set as low as possible, the only parameter that can vary is the meaningful effect size. Table 1 in the protocol (Attachment N) provides estimates of the minimum meaningful effect size (i.e., the incidence rate ratio or “RR”) for various cancers assuming an alpha error of 0.10 (i.e., equivalent to using a 90% confidence interval), a beta error of 0.10 (i.e., 90% statistical power), and various estimates of exposure prevalence in the study population. The expected incidences of the cancers in the unexposed group are based on the age-specific 1999-2004 U.S. cancer incidence rates (all genders and race/ethnicity groups combined) from the National Program of Cancer Registries and estimates of the person-time contributed by the unexposed population to each 5-yr age grouping after a 10-year lag to account for a latency period. The table assumes that the survey is sent to 247,000 from Camp Lejeune and 60,000 from Camp Pendleton and that the overall response rate for the survey is 65%.

B.3. Methods to Maximize Response Rates and Deal with Nonresponse

A review of mailed health surveys concluded that a 60% response rate when surveying the general population is standard for “acceptability” – although achieving this standard requires considerable effort and resources associated with pre-contact, incentives, or reminder postcards or calls (Rosoff et al. 2005). A 1997 review of 321 mail surveys published in medical journals in 1991 estimated an average response rate of about 60%, with surveys of physicians and “non-physicians” having average rates of 54% and 68% respectively (Asch et al 1997). A more recent review of 13 health surveys estimated an average response rate of 65% (Nakash et al 2006). Several mailed surveys have been conducted of military personnel. The median response rate for Gulf War related survey research is about 65% (Hotopf and Wessely 2005). A mailed survey of pregnancy outcomes among Gulf War veterans achieved a 70% response rate (Kang et al. 2001). However, a mailed survey of Navy active duty women with a 1993 pregnancy that evaluated occupational and environmental exposures and adverse pregnancy outcomes achieved only a 56% response rate among those who were reached by the mailing (Hourani and Hilton 2000). Finally, the mailed survey of the Millenium Cohort (256,400 sampled from U.S. military personnel) achieved a response rate of about 36% (Ryan et al. 2007). The goal of achieving a response rate of 65% was recommended by a March 2008 expert panel of epidemiologists. However, a response rate of 40% seems more realistic based on recent surveys. To achieve a high response rate, ATSDR will use intensive methods associated with Dillman’s Tailored Design Method for mailed surveys.

An introductory letter signed by the highest ranking officer of the USMC is likely to increase participation. A personalized letter of invitation, hardcopy survey, and a preaddressed stamped return envelope will be mailed one-two weeks after the pre-notice letter; the letter of invitation will also direct participants to a web-based version of the survey if they prefer to answer on-line. An e-mail invitation will also be sent when an e-mail address is available. Within two weeks, a stamped postcard reminder/thank you will be sent via U.S. mail as well as an email reminder/thank you if possible. A second survey mailed with a letter similar to the initial survey mailing and a second email reminder if possible will be sent to those participants who have not responded within four weeks after receiving the postcard reminder. Telephone reminders will also be conducted if participants have not responded to the survey within two weeks after the second mailing. Nonresponse bias can be assessed by comparing early and late responders.

Even though intensive methods will be used to increase participation rates and convert non-responders, non-response bias is still a concern. To partly address the issue of non-response bias, the study will 1) include only those identified a priori from the DMDC personnel databases and the ATSDR 1999-2002 survey; 2) use Dillman’s Tailored Design Method for mailed surveys and 3) a letter signed by the highest ranking USMC officer to encourage participation in the study. However, even a high participation rate will not be sufficient to rule out possible biases due to non-response. Therefore, sensitivity analyses will be conducted to assess the likelihood and magnitude of potential selection (or non-response) biases.

Initially, the sensitivity analyses will compare those who participate and those who do not on variables available from the personnel databases and family housing databases to identify risk factors associated with response. Next, participation rates will be stratified by several factors including exposure grouping (Camp Lejeune exposed, Camp Lejeune unexposed, Camp Pendleton), a categorical variable for duration of exposure, rank/pay grade (e.g., officer vs. enlisted), by subgroup-Marine base stratum (marines/civilian employees/dependents at Camp Lejeune; marines/civilian employees at Camp Pendleton), and other demographics (e.g., age, race/ethnicity, sex, education level). Participation rate will be defined as the number of completed surveys divided by the total number of sampled individuals for whom current address is available. Logistic regression analyses will also be conducted to identify predictors of response/non-response and early/late response (Steffen et al. 2008).

B.4. Tests of Procedures or Methods to be Undertaken

To determine the optimum length of the survey instrument that still answers the research questions of interest, the survey was pilot tested on five volunteers. The findings of this pilot testing were that the average length of time to complete the survey was 45 minutes and that some of the skip patterns needed to be changed.

An expert panel of four to six scientists with extensive expertise in epidemiological studies of cohorts and/or health survey research involving mailed surveys will be assembled by the contractor and will meet quarterly until the study is completed. ATSDR, the USMC/Department of Navy (DON), and the ATSDR Camp Lejeune Community Assistance Panel (CAP) will nominate candidates for the expert panel. Panel members must have no financial conflict of interest.

The panel will evaluate the ongoing progress of the first phase of the morbidity study – the mailing of the health surveys and the resulting participation rates for the cohorts. The panel will also consider the power calculations and evaluate the results of the sensitivity analyses. Based on the power calculations, the progress of the first phase, and the sensitivity analyses, the panel will make recommendations concerning how to proceed with the rest of the study. ATSDR will take in to account the panel’s recommendations in determining how to proceed with the completion of the study. The first phase will continue until all efforts to increase participation (including phone contact reminders) are exhausted, as specified in B.2. Procedures for the Collection of Information.

It is likely that no single piece of evidence or specific analysis will be sufficient to provide the basis for the panel’s recommendations. For example, selection bias in the morbidity study is possible even with a high participation rate (≥65%), while a low participation rate may have minimal selection bias (Groves 2006; Galea and Tracy 2007). Moreover, published mail survey studies have widely varying response rates which are likely due to differences in population surveyed and by survey administration methods. In the early 1990s, a 60% response rate for mail surveys was suggested as a “standard for acceptability” (Evans et al. 2004). One review of 13 mailed health surveys conducted prior to 2005 estimated an average response rate of 65% (Rosoff et al. 2005; Nakash et al. 2006). A recent meta-analysis of 39 mailed surveys obtained an average response rate of 45% with a range of response rates of 10% to 89% (Shih and Fan 2008). Given that recent mailed health surveys of military populations have achieved response rates of between 30% and 40% (Kang et al. 2009; Ryan et al. 2007), a more realistic goal for the study may be to achieve a participation rate of at least 40%.

If the decision is made to proceed with the rest of the study, including the the medical records confirmation and the data analyses, then additional sensitivity analyses will be performed to assess selection bias. For example, the likelihood and magnitude of selection bias can be indirectly assessed by comparing exposure-disease association measures (i.e., rate ratios and exposure-response trends) for specific, confirmed cancers in the morbidity study with the preliminary results for those cancers in the mortality study of former marines and civilians potentially exposed at Camp Lejeune. Cancers which are not known or suspected of being associated with the drinking water exposures (e.g., colon/rectal, prostate, stomach, and melanoma) would be evaluated. If for several cancers, substantial discrepancies that are not biologically plausible are found between the results of the mortality study and the morbidity study (e.g., for a specific cancer, the mortality study has an SMR close to 100 but the morbidity study has an RR greater than 2.0), then this may be evidence of bias in the morbidity study. However, in addition to selection bias, disease information bias, in particular, under-reporting of diseases by the Camp Pendleton comparison population, could produce discrepancies between the morbidity study and mortality study results. Although substantial under-reporting is not expected for cancers, under-reporting in the Camp Pendleton sample will be evaluated by comparing the incidence of reported, confirmed specific cancers in the Camp Pendleton sample with incidence rates from the Surveillance, Epidemiology and End Results (SEER) program and from a cancer incidence study of veterans (Harris et al. 1989). Underreporting by the Camp Lejeune respondents will be assessed in the same manner.

Another sensitivity approach to evaluate the impact of potential selection bias will be to determine what level of bias would have to be present to explain differences between groups. For diseases having elevated rate ratios (e.g., RRs > 2.0), we will determine the amount of selection bias that would be necessary to produce the observed RRs if the true RR =1 using several different scenarios with the following assumptions:

  • Responders have a higher disease rate than non-responders regardless of exposure status (Tao et al. 2007)

  • Exposed responders have a higher disease rate than exposed non-responders, Camp Pendleton responders, or Camp Pendleton non-responders

Confirming diagnoses will minimize information bias due to over-reporting of conditions. However, confirmation may not be possible for all reported conditions of interest. To assess the extent of information bias due to inability to confirm diagnoses, the percentages of (1) medical record confirmation, (2) medical record disconfirmation, and (3) no available medical record, will be compared between the unexposed and exposed groups for the diseases of interest. In addition, sensitivity analyses will be conducted that include diagnoses for which no confirmation was possible as well as confirmed diagnoses to determine if inclusion of the non-confirmed diagnoses modifies exposure-response relationships. To minimize bias due to underreporting of conditions, (e.g., a problem that might occur among the Camp Pendleton cohorts), the pre-notice letter and the letter accompanying the health survey questionnaire will avoid mentioning the hypotheses under investigation and will not indicate who is considered exposed or unexposed.

Because they are not included in the study, surveys completed by the “registered group” (i.e., those identified solely because they registered with the USMC) will be analyzed separately, primarily in a descriptive manner (i.e., demographics and the percent reporting each disease). In addition, confirmation of reported diseases will not be sought for the participants who are in the “registered group”.

B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The data collection was designed by ATSDR staff. Data collection will be self-administered and conducted via paper surveys or web-based surveys. Data analyses will be conducted by ATSDR staff. The Principal Investigators will include Frank J. Bove, ScD and Perri Zeitz Ruckart, MPH; both are epidemiologists within the Division of Health Studies at ATSDR. An expert panel of epidemiologists who were convened in March 2008 to discuss appropriate methods for future studies of Camp Lejeune populations has agreed to meet as needed to discuss decision points and other methodological issues that arise. The panel members are:

1. Han Kang, Dr. PH
Director, Environmental Epidemiology Service

Department of Veteran Affairs

202-254-0370

[email protected]


2. Kyle Steenland, PhD

Professor, Department of Environmental Health
Rollins School of Public Health
Emory University
404-712-8277

[email protected]


3. Elizabeth Denzell, SD

University of Alabama at Birmingham - School of Public Health

Department of Epidemiology & International Health
Birmingham, AL 35294-0022

205-934-5857

[email protected]


4. Richard Clapp, ScD, MPH

Professor, Environmental Health
Boston University
617-638-4731
[email protected] or [email protected]


5. Kenneth P. Cantor, Ph.D., M.P.H.

Senior Investigator, National Cancer Institute

Executive Plaza South, Room 8106
301-435-4718
[email protected]


6. Maria Schymura, Ph.D

Director, New York State Cancer Registry

518-474-2255

[email protected]


7. Chris Rennix, CIH, Sc.D

Division Officer, Epidata Center Division

Health Promotion and Preventive Medicine Department

Navy and Marine Corps Public Health Center

757-953-0955

[email protected]


The following individuals served as peer reviewers of the protocol; the comments and recommendations provided by these reviewers involved clarifications and more details on data analysis being added to the protocol.


1. Elizabeth Delzell, SD                              

        University of Alabama at Birmingham - School of Public Health

        Department of Epidemiology & International Health
        Birmingham, AL 35294-0022

        (205) 934-5857

        Email:  [email protected]
      

2. Han Kang, Dr. PH
        Director, Environmental Epidemiology Service

        Department of Veteran Affairs

        202-254-0370

        Email:  [email protected]


3. Leslie Stayner, PhD

        Division of Epidemiology and Biostatistics

        University of Illinois Chicago School of Public Health (M/C 923)

        1603 West Taylor St, Room 971

        Chicago, IL 60612

        Email:  [email protected]



References


Asch DM, Jedrziewski MK, Christakis NA. Response Rates to Mail Surveys Published in Medical Journals. J Clin Epidemiol 1997; 50(10):1129-36.


Dillman DA. Mail and Internet surveys: The tailored design method (2nd ed., 2007 update). Hoboken, NJ: John Wiley & Sons, 2007.


Evans BR, Peterson BL, Demark-Wahnefried W. No difference in response rate to a mailed survey among prostate cancer survivors using conditional versus unconditional incentives. Cancer Epidemiology Biomarkers & Prevention. 2004;13:277-278.


Galea S, Tracy M. Participation rates in epidemiologic studies. Ann Epidemiol 2007;17:643-653.


Groves RM. Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly 2006;70:646-675.


Harris RE, Hebert JR, Wynder EL. Cancer risk in male veterans utilizing the Veterans Administration medical system. Cancer 1989;64:1160-1168.


Hotopf M, Wessely S. Can epidemiology clear the fog of war? Lessons from the 1990-91 Gulf War. Int J Epidemiol 2005; 34:791-800.


Hourani L, Hilton S. Occupational and environmental exposure correlates of adverse live-birth outcomes among 1032 US Navy women. J Occup Environ Med 2000; 42:1156-1165.


Kang H, Magee C, Mahan C, Lee K, Murphy F, Jackson L, Matanoski G. Pregnancy outcomes among U.S. Gulf War veterans: A population-based survey of 30,000 veterans. Ann Epidemiol 2001; 11:504-11.


Kang HK, Li B, Mahan CM, Eisen SA, Engel CC. Health of US veterans of 1991 Gulf War: a follow-up survey in 10 years. J Occup Environ Med. 2009;51:401-410.


Nakash RA, Hutton JL, Jorstad-Stein EC, Gates S, Lamb SE. Maximising response to postal questionnaires – A systematic review of randomised trials in health research. BMC Medical Research Methodology 2006; doi:10.1186/1471-2288-6-5.


Rosoff PM, Werner C, Cliff EC, Guill AB, Bonner M, Demark-Wahnefried W. Response Rates to a Mailed Survey Targeting Childhood Cancer Survivors: A Comparison of Conditional versus Unconditional Incentives. Cancer Epidemiol Biomarkers Prev 2005; 14(5):1330-2.


Ryan MAK, Smith TC, Smith B, Amoroso P, Boyko EJ, Gray GC, Gackstetter GD, Riddler JR, Wells TS, Gumbs G, Corbeil TE, Hooper TI. Millennium Cohort: enrollment begins a 21-year contribution to understanding the impact of military service. J Clin Epidemiol 2007; 60:181-91.


Shih TH and Fan X. Comparing response rates from web and mail surveys: a meta-analysis. Field Methods 2008;20:249-271.


Steffen AD, Kolonel LN, Nomura AM, Nagamine FS,Monroe KR, Wilkens LR. The Effect of Multiple Mailings on Recruitment: The Multiethnic Cohort. Cancer Epidemiol Biomarkers Prev 2008;17(2):447–54.


Tao X, Massa J, Ashwell L, Davis K, Schwab M, Geyh A. The World Trade Center clean up and recovery worker cohort study: Respiratory health amongst cleanup workers approximately 20 months after initial exposure at the disaster site. J Occup Environ Med 2007;49:1063-1072.







18


File Typeapplication/msword
File TitleHealth survey of former Marines, dependents, and employees potentially exposed to contaminated drinking water at USMC Base Camp
AuthorPerri Ruckart
Last Modified Bysxw2
File Modified2010-11-04
File Created2010-11-04

© 2024 OMB.report | Privacy Policy