DoD Supporting Statement Part B_2018_07_20X

DoD Supporting Statement Part B_2018_07_20.DOCX

Prospective Studies of US Military Forces: The Millennium Cohort Study

OMB: 0703-0064

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT – PART B

B.  COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

1.  Description of the Activity

As shown in the below table, the responder universe for the Millennium Cohort Study consists of a probability-based sample of active-duty, Reserve, and National Guard members of the US military, identified through service rosters as of October 1, 2000 (Panel 1), October 1, 2003 (Panel 2), October, 1 2006 (Panel 3), October 1, 2010 (Panel 4), and October 1, 2016 (Panel 5). Individuals invited to participate are not chosen based on location. While most invited individuals reside in the United States, Service members can be stationed or deployed to almost any area of the world. Invitations are mailed to the current postal address of the Service members, regardless of city, state, and country.


Panel

Dates Enrolled

Years of Service at Enrollment

Oversampled Groups

Roster Size (Date)

Number Contacted

Total Enrolled (% of contacted)

1

Jul 2001-Jun 2003

All durations
(cross-section of military population)

Females, National Guard/ Reserves, and prior deployers*

256,400

(Oct 2000)

213,949

77,047 (36%)

2

Jun 2004-Feb 2006

1-2 years

Females and Marine Corps

150,000

(Oct 2003)

122,410

31,110 (25%)

3

Jun 2007-Dec 2008

1-3 years

Females and Marine Corps

200,000

(Oct 2006)

153,649

43,439 (28%)

4

June 2011- Apr 2013

2-5 years

Females and Married

250,000

(Oct 2010)

246,230

50,052(20%)

5

2017-2018

TBD

Females and Married

500,000

(Oct 2014)

--

--

Family Panel 1

June 2011- July 2013

N/A

Males

N/A

22,417

9,921 (44%)

Family Panel 2

2017-2018

N/A

TBD

N/A

--

--



Selected Baseline Characteristics of Millennium Cohort Study Responders Compared with Invited Sample



Panel 1

Panel 2

Panel 3

Panel 4

 

Responders

Invited

Responders

Invited

Responders

Invited

Responders

Invited

 

N= 77,047

N= 256,248

N=31,110

N=150,000

N=43,440

N=200,000

N=50,052

N=250,000

Age (years); mean ± sd

33.8 ± 9.2

31.0 ± 9.0

23.9 ± 5.2

23.1 ± 4.4

24.0 ± 4.2

23.2 ± 3.6

26.4 ± 4.9

25.2 ± 4.1

Years of Service;
mean ± sd

12.1 ± 8.0

10.1 ± 7.7

1.4 ± 0.5

1.4 ± 0.5

2.5 ± 1.2

2.4 ± 1.1

3.7 ± 1.5

3.6 ± 1.3

Sex; N (%)

 

 

 

 

 

 


 

Male

56,415 (73.2)

194,749 (76.0)

19,167 (61.6)

112,139 (74.8)

27,941 (64.3)

152,264 (76.1)

36,331 (72.6)

200,000 (80.0)

Female

20,632 (26.8)

61,499 (24.0)

11,943 (38.4)

37,861 (25.2)

15,499 (35.7)

47,736 (23.9)

13,721 (27.4)

50,000 (20.0)

Service Branch; N (%)

 

 

 

 

 

 


 

Army

36,481 (47.4)

112,573 (43.9)

14,995 (48.2)

61,588 (41.1)

15,798 (36.4)

81,168 (40.6)

22,657 (45.3)

127,663 (51.1)

Navy

13,435 (17.4)

50,114 (19.6)

4,941 (15.9)

24,982 (16.7)

6,746 (15.5)

32,464 (16.2)

7,165 (14.3)

38,437 (15.4)

Marine Corps

3,941 (5.1)

18,446 (7.2)

2,576 (8.3)

30,000 (20.0)

6,802 (15.7)

50,000 (25.0)

4,742 (9.5)

35,837 (14.3)

Air Force

22,357 (29.0)

72,110 (28.1)

8,276 (26.6)

30,201 (20.1)

12,918 (29.7)

32,879 (16.4)

14,260 (28.5)

43,846 (17.5)

Coast Guard

833 (1.1)

3,005 (1.2)

322 (1.0)

3,229 (2.2)

1,176 (2.7)

3,489 (1.7)

1,228 (2.5)

4,217 (1.7)

Education; N (%)

 

 

 

 

 

 


 

Less than high school
diploma

4,719 (6.2)

13,030 (7.3)

2,488 (8.0)

14,978 (10.0)

4,443 (10.2)

24,236 (12.1)

6,683 (13.4)

40,598 (16.2)

High school diploma

32,875 (43.1)

97,901 (54.6)

21,370 (68.7)

109,813 (73.2)

30,857 (71.0)

153,068 (76.5)

30,993 (61.9)

174,662 (69.9)

Some college

19,274 (25.3)

41,387 (23.1)

1,894 (6.1)

7,550 (5.0)

987 (2.3)

2,885 (1.4)

2,121 (4.2)

6,483 (2.6)

Bachelor's degree

12,518 (16.4)

17,289 (9.7)

3,521 (11.32)

9,439 (6.3)

5,111 (11.8)

12,802 (6.4)

7,677 (15.3

20,739 (8.3)

Master's/PhD degree

6,820 (9.0)

7,156 (4.0)

770 (2.5)

1,981 (1.3)

745 (1.7)

1,594 (0.8)

1,416 (2.8)

2,996 (1.2)

Unknown

0 (0.0)

2,438 (1.4)

1,067 (3.4)

6,239 (4.2)

1,297 (3.0)

5,415 (2.7)

1,162 (2.3)

4,522 (1.8)


As you can see in the above table the responder universe for the Millennium Cohort Study is generally representative of the Service members invited to join at the time of recruitment. As is typical in survey research, older individuals and female Service members were more likely to enroll in the Cohort. Given that the invited sample for Panel 1 was comprised of a cross-section of Service members that were on active rosters at the time of recruitment, these individuals had more years of service than the subsequently enrolled Panels 2, 3 and 4. In addition, Panel 1 responders had about 2 more years of service compared with the invited sample, which is consistent with the paradigm of older personnel being more likely to respond. Air Force members were more likely to respond across all four panels, while Marine Corps personnel were less likely to respond at each recruitment. This likely reflects the younger age of Service members in the Marine Corps. Also consistent with survey research dynamics is that individuals with higher levels of education are more likely to respond, which is reflected in the Millennium Cohort enrollment; most notably among the Service members with a Bachelor's degree or higher.

Selected Baseline Characteristics of

Family Study Responders Compared with Invited Sample




Panel 1

 

Responders

Invited

 

N= 9,921

N= 28,603

Age (years); mean ± sd

28.5 ± 5.8

27.6 ± 5.1

Years of Service;
mean ± sd

3.8 ± 1.7

3.8 ± 1.6

Sex; N (%)

 

 

Male

1,301 (13.1)

21,984 (76.9)

Female

8,620 (86.9)

6,619 (23.1)

Service Branch; N (%)

 

 

Army

4,581 (46.2)

12,949 (45.3)

Navy/Coast Guard

1,689 (17.0)

4,698 (16.4)

Marine Corps

936 (9.4)

2,649 (9.3)

Air Force

2,715 (27.4)

8,307 (29.0)

Education; N (%)

 

 

Less than high school
diploma

133 (1.3)

51 (0.2)

High school diploma

1,151 (11.6)

5,477 (19.1)

Some college

4,610 (46.5)

15,469 (54.1)

Bachelor's degree

2,854 (28.8)

5,414 (18.9)

Master's/PhD degree

1,173 (11.8)

2,192 (7.7)





























The responder universe for Panel 1 of the Family Study consists of the spouses of married personnel who enrolled in Panel 4 of the Millennium Cohort Study. Panel 2 of the Family Study will consist of the spouses of married personnel who will be invited to enroll in Panel 5 of the Millennium Cohort Study. As you can see in the above table, the responders are generally representative of the invited spouse sample. Despite oversampling for male spouses, female spouses were more than twice as likely to respond. Additionally, spouses with some college were more likely to respond than those with less or more education.




Follow-up Response Rates

Total # of Responders per Cycle (% follow-up rate)

Panel (# in Panel)

2004-2006

2007-2008

2011-2013

2014-2016

Panel 1 (n=77,047)

55,021 (71%) Wave 2

54,790 (71%) Wave 3

51,678 (67%) Wave 4

51,146 (66%)

Panel 2 (n=31,110)

Enrollment of Panel 2

17,152 (55%) Wave 2

15,149 (49%) Wave 3

14,793 (48%)

Panel 3 (n=43,439)


Enrollment of Panel 3

22,071 (51%) Wave 2

19,991 (46%)

Panel 4 (n= 50,052)



Enrollment of Panel 4

27,233 (54%)

Family Panel 1 (n=9,921)



Enrollment of Panel 1

6,789 (68%)



Panel 1 of the Millennium Cohort Study targeted 256,400 Service members of whom 213,949 had valid addresses allowing for study contact attempt. By July 1, 2003, 77,047 (36%) Service members had returned a Panel 1 baseline questionnaire. Among the 77,047 Panel 1 participants, 55,021 submitted the first follow-up survey (71%), 54,790 submitted the second follow-up survey (71%), 51,678 (67%) submitted the third follow-up survey, and51,146 (66%) submitted their fourth follow-up survey. A total of 67,294 (87%) of Panel 1 participants have submitted at least one follow-up survey.


Panel 2 of the Millennium Cohort Study targeted 150,000 Service members of whom 122,410 were determined to have valid addresses allowing for study contact attempt. Of those, 31,110 (25%) returned a Panel 2 baseline questionnaire. Among Panel 2 participants, 17,152 submitted a first follow-up survey (55%), 15,149 (49%) submitted the second follow-up, and14,793 (48%) submitted their third follow-up survey. A total of 21,448 (69%) of Panel 2 participants have submitted at least one follow-up survey.


Panel 3 of the Millennium Cohort Study targeted 200,000 Service members of whom 153,649 had documented valid addresses allowing for study contact attempt. Of those, 43,439 (28%) returned a Panel 3 baseline questionnaire, 22,071 (51%) completed their first follow-up, and 19,991 (46%) submitted their second follow-up survey. A total of 24,241 (56%) of Panel 3 participants have submitted at least one follow-up survey.


Panel 4 of the Millennium Cohort Study targeted 250,000 Service members of whom 246,230 had documented valid addresses allowing for study contact attempt. Of those, 50,052 (20%) returned a Panel 4 baseline questionnaire and 27,233 (54%) have submitted their first follow-up survey.


In 2011 the Family Study targeted 22,417 spouses of Panel 4 Service members who submitted a baseline Millennium Cohort Study questionnaire. Of those, 9,921 (44%) returned a Family Study Panel 1 baseline questionnaire and 6,789 (68%) have submitted their first follow-up survey.


During the 2017-2019 survey cycle, we expect to collect follow-up surveys on Panels 1-4 of the Millennium Cohort Study as well as Panel 1 of the Family Study. We will also attempt to contact an additional 500,000 Service members to enroll 90,000 new participants into a Millennium Cohort Study Panel 5 and 82,500 spouses of the married Panel 5 Service member target population into a Family Study Panel 2.


To date, a total of 1,339 deaths have occurred within the Panel 1 responder group, 242 deaths have occurred within the Panel 2 responder group, 264 deaths have occurred within the Panel 3 responder group, and 134 deaths within the Panel 4 responder group. There are no known deaths within the Family Study Panel 1 responder group.


We estimate that approximately 55,652 of the participants from Panels 1, 2, 3, and 4 who respond to the 2017-2019 survey will no longer be military Service members. Of the spouses who are enrolled in the Family Study, it is estimated that 4,464 will complete a follow-up survey between 2017 and 2019 and will be members of the public. All invited Panel 5 participants will be active duty Service members. For Panel 2 of the Family Study, we estimate that approximately 74,250 of the invited 82,500 spouses will not be Service members at the time of their enrollment. Therefore, for both the Millennium Cohort Study and the Family Study it is estimated that a total of 134,366 participants who complete a survey between 2017 and 2019 will be members of the public.


As of our first OMB approval in September 2003 and throughout the course of the study, proportions of military versus public participants will shift in favor of members of the public as Service members separate from the military.



2.  Procedures for the Collection of Information


The Millennium Cohort Study consists of Service members randomly selected from a large, representative military sample obtained from the Defense Manpower Database Center. A probability-based random sampling process is employed with oversampling for certain sub-groups to ensure enough statistical power to address small subgroups of the population reasonably well in a population-based setting.


In Panel 1, military personnel who had served in Southwest Asia, Bosnia, and Kosovo after 1997 were over-sampled. Additionally, Reserve, National Guard, and female service personnel were over-sampled to assure sufficient statistical power to investigate hypotheses in these smaller subgroups of the military population. In Panels 2 and 3, Marines and female Service members were oversampled to assure sufficient statistical power to investigate hypotheses in these smaller subgroups. In Panel 4, women and married Service members were oversampled to support the enrollment of the concurrent Family Study. In Panel 5, we will oversample for women and married Service members to support the enrollment of a Panel 2 of the Family Study.


The Millennium Cohort Study and Family Study participants are asked to complete a questionnaire every 3 years allowing longitudinal information to be acquired without burdening participants with annual questionnaires.


3.  Maximization of Response Rates, Non-response, and Reliability

Much effort is focused into response and retention rates. Response rates for both the Millennium Cohort Study and Family Study are maximized principally through employment of modified Dillman Mail and Electronic Survey Methods (Dillman, 1978). Recruitment and marketing materials are reviewed at each survey cycle and new strategies are implemented, as appropriate and necessary.


The Millennium Cohort Study and Family Study teams have undertaken numerous efforts to ensure maximum response rates to surveys. The teams consult with survey experts and new approaches are implemented and evaluated during each survey cycle. Each participant is sent paper questionnaires and/or directions to complete the questionnaire online in addition to email requests to participate depending on whether or not s/he has responded to the last questionnaire/email request sent. Each questionnaire mailing/email-request-to-participate is followed approximately two weeks later by a reminder postcard/reminder email. Our approach of contacting the participants over a longer period of time (approximately 18 months) is conducted in case the participant is deployed or has recently moved to allow time for the current address to be updated.


If we receive notice that the address at which we attempt to reach a follow-up participant or newly invited individual is not valid and we do not receive updated address information before the end of the survey cycle, we classify these participants/individuals as non-responders. If an individual is being contacted for enrollment into a new panel and they are classified as a non-responder they are removed from any follow-up contact attempts. At the conclusion of the survey cycle if a participant or newly invited individual has not responded to any invitations and we have not received notice that the address at which we attempted to reach them was not valid, we infer that the questionnaire/email-request was received by the participant/individual and that they have chosen not to participate. They are then classified as non-responders. Any individuals being contacted for enrollment into a new panel that are classified as non-responders are removed from any follow-up contact attempts.

New or “good” addresses are sought from the Internal Revenue Service through contract with the National Institute for Occupational Safety and Health (NIOSH), Defense Manpower Data Center (DMDC), as well as self-reported respondent updates.


Response and retention rates are of utmost importance to Millennium Cohort Study and Family Study investigators. Much effort has been focused on investigation of the type and the use of incentives, wording of invitations, email contacts, and twice yearly postcard and email contacts on Veteran’s Day and Memorial Day (Millennium Cohort Study), or Month of the Military Child and National Military Family Month (Family Study). Past incentives included specially designed t-shirts with study logos, and phone cards. For the 2011-2013 survey cycle effort, incentives included a $5 gift card (Starbucks, Subway, Amazon.com, WalMart), or a Millennium Cohort hat or coin. The Family Study offered $10 survey completion incentives from Starbucks, Subway, or Shutterfly. For the 2014-2016 cycle a combination of pre-incentives and completion post-incentives were used for both the Millennium Cohort Study and the Family Study. During the 2014-2016 cycle, the Millennium Cohort Study conducted an experiment to determine the most effective type of incentive to increase response rates among each panel. Findings from this experiment will be used to inform future data collection cycles.


Invitations and email contacts are specifically designed based on service branch, separation status, and other demographics and are vetted through study team members. Twice yearly contacts using Veteran’s Day and Memorial Day (Millennium Cohort Study), or Month of the Military Child and National Military Family Month (Family Study) serve to keep study participants engaged as well as solicit updated contact information. Additionally, the 2007-2008 enrollment cycle deployed a “welcome to the cohort” campaign that sent welcome cards to enrolling cohort members describing the scope and length of the study.

The first telephone survey of non-responders was performed after the Millennium Cohort Pilot Study in 2000 and focused on survey content and reasons for nonresponse. More recently, in 2005, a telephone survey of non-responders was conducted during the 2004-2006 data collection effort. The submitted report, “MilCohort Nonresponse Study Final Report”, describes the Millennium Cohort Telephone Study of 3,000 non-responders conducted by the Research Triangle Institute (RTI). This sub-study consisted of telephone calls to Panel 1 participants who had not completed a 2004-2006 questionnaire and resulted in a 31% response rate. In addition to asking questions regarding reasons for nonresponse, the phone survey asked about incentives, participant contacts, and collected information on health status. For those with bad phone numbers (e.g., disconnected, wrong number), RTI completed a thorough investigation to obtain up-to-date phone numbers. This additional contact information was given to the Millennium Cohort Study team after the completion of the RTI survey and report.

Chapter 3 of RTI's “MilCohort Nonresponse Study Final Report” discussed telephone questionnaire results and presented recommendations for improving response rates. Overall, recommendations covered six main areas: study materials, panel maintenance, tracking sample members, incentives, telephone prompting, and future nonresponse studies. Many of these suggestions were incorporated in the 2007-2008, 2011-2013 and 2014-2016 survey efforts. Study materials emphasized that participants had the option to complete surveys on the web or by paper, and were tailored to be service specific and based on current military status (i.e., separated/retired or still serving). The continued use of bi-annual postcards and the utilization of the National Change of Address Service through the US Postal Service enhanced panel maintenance and the tracking of sample members. In addition, automated telephone messaging was used among consented participants to encourage them to complete their survey. As a result of these additional marketing strategies during the 2007-2008 cycle, 8,259 Panel 1 participants who did not complete their first follow-up survey, completed their second follow-up survey.

Additionally, during the 2014-2016 survey cycle, the Family Study developed a telephone-based outreach method to retain participants. A maximum of three phone calls were made to all non-responders who provided a phone number on their baseline survey. If the participant was not reached by the third call, a voicemail message was left. The main objectives of the phone calls were to remind the participants about the follow-up survey and gather updated contact information. 746 Family Study participants were called, of which, 167 (22%) completed their follow-up survey, and an additional 41 (5%) have initiated their follow-up survey.

Nearing the end of the 2017-2019 survey cycle, we will conduct a non-response survey among Panels 1-5 non-responders to ask about their impressions of the study, reasons for not participating, and about their general health. These data will be utilized in the design of the next survey cycle to maximize retention. The non-response survey will be developed based on 2017-2019 survey response data from the upcoming survey cycle. Upon finalization of the non-response survey, a separate Information Collection Request will be submitted to OMB for approval.

The previous terms of clearance required that the Millennium Cohort Program “continually examine potential nonresponse and attrition bias in each wave of this study, particularly for key health-related variables and outcomes. DoD will continue this research and will provide updated results from these ongoing bias analyses to OMB with future submissions”. The current package includes a plan for a sampling weight strategy to address issues associated with potentially nonrandom non-response and attrition. Working with an experienced survey methodologist with expertise in creating and applying survey weights for similar datasets, DoD will develop an appropriate set of weights, and will make them available along with the data (including a documentation of how the weights were developed).

Longitudinal studies, such as the Millennium Cohort Study and Family Study, provide the capability to prospectively analyze relationships between exposures and outcomes. A cohort study of military Service members and Veterans over an extended period of time provides the unique opportunity to examine the temporal relationships between service-related experiences, including deployment, with subsequent health and behavioral outcomes, which is not possible using time-series or cross-sectional samples. However, in longitudinal studies, nonresponse and non-random attrition is a potential source of bias. Analysis based on multi-wave panel studies can be heavily compromised by non-random sample attrition. While Millennium Cohort Study and Family Study participants are given the opportunity to complete follow-up questionnaires regardless of whether they completed the previous follow-up questionnaire, the number of responders who do not participate in each subsequent wave of data collection (wave nonresponse) will most likely accumulate over time, which may undermine the precision of any research undertaken using such samples. Unless nonresponse is random, attrition may lead to bias, as there are important factors that influence response propensity. Attrition is often correlated with observable characteristics such as age, education, health, and economic well-being, as well as other unknown or unobserved factors. This non-random attrition can result in samples that include only a selected group of individuals over time, which can bias estimates since the nonresponse can often be associated with the variables of interest. However, non-random attrition does not necessarily lead to attrition bias. Attrition bias is model-specific and, as previous studies have shown, biases might be absent even if attrition rates are high.


The Millennium Cohort Study team has previously examined potential bias from initial enrollment and attrition. Demographic data on all invited personnel have been examined to determine differences in distributions among responders and non-responders. While overall the investigations have demonstrated the responders to be demographically representative of the invited sample (Ryan et al. 2007), some factors have been found to be associated with greater likelihood of enrollment, including some demographic and military characteristics. While some of these differences are significant, most of them are quite small and are similar to patterns found in other surveys of military and non-military populations such as MIDUS and NHANES. These investigations have informed our statistical weighting techniques. Similar to other national studies using probability sampling and weighting to the Current Population Survey, the Millennium Cohort Study weights the study to the entire US military population using DMDC records. In addition to creating sampling weights, principled missing data techniques will be used to mitigate bias that may have emerged from attrition. Details on these imputation procedures are described below in sections A and B. Further examination found few health differences between Millennium Cohort Study responders and non-responders when comparing healthcare utilization preceding study invitation (Wells et al. 2008; Horton et al. 2013). In addition, nonresponse to the follow-up questionnaires has not resulted in any appreciable biases as reflected by comparing measures of association for selected outcomes, including PTSD, depression, and eating disorders, using complete case and inverse probability weighted methods (Littman et al. 2010). Please see the tables below for summaries of these investigations.























Selected Demographic Characteristics from Ryan, et al 2007; Millennium Cohort: Enrollment Begins at 21-year Contribution to Understanding the Impact of Military Service



Characteristics

Responders*

(n = 77,047)

Invited Cohort*

(N = 256,400)


N (%)

%

Sex



Male

56,415 (73.2)

76.0

Female

20,632 (26.8)

24.0

Age (years)



17-24

14,559 (18.9)

30.8

25-34

27,083 (35.2)

35.4

35-44

25,400 (33.0)

25.1

>44

9,975 (13.0)

8.6

Service Branch



Army

36,481 (47.4)

44.0

Navy & Coast Guard

14,268 (18.5)

20.8

Marine Corps

3,941 ( 5.1)

7.2

Air Force

22,357 (29.0)

28.1

Education



Less than HS

4,722 ( 6.1)

7.6

HS diploma or less

32,957 (42.8)

50.4

Some college

19,655 (25.5)

23.6

Bachelor’s degree

12,722 (16.5)

11.6

Graduate school

6,986 ( 9.1)

5.4

Paygrade



Enlisted

59,318 (77.0)

84.6

Officer

17,729 (23.0)

15.4

*Responders and Invited Cohort members were from panel 1 of the Millennium Cohort Study’s baseline enrollment survey (2001-2003).


Selected Demographic Characteristics from Wells, et al 2008; Prior Health Care Utilization as a Potential Determinant of Enrollment in a 21-year Prospective Study



Characteristics

Responders*

(n = 21,067)

Non-responders*

(N = 47,036)


N (%)

N (%)

Sex



Male

15,143 (71.9)

35,304 (75.1)

Female

5,924 (28.1)

11,732 (24.9)

Age (years)



17-24

3,534 (16.8)

15,520 (33.0)

25-34

4,721 (22.4)

12,988 (27.6)

35-44

7,034 (33.4)

11,067 (23.5)

45

5,778 (27.4)

7,461 (15.9)

Service Branch



Army

7,794 (37.0)

15,115 (32.1)

Navy & Coast Guard

5,230 (24.8)

12,770 (27.2)

Marine Corps

1,620 ( 7.7)

5,091 (10.8)

Air Force

6,423 (30.5)

14,060 (29.9)

Education



HS diploma or less

10,974 (52.1)

29,459 (62.6)

Some college

5,842 (27.7)

12,745 (27.1)

Bachelor’s degree

2,134 (10.1)

2,842 ( 6.0)

Graduate school

2,117 (10.1)

1,990 ( 4.2)

Paygrade



Enlisted

17,500 (83.1)

43,328 (92.1)

Officer

3,567 (16.9)

3,708 ( 7.9)

*Response and non-response was evaluated among Active duty personnel who were not deployed to a combat area during the year prior to enrollment invitation for the first panel of the Millennium Cohort Study (2001-2003).
















Selected Demographic Characteristics from Horton, et al 2013; The Impact of deployment experience and prior healthcare utilization on enrollment in a large military cohort study


Panel 2 (2004-2006)

Panel 3 (2007-2008)



Characteristics

Responders

(n = 31,110)

Non-responders

(N = 118,393)

Responders

(n = 43,440)

Non-responders

(N = 156,231)


N (%)

N (%)

N (%)

N (%)

Sex





Male

19,167 (61.6)

92,614 (78.2)

152,029 (76.1)

27,941 (64.3)

Female

11,943 (38.4)

25,779 (21.8)

47,642 (23.9)

15,499 (35.7)

Age (years)





17-20

9,317 (30.0)

40,055 (33.8)

6,005 (13.8)

29,762 (19.0)

21-22

6,693 (21.5)

28,973 (24.5)

13,916 (32.0)

60,243 (38.6)

23-24

5,248 (16.9)

19,688 (16.6)

9,108 (21.0)

32,441 (20.8)

>24

9,852 (31.7)

29,677 (25.1)

14,411 (33.2)

33,785 (21.6)

Service Branch





Army

14,995 (48.2)

46,447 (39.2)

15,798 (36.4)

65,269 (41.8)

Navy & Coast Guard

5,263 (16.9)

22,847 (19.3)

7,922 (18.2)

27,971 (17.9)

Marine Corps

2,576 ( 8.3)

27,363 (23.1)

6,802 (15.7)

43,140 (27.6)

Air Force

8,276 (26.6)

21,736 (18.4)

12,918 (29.7)

19,851 (12.7)

Education





Some college or less

25,752 (82.8)

106,183 (89.7)

36,287 (83.5)

143,632 (91.9)

Bachelor’s or higher

4,291 (13.8)

7,056 ( 6.0)

5,856 (13.5)

8,488 ( 5.4)

Unknown

1,067 ( 3.4)

5,154 ( 4.4)

1,297 ( 3.0)

4,111 ( 2.6)

Paygrade





Enlisted

27,482 (88.3)

112,366 (94.9)

38,455 (88.5)

149,509 (95.7)

Officer

3,628 (11.7)

6,027 ( 5.1)

4,985 (11.5)

6,722 ( 4.3)



However, although previous work has indicated minimal bias has been introduced into the Millennium Cohort Study due to initial and follow-up nonresponse, continued investigation of factors leading to non-random attrition is key for adjusting for nonresponse and producing unbiased estimates. Given the evolving complexity of the Millennium Cohort Study methodology, we consulted with methodologists experienced in modern techniques for handling missing data. Based on their recommendations, cutting-edge principled missing data statistical solutions have been introduced to minimize nonresponse bias in descriptive analyses and the creation of population estimates. In particular, principal component auxiliary (PCAUX) variables (Howard, Rhemtulla, & Little, 2015), a recent advance in principled missing data treatments, are being employed. Our handling of nonresponse bias will employ a combination of weighting schemes as well as principled missing data treatments such as multiple imputation (MI) and full-information maximum likelihood (FIML).


Principled missing data techniques offer several advantages over panel weighting schemes. They are able to address item-nonresponse as well as wave-nonresponse. Panel weights can be created to adjust for attrition (or respondents who otherwise miss an entire wave). In contrast, techniques such as multiple imputation can also address item-nonresponse by respondents present in a wave but failing to answer some items. Weighting in multivariable analyses is also debatable because only a few variables, usually demographic, must be selected to create the weights, and it is plausible that the ideal variables to create weights would vary across outcomes. For example in longitudinal analyses, information on an outcome such as illness in a prior wave is plausibly much more informative than demographic variables from the baseline. Principled missing data treatments avoid this limitation.


PCAUX methodology solves issues of model complexity and computational limitations that have previously limited the use of principled missing data treatments in large/longitudinal datasets. The ideal missing data model is inclusive of all variables in the dataset that might be informative about missing data patterns. Variables included in the missing data model that are not of interest to the analytic model are called auxiliary variables, and can greatly reduce bias and increase power (Collins, Schafer, & Kam, 2001; Graham, 2009). However there is a tension between fully-inclusive missing data models and practical limitations on computation. By extracting a limited number of principal components (typically 10 to 13) to employ as auxiliary variables, inclusive information is provided with a parsimonious number of auxiliary variables. In addition to serving as informative auxiliary variables in FIML analyses, PCAUX variables can also be used as the predictor variables when generating multiple imputations.


The Millennium Cohort Study team has previously calculated response rates, created sample and design weights, and has conducted initial response bias investigations for each panel as well as a few select longitudinal response bias investigations.


Moving forward, the Millennium Cohort Study team will obtain PCAUX variables for each cohort, as well as multiple imputations of each cohort to insure unbiased estimates of population parameters. (These procedures will be reapplied systematically to each cohort after each additional wave of data collection is completed to use newly provided data to better inform previous waves’ missingness and address item and survey nonresponse in the newly collected data.)


A. Procedures to Investigate Nonresponse for Millennium Cohort Study Panels at Baseline:

1. Response rates will be calculated using standard formulas using OMB Standards and Guidelines for Statistical Surveys (2006). For example:

  • Unweighted unit response rates (RRU) as the proportion of those that were eligible for the survey at baseline that responded.

2. Nonresponse (declined participation) bias will be estimated and described by comparing responders to non-responders based on variables available from electronic personnel files available on all Service members via Defense Manpower Data Center. These variables will include factors such as demographic characteristics, deployment histories, and medical/healthcare data. For example, methods to do this will include, but will not be limited to, multivariable logistic regression to describe the propensity to respond or not respond.

3. Baseline-weights will be calculated using the propensity score for response, in order to minimize response bias. These combined sampling and non-participation weights will be used in future studies, where appropriate.


B. Procedures to Investigate Nonresponse and Attrition for Millennium Cohort Study Panels at Follow-Up:

1. Response rates will be calculated using standard formulas using OMB Standards and Guidelines for Statistical Surveys (2006). For example:

  • Longitudinal response rates as the proportion of responders at wave 1 (baseline), who responded at a specific subsequent wave.

2. Nonresponse and attrition bias will be examined and described by comparing responders to non-responders based on previous survey data. These variables will include factors obtained at the baseline survey and beyond, such as behavioral, mental, and physical health factors, as well as demographic characteristics, deployment histories, and medical/healthcare data, and in certain cases response to previous waves. For example, methods to do this will include, but will not be limited to, multivariable multinomial logistic regression to calculate the propensity for response, nonresponse, or death. We will separately model attrition due to death (or attrition by other causes, when appropriate) versus nonresponse since determinants will likely differ between these groups and may be differentially associated with outcomes of interest.

3. PCAUX variables and multiple imputations will be obtained for each cohort after each wave of data collection. PCAUX variables can be used in FIML modeling whenever appropriate, and models may also include baseline-weights. As an alternative, analyses can be conducted across multiply imputed datasets.


The Family Study Team partnered with Abt Associates to similarly create design and nonresponse weights for the Family Study. The use of the weights is critical for generating representative and unbiased statistical estimates for the target population, especially since Panel 4 of the Millennium Cohort Study was oversampled for married and female respondents, sample design weights were initially generated. As with the Millennium Cohort Study, nonresponse adjustments to the Family Study weights were performed to reduce nonresponse bias.


Propensity weighting - a straightforward extension of the propensity score theory of Rosenbaum and Rubin (1983) incorporated into survey nonresponse problems by David et al. (1983) was used to adjust for nonresponse to the Family Study. The availability of military records and survey data for the spouse-paired Millennium Cohort Study Panel 4 Service member provided a unique opportunity to examine potential bias associated with nonresponse in unusual detail. Given the large amount of information available on respondents and non-respondents from the spouse-paired Millennium Cohort Study Panel 4 Service member that could be included in the Family Study response propensity model, we applied an empirical process to guide variable selection (Rizzo, Kalton and Brick 1996, Smith et al. 2001).


References


Collins, L. M., Schafer, J. L., & Kam, C. M. (2001). A comparison of inclusive and restrictive strategies in modern missing data procedures. Psychological Methods, 6(4), 330-351. doi:10.1037//1082-989x.6.4.330

Dong, Y. R., & Peng, C. Y. J. (2013). Principled missing data methods for researchers. Springerplus, 2, 17. doi:10.1186/2193-1801-2-222

David, M., Little, R. J. A., Samuhel, M. E., and Triest, R. K. (1983). Nonrandom nonresponse models based on the propensity to respond, Proceedings of the Business and Economic Statistics Section, American Statistical Association, 168-173.

Dillman DA. Mail and Telephone Surveys: The Total Design Method. New York: Wiley; 1978. xvi, 325.

Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Internet, mail, and mixed-mode surveys: The Tailored Design Method.

Enders, C. K. (2010). Applied missing data analysis: New York : Guilford Press, [2010].

Graham, J. W. (2003). Adding missing-data-relevant variables to FIML-based structural equation models. Structural Equation Modeling, 10(1), 80-100. doi:10.1207/s15328007sem1001_4

Graham, J. W. (2009). Missing Data Analysis: Making It Work in the Real World Annual Review of Psychology (Vol. 60, pp. 549-576). Palo Alto: Annual Reviews.

Groves, R. M. (2006). Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly, 70 (5), 646–675.

Horton JL, Jacobson IJ, Littman AJ, Alcaraz JE, Smith B, and Crum-Cianflone NF. The impact of deployment experience and prior healthcare utilization on enrollment in a large military cohort study. BMC Medical Research Methodology. 2013 Jul 11;13:90.

Howard, W. J., Rhemtulla, M., & Little, T. D. (2015). Using Principal Components as Auxiliary Variables in Missing Data Estimation. Multivariate Behavioral Research. doi:10.1080/00273171.2014.999267

Littman AJ, Boyko EJ, Jacobson IG, Horton JL, Gackstetter GD, Smith B, Hooper TI, Amoroso PJ, Smith TC, for the Millennium Cohort Study Team. Assessing nonresponse bias at follow-up in a large prospective cohort of relatively young and mobile military service members. BMC Medical Research Methodology. 2010 Oct;10(1):99.

Rosenbaum, Paul R. & Rubin, Donald B. (1983). The central role of the propensity score in observational studies for causal effects. Biometrika 70 (1): 41–55.

Rizzo, L., Kalton, G., and Brick, M. (1996). A comparison of some weighting adjustment methods for panel nonresponse. Survey Methodology, 22, 43-53.

Ryan MA, Smith TC, Smith B, Amoroso P, Boyko EJ, Gray GC, Gackstetter GD, Riddle JR, Wells TS, Gumbs G, Corbeil TE, Hooper TI, for the Millennium Cohort Study Team. Millennium Cohort: enrollment begins a 21-year contribution to understanding the impact of military service. Journal of Clinical Epidemiology. 2007 Feb;60(2):181-91.

Smith, P.J., Rao, J.N.K., Battaglia, M.P., EzzatiRice, T.M., Daniels, D. & Khare, M. (2001). Compensating for provider nonresponse using response propensity to form adjustment cells: the national immunization survey. Vital and Health Statistics, 2, pp.133.

Stapleton, L. M., Harring, J. R., & Lee, D. Y. (in press). Sampling weight considerations for multilevel modeling of panel data. In J. R. Harring, L. M. Stapleton, & S. N. Beretvas (Eds.), Advances in multilevel modeling for educational research: Addressing practical issues found in real-world applications. Charlotte, NC: Information Age Publishing, Inc.

Wells TS, Jacobson IG, Smith TC, Spooner CN, Smith B, Reed RJ, Amoroso PJ, Ryan MAK, for the Millennium Cohort Study Team. Prior health care utilization as a determinant to enrollment in a 22-year prospective study, the Millennium Cohort Study. European Journal Of Epidemiology. 2008 Feb;23(2):79-87.




4.  Tests of Procedures


Following preliminary focus group evaluations of the draft Millennium Cohort Study questionnaire conducted in late 1999 with military enlisted and officer groups of less than 10 people, a pilot study was conducted with a 1% sample of military personnel in the spring of 2000 as a means of testing the utility of the instrument. Following this pilot study, corrections were made to produce the final Millennium Cohort Study survey instrument.


Along with non-response testing as described earlier in the report, we will conduct focus group testing. The purpose of the focus group testing is to determine effective strategies to maximize participation rates in populations with similar demographics to Millennium Cohort Study participants, ensure the clarity of our contact materials, including the text and overall visual format of emails, postcards, and survey packets, and lastly, to obtain feedback on the cost-saving initiatives currently offered.


5.  Statistical Consultation and Information Analysis


**************************

For the Millennium Cohort Study:


Rudy Rull, Ph.D.

Principal Investigator and Directorate Head

Military Population Health

Naval Health Research Center

[email protected]

(619) 553-9267

***************************

For the Family Study:


Valerie Stander, Ph.D.

Principal Investigator

Deployment Health Research Department

Naval Health Research Center

[email protected]

(619) 553-7174

**************************


The Research Team at the Department of Defense Center for Deployment Health Research, Naval Health Research Center, San Diego CA 92106-3521

[email protected]

(619) 553-7465






7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authornhrc.admin
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy