0703-0064_ss-b _7.26.2021x

0703-0064_SS-B _7.26.2021.DOCX

Prospective Studies of US Military Forces and Their Families: The Millennium Cohort Program

OMB: 0703-0064

Document [docx]
Download: docx | pdf

Supporting Statement Outline – Sample

NOTE: Complete Part B for Survey ICR Requests

SUPPORTING STATEMENT – PART B

B.  COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

If the collection of information employs statistical methods, the following information should be provided in this Supporting Statement:

1.  Description of the Activity

As shown in the below table, the responder universe for the Millennium Cohort Study consists of a probability-based sample of active-duty, Reserve, and National Guard members of the US military, identified through service rosters as of October 1, 2000 (Panel 1), October 1, 2003 (Panel 2), October, 1 2006 (Panel 3), October 1, 2010 (Panel 4), and June 1, 2020 (Panel 5). Individuals invited to participate are not chosen based on location. While most invited individuals reside in the United States, service members can be stationed or deployed to almost any area of the world. Invitations are mailed to the current postal address of the service members, regardless of city, state, and country.


The responder universe for Panel 1 of the Millennium Cohort Family Study consists of the spouses of married personnel who enrolled in Panel 4 of the Millennium Cohort Study. Panel 2 of the Family Study consists of the spouses of married personnel who were invited to enroll in Panel 5 of the Millennium Cohort Study.

















Panel

Dates Enrolled

Years of Service at Enrollment

Oversampled Groups

Number Invited

Total Enrolled (% of contacted)

Total Members of the Public (% of participants)

1

Jul 2001-Jun 2003

All durations
(cross-section of military population)

Females, National Guard/ Reserves, and prior deployers

213,949

77,047 (36%)

69,233 (90%)

2

Jun 2004-Feb 2006

1-2 years

Females and Marine Corps

122,410

31,110 (25%)

22,172 (71%)

3

Jun 2007-Dec 2008

1-3 years

Females and Marine Corps

153,649

43,439 (28%)

27,793 (64%)

4

June 2011- Apr 2013

2-5 years

Females and married

246,230

50,052(20%)


27,638 (55%)

5

Sept 2020- Aug 2021

1-5 years

Females and married

492,041

43,043(9%) *

0 (0%)

Panel

Dates Enrolled

Years of Service at Enrollment

Oversampled Groups

Number Invited

Total Enrolled (% of contacted)

Total Members of the Public (% of participants)

Family Panel 1

June 2011- July 2013

N/A

Males

22,417

9,872 (44%)

9,872 (100 %)

Family Panel 2

January 2021-August 2021

N/A

Males

194,000*

21,841(11%)*

21,841 (100%)*

*As of this submission, data collection is ongoing. Therefore, the enrolled numbers for MCS Panel 5 and FCS Panel 2 are not final.













a

Selected Baseline Characteristics of Millennium Cohort Study Responders Compared with Invited Sample



 

Panel 1

Panel 2

Panel 3

Panel 4

Panel 5 (through July 14, 2021 )



Responders

Invited

Responders

Invited

Responders

Invited

Responders

Invited

Responders

Invited


 

N= 77,047

N= 256,248

N=31,110

N=150,000

N=43,440

N=200,000

N=50,052

N=250,000

N=43,034

N=492,041


Age (years); mean ± sd

33.8 ± 9.2

31.0 ± 9.0

23.9 ± 5.2

23.1 ± 4.4

24.0 ± 4.2

23.2 ± 3.6

26.4 ± 4.9

25.2 ± 4.1

26.4 ± 4.7

25.4 ± 4.0


Years of Service;

12.1 ± 8.0

10.1 ± 7.7

1.4 ± 0.5

1.4 ± 0.5

2.5 ± 1.2

2.4 ± 1.1

3.7 ± 1.5

3.6 ± 1.3

2.8 ± 1.4

2.8 ± 1.4


mean ± sd


Sex; N (%)












Male

56,415 (73.2)

194,749 (76.0)

19,167 (61.6)

112,139 (74.8)

27,941 (64.3)

152,264 (76.1)

36,331 (72.6)

200,000 (80.0)

30,299 (70.4)

388,013 (78.9)


Female

20,632 (26.8)

61,499 (24.0)

11,943 (38.4)

37,861 (25.2)

15,499 (35.7)

47,736 (23.9)

13,721 (27.4)

50,000 (20.0)

12,735 (29.6)

104,028 (21.1)


Service Branch; N (%)












Army

36,481 (47.4)

112,573 (43.9)

14,995 (48.2)

61,588 (41.1)

15,798 (36.4)

81,168 (40.6)

22,657 (45.3)

127,663 (51.1)

14,840 (34.5)

217,872 (44.3)


Navy

13,435 (17.4)

50,114 (19.6)

4,941 (15.9)

24,982 (16.7)

6,746 (15.5)

32,464 (16.2)

7,165 (14.3)

38,437 (15.4)

4,802 (11.2)

76,766 (15.6)


Marine Corps

3,941 (5.1)

18,446 (7.2)

2,576 (8.3)

30,000 (20.0)

6,802 (15.7)

50,000 (25.0)

4,742 (9.5)

35,837 (14.3)

4,626 (10.8)

61,118 (12.4)


Air Force

22,357 (29.0)

72,110 (28.1)

8,276 (26.6)

30,201 (20.1)

12,918 (29.7)

32,879 (16.4)

14,260 (28.5)

43,846 (17.5)

17,447 (40.5)

127,069 (25.8)


Coast Guard

833 (1.1)

3,005 (1.2)

322 (1.0)

3,229 (2.2)

1,176 (2.7)

3,489 (1.7)

1,228 (2.5)

4,217 (1.7)

1,319 (3.1)

9,216 (1.9)


Paygrade;












N (%)a


Junior Enlisted

20,996 (27.3)

109,721 (42.8)

25,971 (83.5)

133,033 (88.7)

35,437 (81.6)

176,603 (88.3)

32,128 (64.2)

188,729 (75.5)

25,784 (59.9)

346,752 (70.5)


Senior Enlisted

38,300 (49.7)

107,031 (41.8)

1,511 (4.9)

7,256 (4.8)

3,016 (6.9)

11,641 (5.8)

10,041 (20.1)

41,301 (16.5)

8,032 (18.7)

91,998 (18.7)


Officer

17,723 (23.0)

39,496 (15.4)

3,628 (11.7)

9,711 (6.5)

4,985 (11.5)

11,756 (5.9)

7,883 (15.8)

19,970 (8.0)

9,218 (21.4

53,291 (10.8)



a Junior Enlisted personnel include those with paygrade of E00 through E04
Senior Enlisted personnel include those with paygrade of E05 through E09
Officers include all commissioned and warrant officers

As seen in the above table the responder universe for the Millennium Cohort Study is generally representative of the service members invited to join at the time of recruitment. As is typical in survey research, older individuals and female service members were more likely to enroll in the Cohort. Given that the invited sample for Panel 1 was comprised of a cross-section of service members that were on active rosters at the time of recruitment, these individuals had more years of service than the subsequently enrolled Panels 2, 3, 4, and 5. In addition, Panel 1 responders had approximately 2 more years of service compared with the invited sample, which is consistent with the paradigm of older personnel being more likely to respond. Air Force members were more likely to respond across all five panels, while Marine Corps personnel were less likely to respond at each recruitment. This likely reflects the younger age of service members in the Marine Corps. Also consistent with survey research dynamics is that higher ranking individuals are more likely to respond, which is reflected in the Millennium Cohort enrollment; most notably among service members who are officers.


Selected Baseline Characteristics of

Family Study Responders Compared with Invited Sample




Panel 1*

 

Responders

Invited

 

N= 9,921

N= 28,603

Age (years); mean ± sd

28.5 ± 5.8

27.6 ± 5.1

Years of Service;
mean ± sd

3.8 ± 1.7

3.8 ± 1.6

Sex; N (%)

 

 

Male

1,301 (13.1)

21,984 (76.9)

Female

8,620 (86.9)

6,619 (23.1)

Service Branch; N (%)

 

 

Army

4,581 (46.2)

12,949 (45.3)

Navy/Coast Guard

1,689 (17.0)

4,698 (16.4)

Marine Corps

936 (9.4)

2,649 (9.3)

Air Force

2,715 (27.4)

8,307 (29.0)

Education; N (%)

 

 

Less than high school
diploma

133 (1.3)

51 (0.2)

High school diploma

1,151 (11.6)

5,477 (19.1)

Some college

4,610 (46.5)

15,469 (54.1)

Bachelor's degree

2,854 (28.8)

5,414 (18.9)

Master's/PhD degree

1,173 (11.8)

2,192 (7.7)




























*Panel 2 enrollment is still underway. This table will be updated once the current survey cycle closes.




As you can see in the above table, the responders are generally representative of the invited spouse sample. Despite oversampling for male spouses, female spouses were more than twice as likely to respond. Additionally, spouses with some college were more likely to respond than those with less or more education.




Follow-up Response Rates

Total # of Responders per Cycle (% follow-up rate)

Panel (# in Panel)

2004-2006

2007-2008

2011-2013

2014-2016

2019-2021


Panel 1 (n=77,047)

55,021 (71%) Wave 2

54,790 (71%) Wave 3

51,678 (67%) Wave 4

51,146 (66%)

Wave 5

38,356 (52%) *

Wave 6


Panel 2 (n=31,110)

Enrollment of Panel 2

17,152 (55%) Wave 2

15,149 (49%) Wave 3

14,793 (48%)

Wave 4

10,122 (33%) *

Wave 5


Panel 3 (n=43,439)


Enrollment of Panel 3

22,071 (51%) Wave 2

19,991 (46%)

Wave 3

13,402 (31%) *

Wave 4


Panel 4 (n=50,052)



Enrollment of Panel 4

27,233 (54%)

Wave 2

17,119 (34%) *

Wave 3


Panel 5 (n=43,034)*





Enrollment of Panel 5


Family Panel 1 (n=9,872)



Enrollment of Panel 1

6,618 (67%)

Wave 2

3,966 (40%)*

Wave 3


Family Panel 2 (n=21,841)*





Enrollment of Panel 2


*As of this submission, data collection is ongoing. Therefore, the response rate and enrollment for the 2019-2021 survey cycle is not final.



Panel 1 of the Millennium Cohort Study targeted 256,400 service members of whom 213,949 were determined to have valid addresses allowing for study contact attempt. Of those, 77,047 (36%) submitted a Panel 1 baseline questionnaire. Among the 77,047 Panel 1 participants, 55,021 (71%) submitted a first follow-up survey, 54,790 (71%) submitted a second follow-up survey, 51,678 (67%) submitted a third follow-up survey, 51,146 (66%) submitted a fourth follow-up survey, and 38,356 (52%) submitted a fifth follow-up survey. A total of 68,510 (89%) of Panel 1 participants have submitted at least one follow-up survey.


Panel 2 of the Millennium Cohort Study targeted 150,000 service members of whom 122,410 were determined to have valid addresses allowing for study contact attempt. Of those, 31,110 (25%) submitted a Panel 2 baseline questionnaire. Among the 31,110 Panel 2 participants, 17,152 (55%) submitted a first follow-up survey, 15,149 (49%) submitted a second follow-up survey, 14,793 (48%) submitted a third follow-up survey, and 10,122 (33%) submitted a fourth follow-up survey. A total of 22,801 (73%) of Panel 2 participants have submitted at least one follow-up survey.


Panel 3 of the Millennium Cohort Study targeted 200,000 service members of whom 153,649 were determined to have valid addresses allowing for study contact attempt. Of those, 43,439 (28%) submitted a Panel 3 baseline questionnaire. Among the 43,439 Panel 3 participants 22,071 (51%) submitted a first follow-up survey, 19,991 (46%) submitted a second follow-up survey, and 13,402 (31%) submitted a third follow-up survey. A total of 27,397 (63%) of Panel 3 participants have submitted at least one follow-up survey.


Panel 4 of the Millennium Cohort Study targeted 250,000 service members of whom 246,230 were determined to have valid addresses allowing for study contact attempt. Of those, 50,052 (20%) submitted a Panel 4 baseline questionnaire. Among the 50,052 Panel 4 participants, 27,233 (54%) submitted a first follow-up survey and 17,119 (34%) submitted a second follow-up survey. A total of 29,574 (59%) of Panel 4 participants have submitted at least one follow-up survey.


Panel 5 of the Millennium Cohort Study targeted 533,125 service members of whom 492,041 were determined to have valid addresses allowing for study contact attempt. Of those, 43,043 (9%) submitted a Panel 5 baseline questionnaire.


Panel 1 of the Millennium Cohort Family Study targeted 22,417 spouses of Panel 4 service members who submitted a baseline Millennium Cohort Study questionnaire. Of those, 9,872 (44%) submitted a Family Study Panel 1 baseline questionnaire and 6,618 (67%) submitted a first follow-up survey and 3,966 (40%) have submitted a second follow-up survey to date. A total of 7,083 (72%) of Family Study Panel 1 participants have submitted at least one follow-up survey.


Panel 2 of the Millennium Cohort Family Study targeted 194,000 spouses of service members invited to Millennium Cohort Study Panel 5. Of those, 21,841 (11%) have submitted a Family Study Panel 2 baseline questionnaire to date.


To date, a total of 2,517 deaths have occurred within the MCS Panel 1 responder group, 413 deaths have occurred within the MCS Panel 2 responder group, 475 deaths have occurred within the MCS Panel 3 responder group, and 417 deaths within the MCS Panel 4 responder group. To date a total of 13 deaths have occurred within the FCS Panel 1 responder group, and 2 deaths have occurred within the FCS Panel 2 responder group.


We estimate that approximately 214,168 of the participants from Panels 1-5 who respond to the 2023-2024 Millennium Cohort Study survey will no longer be military service members. Of the spouses who are enrolled in the Family Study, it is estimated that 18,089 will complete a Millennium Cohort Family Study follow-up survey between 2023 and 2024 and will be members of the public. Therefore, for both the Millennium Cohort Study and the Millennium Cohort Family Study it is estimated that a total of 232,257 participants who complete a survey between 2023 and 2024 will be members of the public.


As of our first OMB approval in September 2003 and throughout the course of the study, proportions of military versus public participants will shift in favor of members of the public as service members separate from the military.



2.  Procedures for the Collection of Information


The Millennium Cohort Study consists of service members randomly selected from a large, representative military sample obtained from the Defense Manpower Database Center. A probability-based random sampling process is employed with oversampling for certain sub-groups to ensure enough statistical power to address small subgroups of the population reasonably well in a population-based setting.


In Panel 1, military personnel who had served in Southwest Asia, Bosnia, and Kosovo after 1997 were over-sampled. Additionally, Reserve, National Guard, and female service personnel were over-sampled to assure sufficient statistical power to investigate hypotheses in these smaller subgroups of the military population. In Panels 2 and 3, Marines and female service members were oversampled to assure sufficient statistical power to investigate hypotheses in these smaller subgroups. In Panels 4 and 5, women and married service members were oversampled to support the enrollment of the concurrent Family Study.


The Millennium Cohort Study and Family Study participants are asked to complete a questionnaire every 3 to 5 years allowing longitudinal information to be acquired without burdening participants with annual questionnaires.


3.  Maximization of Response Rates, Non-response, and Reliability

Much effort is focused into retention of study participants and maximizing follow-up survey response. Response rates for both the Millennium Cohort Study and Family Study are maximized principally through employment of modified Dillman Mail and Electronic Survey Methods (Dillman, 1978). Recruitment and marketing materials are reviewed at each survey cycle and new strategies are implemented, as appropriate and necessary.


The Millennium Cohort Study and Family Study teams have undertaken numerous efforts to ensure maximum response rates to surveys. The teams consult with survey experts and new approaches are implemented and evaluated during each survey cycle. Each participant is sent paper questionnaires and/or directions to complete the questionnaire online in addition to email requests to participate depending on whether or not s/he has responded to the last questionnaire/email request sent. Each questionnaire mailing/email-request-to-participate is followed approximately two weeks later by a reminder postcard/reminder email. Our approach of contacting the participants over a longer period of time (approximately 18 months) is conducted in case the participant is deployed or has recently moved to allow time for the current address to be updated.


If we receive notice that the address at which we attempt to reach a participant or newly invited individual is not valid and we do not receive updated address information before the end of the survey cycle, we classify these participants/individuals as non-responders. At the conclusion of the survey cycle if a participant or newly invited individual has not responded we infer that the questionnaire/email-request was received by the participant/individual and that they have chosen not to participate. They are then classified as non-responders. Any individuals being contacted for enrollment into a new panel that are classified as non-responders at the end of the survey cycle are removed from any future contact attempts.

New or “good” addresses are sought from the Internal Revenue Service through contract with the National Institute for Occupational Safety and Health (NIOSH), Defense Manpower Data Center (DMDC), as well as self-reported respondent updates. We also utilize the National Change of Address Service through the US Postal Service when a participant’s postal item is returned to the study team with a new or “good” address.


Response and retention rates are of utmost importance to Millennium Cohort Study and Family Study investigators. Much effort has been focused on investigation of the type and the use of incentives, wording of invitations, email contacts, and twice-yearly postcard and email contacts on Veteran’s Day and Memorial Day (Millennium Cohort Study), or Month of the Military Child and National Military Family Month (Family Study). Past incentives included specially designed t-shirts with study logos, and phone cards. For the 2011-2013 survey cycle effort, incentives included a $5 gift card (Starbucks, Subway, Amazon.com, WalMart), or a Millennium Cohort hat or coin. The Family Study offered $10 survey completion incentives from Starbucks, Subway, or Shutterfly. For the 2014-2016 cycle a combination of pre-incentives and completion post-incentives were used for both the Millennium Cohort Study and the Family Study. During the 2014-2016 cycle, the Millennium Cohort Study conducted an experiment to determine the most effective type of incentive to increase response rates among each panel. Findings from this experiment will be used to inform future data collection cycles and are being finalized as a peer reviewed manuscript.


Invitations and email contacts are specifically designed based on service branch, separation status, and other demographic factors which are vetted thoroughly by the study team. Twice-yearly contacts using Veteran’s Day and Memorial Day (Millennium Cohort Study), or Month of the Military Child and National Military Family Month (Family Study) serve to keep study participants engaged as well as solicit updated contact information. Additionally, the 2007-2008 enrollment cycle employed a “Welcome to the Cohort” campaign that sent welcome cards to newly-enrolled cohort members describing the scope and length of the study.

Telephone surveys of non-responders were first performed after the 2000 Millennium Cohort Pilot Study and focused on survey content and reasons for non-response. In 2005 telephone surveys of non-responders were conducted for the corresponding 2004-2006 survey cycle. The submitted report, “MilCohort Nonresponse Study Final Report”, describes the Millennium Cohort Telephone Study of 3,000 non-responders conducted by the Research Triangle Institute (RTI). This sub-study consisted of telephone calls to Panel 1 participants who had not completed a 2004-2006 questionnaire. In addition to asking questions regarding reasons for non-response, the phone survey asked about incentives, participant contacts, and collected information on health status. For those with bad phone numbers (e.g., disconnected, wrong number), RTI completed a thorough investigation to obtain up-to-date phone numbers. This additional contact information was given to the Millennium Cohort Study team after the completion of the RTI survey and report.

Chapter 3 of RTI's “MilCohort Nonresponse Study Final Report” discussed telephone questionnaire results and presented recommendations for improving response rates. Overall, recommendations covered six main areas: study materials, panel maintenance, tracking sample members, incentives, telephone prompting, and future non-response studies. Many of these suggestions continue to be incorporated in the ongoing survey efforts. For example, study materials encourage online participation but do mention the option to complete surveys via paper, contacts are tailored to be service branch specific and based on current military status (i.e., separated/retired or still serving). Furthermore, the continued use of bi-annual postcards and the utilization of the National Change of Address Service through the US Postal Service enhanced panel maintenance and the tracking of study participants.

Additionally, during the 2014-2016 survey cycle, the Family Study developed a telephone-based outreach method to retain participants. A maximum of three phone calls were made to non-responders who provided a phone number on the baseline survey. If the participant was not reached by the third call, a voicemail message was left. The main objectives of the phone calls were to remind the participants about the follow-up survey and gather updated contact information. Of the 746 Family Study participants called, 167 (22%) completed their follow-up survey, and an additional 41 (5%) initiated their follow-up survey.

Near the end of the 2023-2024 survey cycle, the Millennium Cohort Study will conduct a participant feedback survey among Panel 1-5 responders and non-responders that was designed to assess a variety of factors including those that have motivated and/or discouraged Millennium Cohort participants to stay connected with the study. This data will be utilized in the design of the future surveys and survey operations to maximize retention and increase participation from previous non-responders. The surveys were developed based on preliminary 2019-2021 MCS survey response data and the Hispanic Community Health Study Participant Feedback survey (OMB#: 0925-0584). The survey has been submitted as a part of this review.

The previous terms of clearance required that the Millennium Cohort Program “continually examine potential non-response and attrition bias in each wave of this study, particularly for key health-related variables and outcomes. DoD will continue this research and will provide updated results from these ongoing bias analyses to OMB with future submissions”.

Longitudinal studies, such as the Millennium Cohort Study and Family Study, provide the capability to prospectively analyze relationships between exposures and outcomes. A cohort study of military service members and Veterans over an extended period of time provides the unique opportunity to examine the temporal relationships between service-related experiences, including deployment, with subsequent health and behavioral outcomes, which is not possible using time-series or cross-sectional samples. However, in longitudinal studies, non-response and non-random attrition is a potential source of bias. Analysis based on multi-wave panel studies can be heavily compromised by non-random sample attrition. While Millennium Cohort Study and Family Study participants are given the opportunity to complete follow-up questionnaires regardless of whether they completed the previous follow-up questionnaire, the number of responders who do not participate in each subsequent wave of data collection (wave non-response) will most likely accumulate over time, which may undermine the precision of any research undertaken using such samples. Unless non-response is random, attrition may lead to bias, as there are important factors that influence response propensity. Attrition is often correlated with observable characteristics such as age, education, health, and economic well-being, as well as other unknown or unobserved factors. This non-random attrition can result in samples that include only a selected group of individuals over time, which can bias estimates since the non-response can often be associated with the variables of interest. However, non-random attrition does not necessarily lead to attrition bias. Attrition bias is model-specific and, as previous studies have shown, biases might be absent even if attrition rates are high.


The Millennium Cohort Study team has previously examined potential bias from initial enrollment and attrition. Demographic data on all invited personnel have been examined to determine differences in distributions among responders and non-responders. While overall the investigations have demonstrated the responders to be demographically representative of the invited sample (Ryan et al. 2007), some factors have been found to be associated with greater likelihood of enrollment, including some demographic and military characteristics. While some of these differences are significant, most of them are quite small and are similar to patterns found in other surveys of military and non-military populations such as MIDUS and NHANES. These investigations have informed our statistical weighting techniques. Similar to other national studies using probability sampling and weighting to the Current Population Survey, the Millennium Cohort Study weights the study to the entire US military population using DMDC records. In addition to creating sampling weights, principled missing data techniques will be used to mitigate bias that may have emerged from attrition. Details on these imputation procedures are described below in sections A and B. Further examination found few health differences between Millennium Cohort Study responders and non-responders when comparing healthcare utilization preceding study invitation (Wells et al. 2008; Horton et al. 2013). In addition, non-response to the follow-up questionnaires has not resulted in any appreciable biases as reflected by comparing measures of association for selected outcomes, including PTSD, depression, and eating disorders, using complete case and inverse probability weighted methods (Littman et al. 2010). Also, a recent Millennium Cohort Study publication examined the efficiency and feasibility of multiple imputation (MI) to recover data from a question completely missing at a Millennium Cohort follow-up survey (which occurs as items are added or removed over time) and found similar associations between imputed and self-reported predictors with related constructs. This confirms that MI allows for the inclusion of an otherwise missing item as a covariate in statistical models (Kolaja et al. 2021). Please see the tables 1 – 3 below for summaries of these investigations.


Table 1: Selected Demographic Characteristics from Ryan, et al 2007; Millennium Cohort: Enrollment Begins at 21-year Contribution to Understanding the Impact of Military Service



Characteristics

Responders*

(n = 77,047)

Invited Cohort*

(N = 256,400)


N (%)

%

Sex



Male

56,415 (73.2)

76.0

Female

20,632 (26.8)

24.0

Age (years)



17-24

14,559 (18.9)

30.8

25-34

27,083 (35.2)

35.4

35-44

25,400 (33.0)

25.1

>44

9,975 (13.0)

8.6

Service Branch



Army

36,481 (47.4)

44.0

Navy & Coast Guard

14,268 (18.5)

20.8

Marine Corps

3,941 ( 5.1)

7.2

Air Force

22,357 (29.0)

28.1

Education



Less than HS

4,722 ( 6.1)

7.6

HS diploma or less

32,957 (42.8)

50.4

Some college

19,655 (25.5)

23.6

Bachelor’s degree

12,722 (16.5)

11.6

Graduate school

6,986 ( 9.1)

5.4

Paygrade



Enlisted

59,318 (77.0)

84.6

Officer

17,729 (23.0)

15.4

*Responders and Invited Cohort members were from panel 1 of the Millennium Cohort Study’s baseline enrollment survey (2001-2003).


Table 2: Selected Demographic Characteristics from Wells, et al 2008; Prior Health Care Utilization as a Potential Determinant of Enrollment in a 21-year Prospective Study



Characteristics

Responders*

(n = 21,067)

Non-responders*

(N = 47,036)


N (%)

N (%)

Sex



Male

15,143 (71.9)

35,304 (75.1)

Female

5,924 (28.1)

11,732 (24.9)

Age (years)



17-24

3,534 (16.8)

15,520 (33.0)

25-34

4,721 (22.4)

12,988 (27.6)

35-44

7,034 (33.4)

11,067 (23.5)

45

5,778 (27.4)

7,461 (15.9)

Service Branch



Army

7,794 (37.0)

15,115 (32.1)

Navy & Coast Guard

5,230 (24.8)

12,770 (27.2)

Marine Corps

1,620 ( 7.7)

5,091 (10.8)

Air Force

6,423 (30.5)

14,060 (29.9)

Education



HS diploma or less

10,974 (52.1)

29,459 (62.6)

Some college

5,842 (27.7)

12,745 (27.1)

Bachelor’s degree

2,134 (10.1)

2,842 ( 6.0)

Graduate school

2,117 (10.1)

1,990 ( 4.2)

Paygrade



Enlisted

17,500 (83.1)

43,328 (92.1)

Officer

3,567 (16.9)

3,708 ( 7.9)

*Response and non-response was evaluated among Active duty personnel who were not deployed to a combat area during the year prior to enrollment invitation for the first panel of the Millennium Cohort Study (2001-2003).















Table 3: Selected Demographic Characteristics from Horton, et al 2013; The Impact of deployment experience and prior healthcare utilization on enrollment in a large military cohort study


Panel 2 (2004-2006)

Panel 3 (2007-2008)



Characteristics

Responders

(n = 31,110)

Non-responders

(N = 118,393)

Responders

(n = 43,440)

Non-responders

(N = 156,231)


N (%)

N (%)

N (%)

N (%)

Sex





Male

19,167 (61.6)

92,614 (78.2)

152,029 (76.1)

27,941 (64.3)

Female

11,943 (38.4)

25,779 (21.8)

47,642 (23.9)

15,499 (35.7)

Age (years)





17-20

9,317 (30.0)

40,055 (33.8)

6,005 (13.8)

29,762 (19.0)

21-22

6,693 (21.5)

28,973 (24.5)

13,916 (32.0)

60,243 (38.6)

23-24

5,248 (16.9)

19,688 (16.6)

9,108 (21.0)

32,441 (20.8)

>24

9,852 (31.7)

29,677 (25.1)

14,411 (33.2)

33,785 (21.6)

Service Branch





Army

14,995 (48.2)

46,447 (39.2)

15,798 (36.4)

65,269 (41.8)

Navy & Coast Guard

5,263 (16.9)

22,847 (19.3)

7,922 (18.2)

27,971 (17.9)

Marine Corps

2,576 ( 8.3)

27,363 (23.1)

6,802 (15.7)

43,140 (27.6)

Air Force

8,276 (26.6)

21,736 (18.4)

12,918 (29.7)

19,851 (12.7)

Education





Some college or less

25,752 (82.8)

106,183 (89.7)

36,287 (83.5)

143,632 (91.9)

Bachelor’s or higher

4,291 (13.8)

7,056 ( 6.0)

5,856 (13.5)

8,488 ( 5.4)

Unknown

1,067 ( 3.4)

5,154 ( 4.4)

1,297 ( 3.0)

4,111 ( 2.6)

Paygrade





Enlisted

27,482 (88.3)

112,366 (94.9)

38,455 (88.5)

149,509 (95.7)

Officer

3,628 (11.7)

6,027 ( 5.1)

4,985 (11.5)

6,722 ( 4.3)



However, although previous work has indicated minimal bias has been introduced into the Millennium Cohort Study due to initial and follow-up non-response, continued investigation of factors leading to non-random attrition is key for adjusting for non-response and producing unbiased estimates. Given the evolving complexity of the Millennium Cohort Study methodology, we consulted with methodologists experienced in modern techniques for handling missing data. Based on their recommendations, cutting-edge principled missing data statistical solutions have been introduced to minimize non-response bias in descriptive analyses and the creation of population estimates. In particular, principal component auxiliary (PCAUX) variables (Howard, Rhemtulla, & Little, 2015), a recent advance in principled missing data treatments, are being employed. Our handling of non-response bias will employ a combination of weighting schemes as well as principled missing data treatments such as multiple imputation (MI) and full-information maximum likelihood (FIML).


Principled missing data techniques offer several advantages over panel weighting schemes. They are able to address item-non-response as well as wave-non-response. Panel weights can be created to adjust for attrition (or respondents who otherwise miss an entire wave). In contrast, techniques such as multiple imputation can also address item-non-response by respondents present in a wave but failing to answer some items. Weighting in multivariable analyses is also debatable because only a few variables, usually demographic, must be selected to create the weights, and it is plausible that the ideal variables to create weights would vary across outcomes. For example, in longitudinal analyses, information on an outcome such as illness in a prior wave is plausibly much more informative than demographic variables from the baseline. Principled missing data treatments avoid this limitation.


PCAUX methodology solves issues of model complexity and computational limitations that have previously limited the use of principled missing data treatments in large/longitudinal datasets. The ideal missing data model is inclusive of all variables in the dataset that might be informative about missing data patterns. Variables included in the missing data model that are not of interest to the analytic model are called auxiliary variables, and can greatly reduce bias and increase power (Collins, Schafer, & Kam, 2001; Graham, 2009). However, there is a tension between fully-inclusive missing data models and practical limitations on computation. By extracting a limited number of principal components (typically 10 to 13) to employ as auxiliary variables, inclusive information is provided with a parsimonious number of auxiliary variables. In addition to serving as informative auxiliary variables in FIML analyses, PCAUX variables can also be used as the predictor variables when generating multiple imputations.


The Millennium Cohort Study team has previously calculated response rates, created sample and design weights, and has conducted initial response bias investigations for each panel as well as a few select longitudinal response bias investigations. One ongoing project aims to examine follow-up survey response longitudinally among all panels and waves and will utilize novel approaches such as generalized estimating equation models, machine learning and spatial hot-spot analyses. Preliminary results can be seen in Table 4. Table 4 shows, among Panels 1-4, the response rate for all eligible waves (up to 5 for Panel 1). The rate is shown overall and for each panel separately.







Table 4: Response to follow-up surveys among eligible (i.e., not deceased or withdrawn) study participants, overall and by panel

 

Wave 2
n = 199,969

Wave 3
n = 149,286

Wave 4
n = 105,489

Wave 5
n = 74,257

 

n

%

n

%

n

%

n

%

Responded

121,041

61

59,613

60

39,222

63

50,859

68

Did Not Respond

78,928

39

89,673

40

66,267

37

23,398

32










 

Wave 2

Wave 3

Wave 4

Wave 5

 

n

%

n

%

n

%

n

%

Panel 1

n = 76,148

n = 75,656

n = 74,957

n = 74,257

Responded

54,877

72

54,697

72

51,558

69

50,859

68

Did Not Respond

21,271

28

20,959

28

23,399

31

23,398

32

Panel 2

n = 30,888

n = 30,692

n = 30,532


 

Responded

17,134

55

15,562

51

15,823

52


 

Did Not Respond

13,754

45

15,130

49

14,709

48


 

Panel 3

n = 43,145

n = 42,938




 

Responded

22,040

51

19,846

46




 

Did Not Respond

21,105

49

23,092

54




 

Panel 4

n = 49,788




Responded

26,990

54






 

Did Not Respond

22,798

46

 

 

 

 

 

 






Moving forward, the Millennium Cohort Study team will continue to obtain PCAUX variables for each cohort, as well as multiple imputations of each cohort to insure unbiased estimates of population parameters. (These procedures will be reapplied systematically to each cohort after each additional wave of data collection is completed to use newly provided data to better inform previous waves’ missingness and address item and survey non-response in the newly collected data.)


A. Procedures to Investigate Non-response for Millennium Cohort Study Panels at Baseline:

1. Response rates will be calculated using standard formulas using OMB Standards and Guidelines for Statistical Surveys (2006). For example:

  • Unweighted unit response rates (RRU) as the proportion of those that were eligible for the survey at baseline that responded.

2. Non-response (declined participation) bias will be estimated and described by comparing responders to non-responders based on variables available from electronic personnel files available on all service members via Defense Manpower Data Center. These variables will include factors such as demographic characteristics, deployment histories, and medical/healthcare data. For example, methods to do this will include, but will not be limited to, multivariable logistic regression to describe the propensity to respond or not respond.

3. Baseline weights will be calculated using the propensity score for response, in order to minimize response bias. These combined sampling and non-participation weights will be used in future studies, where appropriate.


B. Procedures to Investigate Non-response and Attrition for Millennium Cohort Study Panels at Follow-Up:

1. Response rates will be calculated using standard formulas using OMB Standards and Guidelines for Statistical Surveys (2006). For example:

  • Longitudinal response rates as the proportion of responders at wave 1 (baseline), who responded at a specific subsequent wave.

2. Non-response and attrition bias will be examined and described by comparing responders to non-responders based on previous survey data. These variables will include factors obtained at the baseline survey and beyond, such as behavioral, mental, and physical health factors, as well as demographic characteristics, deployment histories, and medical/healthcare data, and in certain cases response to previous waves. For example, methods to do this will include, but will not be limited to, multivariable multinomial logistic regression to calculate the propensity for response, non-response, or death. We will separately model attrition due to death (or attrition by other causes, when appropriate) versus non-response since determinants will likely differ between these groups and may be differentially associated with outcomes of interest.

3. PCAUX variables and multiple imputations will be obtained for each cohort after each wave of data collection. PCAUX variables can be used in FIML modeling whenever appropriate, and models may also include baseline-weights. As an alternative, analyses can be conducted across multiply imputed datasets.


The Family Study Team partnered with Abt Associates to similarly create design and non-response weights for the Family Study. The use of the weights is critical for generating representative and unbiased statistical estimates for the target population, especially since Panels 4 and 5 of the Millennium Cohort Study were oversampled for married and female respondents, sample design weights were initially generated. As with the Millennium Cohort Study, non-response adjustments to the Family Study weights were performed to reduce non-response bias.


Propensity weighting - a straightforward extension of the propensity score theory of Rosenbaum and Rubin (1983) incorporated into survey non-response problems by David et al. (1983) was used to adjust for non-response to the Family Study. The availability of military records and survey data for the spouse-paired Millennium Cohort Study Panel 4 service member provided a unique opportunity to examine potential bias associated with non-response in unusual detail. Given the large amount of information available on respondents and non-respondents from the spouse-paired Millennium Cohort Study Panel 4 service member that could be included in the Family Study response propensity model, we applied an empirical process to guide variable selection (Rizzo, Kalton and Brick 1996, Smith et al. 2001).


References


Collins, L. M., Schafer, J. L., & Kam, C. M. (2001). A comparison of inclusive and restrictive strategies in modern missing data procedures. Psychological Methods, 6(4), 330-351. doi:10.1037//1082-989x.6.4.330

Dong, Y. R., & Peng, C. Y. J. (2013). Principled missing data methods for researchers. Springerplus, 2, 17. doi:10.1186/2193-1801-2-222

David, M., Little, R. J. A., Samuhel, M. E., and Triest, R. K. (1983). Nonrandom nonresponse models based on the propensity to respond, Proceedings of the Business and Economic Statistics Section, American Statistical Association, 168-173.

Dillman DA. Mail and Telephone Surveys: The Total Design Method. New York: Wiley; 1978. xvi, 325.

Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Internet, mail, and mixed-mode surveys: The Tailored Design Method.

Enders, C. K. (2010). Applied missing data analysis: New York : Guilford Press, [2010].

Graham, J. W. (2003). Adding missing-data-relevant variables to FIML-based structural equation models. Structural Equation Modeling, 10(1), 80-100. doi:10.1207/s15328007sem1001_4

Graham, J. W. (2009). Missing Data Analysis: Making It Work in the Real World Annual Review of Psychology (Vol. 60, pp. 549-576). Palo Alto: Annual Reviews.

Groves, R. M. (2006). Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly, 70 (5), 646–675.

Horton JL, Jacobson IJ, Littman AJ, Alcaraz JE, Smith B, and Crum-Cianflone NF. The impact of deployment experience and prior healthcare utilization on enrollment in a large military cohort study. BMC Medical Research Methodology. 2013 Jul 11;13:90.

Howard, W. J., Rhemtulla, M., & Little, T. D. (2015). Using Principal Components as Auxiliary Variables in Missing Data Estimation. Multivariate Behavioral Research. doi:10.1080/00273171.2014.999267

Kolaja CA, Porter B, Powell TM, Rull RP; Millennium Cohort Study Team. Multiple imputation validation study: addressing unmeasured survey data in a longitudinal design. BMC Med Res Methodol. 2021 Jan 6;21(1):5. doi: 10.1186/s12874-020-01158-w. PMID: 33407168; PMCID: PMC7789687.

Littman AJ, Boyko EJ, Jacobson IG, Horton JL, Gackstetter GD, Smith B, Hooper TI, Amoroso PJ, Smith TC, for the Millennium Cohort Study Team. Assessing nonresponse bias at follow-up in a large prospective cohort of relatively young and mobile military service members. BMC Medical Research Methodology. 2010 Oct;10(1):99.

Rosenbaum, Paul R. & Rubin, Donald B. (1983). The central role of the propensity score in observational studies for causal effects. Biometrika 70 (1): 41–55.

Rizzo, L., Kalton, G., and Brick, M. (1996). A comparison of some weighting adjustment methods for panel nonresponse. Survey Methodology, 22, 43-53.

Ryan MA, Smith TC, Smith B, Amoroso P, Boyko EJ, Gray GC, Gackstetter GD, Riddle JR, Wells TS, Gumbs G, Corbeil TE, Hooper TI, for the Millennium Cohort Study Team. Millennium Cohort: enrollment begins a 21-year contribution to understanding the impact of military service. Journal of Clinical Epidemiology. 2007 Feb;60(2):181-91.

Smith, P.J., Rao, J.N.K., Battaglia, M.P., EzzatiRice, T.M., Daniels, D. & Khare, M. (2001). Compensating for provider nonresponse using response propensity to form adjustment cells: the national immunization survey. Vital and Health Statistics, 2, pp.133.

Stapleton, L. M., Harring, J. R., & Lee, D. Y. (in press). Sampling weight considerations for multilevel modeling of panel data. In J. R. Harring, L. M. Stapleton, & S. N. Beretvas (Eds.), Advances in multilevel modeling for educational research: Addressing practical issues found in real-world applications. Charlotte, NC: Information Age Publishing, Inc.

Wells TS, Jacobson IG, Smith TC, Spooner CN, Smith B, Reed RJ, Amoroso PJ, Ryan MAK, for the Millennium Cohort Study Team. Prior health care utilization as a determinant to enrollment in a 22-year prospective study, the Millennium Cohort Study. European Journal Of Epidemiology. 2008 Feb;23(2):79-87.


4.  Tests of Procedures


Following preliminary focus group evaluations of the draft Millennium Cohort Study questionnaire conducted in late 1999 with military enlisted and officer groups of less than 10 people, a pilot study was conducted with a 1% sample of military personnel in the spring of 2000 as a means of testing the utility of the instrument. Following this pilot study, corrections were made to produce the final Millennium Cohort Study survey instrument.


Participant Feedback Survey: The purpose of the included participant feedback questionnaire is to understand more information around participant recruitment and study retention, such as reasons for non-response, correlates of non-response, motivations to participate, acceptability of study communication methods, and recommendations for improvement.


Focus Group Testing: Along with the participant feedback survey as described above, we will also conduct focus group testing. The purpose of the focus group testing is to determine effective communication and study marketing strategies to maximize participation rates in populations with similar demographics to Millennium Cohort Study participants, ensure the clarity of our contact materials, including the text and overall visual format of emails, postcards, and survey packets, and lastly, to obtain feedback on the cost-saving initiatives currently offered.


Beta Testing: The Millennium Cohort Family Study is currently conducting a beta test of some of the items we are considering adding to the 2023 follow-up survey. All beta testing procedures have been reviewed and approved by the Naval Health Research Center Institutional Review Board. The study team will analyze the results from this small-group beta testing to determine the validity, utility and usability of the newly suggested survey items.


5.  Statistical Consultation and Information Analysis


For the Millennium Cohort Study:


Rudy Rull, Ph.D.

Principal Investigator

Deployment Health Research Department

Naval Health Research Center

[email protected]

(619) 553-9267



For the Family Study:


Valerie Stander, Ph.D.

Principal Investigator

Deployment Health Research Department

Naval Health Research Center

[email protected]

(619) 553-7174



The Research Team at the Department of Defense Center for Deployment Health Research, Naval Health Research Center, San Diego CA 92106-3521

[email protected]

(619) 553-7465

2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authornhrc.admin
File Modified0000-00-00
File Created2021-08-24

© 2024 OMB.report | Privacy Policy