0704-weos_ssb_3.31.22

0704-WEOS_SSB_3.31.22.docx

Armed Forces Workplace Equal Opportunity Survey

OMB: 0704-0631

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT – PART B



  1. Description of the Activity

Describe the potential respondent universe and any sampling or other method used to select respondents.  Data on the number of entities covered in the collection should be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample.  Indicate the expected response rates for the collection as a whole, as well as the actual response rates achieved during the last collection, if previously conducted.

The purpose of the Armed Forces Workplace and Equal Opportunity (WEO) survey is to assess the attitudes and opinions of Active Duty and Reserve Component military members on racial/ethnic relations in the military, including their experiences of, the climate surrounding, and reporting racial/ethnic harassment and discrimination in their military workplace (IAW Title 10 USC §481). The DoD is committed to eliminating unlawful racial/ethnic discrimination and harassment within the Armed Forces (Department of Defense, 2020) and seeks to estimate the prevalence of these experiences among members as part of this effort.


The 2022 WEO will be administered to a sample of Active Duty and Reserve component members in order to meet this statutory requirement for the DoD. Additionally, the 2022 WEO will also include a census of the Active and Reserve components of the United States Coast Guard.1 The WEO transitioned from a quadrennial to biennial fielding for each population IAW SecDef’s (2020) immediate actions to improve diversity and inclusion in the military (the previous Reserve component surveys were administered in 2007, 2011, 2015, and 2019 while previous Active Duty component surveys were administered in 2009, 2013, and 2017). For the DoD sample, the 2022 WEO survey will use single-stage non-proportional stratified random sampling to identify eligible participants in order to achieve precise estimates for important reporting categories (e.g., race/ethnicity, Service, pay grade).2 OPA uses a sampling tool developed by the Research Triangle Institute (RTI) to determine the sample size needed to achieve 95% confidence and an associated precision of 5% or less on each reporting domain. We select a single-stage, non-proportional stratified random sample to ensure statistically adequate expected number of responses for the reporting categories (i.e., domains). OPA uses Service/Component, race/ethnicity, paygrade, and age.

Given anticipated eligibility and response rates, an optimization algorithm was used to determine the minimum-cost allocation that simultaneously satisfies the domain precision requirements for the DoD sample. Anticipated eligibility and response rates for the Active component will be based on the 2017 WEO survey of Active Duty Members (Table 1) which had a weighted response rate of 15.5%. The total sample size overall will be increased to approximately 252,000 in order to improve precision estimates for racial/ethnic minority respondents this iteration in light of the new exposure to extremism metric. This includes a sample of approximately 211,000 DoD Active Duty members and a census of approximately 41,000 Active duty Coast Guard members. The total sample size for the Reserve Component will be increased to 162,000 as compared to the 2015 WEO survey of Reserve Component Members. This includes a sample of approximately 150,000 DoD Reserve component members and a census of approximately 12,000 Coast Guard Reserve component members.3 This estimate is based on precision requirements for key reporting domains for this population.4 Anticipated eligibility and response rates for the Reserve component will be largely based on the 2015 WEO survey of Reserve Component Members (Table 2) which had a weighted response rate of 18.8%.

Because precise estimates on the percentage of reported racial/ethnic harassment/discrimination rates among even the smallest domains (e.g., Marine Corps racial/ethnic minorities) are required, a sizable sample is necessary.


Table 1.

2017 Active Component Sample Size by Stratification Variables


Table 2.

2015 Reserve Component Sample Size by Stratification Variables

  1. Procedures for the Collection of Information

Describe any of the following if they are used in the collection of information:

    1. Statistical methodologies for stratification and sample selection;

As described above, OPA uses a sampling tool developed by RTI to determine the sample size needed to achieve 95% confidence and an associated precision of 5% or less on each reporting category domain. We use a single-stage, non-proportional stratified random sample to ensure statistically adequate expected number of responses for the reporting domains. For the 2022 WEO, OPA will also use Service/Component, race/ethnicity, paygrade, and age to define the initial strata for DoD Service members. Within each stratum, individuals will be selected with equal probability and without replacement. However, because allocation of the sample will not be proportional to the size of the strata, selection probabilities will vary among strata, and individuals will not be selected with equal probability overall. Non-proportional allocation will be used to achieve adequate sample sizes for small subpopulations of analytic interest for the survey reporting domains. These domains will include subpopulations defined by the stratification characteristics, as well as others.

The OPA Sample Planning Tool, Version 2.1 (Dever and Mason, 2003) will be used to accomplish the allocation. This application will be based on the method originally developed by J. R. Chromy (1987), and is described in Mason, Wheeless, George, Dever, Riemer, and Elig (1995). The Sampling Tool defines domain variance equations in terms of unknown stratum sample sizes and user-specified precision constraints. A cost function is defined in terms of the unknown stratum sample sizes and per-unit costs of data collection, editing, and processing. The variance equations are solved simultaneously, subject to the constraints imposed, for the sample size that minimizes the cost function. Eligibility rates modify estimated prevalence rates that are components of the variance equations, thus affecting the allocation; response rates inflate the allocation, thus affecting the final sample size.

    1. Estimation procedures;

The eligible respondents will be weighted in order to make inferences about the entire Active duty and Reserve component populations separately for DoD and Coast Guard. We will utilize the industry standard three-stage process to weight data. The first stage will be to assign a base weight to the sampled member based on the reciprocal of the selection probability.5 For the second stage, non-response, the base weights will be adjusted (eligibility and completion of the survey). Finally, the current weights will be post-stratified to known population totals to reduce bias associated with the estimates in the third stage. Variance strata will then be created so precision measures can be associated with each estimate. Estimates will be produced for reporting categories using 95% confidence intervals with the goal of achieving a precision of 5% or less.

    1. Degree of accuracy needed for the Purpose discussed in the justification;

OPA allocated the DoD sample to achieve the goal of reliable precision on estimates for outcomes associated with reporting racial/ethnic harassment/discrimination and other measures that were only asked of a very small subset of members, especially for racial/ethnic minorities. Given estimated variable survey costs and anticipated eligibility and response rates, OPA used an optimization algorithm to determine the minimum-cost allocation that simultaneously satisfied the domain precision requirements. Response rates from previous surveys were used to estimate eligibility and response rates for all strata.

The allocation precision constraints will be imposed only on those domains of primary interest. Generally, the precision requirement will be based on a 95% confidence interval and an associated precision of 5% or less on each reporting category. Constraints were manipulated to produce an allocation that will achieve satisfactory precision for the domains of interest at the target sample size of approximately 211,000 members for the Active component and approximately 150,000 for the Reserve component for the DoD sample.

    1. Unusual problems requiring specialized sampling procedures; and

None.

    1. Use of periodic or cyclical data collections to reduce respondent burden.

The Secretary of Defense was given the responsibility of conducting two biennial surveys as outlined in Title 10 USC §481 to identify and assess racial and ethnic issues and discrimination. The WEO surveys occur on a biennial basis (i.e. every other year) in accordance with the congressional mandate.





  1. Maximization of Response Rates, Non-response, and Reliability

Discuss methods used to maximize response rates and to deal with instances of non-response.  Describe any techniques used to ensure the accuracy and reliability of responses is adequate for intended purposes.  Additionally, if the collection is based on sampling, ensure that the data can be generalized to the universe under study.  If not, provide special justification.

To reduce respondent burden, web-based surveys use “smart skip” technology to ensure respondents only answer questions that are applicable to them. To maximize response rates, OPA offers the survey via the web and uses reminder emails to encourage participation throughout the survey’s fielding window. E-mail reminders will be sent to sample members until they respond or indicate that they no longer wish to be contacted as well as a postal notification early in the fielding period (i.e., mailed 2-3 weeks after the start of fielding). The outreach communications include text highlighting the importance of the surveys and signatures from senior DoD leadership. With the sampling procedures employed, OPA predicts enough responses will be received within all important reporting categories to make estimates that meet confidence and precision goals.

The sample sizes were determined based on prior response rates from similar Active duty or Reserve component surveys. Given the anticipated response rates, OPA predicts enough responses will be received within all important reporting categories to make estimates that meet confidence and precision goals. To deal with instances of nonresponse, OPA adjusts for nonresponse in the weighting methodology. To ensure the accuracy and reliability of responses, OPA conducts a nonresponse bias (NRB) analysis every third survey cycle and will conduct one in 2022. Historically OPA has found little evidence of significant NRB during these studies; however, OPA statisticians consider the risk of NRB high and consider it likely the largest source of error in OPA surveys. OPA uses probability sampling and appropriate weighting to ensure the survey data can be generalized to the universe under study.



  1. Tests of Procedures

Describe any tests of procedures or methods to be undertaken.  Testing of potential respondents (9 or fewer) is encouraged as a means of refining proposed collections to reduce respondent burden, as well as to improve the collection instrument utility.  These tests check for internal consistency and the effectiveness of previous similar collection activities.

In accordance with the DoD Survey Burden Action Plan, we went to great length to develop a survey instrument that collected only the information required to meet the Congressional-mandate and to support policy and program development and/or assessments. To accomplish this, we reviewed data from the 2017 WEO survey of the Active component and 2019 WEO survey of the Reserve component to identify items that did not perform well, for which very little usable data were collected, or that did not support information requirements. This process allowed us to cut approximately 54 questions from the most recent WEO survey fielded by the Department (the 2019 WEOR; 153 questions down to 99). We also collaborated with leaders from each of the relevant policy offices (i.e., the Office for Diversity, Equity, and Inclusion [ODEI], & the Office of Force Resiliency [OFR]) to identify their critical information needs and research questions. This led us to identify additional items to remove from the survey as well as and new items to address important gaps in knowledge regarding response to racial/ethnic harassment and discrimination, including workplace climate factors that contribute to these experiences for military members.

As noted above, the WEO survey involves complex skip patterns that substantially reduce the number of questions respondents will be asked based on their answers to previous questions. Because of these skips, we anticipate that the vast majority of respondents will NOT see all of the follow-up questions that are included in several sections of the survey (e.g., questions asking respondents to describe the characteristics of their negative experience). The current survey is anticipated to take 20-30 minutes for most participants to complete.

In addition, we will conduct a methodological communications experiment assessing the impact of sender display name, email content and timing of specific emails on response rates throughout the fielding period. In this test, there will be two groups (a control group and an experimental group). Based on findings from an experiment conducted on the 2019 WEOR survey communications yielding higher response rates from less specific communications, email content for the control group will provide generic information regarding the survey content and will not reference the name of the survey. In the experimental group, however, email communications will vary the content of the emails as well as what sender is displayed in the recipients’ inboxes to determine the combination of sender name and email content with the greatest impact on response rates as research with other populations suggest that starting with generic content and getting more specific throughout fielding may be helpful as well as using known policy officials to promote surveys later in fielding. Results from this experiment will allow us to tailor communications on future OPA surveys to maximize response rates.



  1. Statistical Consultation and Information Analysis

    1. Provide names and telephone number of individual(s) consulted on statistical aspects of the design.

    1. Provide name and organization of person(s) who will actually collect and analyze the collected information.

Data will be collected by Data Recognition Corporation, OPA’s operations contractor. Contact information is listed below.

  • Ms. Valerie Waller, Senior Managing Director, Data Recognition Corporation, [email protected]

Data will be analyzed by OPA social scientists & analysts. Contact information is listed below.

  • Dr. Samantha Daniel, Chief of Diversity & Inclusion Research, Health & Resilience Research Division – OPA, [email protected]

  • Dr. Adon Neria, Senior Research Psychologist, Health & Resilience Research Division – OPA, [email protected]

  • Dr. Ashlea Klahr, Director, Health & Resilience Research Division – OPA, [email protected]

  • Ms. Lisa Davis, Deputy Director, Health & Resilience Research Division – OPA, [email protected]

  • Ms. Jess Tercha, Senior Researcher – Fors Marsh Group, [email protected]

  • Ms. Ariel Hill, Operations Analyst – Fors Marsh Group, [email protected]

  • Ms. Alycia White, Operations Analyst – Fors Marsh Group, [email protected]

  • Mr. DaCota Hollar, Operations Analyst – Fors Marsh Group, [email protected]

  • Ms. Amanda Barry, Director, Military Workplace Climate Research – Fors Marsh Group, [email protected]

  • Ms. Margaret H. Coffey – Fors Marsh Group, [email protected]



1 At the time of this submission, inclusion of personnel from the Active and Reserve Components of the U.S. Coast Guard is tentative pending approvals and funding from the DHS.

2 We do not anticipate needing to stratify the Coast Guard because it will be a census due to the small number of military personnel within DHS.

3 Coast Guard Reserve component members were not included in the 2015 WEOR.

4 Although the 2019 survey of Reserve members was most recently administered, our sampling approach for the 2022 survey is most similar to the survey administered in 2015. In 2019, a census of DoD Reserve component members was conducted, though members were assigned through random stratified sample procedures to receive either the 2019 Status of Forces, Workplace and Gender Relations, or WEO survey. Thus, the 2019 WEO Reserve component sample was larger than precision estimates warrant for the 2022 WEO due to the need to assign every member to take one survey.

5 Note for Coast Guard, each respondent will be assigned a base weight of 1 given that a census will be conducted.

3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPatricia Toppings
File Modified0000-00-00
File Created2022-04-13

© 2024 OMB.report | Privacy Policy