SupportingStatement_414_Survey_Part_B_09Nov2020

SupportingStatement_414_Survey_Part_B_09Nov2020.docx

SURVEY (PL 114-315 Section 414) OF INDIVIDUALS USING THEIR ENTITLEMENT TO EDUCATIONAL ASSISTANCE UNDER THE EDUCATIONAL ASSISTANCE PROGRAMS ADMINISTERED BY THE SECRETARY OF VETERANS AFFAIRS

OMB: 2900-0887

Document [docx]
Download: docx | pdf

Survey of Individuals Using Their Entitlement to Educational Assistance under the Educational Assistance Programs Administered by the Secretary of Veterans Affairs

OMB 2900-


SUPPORTING STATEMENT B


Collection of Information Employing Statistical Methods



  1. Provide a numerical estimate of the potential respondent universe and describe any sampling or other respondent selection method to be used. Data on the number of entities (e.g., households or persons) in the universe and the corresponding sample are to be provided in tabular format for the universe as a whole and for each strata. Indicate expected response rates. If this has been conducted previously include actual response rates achieved.


Target Population:

The target population of the Educational Assistance Program Feedback Survey is defined as any beneficiary who used benefits under the four covered programs (Chapter 33, Chapter 30, Chapter 32, or Chapter 35) since 2014.


The sample frame is prepared by extracting population information from the VBA Annual Report (see Table 1).


Table 1. Number of Educational Assistance Beneficiaries per Year from 2014 - 2019

Education Program

2014

2015

2016

2017

2018

2019

Total

Post 9/11 GI Bill (Chapter 33)

790,408

790,507

790,090

755,476

708,069

714,346

4,548,896

Montgomery GI Bill – Active Duty (Chapter 30)

77,389

61,403

47,307

34,582

26,441

22,166

269,288

VEAP (Chapter 32)

8

4

4

1

69

78

164

DEA (Chapter 35)

90,789

91,755

96,762

100,275

109,760

128,075

617,416

Total

958,594

943,669

934,163

890,334

844,339

864,665

5,435,764


Frame and Stratification:

The education beneficiary is the primary sampling unit and is randomly selected from the population according to a stratified design with a fixed allocation. The strata consist of Education Benefit Type, as listed in Table 2. To ensure demographic representation, the sampling within each stratum is also proportional with regard to age group and gender.


  1. Describe the procedures for the collection of information, including:


  • Statistical methodology for stratification and sample selection

  • Estimation procedure

  • Degree of accuracy needed

  • Unusual problems requiring specialized sampling procedures

  • Any use of less frequent than annual data collection to reduce burden


Frame and Stratification:

The education beneficiary is the primary sampling unit and is randomly selected from the population according to a stratified design with a fixed allocation. The strata consist of Education Benefit Type, as listed in Table 2. To ensure demographic representation, the sampling within each stratum is also proportional with regard to age group and gender.


Sample Size Determination:

To achieve a certain level of reliability, the sample size for a given level of reliability is calculated below (Lohr, 1999):

For a population that is large, the equation below is used to yield a representative sample for proportions:

where

  • = is the critical Z score which is 1.96 under the normal distribution when using a 95% confidence level (α = 0.05).

  • p = the estimated proportion of an attribute that is present in the population, with q=1-p.

  • Note that pq attains its maximum when value p=0.5 or 50%. This is what is typically reported in surveys where multiple measures are of interest. When examining measures closer to 100% or 0% less sample is needed to achieve the same margin of error.

  • e = the desired level of precision or margin of error. For example, for the Post-9/11 GI Bill survey the targeted margin of error is e = 0.03, or +/-3%.

For a population that is relatively small, the finite population correction is used to yield a representative sample for proportions:

where

  • = Representative sample for proportions when the population is large.

  • N = Population size.



The margin of error surrounding the baseline proportion is calculated as:

where

  • = 1.96, which is the critical Z score value under the normal distribution when using a 95% confidence level (α = 0.05).

  • N = Population size.

  • n = Representative sample.

  • p = the estimated proportion of an attribute that is present in the population, with q=1-p.



Table 1 depicts the population figures for the education benefit population from 2014 through 2019. The sample size was calculated to ensure at least 3% Margin of Error at a 95% Confidence Level. This represents a standard for reliability widely used in the survey industry (Lohr, 1999). The Educational Assistance Program Feedback Survey aims to collect data on approximately 1080 respondents.


For this study, we assume a fairly low response rate of 15%. This is due to past experience with surveying this relatively young population for the VSignals Customer Experience Education Call Center (ECC) Survey. Because of non-response, VA will initiate contact with 7,200 beneficiaries to attain the target sample size.


Stratification is used to ensure that the sample matches the population, to the extent possible, across sub-populations. The sample is stratified by educational benefit program type (Chapter 33, 30, 32, 35, see Table 2).


Table 2. Sample Targets by Education Program Type

Educational Benefit Program Type

Population

Survey Sample

Number of email/mail invitations sent

Expected Response Rate

Chapter 33

4,548,896

900

6,000

15%

Chapters 30, 32, 35

886,868

180

1,200

15%

Total

5,435,764

1080

7,200

15%



The sample will be drawn using a systematic sampling methodology. This statistical valid approach allows the team to balance the sample across several variables such as age and gender. This balancing variable are often referred to as implicit strata. The survey will leverage this capability because, though the effect on margin of error is difficult to measure, this methodology has been proven to improve the accuracy of estimates, stabilize weights, and reduce the variability that make trends difficult to interpret.

Each email address encountered is validated in several ways:

  • Validation that the email address has a valid structure

  • Comparison with a database of bad domains

  • Correction of common domain spellings

  • Comparison with a database of bad emails including:

    • Opt outs

    • Email held by multiple beneficiaries

  • Comparison to a database of valid TDLs (e.g. “.com”, “.edu”)

Weighting is commonly applied in surveys, to adjust for nonresponse bias and/or coverage bias. Nonresponse is defined as failure of selected persons in the sample to provide responses. This is observed virtually in all surveys, in that some groups are more or less prone to complete the survey. The nonresponse issue may cause some groups to be over- or under-represented. Coverage bias is another common survey problem in which certain groups of interest in the population are not included in the sampling frame. The reason that these beneficiaries may not participate is because they cannot be contacted (i.e. no email address available). In both cases, the exclusion of these portions of beneficiaries from the survey contributes to the measurement error. The extent that the final survey estimates are skewed depends on the nature of the data collection processes within an individual line of business and the potential alignment between beneficiary sentiment and their likelihood to respond.

Survey practitioners recommend the use of sample weighting to improve inference on the population so that the final respondent sample more closely resembles the true population. It is likely that differential response rates may be observed across different age and gender groups. Weighting can help adjust for the demographic representation by assigning larger weights to underrepresented group and smaller weights to over-represented group. Stratification can also be used to adjust for nonresponse by oversampling the subgroups with lower response rates. In both ways of adjustments, weighting may result in substantial correction in the final survey estimates when compared to direct estimates in the presence of non-negligible sample error.

The Educational Assistance Program Feedback Survey will also rely on what are often referred to as design weights—weights that correct for disproportional sampling where respondents have different probabilities of selection. Therefore, the weights are applied to make the explicit strata (Benefit Type) proportional to the number of beneficiaries.

If we let wij denote the sample weight for the ith person in group j (j=1, 2, and 3), then the weighting formula is:

As part of the weighting validation process, the weights of persons in an age and gender group are summed and verified that they match the universe estimates (i.e., population proportion). Additionally, we calculate the unequal weighting effect, or UWE (see Kish, 1992; Liu et al., 2002). This statistic is an indication of the amount of variation that may be expected due to the inclusion of weighting. The unequal weighting effect estimates the percent increase in the variance of the final estimate due to the presence of weights and is calculated as:

where

  • cv = coefficient of variation for all weights .

  • s = sample standard deviation of weights.

  • = sample mean of weights, ij.


  1. Describe methods to maximize response rate and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


Recruitment emails and letters will be sent to 7,200 beneficiaries who used benefits under the four covered programs (Chapter 33, Chapter 30, Chapter 32, or Chapter 35) since 2014 (n=6,000 Chapter 33; n=1,200 Chapters 30, 32, and 35). The survey will use data extracted directly from the Corporate Data Warehouse (CDW). Beneficiaries will have two weeks to complete the survey. A reminder email will be sent after one week to non-respondents, to remind them that the survey is available for another week. The questionnaire was designed to minimize respondent burden by its length (~ 10 minutes to complete) and the proper use of branching and skip patterns. If the sample size of 1080 responses are not attained, then additional invitations of the survey will be sent out via email and regular mail.


Beneficiaries are selected to participate in the survey via an invitation email or mailed letter. A link is enclosed in either the email or letter so the survey may be completed using an online interface by all participants. Beneficiaries will also have the option of returning the survey by mail.


Finally, the contractor will send surveys to 7,200 veterans in anticipation of receiving a 15% response rate. Additional invitations of the survey will be sent out should the targeted sample size not be attained from the initial invitation.

See also the attached Survey Sample Plan.


  1. Describe any tests of procedures or methods to be undertaken. Testing is as an effective means of refining collections to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions of 10 or more individuals.


After programming the survey in a survey software platform, pilot tests will be conducted to improve utility.


  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

VA


MITRE


Mr. Don Ortega


Dr. Micah Roediger

703-937-4251

Mr. Jason Carley

571-308-9048

Dr. Juli Simon-Thomas

703-983-1251



Dr. Amber Sprenger

703-983-4717



Mr. Zach Mastrich

609-477-2312




6


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJones, Ericka, VBAVACO
File Modified0000-00-00
File Created2021-03-14

© 2024 OMB.report | Privacy Policy