SUPPORTING STATEMENT-B (NCA VBA VHA) - PRA Qualitative Feedback

SUPPORTING STATEMENT-B (NCA VBA VHA) - PRA Qualitative Feedback.docx

Veteran Rapid Retraining Assistance Program (VRRAP) 30,60,90,180-Day Experience Survey, and VRRAP Experience Survey After Employment

OMB: 2900-0916

Document [docx]
Download: docx | pdf

Supporting Statement - B for Paperwork Reduction Act (PRA)


Veteran Rapid Retraining Assistance Program (VRRAP) 30, 60, 90,

180-Day Experience Survey, and VRRAP Experience Survey After Employment

2900-XXXX



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


Data collection methods and procedures will vary; however, the primary purpose of this collection will be for internal management purposes and shared with Department of Labor. However, there are no plans to publish or otherwise release this information to others.


  1. Provide a numerical estimate of the potential respondent universe and describe any sampling or other respondent selection method to be used. Data on the number of entities (e.g., households or persons) in the universe and the corresponding sample are to be provided in tabular format for the universe as a whole and for each stratification. Indicate expected response rates. If this has been conducted previously include actual response rates achieved.


Regarding the activities under this regular ICR submission, please refer to Table 2 which depicts the estimated number of unique VRRAP customers within a month. Preliminary analysis of this population indicates that approximately all VRRAP participants have provided an email address. Because VBA EDU is congressionally mandated to reach out to every VRRAP participant, we are proposing to conduct a census for each of these surveys.

With current estimates, this would result in around 560 completed surveys from 3,096 invitations per year. Considering the VEO must reach out to participants that participated within the first year of VRRAP before this survey effort was underway (i.e. mid-March 2021 to mid/late May 2022), there is a potential for up to an additional 360 completes from approximately an additional 2,000 invitations during the first year of this survey effort.

To account for reaching out to previous participants before the launch of the survey, potential estimation errors, improvement in email collection, or changes in business volume; we are requesting approval for a maximum of 1,400 completes annually across the five surveys.

Table 2. Monthly Population and Survey Figures

Surveys

Estimated Total Population

Approximate Email Population

Available Population1

Expected Response Rate

Estimated Number of Respondents

30 Day Experience

94

94

85

18%

15

60 Day Experience

80

80

72

18%

13

90 Day Experience

60

60

54

18%

10

180 Day Experience

12

12

11

18%

2

Experience After Meaningful Employment

12

12

11

18%

2

1.Excluding estimated duplicates and quarantined records (10% loss)

The target population of the VRRAP surveys is all VBA customers that have experiences at any of five potential triggers:

  • 30 Day Experience: Experience 30 days after graduation/early termination in VRRAP

  • 60 Day Experience: Experience 60 days after graduation/early termination in VRRAP

  • 90 Day Experience: Experience 90 days after graduation/early termination in VRRAP

  • 180 Day Experience: Experience 180 days after graduation/ early termination in VRRAP

  • Experience After Meaningful Employment: Experience after attaining meaningful employment in a high demand field post VRRAP. The population data will be accessed directly from the VRRAP salesforce application.


  1. Describe the procedures for the collection of information, including:


  • Statistical methodology for stratification and sample selection


Stratification is used to ensure that the sample matches the population, to the extent possible, across sub-populations. Since we are proposing a census, stratification is not required. The population for the survey will be drawn from the salesforce application VBA EDU uses for this program. VEO data analysts will access the data to download the required fields from records that reached the one of the five milestones in a two-week period. Any record with a valid email address will be included in the survey. Email invitations are delivered to all selected customers. Selected respondents will be contacted within 15 days of their interaction. They will have 14 days to complete the survey. Estimates will be accessible to data users instantly on the VSignals platform.


  • Estimation procedure


For a given margin of error and confidence level, the sample size is calculated as below (Lohr, 1999). For population that is large, the equation below is used to yield a representative sample for proportions:

where

  • = 1.96, which is the critical Z score value under the normal distribution when using a 95% confidence level (α = 0.05).

  • p = the estimated proportion of an attribute that is present in the population, with q=1-p.

  • Note that pq attains its maximum when value p=0.5, and this is often used for a conservative sample size (i.e., large enough for any proportion).

  • e = the desired level of precision; in the current case, the margin of error e = 0.03, or 3%. Also referred to as MOE.

For a population that is relatively small, the finite population correction is used to yield a representative sample for proportions:

Where

  • = Representative sample for proportions when the population is large.

  • N = Population size.



The margin of error surrounding the baseline proportion is calculated as:

Where

  • = 1.96, which is the critical Z score value under the normal distribution when using a 95% confidence level (α = 0.05).

  • N = Population size.

  • n = Representative sample.

  • p = the estimated proportion of an attribute that is present in the population, with q=1-p.



Table 2 depicts the estimated number of unique VRRAP customers within a month. Preliminary analysis of this population indicates that approximately all VRRAP participants have provided an email address. Because VBA EDU is congressionally mandated to reach out to every VRRAP participant, we are proposing to conduct a census for each of these surveys. With current estimates, this would result in around 560 completed surveys from 3,096 invitations per year. Considering the VEO must reach out to participants that participated within the first year of VRRAP before this survey effort was underway (i.e. mid-March 2021 to mid/late May 2022), there is a potential for up to an additional 360 completes from approximately an additional 2,000 invitations during the first year of this survey effort. To account for reaching out to previous participants before the launch of the survey, potential estimation errors, improvement in email collection, or changes in business volume; we are requesting approval for a maximum of 1,400 completes annually across the five surveys.

Table 2. Monthly Population and Survey Figures

Surveys

Estimated Total Population

Approximate Email Population

Available Population1

Expected Response Rate

Estimated Number of Respondents

30 Day Experience

94

94

85

18%

15

60 Day Experience

80

80

72

18%

13

90 Day Experience

60

60

54

18%

10

180 Day Experience

12

12

11

18%

2

Experience After Meaningful Employment

12

12

11

18%

2

1 Excluding estimated duplicates and quarantined records (10% loss)

  • Degree of accuracy needed


Standard alpha and beta levels


  • Unusual problems requiring specialized sampling procedures


None Known. However, to ensure the prevention of errors and inconsistencies in the data and the analysis, quality control procedures will be instituted in several steps of the survey process. Records will undergo a cleaning during the population file creation. The quality control steps are as follows.

  1. Records will be reviewed for missing data. When records with missing data are discovered, they will be either excluded from the population file when required or coded as missing.

  2. Any duplicate records will be removed from the population file to both maintain the probabilities of selection and prevent the double sampling of the same customer.

  3. Invalid emails will be removed.

The survey sample loading and administration processes will have quality control measures built into them.

  1. The extracted sample will be reviewed for representativeness. A secondary review will be applied to the final respondent sample.

  2. The survey load process will be rigorously tested prior to the induction of the survey to ensure that sampled customers is not inadvertently dropped or sent multiple emails.

  3. The email delivery process is monitored to ensure that bounce-back records will not hold up the email delivery process.


  • Any use of less frequent than annual data collection to reduce burden


These surveys will be conducted four times per year, or once quarterly for each respondent.


  1. Describe methods to maximize response rate and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


A final respondent sample should closely resemble the true population, in terms of the demographic distributions (e.g. age groups). One problem that arises in the survey collection process is nonresponse, which is defined as failure of selected persons in the sample to provide responses. This occurs in various degrees to all surveys, but the resulting estimates can be distorted when some groups are actually more or less prone to complete the survey. In many applications, younger people are less likely to participate than older persons. Another problem is under-coverage, which is the event that certain groups of interest in the population are not even included in the sampling frame. They cannot participate because they cannot be contacted: those without an email address will be excluded from sample frame. These two phenomena may cause some groups to be over- or under-represented. In such cases, when the respondent population does not match the true population, conclusions drawn from the survey data may not be reliable and are said to be biased.



While we are not currently planning to weight the data, survey practitioners recommend the use of sampling weighting to improve inference on the population. This will be introduced into the survey process as a tool that helps the respondent sample more closely represent the overall population. Weighting adjustments are commonly applied in surveys to correct for nonresponse bias and coverage bias. As a business rule will be implemented to require callers to provide email address, the coverage bias for this survey is expected to decrease. In many surveys, however, differential response rates may be observed across age groups. In the event that some age groups are more represented in the final respondent sample, the weighting application will yield somewhat smaller weights for this age group. Conversely, age groups that are underrepresented will receive larger weights. This phenomenon is termed non-response bias correction for a single variable. Strictly speaking, we can never know how non-respondents would have really answered the question, but the aforementioned adjustment calibrates the sample to resemble the full population – from the perspective of demographics. This may result in a substantial correction in the resulting weighting survey estimates when compared to direct estimates in the presence of non-negligible sample error (non-response bias).



It was reported earlier that the email population comprises 100% of the VRRAP population. This is very respectable considering that 88% of US veterans utilize email (National Telecommunications and Information Administration, 2020). It is assumed that the level of customer satisfaction is not directly related to their email status (Missing at Random).



When implemented, weighting will utilize cell weights in real time. With each query on the VSignals platform for each respondent by dividing the target for a cell by the number of respondents in the cell. The weighting scheme will include, where possible all the variables used for explicit stratification, However, cells will be collapsed if the proportion of the population is insufficient to reliably achieve a minimum of 3 completes per month. As a result, weights may be more comprehensive for larger population segments. For instance, in the VA, women are a smaller proportion of the populations. Therefore, woman will have more collapsed cells than men.



As part of the weighting validation process, the weights of persons in age and gender groups are summed and verified that they match the universe estimates (i.e., population totals). Additionally, we calculate the unequal weighting effect, or UWE (see Kish, 1992; Liu et al., 2002). This statistic is an indication of the amount of variation that may be expected due to the inclusion of weighting. The unequal weighting effect estimates the percent increase in the variance of the final estimate due to the presence of weights and is calculated as:

where

  • cv = coefficient of variation for all weights .

  • s = sample standard deviation of weights.

  • = sample mean of weights, ij.


  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions of 10 or more individuals.

No tests have been completed for this item.


  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Melissa Mitchell, 202-437-6730, Statistician and Plan Creator, Veteran Experience Office (VEO)


Juan Jackson, 202-603-4374, Portfolio Management Directorate, Veteran Experience Office


Michael Napper, 202-632-9104, is Education Service Product Owner, Executive Management Office


Schnell Carraway, 202-461-9362, Education Service, Chief, Strategic Initiatives and VRRAP Point of Contact


Bill Spruce, 202-461-9839, Team Lead, VRRAP Program Coordinator, and EDU Operations


The surveys will be received and reviewed by Education Service Staff at the GS-13/05 level, and VEO Contractors



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMar_S
File Modified0000-00-00
File Created2022-09-27

© 2024 OMB.report | Privacy Policy