PSTAP_OMB_Package Statement B__2.12.18 final

PSTAP_OMB_Package Statement B__2.12.18 final.docx

Post Separation TAP Assessment Survey

OMB: 2900-0864

Document [docx]
Download: docx | pdf


Pre-Separation Transition Assistance Program (PSTAP)

Assessment


Supporting Statement B:


Collections of Information Employing

Statistical Methods



  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collect and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicated the expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.


The design for this study is to draw a census of all Veterans at three points after separation from service, thus making all Veterans eligible who meet the time criteria past separation. After the initial data collection from these three cohorts establishing baseline (cross-sectional) estimates from the PSTAP Assessment, the respondents who opt-in for future waves of data collection shall become the Veterans participating in this longitudinal study to be contacted no more than once per year.


To create these three cohorts, at least six (6) months of data shall be requested for the month prior to fielding the survey for each cohort. Although not all of these Veterans shall be included in the study, the possibility of printer vendor delays (external to VA and Contractor) necessitate requesting extra records as an overall efficiency to have sufficient records that can be restricted prior to mailing.


Three cohorts have been identified by VA as the Veterans of interest in this study. The reference points after separating from service for the three cohorts shall approximate six (6) months, one (1) year, and three (3) years as outlined and graphically depicted below:

Cohort 1:

Veterans that separated 5-6 months prior to fielding

(2 months of annual population)


Cohort 2:

Veterans that separated 11-12 months prior to fielding

(2 months of annual population)


Cohort 3:

Veterans that separated 35-36 months prior to fielding

(2 months of annual population)





The population size for these three cohorts was estimated using the most recent separation numbers from the Department of Defense. In 2016, there were 211,668 Servicemembers who became Veterans. Given the low response rates and inexact mailing dates, two months of data is required. This equates to about 35,000 for each of the three eligible cohorts (i.e., approximately 210,000 annual Servicemembers who become Veterans equates to 35,000 every two months). These Veterans will be sent an advance letter signed by VA Senior Leadership inviting them to participate in the PSTAP Assessment through a secure portal on the VA.gov website. Due to the heightened political attention to this topic, potential respondents are more likely to reply if the person signing the advance letter is a known individual.


The table below shows the estimated response rates at the three stages of data collection: (1) Web invitation with advance letter; (2) mailed paper survey package; and (3) postcard reminder as the third and final reminder to participate electronically via the Web portal on VA.gov. These response rates are based on the FY17 response rates for electronic and paper returns for the Compensation Enrollment survey from the VOV data. The final reminder is estimated at half the response rate from the initial invitation. Although these estimated response rates are lower than those achieved in the TVMI study, it should be noted that the TVMI study is not federal and uses incentives ranging from $5 to $50, which drive higher response rates.


Response Rate Estimates



Postcard Web-invitation

5% response rate

Paper surveys (less Web returns)

10% response rate

Postcard web reminder (less Web and paper returns)

2.5% response rate

Cohort 1

35,000

1,750

33,250

3,325

29,925

748

Cohort 2

35,000

1,750

33,250

3,325

29,925

748

Cohort 3

35,000

1,750

33,250

3,325

29,925

748


105,000

5,250

99,750

9,975

89,775

2,244



The PSTAP Assessment will be fielded no more than once in a 12-month period. For the purposes of this ICR approval of a new collection, the three-year average annual burden calculation is shown in the table below. Attrition is a concern in all longitudinal studies. Attempts to boost participation rates can often be mitigated by using techniques such as incentives, personal interviews, telephone reminders, etc., which carry an additional monetary burden. Therefore, the second and third waves of data collection (i.e., Year 2 and Year 3) are estimated using data from a longitudinal VA survey of Veterans.


Average Annual Hourly Burden Calculation



Retention Rate*

Number of Responses

Minutes

Hourly Burden

Year 1

n/a

17,469

18.5

5,386

Year 2

74.4%

12,997

18.5

4,007

Year 3

60.1%

10,499

18.5

3,237



Average Annual


4,210


* Retention rates are derived from the attrition rates reported in the VR&E longitudinal study for follow-up at one and two years. The upper bound of the 95% confidence interval was to estimate the maximum burden to the public for this ICR.



  1. Describe the procedures for the collection of information including:

    • Statistical methodology for stratification and sample selection;

    • Estimation procedure;

    • Degree of accuracy needed for the purpose described in the justification;

    • Unusual problems requiring specialized sampling procedures; and

    • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


As discussed in the previous question (Item 1), all eligible Veterans at six (6) months, one (1) year, and three (3) years post-separation will be invited to participate in the study resulting in a census. As a result, sampling or stratification procedures will not be utilized. When findings from the survey are available, differences shall undergo significance testing at the 95% confidence level.


In the event response rates are too low to produce sufficient sample sizes from which to draw estimates, post-stratification weights shall be used drawing upon the population file. The Contractor shall discuss these instances with VA to determine if such analyses are necessary. For example, given that the longest reference point is 3 years post-separation, it is unlikely that Veterans will be older than age 50. To the extent that estimates are needed for small cell sizes like these and not collapsed into a larger category (such as Veterans age 45 or older), then weights will be derived from the population file to adjust the estimates accordingly.


This PSTAP Assessment is not planned to be conducted more frequently than once in a 12-month period. At the end of the instrument, Veterans are requested to opt into future waves of data collection and are informed in writing that they shall not be contracted more than once in a 12-month period.



  1. Describe methods to maximize response rates and to deal with issues of nonresponse. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data to be generalized to the universe studied.


Nonresponse is mitigated for the PSTAP Assessment in four primary ways: (1) offering two modes; (2) multiple contact attempts; (3) minimizing the instrument length; and (4) increasing the ease of completing a paper questionnaire by optimizing the visual layout.


Survey response rates have been dropping over the last few decades. Some of the ways response rates are boosted include offering multiple modes. While some survey modes, such as those involving personal interviews, can also come at significant cost, a combination mode of Web and paper is both cost-effective and assures anonymity for the respondent. The Web option for the PSTAP Assessment is being offered first to reduce the number of paper surveys that need to be mailed to Veterans who do not respond after the first attempt.


The final instrument is the culmination of cognitive interviews as well as close coordination between VA, the Interagency Partners, and the Contractor. As with any survey development, there is often a desire to include more questions than is feasible without jeopardizing response rates. At all junctures, there was close coordination to ensure that both the concepts being measured and the number of questions were kept at a minimum to decrease respondent burden.


Visual layout can reduce the effort required by a respondent to encode their responses to a question and mark the right category. Such techniques as using grids and alternate shading can decrease this burden.


Despite this multi-pronged strategy, achieving an 80% response has been difficult to achieve for even the largest federal surveys. Consequently, nonresponse bias reports will be conducted to ensure that this data produces valid estimates. Despite response rates to VOV surveys as low as single digits, such statistical procedures as survey raking (also used for OPM’s Federal Employee Viewpoint survey) have consistently shown the data to be valid.




  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


The PSTAP Assessment was pretested using cognitive interviews to employ the “think aloud” method. Referrals for the pretesting subjects were obtained by the Interagency Partners, VA program staff, an email sent to Veteran Service Organizations (VSOs) by VA’s TAP subject matter expert (SME), and an email sent by a vice president at J.D. Power to over 500 of its employees based in the Unites States for family or acquaintance referrals.


Cognitive interviews were conducted with six (6) members of the public consistent with OMB regulations that state testing shall not exceed nine (9) members of the public without applying for a generic clearance. These interviews were conducted December 1, 2017 through January 17, 2018. To supplement these interviews, an additional six (6) interviews were conducted with VA employees permissible for testing purposes as federal employees.


All of the pretesting subjects ranged in age from 28 to 57 years, two (2) were female Veterans, and the branches of service spanned the Air Force, Army, Marine Corps, and Navy. Four (4) members of the public completed their TAP training during the target reference period of three (3) years, while the remaining eight (8) subjects exceeded this reference period; however, all pre-testing subjects had sufficient knowledge of the TAP curriculum to provide applicable feedback. Of the 12 subjects, one subject (from the public) considered himself Hispanic or Latino and six (6) respondents self-identified as African American. All ethnicity and race questions to establish demographics were compliant with OMB guidelines whereby ethnicity questions are asked first followed by a multiple response racial category question. The remainder of the respondents were Caucasian. With the exception of one respondent who had a “side hobby/entrepreneurial activity” (not considered as self-employment by the respondent) all other 11 respondents were employed.


Cognitive interviews lasted one hour, on average, with a minimum of 30 minutes for one interview (a VA employee who had to leave for an unscheduled meeting), and the longest interview was 93 minutes, which occurred early in the testing phase when the instrument was longer. Three iterations of the survey instrument were used during pretesting.


All interviews were conducted by phone and participants were informed about the interview being voluntary, the Privacy Act, and assured that their information would not be attributed to them even to other researchers on the team or program staff. When findings were discussed, all respondents were referred to either by gender and branch of service or by their date of interview. At no point have their names been provided to anyone on the team in conjunction with their responses.


The types of changes that were made during pretesting before finalizing the instrument draft are described below. It is important to note that, as with any research, there are limitations in the findings. For this testing, the primary threat to validity is that the pretesting pool was not representative of the overall Veteran population. Specifically, the majority of the pretesting pool had advanced degrees. Many of the subjects had been officers which is disproportionate to the 6% of Veterans overall who had been officers.1 These disparate proportions are offset by the fact that recent Servicemembers are required to have a high school equivalent, so the literacy rate and English fluency is expected to be higher among Servicemembers than the general population.


Another limitation is that only three (3) people had attended TAP training within the 3-year reference period targeted for this study. The current TAP curriculum differs from earlier versions. There is also recall bias with much of the content occurring in the past. However, all test subjects had sufficient recall of the TAP curriculum that their feedback provided was applicable.


The overarching concern from all respondents was the length of the survey itself. Although the subjects were informed that cognitive interviews by nature are longer than the time it takes to complete a self-administered survey, a frequent comment was that there were too many topics being covered in the survey. As a result, questions were cut from certain sections.


Another technique was used to minimize the cognitive burden was to add language to make a transition clearer when the instrument switched from questions focused on TAP to the outcome portion. The final transition now reads:


The transition process is much more than just what you learned in the classroom. VA is not only interested in what you learned but more importantly, how the information you received is impacting your life as a civilian. Our goal is to make sure that we provide you the necessary information and support to make a successful transition from a military member to part of the civilian population.


To help us determine how we can better serve Veterans and transitioning Servicemembers, these next sections will be asking about some key life areas, such as employment, education, and training after separation, retirement, or release from active duty service as well as some health, financial, and social relationship questions.”


Other areas of the survey that were modified include:


  • Utilizing more cues to help respondent recall. For example, official TAP modules were listed in the instrument, but parenthetical examples were added such as “Boots to Business” to help the Entrepreneurship track resonate for respondents. Similarly, “GI Bill” was added as an example in the section on the instrument measuring knowledge and utilization of VA benefits.

  • Career training has the formal name of Career Credentialing and Apprenticeship track, but it has changed names several times. As a result, the parenthetical for this item now includes all titles elicited from pretesting subjects. The examples now include: “CT3, previously called CTT or sometimes “career training track.”

  • VA Advisor was ultimately changed to “VA Benefits Advisor (VA Rep).” These advisors were generally not understood by most respondents, but inserting “benefits” before advisor helped to clarify the person who most respondents called a “VA Rep” when this item was included as a probe for all testing subjects.

  • When asking about the Veteran’s “job” initial consideration was whether this should be “first” job or “current” job. When this was left as simply “job” for respondents, it was not clear if this could be what was described by one respondent as, “It didn’t take me long to find odd jobs to pay for school, but I now have what I consider to be my career after completing college and graduate school after leaving service.” The final phrasing of this question states “current” job.

  • Response options for reasons for looking for a new job were revised from “closer to home” to “shorter commute” since it was explained that when travelling a short distance in a metropolitan area, it can still take a very long time to get to work.

  • Civilian employment was changed to transition to civilian “life,” which helped the flow of the instrument since it is not limited to employment.

  • Likert scale questions recommended by the PMWG were included after being shortened and modified through tested.



  1. Provide the name and telephone numbers of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who actually collect and/or analyze the information for the agency.


The PSTAP Assessment is the culmination of significant federal planning and interagency coordination. In May 2017, a contract was awarded to the global survey research firm J.D. Power to refine the instrument, conduct pretesting, and prepare for submission to OMB. The cost estimates in this clearance are based on optional tasks in a current contract to J.D. Power that expires on April 30, 2018. Key personnel involved in the final stages of the design of the survey instrument and statistical planning of the study include:


Veterans Benefits Administration (202-530-9053)

Meredith Bedenbaugh-Thomas, Assistant Director Transition

Kenyonna Power, Contracting Officer’s Representative

William Brinley, Lead Program Analyst


Interagency Performance Management Working Group (PMWG)

Lynne Kelley, Ph.D., PMWG Chair (703- 614-8676)

More than 40 representatives from the 7 federal agencies in this group.


J.D. Power 805-418-8000

Greg Truex, Senior Director

Katrina B. Stone, Ph.D., Government Program Director

Tara Porter, Government Program Deputy Director

Judy Gottlieb, Global Research Operations Director


1 https://www.va.gov/vetdata/veteran_population.asp

Page 7 of 7

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement for VBA Generic Customer Survey Clearance
AuthorVBA
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy