Responses to OMB Questions for EDU Survey

Responses to OMB Questions for EDU Survey.docx

Post-9/11 GI Bill Longitudinal Study

Responses to OMB Questions for EDU Survey

OMB: 2900-0794

Document [docx]
Download: docx | pdf
  1. In general, please expand the written explanations in Part A to explain why this study, in this form, is necessary and what specifically VA hopes to learn from it.  We provided some comments in the attachment to provide some guidance on where additional info is needed.

Westat, the contractor that will implement the Chapter 33 Education Longitudinal Study, has assisted VA in expanding all written explanations in Part A. Please refer to the Part A written explanations enclosed with this response.

  1. Will VA use a screener?  If not, please just confirm that for us.  If so, please provide it.

No, VA does not intend to use a screener. Everyone to be included on the sample frame VA will provide to Westat will be eligible to be recruited for the survey.

  1. We would appreciate the following information being added to Part B:


    1. What is the frame for the respondents?  Is VA is confident in the accuracy/completeness of that frame and contact information (mail/phone/email)?

We have proposed a 12-week data collection period that will begin immediately after OMB approval is obtained. Please see the written explanation to item (16) for more specific details on our proposed data collection contact protocol.

    1. Please explain the rationale behind the assumption about initial recruitment response rate, which is set higher than that in the two other studies of veterans cited in footnotes.

Since the time that we submitted the EDU OMB package, we have completed the data collection for the 2010 and 2012 VR&E cohorts. Based on our VR&E experience we revise the recruitment rate downward from 70% to 35%. Similar to the proposed study, the aim of the VR&E Study was to recruit 3,500 Veterans for a 2010 cohort and 3,500 Veterans for a 2012 cohort. VBA was successful in doing this. The participation rate for the VR&E 2012 cohort was 34 percent and that for the 2012 cohort was 26 percent.

    1. Please explain the rationale behind the attrition rates.

The reasons for attrition in the number of participants include deaths of some participants, unable to contact, and unwillingness to participate. An analysis based on SSA’s life tables predicts that the annual attrition rate due to deaths of participants will be approximately 0.3 percent. It is difficult to predict what the annual attrition rates due to other causes.

Please say more about the periodic reminders to be used to maintain up-to-date contact information.  Does VA have mock-ups of these reminders?  If so, we would appreciate seeing them.

Survey participants will be contacted at regular, periodic intervals in between annual survey field periods to maintain up-to-date contact information on the survey sample. A copy of the reminder letter has been included with this revised justification package.

  1. Is there a revised version of the survey?  We are not confident that we have the most up-to-date version because some of the fairly non-controversial changes (e.g., typos) suggested by the public commenters were not reflected in the version in ROCIS. 

Yes, the survey was revised, and the version you accessed in ROCIS was not the latest version for OMB review. Public comments were received in response to this notice on the survey content from the National Association of Veterans Program Administrators, the American Council on Education, and the American Association of State Colleges and Universities. VBA also consulted with the Department of Education, Department of Justice, and the Consumer Financial Protection Bureau on the survey content. In light of the feedback received from public comments and consultation with other government agencies, the contractor for this study, Westat, also conducted an expert appraisal of the survey content, and reviewed the content of pre-existing valid and reliable surveys for Veteran populations, to further refine the survey items. The correct survey has been included with this revised justification package.

  1. In general, we would appreciate a better understanding of why VA chose not to accept the suggestions in some of the public comments.  If it’s easiest to walk through these on the phone we can set up a call to do that.

The vast majority of the public comments received were incorporated into the revised survey. The wording of specific survey items was revised to improve clarity and precision, so that it is clear when we are asking about respondents’ experience with their educational program of study, versus asking about their experience with the Chapter 33 program. We also revised the time period for which we are collecting education outcomes during the initial survey administration to improve the precision of this measure. By asking about any educational outcomes attained since respondents started using benefits, we minimize under- or over-reports of educational outcomes. We also improved the wording of various items, such as the item on income and the item on current work status, to reflect the way these measures are collected in widely-used validated surveys, such as the American Community Survey. A few comments recommended adding a few additional items to the survey, however we did not add these items in an effort to minimize respondent burden and avoid duplication because they are available through VBA administrative data files. Finally, it is important to recognize that the EDU Longitudinal Study is not a study of the effectiveness of the program. The study is designed to examine the long-term education, employment, and standard of living outcomes of three cohorts of Veterans (and their eligible dependents) who began using Chapter 33 benefits in FY 2010, FY 2012, and FY 2014. The survey has been designed to collect only data that is not available through VBA administrative data files or any other existing data source. VBA administrative data will be used to obtain demographic information on the characteristics of cohort members, as well as to obtain additional information on the education and training programs being pursued under Chapter 33 (e.g., length of time in the various programs of study, type of institution).

  1. What are the measures to determine effectiveness of this program?

The EDU Longitudinal Study is not a study of the effectiveness of the program. The study is designed to examine the long-term education, employment, and standard of living outcomes of three cohorts of Veterans (and their eligible dependents) who began using Chapter 33 benefits in FY 2010, FY 2012, and FY 2014. Table 1 in Part A of the justification package provides a list of the measures to be analyzed and the data source that will be used to gather that information.

  1. How do the questions in the survey map back to these measures, and for questions that do not map back to the measures, why are they being asked?

Table 1 in Part A of the justification package provides a list of the measures to be analyzed and the data source that will be used to gather that information . Whenever the survey is identified as the data source, the table explicitly lists the specific items that will be used for the respective measures.

  1. Does VA have plans to a comparison group for the cohorts in this study, i.e., to show the effects for people in this program compared to those who are not in this program? 

There are no current plans to include a comparison group.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJarnee Riley
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy