0990-UBT_Part B

0990-UBT_Part B.doc

Evaluation of the IT Professionals in Health Care Workforce Program: University-Based Training

OMB: 0990-0380

Document [doc]
Download: doc | pdf

B. Statistical Methods

B.1. Respondent Universe and Sampling Methods

ONC’s contractor, the National Opinion Research Center (NORC), will collect information for the Evaluation of Information Technology Professionals in Health Care Program on behalf of ONC. The contractor is responsible for the design and administration of the surveys that will be used to collect information about university students associated with the Workforce program.

As discussion in Part A, the data collection is part of a mixed methods (surveys and focus groups) evaluation design that address the complex nature of the Workforce program and will ensure that all aspects of the program are captured. The design will therefore require multiple contacts with students at various times during the evaluation. This is necessary to capture the different types of information and inform subsequent stages of the assessment. However, the primary data collection for which OMB approval is being sought as part of this submission is for baseline and follow-up data collection and includes the following activity:

Web-Based Survey of Students in University-Based Programs

The universe consists of students enrolled in the university-based Workforce program, and the sampling frame will encompass most of the universe, excluding students who are expected to graduate after the survey period and those who do not have email addresses. It is estimated that approximately 1,682 students will be enrolled and/or graduate from the UBT programs by August 2013. Due to the schedule for conducting this survey, only students that are expected to graduate in 2011 and 2012 will be eligible to participate in the study and, therefore, included in the sampling frame. For the baseline survey in the first year, we predict 520 students will graduate from the university-based program in 2011. In the second year, we will add an additional 634 graduates from 2012 to the sample.

All students with sufficient locating information (i.e., email address, phone number, or address) will be extended an opportunity to complete the survey. The census approach also allows for a nominal buffer should either the locating or response rates be lower than anticipated. We expect an 80% response rate each round. The universities have agreed to provide lists of students graduating from their programs during the evaluation from which we will draw the sample. We offer more detail on our sampling plan in section B.2.3.

Student Survey Sample

Sample

Baseline

Follow-Up

2011

2012

2012

2013

University Graduates






Cohort 1

413

413

--

413

--

Cohort 2

107

107

--

107

--

Cohort 3

634

--

634

--

634

Total Respondents

1,154

520

634

520

634





B.2. Information Collection Procedures

ONC will coordinate with the staff of universities participating in the grant program to obtain lists of students enrolled in the Workforce program. To obtain baseline information, a self-administered Web-based survey will be made available to respondents. Respondents will be contacted via email (or if necessary postal mail) using addresses provided by the universities and asked to complete the 20 minute survey.

B.2.1. Statistical Methodology for Stratification and Sample Selection

The purpose of the Web-based survey of students is to collect information on students’ experiences in the program including, students’ attitudes and satisfaction with the learning environment (e.g., with faculty/courses), perceptions about work/skill readiness, students’ motivation for entering the program and the health IT profession, areas for program improvement, and employment outcomes.

To minimize burden and cost to the government, ONC will collect data from university students via two Web surveys. The Web surveys will include skip patterns, range checks, and other quality control measures; and will be hosted on a secure server. Students’ email addresses and other contact information will be provided by the universities. Email addresses, or postal addresses where the former are not available, will be used to correspond with members of the sample. This communication will include sending members their unique personal identification numbers (PINs) and passwords, as well as the URL of the survey. If no email or postal address is available for a student, the student will be contacted via telephone and asked for his or her email address.

The table below provides an estimated timeline of data collection activities.

Activity

Estimated Start Date

Estimated End Date

Web-based survey of students – cohort 1, baseline

July 2011

September 2011

Web-based survey of students – cohort 2, baseline

December 2011

February 2012

Web-based survey of students – cohort 3, baseline

July 2012

September 2012

Web-based survey of students – cohort 1, follow-up

January 2012

March 2012

Web-based survey of students – cohort 2, follow-up

June 2012

August 2012

Web-based survey of students – cohort 3, follow-up

January 2013

March 2013

B.2.2. Estimation Procedure

For producing population-based estimates of totals, percentages and means, each respondent member will have a respondent weight to adjust for non-response. Every effort will be made to minimize the non-sampling errors in the estimates by maximizing the response rates and taking steps to reduce response errors.

Quantitative analysis will primarily be descriptive to answer research questions on program satisfaction and student employment outcomes. We will provide aggregate descriptions of the following information:

  • Students’ satisfaction with the learning environment (e.g., with faculty/courses/resources/curriculum materials);

  • Students’ perceptions about work/skill readiness; and

  • Employment outcomes of students who complete the educational programs.


In addition to descriptive statistics of the measures associated with student satisfaction and employment outcomes and conducting longitudinal analyses of how key measures vary over time, we will analyze the data by key subgroups, such as student race/ethnicity, gender, enrollment status, and curriculum focus.

The reports will also include a discussion of the qualitative information gleaned through the focus groups (included in a previous OMB package) and open-response survey questions. This will allow us to answer key research questions on:

  • Students’ motivations for entering the program and the health IT profession; and

  • Areas for program improvement.


B.2.3. Degree of Accuracy Needed for the Purpose Described in the Justification

For the web-based survey of UBT students, a census of all eligible students graduating in 2011 and 2012 (N=1,154) was selected to enable us to conduct our analysis. This census will be used to generate frequencies and means by important characteristics.

The 1,154 eligible students is a sufficiently small sampling frame such that a census can be conducted across both Type I and II roles within the UBT program, while allowing for non-locating and non-response rates. We estimate that there will be 1,108 students graduating from Type I programs and 46 students graduating from Type II programs in 2011 and 2012. Based on prior experience collecting data from program grantees, we assume that we will not receive useful contact information for approximately 25% of the students. Therefore, it is expected that only 866 eligible students will have sufficient contact information to be invited to participate in the survey. If we assume an 80% response rate, this number is further reduced to 693 completed surveys. However, efforts are being made by the programs to improve the recording of student contact information, so we expect to see an increase in the number of cases with sufficient contact information.

A 95% confidence level and a margin of error of 3.0% were our assumed rates to determine the sample size needs. Based on these rates, a simple random sample would require a sample size of 542 completed surveys, which includes the finite population correction (FPC) for an estimated enrollment of 1,154 students at the time of sample selection. Assuming a response rate of 80%, the sample size needs are 680 sampled students. Therefore, the targeted size of 693 completed student responses will provide sufficient sample for analysis.

We assume we will be able to achieve a response rate of 80% for each survey. If, however, a lower than expected response rate results, we will conduct nonresponse bias tests to determine if any bias resulted from the lower response rate. If these tests provide evidence of bias, we will make adjustments to our results with the use of weight adjustments and/or response imputation.

B.2.4. Unusual Problems Requiring Specialized Sampling Procedures

There are no unusual problems requiring specialized sampling procedures.

B.2.5. Use of Periodic (Less Frequent Than Annual) Data Collection Cycles

The baseline survey is a one-time data collection necessary to establish a baseline from which to measure outcomes on program participants. The follow-up survey is a one-time data collection necessary for identifying employment outcomes, assessing students’ perceptions of the program six months after baseline, and developing a better understanding of any changes in program graduates’ circumstances.

B.3. Methods to Maximize Response Rates

During the study, contact with students will be maintained between survey rounds in order to track them successfully over the period of subsequent follow-ups. A dedicated study Website will be made available to students, as well as a dedicated 800-line and email address. The research team may also explore the use of additional tools to track students such as online networking (e.g., Facebook and MySpace). Overall, the questions in the surveys were developed to not only meet analytic goals, but also to encourage the interest and investment of program students.

B.4. Tests of Procedures

The survey instruments have been drafted and have undergone two reviews: (1) an internal review conducted by NORC’s Institutional Review Board and (2) a pre-test of the baseline survey with up to nine university students. Based on the review by the NORC Institutional Review Board, minor changes were made to the consent language at the beginning of the survey in order to clarify how survey findings will be used.

In order to accurately determine the burden placed on respondents as well as further test the clarity of the survey questions, a pre-test was conducted in which six recent graduates of a CCC program from diverse backgrounds responded to the baseline survey to assess the reliability of the instrument. Minor revisions were made to the wording of a small number of questions to improve comprehension based on comments received from both of these reviews. No recommendations were made from the pre-test participants to change the length, content, or overall structure of the surveys.

For both the baseline and follow-up surveys, not all questions will be asked of each respondent. Different sets of questions will be administered to a respondent depending on his/her background and level of work experience in health IT. Each data collection instrument provides flow charts demonstrating which questions will be asked of each type of respondent.

B.5. Statistical Consultants

The information for this study is being collected by NORC, a research and consulting firm, on behalf of ONC. With ONC oversight, NORC is responsible for the study design, instrument development, data collection, analysis, and report preparation.

The instrument for this study and the plans for statistical analyses were developed by NORC. The staff team is composed of Dr. Kristina Lowell, Project Director; Dr. Carrie Markovitz, Senior Research Scientist; and a team of senior-level staff including Karen Grigorian. Contact information for these individuals is provided below.

Name

Number

Kristina Lowell

301-634-9488

Carrie Markovitz

301-634-9388

Karen Grigorian

312-759-4025





File Typeapplication/msword
AuthorSherette.FunnColeman
Last Modified BySherette.FunnColeman
File Modified2011-05-05
File Created2011-05-05

© 2024 OMB.report | Privacy Policy