0990-EvaluationoftheITProfessionalHealthcare

0990-EvaluationoftheITProfessionalHealthcare.doc

Evaluation of the IT Professionals in Health Care

OMB: 0990-0372

Document [doc]
Download: doc | pdf

B. Statistical Methods

B.1. Respondent Universe and Sampling Methods

ONC’s contractor, the National Opinion Research Center (NORC), will collect information for the Evaluation of Information Technology Professionals in Health Care Program on behalf of ONC. The contractor is responsible for the design and administration of the survey that will be used to collect information about students, faculty, and exam takers associated with the Workforce program.

As discussion in Part A, the data collection is part of a mixed method (surveys and focus groups) evaluation design that address the complex nature of the Workforce program and will ensure that all aspects of the program are captured. The design will therefore require multiple contacts with students and other program stakeholders at various times during the evaluation. This is necessary to capture the different types of information and inform subsequent stages of the assessment.

However, the primary data collection for which OMB approval is being sought is for baseline data collection and includes the following activities:

Web-Based Survey of Students in Community Colleges Programs

In order to allow for stratification and subgroup analyses of key variables, such as gender, race, and professional roles, we will derive a growing panel of community college students that will total approximately 1,850 unique sample members by the end of the program’s third year across five Community College Consortia made up of 84 colleges. The universe consists of students enrolled in the Workforce program and the sampling frame will encompass most of the universe, excluding a small number of students without email addresses. As outlined in the table below, for the baseline survey in the first year, we would sample 616 enrollees in the community college program. For each subsequent cohort, we will add an additional 617 community college students to the panel. This will allow us to conduct robust analyses of specific subgroups as well as meaningful cohort comparisons over time. We expect an 80% response rate each round. The Community College Consortia have agreed to provide lists of students enrolled in their programs during the evaluation from which we will draw the sample. We offer more detail on our sampling plan in section B.2.3.

Student Survey Sample

Sample

Baseline

Follow-Up

2011

2012

2011

2012

Community College Enrollees






Cohort 1

616

X

--

X

--

Cohort 2

617

X

--

--

X

Cohort 3

617

--

X

--

X

Total Respondents

1,850

1,233

617

616

1,234


Web-Based Survey of Faculty from Community Colleges Programs. A census of the estimated 300 faculty instructors participating in the Workforce program will be invited to participate in the Web-based faculty survey via email. The email addresses of faculty in their programs will be provided by the Community College Consortia.

Course Evaluation Forms. The community colleges will be required to collect student evaluation forms of Workforce courses. All students in the Workforce program will receive an email invitation to complete a course evaluation form. This will not add any additional burden as evaluations are already part of the normal class procedures. We will provide a set of core questions to which each college may add their own specific questions.

Focus Group with Students Discussion Guide. During unstructured site visits to the programs, we plan to hold up to four focus groups with eight students at each site (approximately eight sites per year). Participation in the focus groups will be voluntary and students will be selected from individuals who respond to an email invitation to participate.

Focus Groups with Faculty Discussion Guide. Once per year for the three years of the project, we plan to hold up to five focus groups with ten faculty members selected from across multiple community colleges within the same Community College Consortia. These Web-based focus groups will likely consist of faculty from more than one school and participation will be voluntary. Faculty will be invited to participate via email.

Focus Groups with Exam Takers Discussion Guide. We plan to conduct four Web-based focus groups per year with up to eight exam takers. Focus group participants will be selected from individuals who respond to a question on the competency exam inviting them to participate in a focus group on health IT certification and careers.

B.2. Information Collection Procedures

ONC will coordinate with the staff of Community College Consortia participating in the grant program to obtain lists of students enrolled in Workforce course(s). To obtain baseline information, a self-administered Web-based survey will be made available to respondents. Respondents will be contacted via email using addresses provided by the Community College Consortia and asked to complete the 20 minute survey. In addition, evaluation forms will be distributed to students enrolled in Workforce courses, and a subset of students, as well as faculty and exam takers, will be asked to participate in small, in-depth focus group discussions.

B.2.1. Statistical Methodology for Stratification and Sample Selection

Web-based Survey

The purpose of the Web-based survey of students is to collect information on students’ experiences in the program including, students’ attitudes and satisfaction with the learning environment (e.g., with faculty/courses/resources/curriculum materials), perceptions about work/skill readiness, students’ motivation for entering the program and the health IT profession, students’ level of engagement with faculty, satisfaction with support systems available within and outside of the college environment , areas for program improvement, employment outcomes, and experiences with the competency exam.

To minimize burden and cost to the government, ONC will collect data from community college students via a Web survey. The Web survey will include skip patterns, range checks, and other quality control measures; and will be hosted on a secure server. Students’ email addresses and other contact information will be provided by the community colleges. Email addresses, or postal addresses where the former are not available, will be used to correspond with members of the sample. This communication will include sending members their unique personal identification numbers (PINs) and passwords, as well as the URL of the survey. If no email or postal address is available for a student, the student will be contacted via telephone and asked for his or her email address.

Course Evaluation Forms

The purpose of the course evaluation forms is to collect students’ opinions and attitudes about the materials in individual courses. The Community College Consortia programs will distribute these course evaluation forms via email to all workforce students at the end of each course. The programs will be responsible for processing the course evaluation forms and delivering the data to ONC.

Focus Groups

The purpose of the focus groups is to gain insights from the students, faculty members, and exam takers. Our discussions with students will explore their perceptions of the quality of instruction, motivations for enrolling in the program, and plans for taking (or not taking) the competency exam. With faculty members, we will discuss how they are using the curricula developed as part of this Program and whether they have identified any gaps in the available materials. We will also conduct focus groups with individuals who took the exam, but did not attend one of the ONC-funded training programs in order to understand their motivations.

To minimize burden and cost to the government, we will conduct focus groups with students during site visits and with faculty and test takers using an on-line forum. Students will be recruited via email invitations sent to all students. An experienced researcher will facilitate each focus group, using the discussion guide that will ensure that key topics are covered, while allowing for sufficient flexibility to follow the natural flow of the conversation. Faculty will be recruited via email invitations sent to all instructors in the program. For the focus groups with exam takers, we will schedule some of the discussions shortly after the individuals have sat for the exam in order to discuss the clarity, validity, and relevance of the questions while they are still fresh in participants’ minds. A question at the end of the exam will ask if the exam taker is interested in being contacted in the future for focus groups, and we will then draw our sample from those individuals who respond “yes.”

The table below provides an estimated timeline of data collection activities.

Activity

Estimated Start Date

Estimated End Date

Course evaluation forms

February 2011

December 2012

Web-based faculty survey

January 2011

February 2012

Focus groups with students

1 month following OMB approval

October 2012

Focus groups with faculty

1 month following OMB approval

October 2012

Web-based survey of students – cohort 1, baseline

January 2011

April 2011

Web-based survey of students – cohort 2, baseline

May 2011

August 2011

Focus groups with exam takers

May 2011

October 2012

Web-based survey of students – cohort 1, follow-up

July 2011

December 2011

Web-based survey of students – cohort 2, follow-up

November 2011

April 2012

Web-based survey of students – cohort 3, baseline

January 2012

April 2012

Web-based survey of students – cohort 3, follow-up

July 2012

December 2012


B.2.2. Estimation Procedure

For producing population-based estimates of totals, percentages and means, each respondent member will have a sampling weight. This weight combines the base weight that is the inverse of the probability of selection of a member and an adjustment for non-response. This adjustment is to account for members who are in the sample but do not respond to the follow-up surveys. Every effort will be made to minimize the non-sampling errors in the estimates by maximizing the response rates and taking steps to reduce response errors.

Quantitative analysis will primarily be descriptive to answer research questions on program satisfaction and student employment outcomes. We will provide aggregate descriptions of the following information:

  • Students’ satisfaction with the learning environment (e.g., with faculty/courses/resources/curriculum materials);

  • Students’ perceptions about work/skill readiness;

  • Satisfaction with support systems available within and outside of the college environment;

  • Students’ level of engagement with faculty; and

  • Employment outcomes of students who complete the educational programs.


In addition to descriptive statistics of the measures associated with student satisfaction and employment outcomes and conducting longitudinal analyses of how key measures vary over time, we will analyze the data by key subgroups, such as student race/ethnicity, gender, enrollment status, and curriculum focus.

The reports will also include a discussion of the qualitative information gleaned through the focus groups and open-response survey questions. This will allow us to answer key research questions on:

  • Students’ motivations for entering the program and the health IT profession;

  • Faculty members’ perceptions of student engagement and availability of appropriate curricula; and

  • Areas for program improvement.


B.2.3. Degree of Accuracy Needed for the Purpose Described in the Justification

For the Web-based survey of students, the sample size of 1,850 was selected to enable us to conduct our analysis. The selection will be done in two stages: 1) at the first stage a sample of community colleges with the Workforce program will be chosen; 2) at the second stage a sample of students within each of the selected community college programs will be selected. The goal of our sampling plan is to select a sample that is representative of the population of students either currently or previously enrolled in a Workforce program. The sample size, which will be representative of the population, will then be used to generate frequencies and means, by important characteristics.

Sample Size Needs. A sample size of 1,850 students across three cohorts gives a sufficient size to be representative of the population of students enrolled in the workforce program across the five consortia. In determining the sample size, a 95% confidence level and a margin of error of 3.5% were our assumed rates to determine the sample size needs. Based on these rates, a simple random sample would require a sample size of 755, which includes the finite population correction (fpc) for an enrollment of 20,000 students at the time of sample selection. The use of 20,000 enrolled students is an estimate of the universe of students. The sampling frame will contain the number of students currently or previously enrolled in the program at the time of sample selection for each cohort.

Since the proposed sample design is a cluster design, the number of effective completes will be less than the number of completed surveys (i.e., the design effect will be greater than one). The design effect can be estimated from a standard formula, using an assumed intraclass correlation coefficient (ICC). For this analysis, we are assuming an ICC of .05, which equates to a design effect of 1.5 that will be assumed for this study. Note that this is an estimate, and the actual intraclass correlation and design effect will not be known until after the survey is complete. With a baseline sample of 1,850 students and an 80% response rate, the result would be 1,480 completed surveys, which results in a 986 effective sample size. Since the minimum required sample size is 755, this will be a sufficient size to be representative of the population, while also providing sufficient additional sample for subgroup analysis by gender, race, and role (i.e., program type).

Stratification of Community Colleges. The colleges will be stratified by consortium and three community colleges within each consortium will be selected. We will select the colleges from each consortium using Probability Proportional to Size (PPS) sampling. Therefore, the three colleges chosen will be those with the largest number of students enrolled in the workforce program at the time of selection for each cohort. Below is a table containing the estimated number of community colleges within each consortium that are expected to have a workforce program; however, these numbers could change slightly at the time of sample selection for each cohort.

Consortium

Community Colleges with Workforce Program

Bellevue

8

Los Rios

15

Cuyahoga

17

Pitt

21

Tidewater

23


Sampling of Students. The allocation of the 1,850 students across the community colleges will be based on the size of the community college program (i.e., number of students enrolled) as well as the enrollment within each consortium. The allocation will be in two stages. First, the students will be allocated proportional to the size of the program (PPS) within each consortium based on the total enrollment for the workforce program across all of the consortia. A second step will be done to allocate the consortium’s sample needs across the three community colleges chosen for the sample, based on workforce program enrollment size at the time of sample selection for those three community colleges. This sampling method will be applied during each cohort timeframe; so that the colleges chosen and allocation calculated for cohort 1 may not be the same as for cohort 2. This will allow for accurate estimates based on current enrollment levels during each cohort. At any time of sample selection, if there are fewer enrolled students than the sample size requires, a census of all students will be done.

B.2.4. Unusual Problems Requiring Specialized Sampling Procedures

There are no unusual problems requiring specialized sampling procedures.

B.2.5. Use of Periodic (Less Frequent Than Annual) Data Collection Cycles

The baseline survey is a one-time data collection necessary to establish a baseline from which to measure outcomes on program participants. The course evaluation forms and focus groups will occur up to three times over the 24-month data collection period.

B.3. Methods to Maximize Response Rates

During the study, contact with students will be maintained between survey rounds in order to track them successfully over the period of subsequent follow-ups. A dedicated study Website will be made available to students, as well as a dedicated 800-line and email address. The research team may also explore the use of additional tools to track students such as online networking (e.g., Facebook and MySpace). ONC also plans to offer a free one year membership to a professional organization for students participating in the Web survey and a $50 gift certificate to a retail store for students, faculty, and text examiners participating in focus groups to offset the burden of study participation. Overall, the questions in the survey and focus groups were developed to not only meet analytic goals, but also to encourage the interest and investment of program students.

B.4. Tests of Procedures

The survey instrument has been drafted and has undergone two reviews: (1) an internal review conducted by NORC’s Institutional Review Board and (2) a pre-test with seven students. In order to accurately determine the burden placed on respondents as well as further test the clarity of the survey questions, a pre-test was conducted in which a total of seven students from diverse backgrounds responded to the survey to assess the reliability of the instrument. Slight revisions were made to the order and wording of a small number of questions based on comments received from both of these reviews.

Modifications to the length, content, and structure of the survey have been made based on the results of the survey pre-test interviews. Respondents provided generally positive feedback indicating that they could readily answer the questions and that the time to complete the survey was not onerous (approximately 20 minutes). The respondents also had an enthusiastic response to the idea of a professional membership to offset the burden of participating in the survey and registered their likelihood to respond to the baseline and subsequent student surveys if presented this offer.

B.5. Statistical Consultants

The information for this study is being collected by NORC, a research and consulting firm, on behalf of ONC. With ONC oversight, NORC is responsible for the study design, instrument development, data collection, analysis, and report preparation.

The instrument for this study and the plans for statistical analyses were developed by NORC. The staff team is composed of Dr. Kristina Lowell, Project Director; Dr. Carrie Markovitz, Senior Research Scientist; and a team of senior-level staff including Karen Grigorian. Contact information for these individuals is provided below.

Name

Number

Kristina Lowell

301-634-9488

Carrie Markovitz

301-634-9388

Karen Grigorian

312-759-4025



Page | 7

File Typeapplication/msword
AuthorDHHS
Last Modified ByDHHS
File Modified2010-12-09
File Created2010-12-09

© 2024 OMB.report | Privacy Policy