Response to OMB comments

Response to OMB Qs.1-21-09.doc

Evaluation of Evidence-Based Practices in Online Learning

Response to OMB comments

OMB: 1875-0251

Document [doc]
Download: doc | pdf

To: Bernadette Adams Yates

From: Marianne Bakia and Karla Jones, SRI International

Date: 1/15/09


In this document, we provide written response to OMB questions sent via email on January 9, 2009 by Bridget Dooling regarding the Evidence-based Practices in Online Learning submission. Each question from OMB is numbered and italicized. The response to each question immediately follows the question in indented text.


1. Under what statute(s) does ED propose to offer confidentiality assurances?


We have revised Supporting Statement Part A to include reference to The Privacy Act of 1974, P. L. 93-579, 5 USC 552 a; the "Buckley Amendment, " Family Educational and Privacy Act of 1974, 20 USC 1232 g; The Freedom of Information Act, 5 USC 522; and related regulations, including but not limited to: 41 CFR Part 1-1 and 45 CFR Part 5b and, as appropriate, the Federal common rule or the Department’s final regulations on the protection of human research participants.



2. In the various confidentiality pledges, please change all references to "for statistical purposes" to "for study/research purposes" or something similar. We cannot approve a pledge that sounds too similar to that authorized under the Confidential Information Protection and Statistical Efficiency Act (http://www.whitehouse.gov/omb/fedreg/2007/061507_cipsea_guidance.pdf).


This information has been changed as requested in the “Supporting Statements: Part A,” consent forms, and notification letters. The revised Supporting Statement: Part A is provided as a separate attachment to the email that transmitted this document along with the other revised forms.



3. Are the focus groups with students proposed to occur during the school day?


Focus groups with students are currently planned to occur during the school day.



4. How were the burden estimates developed for each type of interview? Please provide the results of pretesting.


The number of hours of estimated burden for interviews were based on our pretesting of instruments. The results of our pretesting have been included in Appendix A of this document.



5. Please provide additional detail than is currently provided by SS A16 regarding the specific types of tabulations, if any, the study will produce.


As described in response to question 7 below, the contractor is conducting quantitative analysis in preparation for qualitative data collection during site visits. Qualitative data will be collected by interviews during site visits with course providers in order to document the online interventions included in the quantitative analysis. Data will be used to describe key characteristics of the intervention (as provided during developer interviews) and document implementation fidelity across sites visited. The case studies will provide much more in-depth understanding how school staff support online learning as well as the policies and practices associated with local implementations. Similarities and differences will be analyzed across sites, and the contractor will report percentages of school exhibiting key features or characteristics.


Our approach to analyzing the qualitative data will be iterative – beginning before each site visit, continuing while on-site, and proceeding through the drafting of the internal case study reports and cross-site analysis. Before we conduct site visits, we will collect and review relevant documents (e.g. information about district performance and policies as well as school achievement and demographics served). During site visits, researchers gather data and compare findings in the context of the study’s conceptual framework. Two researchers will conduct each site visit, and throughout the visit, the team will informally discuss their initial impressions about key features of how online learning is used at the school. More formally, the site visitors will meet each day of the visit to go through the case study debriefing form and formulate preliminary responses. Once each site visit is complete, site visitors will draft their case study reports. Drafting such reports requires the researchers to reduce their field notes to descriptive prose within the structure of a debriefing form associated with particular sections and questions on the interview protocol. Within each section – or major topic areas – researchers will code for information on specific subtopics (e.g. local support for online students).


Because researchers will draw on information reported from a variety of respondents (e.g. both teachers and students), they will use the case study report to synthesize findings and note apparent contradictions. These reports are meant to facilitate cross-site analysis. Once the individual reports are completed, formal cross-site analysis will begin. The goal of the analysis is to compare, contrast, an synthesize findings and propositions from the single cases to make statements about the sample or segments of the sample.


This information will be important for exploring reasons for potentially different results across cohorts, subjects, and courses. In addition, this data will be useful for the research and development community so findings from this study can be placed in context with future studies that may have similar or different results. Failure to include sufficient descriptive information has hindered attempts to aggregate studies through meta-analysis and other techniques.1


6. Please provide the meta-analysis report.


The meta-analysis report is provided as a separate attachment to the email that transmitted this document.


7. Please clarify how the study will determine sites that are "leading users of online learning,...through the analysis of student outcome data." Is the study attributing student outcomes to online learning in sites without experimental designs? What methods are being employed to make this determination?


The study has received extant data from the Florida Virtual School and is in the process of obtaining similar data from a district in Virginia. Analysis will compare course passing rates across sites that serve at least 20 students, looking for sites that achieve above average student achievement relative to other sites serving online students. Logistic regression will be used to study course completion. Ordinary least squares regression will be used to examine grades and AP exam scores for those students who passed the course. The model will include covariates, including student demographics, a measure of prior student achievement, school characteristics, and course characteristics. Addition information about the quantitative analysis planned for these data sets is included in Appendix B.


8. Please clarify what is meant in SS B1 by "statistical procedures for estimating the effects of particular characteristics associated with online learning across particular interventions." Is this referring to a method for determining which sites to select for case studies because they are deemed successful, or is this referring to a method to be used with data collected from the case studies?


We apologize but we cannot find reference to this phrase in either SS A or SS B. We believe that it refers to the meta-analysis work. No analysis will be performed regarding the effectiveness of particular characteristics either specifically for case study selection or with site visit data.




Appendix A: Results of Instrument Pretesting

In order to refine the protocols that were developed for site visits associated with the Evidence-based Practices in Online Learning Project, SRI pilot tested the interview questions with volunteers who demonstrated experience with K-12 online education. SRI conducted 7 pilot testing interviews with developers, administrators, and teachers. The pilot test subjects included:

  • Michelle Roper. Federation of American Scientists. Developer protocol.

  • Patricia Schank. SRI International. Sr. Cognitive Scientist. Developer protocol.

  • Kenneth Kuczynski. Excel High School. Principal/Director. Administrator Protocol.

  • Harold Vietti. eScholar Academy. Principal/Executive Director. Administrator Protocol.

  • Mark Cruthers. Homeschool-teachers.net. Instructor Protocol.

  • Dave Harmeyer. Azusa Pacific University. Associate Professor, Librarian. Instructor Protocol.

  • Luanne Schanse. eScholar Academy. Instructor Protocol.

The pilot testing interviews lasted for approximately 60 to 90 minutes. During the pilot testing, the interviewer timed each section, in order to approximate the length of time necessary to complete an interview. The pilot testers answered the interview questions, provided feedback on the structure and wording of the questions, and suggested additional topics and questions that would be potentially useful to address during the interview.

Results

The interviewers were unable to complete the Developer pilot testing interviews within 90 minutes. Ms. Schank suggested that the probes in Technology: Question 1 should be formalized, as they were slightly confusing when read aloud. An additional probe could be added to Technology: Question 2, to ask if instructors have the ability to supervise online activity while using the application. Ms. Roper indicated that we should revise our language slightly and use less formal terminology.

We found that the Administrator interview was able to be completed in 90 minutes, but not within 60 minutes. Mr. Vietti and Mr. Kuczynski both had suggestions for question revisions and additions. Mr. Kuczynski suggested that in Business/Staffing/Funding: Question 2, we should add a probe to ask about the course management system that implementation site uses, if any. He also suggested that in the Enrollment section, it may be useful to probe for whether or not the implementation site has a process for creating specific learning plans for the student, and what this process entails. Mr. Vietti felt that in Supports and Barriers: Question 2, the term “support services” needs to be clarified, and that we should provide examples of these services. Mr. Vietti also thought it might be interesting to ask how the implementation site tracks attendance, and how the staff is comprised (i.e. does the staff include a majority of retired teachers).

One Instructor Protocol was able to be completed within 45 minutes; however, the other two interviews were not able to be completed within the allotted hour. Each instructor provided suggestions for additional information to probe for during our interviews. Mr. Cruthers suggested that we ask how hands-on demonstrations and labs are completed online. Mr. Harmeyer felt that we should ask how teachers deal with “problem students” (i.e. students who do not participate, students who participate too much, etc…). Ms. Schanse noticed that the protocol does not ask teachers if they prefer online or offline courses, and that this might be useful to know. Mr. Harmeyer also felt that in the Supports/Barriers section, the term “academic supports” needed to be refined. He confused it with “technical support,” and he indicated that examples would be helpful.

An overall finding is that the term “online application” was troublesome at times. For example, instructors affiliated with a virtual school do not use an online “application,” they teach an online course. The language of the protocol will be to be specifically tailored to the respondent before each interview begins. Some of the terminology was confusing, especially the phrase “support services;” this phrase will be clarified, and we will provide specific examples within the question. Finally, we will slightly modify the protocols so that some of the language is less formal and more colloquial.



Appendix B: Quantitative Analysis of Extant Data in Preparation for Case Studies

The contractor plans to analyze two cohorts of data, one for school year 2006-07 and another for school year 2007-8 for each of two datasets anticipated. Two evaluation questions will be addressed. As described in the body of this document, the data will also be used for site selection to gather qualitative data regarding online learning as applied in each location.

Question 1: How do the characteristics of students taking online courses compare with students in the state as a whole? Are student characteristics different for a given a course?

The first research question will be addressed using descriptive statistics. We will be testing the null hypothesis that populations are the same across platform, regardless of subject or course. A profile of the virtual school students in the state will be presented next to that of students in equivalent face-to-face courses during the same time period (Exhibit 4). These profiles will be examined by platform, subject, and course. Statistical significance of differences across platforms will be conducted using a between group t-test. If the contrast between virtual school students and face-to-face students is constant across courses within each subject area, then profiles will be presented by subject rather than by individual course, for a minimum of two different tables and maximum of 11.



Exhibit 4: Summary of Profiles to be Presented, by Subject/Course

2006-07

2007-08


English (4 courses)

AP English (2 courses)

Math (4 courses)

AP Math (1 course)




English (4 courses)

AP English (2 courses)

Math (4 courses)

AP Math (1 course)





While the specific data elements that FL will be able to provide are still being negotiated, the table shell below shows the broad categories that will be included in these profiles (Exhibit 5).



Exhibit 5. Table Shell: Characteristics of Students enrolled in Virtual School and Face-to-Face Courses (by Subject)


2006-07

2007-08

Total


Online

Face-to-face

Online

Face-to-face

Online

Face-to-face

Student Achievement







Examples:

Mean grade level

Mean test scores (prior year)







Student demographics







Examples:

Percent English Language Learners

Percent female







School Characteristics







Examples:

Percent of students from charter schools

Mean percent of students designated English language learners at source school










Question 2: How do student outcomes such as course grades and course completion compare for online and conventional courses on the same topic, after controlling for student characteristics?


Comparison Group Definition: The student profiles developed under Question 1 will be used to guide the definition of the comparison group used to examine course outcomes for students enrolled in equivalent online and face-to-face courses. If the characteristics of students taking online courses are similar to those taking face-to-face courses, particularly in terms of prior student achievement, then the comparison group will be defined as all students enrolled in equivalent face-to-face courses in the state during the same time period, allowing the analysis to be completed with the largest number of cases available; if not, then, then a comparison group will be created that is a subset of students enrolled in equivalent face-to-face courses in the state during the same time period with prior student achievement similar to that of on-line course enrollees. The rationale behind defining the comparison group as a matched sample with similar prior achievement is that the relationship between student course performance and online format may vary by student ability level.

Outcomes: The analysis will examine two primary outcomes, successful course completion (earning a passing grade) and course grades. For Advanced Placement courses in Florida, AP scores will also be studied for the subset of student who took the corresponding AP test.

Methods: Ordinary least squares regression will be used to examine grades and AP exam scores for those students who passed the course. Logistic regression will be used to study course completion.

Covariates: The same set of covariates will be used in both analyses. Several broad categories have been requested, though the specific variables that each state can provide have not yet been confirmed. Covariates in the models will include student demographics, such as gender; a measure of prior student achievement, such as GPA;2characteristics of a student’s source school, such as a measure of school performance; and course characteristics, such as honors designation.



Analysis for Mathematics Courses

Separate analyses will be performed for each of the four mathematics courses by cohort year. Two outcomes will be examined for each course (grades and course completion), and the additional outcome of AP exam score will be examined for AP Calculus AB, for a total of 22 analyses.

For the linear regressions, the functional form of the model will be:

Y=b0+b1(ONLINE)+b2(HONORS)+…+bn(Xn)+e

Where Y is the outcome of interest (course grade or AP exam), ONLINE is a dummy variable coded 1 for online courses and 0 for face-to-face courses, and honors is a dummy variable coded 1 for honors courses and 0 for standard courses. Additional covariates will include student demographics, school characteristics and a measure of prior student achievement. A unique teacher identifier linked to student course is not available, so no teacher characteristics can be examined.



For logistic regression, the functional form is the same but the model estimates the log of the odds of successfully completing the course:

P(Y=1|B)=P

Log (1/1-P)=b0+b1(ONLINE)+b2(HONORS)+…+bn(Xn)+e

The coefficient of interest for the comparison is that of the variable indicating online course enrollment, with the following null hypothesis:

H0: Mean predicted achievement in online and face-to-face courses is equivalent, b1 =0.

The null hypothesis will be tested using a t-test for linear regression and a Wald test for logistic regression.

The results of the regression will be presented in the form shown in Exhibit 6, below. If space permits, the results from the two years will be presented side-by-side for ease of comparison.



Table Shell: Regression Results for Florida High School Mathematics Courses (by Outcome and Course)


2006-07

2007-08

Linear regression

Beta Coefficient

or

t

or

p

Beta coefficient

or

t

or

p

Logistic regression

Odds Ratio

z


Odds Ratio

z


Variable*







Constant







Course characteristic

Online

Honors







Prior Student Achievement

Previous year GPA







Student Demographics

Female

Highest level parental education







School Characteristics

Charter school

Private school

Performance index







*Specific covariates available not yet determined.



Analysis for English Courses

Separate analyses will be performed for each of the four non-AP English courses in Florida, by year. If enrollment in the two online AP English courses are small (fewer than 100 students each), they will be grouped together rather than analyzed separately, and a dummy variable indicating course will be included. Two outcomes will be examined for each course (grades and course completion), and the additional outcome of AP exam score will be examined for the two AP English courses, for a total of 22 or 28 analyses, depending on whether the two AP courses are examined together or separately.

The functional forms of the models, the null hypotheses, and the tests used for English courses are identical to those described in the mathematics section.


List of Data Elements for Secondary Data Analysis

Evidence-Based Practices for Online Learning


Course Characteristics

Format (Online, face-to-face (FTF), blended)*

Subject (with enough detail to determine comparable courses)*

Credits*

Designation (honors, AP, remedial, college credit)*

Class Size (instructor: student ratio)

Instructional time (time on task)


Teacher Qualities

Instructor unique identifier

Instructor certification

Instructor training in online course delivery


Course Enrollment Information

Dates of enrollment (online: start and end; FTF: year and term)*

Course completion/withdrawal/failure*

Course grade*


Student Achievement

Grades in previous courses or Standardized test scores*

Previous course failure

Other assessments (must be same for online and FTF students; validity and reliability evidence must be available)


Student Characteristics

Gender*

Grade level*

Age*

Race/Ethnicity

Free/reduced lunch

English language learner/English as a Second Language/ bilingual*

Special Ed/Individualized Education Program*

Absenteeism

Parental Income

Parental Education

Single-parent household


School Context

School type (public, private, homeschool, public charter or unique NCES identifier)*

% student receiving free or reduced lunch

% of students English language learners

School code (unique identifier)*

School size


On-line Practice Characteristics

Feedback (frequency of contact)

Type of computer-mediated communication (synchronous v. asynchronous with peers, instructor)

Learning experience type (expository, active, interactive)


Other Outcomes

AP scores / SAT II or ACT scores (date, subject and score)

College enrollment

Enrollment in subsequent higher-level courses



1 See, for example, Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., et al. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research, 74(3), 379-439 and CTL’s recent meta-analysis submitted under a different task on the OLL project.

2 Because middle school grades (including GPA) are not available in FL, GPA cannot be used as a measure of prior achievement for English 1. The FCAT is taken in reading and math in grades 6,7,8,9 and 10.  Writing is 8th and 10th grade only. The most consistent measure of prior achievement available for all the focal courses is FCAT reading for English courses and FCAT math for math courses.  Alternately, we could only look at courses taken in the sophomore year and beyond so that we could use GPA.

File Typeapplication/msword
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy