Supporting statement A.rev with edits

Supporting statement A.rev with edits.doc

Study of Education Data Systems and Decision Making

OMB: 1875-0241

Document [doc]
Download: doc | pdf

A. Justification

1. Circumstances Making Collection of Information Necessary

Section 2404(b)(2) of Title II, Part D, of the Elementary and Secondary Education Act provides for the support of national technology activities, including technical assistance and dissemination of effective practices to state and local education agencies that receive funds to support the use of technology to enhance student achievement. In addition, sections 2415 (5) and 2416(b)(9) further call on states and districts to develop performance measurement systems as a tool for determining the effectiveness of uses of educational technology supported under Title IID. The Study of Education Data Systems and Decision Making will address both of these provisions by providing SEAs and LEAs with critical information to support the effective use of technology to generate data to support instructional decision making.

New standards and accountability requirements have led states’ to reexamine the appropriate uses of technology to support learning and achievement. For some years now, researchers and developers have promoted technology systems that give students access to high-quality content and carefully designed instructional materials (Roschelle, Pea, Hoadley, Gordin, and Means, 2001). More recently, both technology developers and education policymakers have seen great promise in the use of technology for delivering assessments that can be used in classrooms (Means, Roschelle, Penuel, Sabelli, and Haertel, 2004; Means, 2006) and for making student data available to support instructional decision making (Cromey, 2000; Feldman and Tung, 2001; Light, Wexler, and Heinze, 2004; Mason, 2002; U.S. Department of Education, 2004). These strategies are not only allowable uses of ESEA Title IID funds but also of great interest to educational technology policymakers because of their potential to link technology use with changes in practice that can raise student achievement—thus helping to fulfill the primary goal of both Title IID (Enhancing Education Through Technology) and ESEA as a whole. The purpose of the present study is to provide further information exploring these potential links between the use of data housed in technology systems and school-level practices that support better teaching and learning.

While data is being collected on the range of database systems that manage, collect and analyze data (Means 2005; Wayman, Stringfield, and Yakimowski, 2004), much less is known about actual system usage in ways that support better teaching and learning at the school and classroom levels. The wide variety in system architectures and components and the multiple ways in which educators have been encouraged to use them, suggest that a whole range of different practices and supports are being used in different places. However, survey questions used in the past have been at too large a grain-size to capture important distinctions. Moreover, the field has limited knowledge about how educators can best use data from different kinds of assessments (e.g., annual state tests versus classroom-based assessments) for informing instruction. Further study is needed to determine how educators can use data from these database systems to inform their decision making, and the kinds of supports they are receiving from their school districts.

Stringfield, Wayman, and Yakimowski-Srebnick (2005), have been studying these data systems for some time and believe that school-level data-driven decision making using such systems has great potential for improving student achievement but is not yet widely implemented in schools:

Formal research and our own observations indicate that being data-driven is a phenomenon that generally does not occur in the basic structure of the typical school. We believe that the absence of data-informed decision making is not due to educators’ aversion to being informed. Rather, the wealth of data potentially available in schools is often stored in ways that make it virtually inaccessible to teachers and principals and generally far from analysis-friendly. (p. 150).

Our conceptual framework suggests that in addition to making usable, system-wide data available through technology systems, schools and districts need to put in place processes for preparing educators to use these systems and to cultivate practices of inquiring about data within schools. Far more detailed, school-level studies of use of these systems are needed if we are to extract principles for designing systems and practices that support frequent and effective use (the six supports identified in our conceptual framework). This is the goal of the Study of Education Data Systems and Decision Making.

2. Use of Information

The data collected for this study will be used to inform educational policy and practice. More specifically, the data can be used:

  • By SEAs and LEAs to identify and develop policies to support promising practices in data-driven decision making and to develop and implement performance measurement systems that utilize high-quality education data implementation practices that are associated with more teacher use of student data in planning and executing instruction and greater teacher understanding of basic data interpretation concepts.

  • By ED staff to design outreach efforts to stimulate and enhance the use of data systems to improve instruction.

  • By researchers, who may use the data to inform future studies of data-driven decision making to improve instructional practice.

3. Use of Information Technology

The contractors will use a variety of advanced information technologies to maximize the efficiency and completeness of the information gathered for this evaluation and to minimize the burden the evaluation places on respondents at the district and school levels. For example, members of the study team will collect demographic and other descriptive data by accessing the websites and databases of case study districts and schools. This practice will significantly reduce the amount of information that will need to be gathered through interviews.

During the data collection period, an e-mail address will be available to permit respondents to contact the contractor with questions or requests for assistance. The e-mail address will be printed on all data collection instruments, along with the name and phone number of a member of the data collection team. (Additional uses of informational technology such as an on-line survey option will be described in the subsequent OMB package.)

4. Efforts to Identify Duplication

The contractor is working with the U.S. Department of Education to minimize burden by excluding potential case study schools that are also part of other Department educational technology evaluations. Instrumentation will be coordinated across Department studies to prevent unnecessary duplication (e.g., no repeating questions for which sufficient data are already available). In addition to using teacher survey data collected from other Department-sponsored evaluation activities to assess national trends regarding some aspects of data-driven decision making, secondary sources have also been identified for data on state-supported data systems, thereby eliminating the need for state-level data collection. Additional information on these secondary sources is provided in the discussion of data collection procedures.

5. Methods to Minimize Burden on Small Entities

No small businesses or entities will be involved as respondents.

6. Consequences If Information Is Not Collected or Is Collected Less Frequently

Currently the Department is working to improve the quality of K-12 performance data provided by SEAs to assess the effectiveness of federally-supported programs. EDFacts is an initiative of the Department, SEAs, and other members of the education community to use state-reported performance data to support planning, policy, and program management at the federal, state, and local levels. Since much of the data that states gather on program effectiveness comes from the local level, it is important that local performance measurement systems are collecting high-quality data. No Child Left Behind also cites the importance of using data to improve the instruction that students receive, but many of the efforts to date have focused on improving state data systems without comparable attention to making the data in those systems available to and meaningful to the school staff who make instructional decisions. Improved use of data at the local level will not only improve the data that the Department receives from states, but it can also serve to improve the teaching and learning process by giving teachers the information they need to guide their instructional practices.

To address the need for high-quality, actionable data at the local level, the Study of Education Data Systems and Decision Making will provide in-depth information in several key issue areas: the availability of education data systems, their characteristics, the prevalence and nature of their use in districts and schools, and the conditions and practices associated with effective data usage. The study will also provide more detailed evidenced-based information on implementation of data-driven decision making than has previously been available at the local level.

If the data are not collected from the proposed study, there will be no current national data collection on the availability of education data systems at the local level and the nature of their use in districts and schools. There will also be only limited information on classroom and other school-level use of data. (Several investigations are exploring data use in individual schools, but no other study is producing cross-case analysis.) The result will be a continued gap in information on how schools are using data to inform their instructional decision making, the barriers to effective use of data, and the supports staff need to better employ data in their teaching in a way that can positively impact student achievement.

7. Special Circumstances

None of the special circumstances listed apply to this data collection.

8. Federal Register Comments and Persons Consulted Outside the Agency

A notice about the study will be published in the Federal Register when this package is submitted to provide the opportunity for public comment. In addition, throughout the course of this study, SRI will draw on the experience and expertise of a technical working group (TWG) that provides a diverse range of experience and perspectives, including representatives from the district and state levels, as well as researchers with expertise in relevant methodological and content areas. The members of this group and their affiliations are listed in Exhibit 3. The first meeting of the technical working group was held on January 26, 2006.

Exhibit 3
Technical Working Group Membership

Member

Affiliation

Marty Daybell

Washington State Education Authority

Dr. Jeff Fouts

Fouts & Associates

Aimee Guidera

National Center for Educational Accountability

Jackie Lain

Standard and Poor’s School Evaluation Services

Dr. Glynn Ligon

ESP Solutions

Dr. Ellen Mandinach

Center for Children and Technology

Dr. Jim Pellegrino

University of Illinois-Chicago

Dr. Arie van der Ploeg

North Central Regional Education Laboratory (NCREL)/Learning Points

Dr. Jeff Wayman

University of Texas, Austin

Katherine Conoly

Executive Director of Instructional Support, Corpus Christi Independent School District



9. Respondent Payments or Gifts

No payments or gifts will be provided to respondents, but schools will be compensated for payment of substitute teachers if required to enable teachers to participate in a focus group or individual interview..

10. Assurances of Confidentiality

SRI is dedicated to maintaining the confidentiality of participant information and the protection of human subjects. SRI recognizes the following minimum rights of every subject in the study: (1) the right to an accurate representation of the right to privacy, (2) the right to informed consent, and (3) the right to refuse participation at any point during the study. Because much of the Policy Division’s education research involves collecting data about children or students, we are very familiar with the Department’s regulation on protection of human subjects of research. In addition, SRI maintains its own Institutional Review Board. All proposals for studies in which human subjects might be used are reviewed by SRI’s Human Subjects Committee, appointed by the President and Chief Executive Officer. For consideration by the reviewing committee, proposals must include information on the nature of the research and its purpose; anticipated results; the subjects involved and any risks to subjects, including sources of substantial stress or discomfort; and the safeguards to be taken against any risks described.

SRI project staff have extensive experience collecting information and maintaining confidentiality, security, and integrity of interview and survey data. In accordance with the SRI’s institutional policies, confidentialityprivacy and data protection procedures will be in place. These standards and procedures for case study data are summarized below.

  • Project team members will be educated about the confidentiality privacy assurances given to respondents and to the sensitive nature of materials and data to be handled. Each person assigned to the study will be cautioned not to discuss confidential data.

  • Respondents names and addresses will be disassociated from the data as they are entered into the database and will be used for data collection purposes only. As information is gathered on individuals or sites, each will be assigned a unique identification number, which will be used for printout listings on which the data are displayed and analysis files. The unique identification number also will be used for data linkage.

  • Participants will be informed of the purposes of the data collection and the uses that may be made of the data collected. All case study respondents will be asked to sign an informed consent form (see draft in Appendix A).

  • Access to the database and case study notes will be limited to authorized project members only; no others will be authorized such access. Multilevel user codes will be used, and entry passwords will be changed frequently.

  • All identifiable data (e.g., interview notes) will be shredded as soon as the need for this hard copy no longer exits.

  • Reports to the Department or any employee of the Department concerning school DDDM activities will contain no individual or school or district identifiers. The reports prepared for this study will summarize findings across the samples and willl not associate responses with a specific district, school or individual. Information that identifies a school or district will not be provided to anyone outside the study team, except as required by law. Participating schools will be acknowledged in the final report for their cooperation, but they will not be identified in the text of any report unless modelparticular practices are highlighted, in which case permission will be obtained from the school principal before the information is included in reporting.

The additional steps taken to ensure the confidentialityprivacy of survey data will be included in the subsequent OMB package for the district survey.

All case study participants will be assured of confidentialityprivacy to the extent possible in the initial invitation to participate in the study (see drafts of notification letters in Appendix B), and this assurance will be reiterated at the time data collection begins (i.e., when each respondent is presented with an informed consent form). While most of the information in the final report will be reported in aggregate form, as noted above, there may be instances where specific examples from the case study data will be utilized to illustrate “best practices” implementation practices related to education data systems and decision making. In these instances, additional permission will be obtained from the principal and the specific report text will be reviewed by school staff prior to publication. This is an approach frequently used in developing technical assistance materials developed by the Department.

11. Questions of a Sensitive Nature

No questions of a sensitive nature will be included in the site visit protocols.

12. Estimate of Hour Burden

The estimates in Exhibit 4 reflect the burden for school selection and notification of study participants, as well as the case study data collection activity.

  • District personnel—time associated with reviewing study information, and if required, reviewing study proposals submitted to the district research committee, and preparing a list of schools that are active data users (e.g., marking up existing list of schools); time associated with asking questions about the study and answering questions about the district’s use of data systems.

  • School personnel—time associated with asking questions about the study and time associated with answering interview questions.

Exhibit 4
Estimated Burden for Site Selection and Notification

Group

Participants

Total No.

No. of Hours per Participant

Total No. of Hours

Estimated Burden

District Personnel

Superintendent (notification)

District staff (interviews)


10


30


0.5


1.0


5


30


$200


1,200

School Personnel

School principal (notification)

School principal (interview)

Teachers (interviews & focus groups)


30


30


360


0.5


1.0


1.0


15


30


360


$ 600


1,200


14,400


Total

460


440

$17,600



13. Estimate of Cost Burden to Respondents

There are no additional respondent costs associated with this data collection other than the burden estimated in item A12.



14. Estimate of Annual Costs to the Federal Government

The annual costs to the federal government for this study, as specified in the contract, are:

Fiscal year 2005

$272,698

Fiscal year 2006

$849,532

Fiscal year 2007

$242,696

Total

$1,364,926



15. Change in Annual Reporting Burden

This request is for a new information collection.

16. Plans for Tabulation and Publication of Results

During the summer of 2006, the data collection team will work with the secondary analysis team to analyze data from the NETTS 2005-06 teacher survey. Descriptive statistics will be calculated to show the proportion of teachers nationwide who are aware of their access to state student data systems and the support teachers have had for DDDM. Although an evaluation report is not required for the first year of the study, a policy brief will be prepared on the secondary analyses of the NETTS teacher survey data.

In 2007, analysis activities will focus on the case study data. These data will provide a more in depth look at the kinds of systems that are available to support district and school data-driven decision making, and the supports for school use of data systems to inform instruction. We will analyze teacher focus group and interview data on these topics. A separate analysis will examine responses to the portion of the teacher case study interviews in which teachers respond to hypothetical data sets. We will develop and apply rubrics for judging the quality of the inferences teachers make from data. The school-level and teacher-level data will be reported in an interim report to be drafted by October 2007 to answer the following evaluation questions:

  • To what extent do teachers use data systems in planning and implementing instruction?

  • How are school staff using data systems?

  • How does school staffs’ use of data systems influence instruction?

In 2008, analysis activities will focus on data from a district survey (a separate OMB package to be submitted at a later date). These analyses, along with prior analyses of data collected at the state, district, and teacher levels, will be combined in a final report to be drafted by April 2008. The final report will focus on the following additional evaluation questions:

  • What kinds of systems are available to support district and school data-driven decision making (prevalence of system access; student information and functions of these data systems)?

  • Within these systems, how prevalent are tools for generating and acting on data?

  • How prevalent are state and district supports for school use of data systems to inform instruction?

Exhibit 5 outlines the report dissemination schedule.


Exhibit 5
Schedule for Dissemination of Study Results

Activity/Deliverable

Due Date

Interim Report

Outline of interim report

First draft of interim report

Second draft of interim report

Third draft of interim report

Final version of interim report


8/07

10/07

12/07

2/08

3/08

Final Report

Outline of final report

First draft of final report

Second draft of final report

Third draft of final report

Final version of final report


2/08

4/08

6/08

7/08

9/08



Data Analysis

The Study of Education Data Systems and Decision Making will analyze quantitative data gathered through surveys and qualitative data gathered through case studies. Survey data will include a 2006-07 district survey of a nationally representative sample of about 500 districts and data from the 2005-06 NETTS teacher survey. Qualitative data will be collected through case studies of 15schools conducted during the 2006-07 school year.

The evaluation will integrate findings derived from the quantitative and qualitative data collection activities. Results from the case studies may help to explain survey findings, and the survey data might be used to test hypotheses derived from the case studies. In this section, we present our approaches to analysis of the qualitative data; a description of our approach to analysis of quantitative data will be presented with the OMB package for the district survey.

Case Study Analyses

School site visits will be conducted in 30 schools in the 2006-07 school year. In addition to informing the development of the district survey, the case studies will provide a much more in-depth understanding of how school staff use data systems and of the influence that their interpretations of the data are having on instruction. The case studies will provide detailed descriptions of the practices of schools, teacher teams, and individual teachers at a level of detail that would not be possible through survey data. Our approach to analyzing the qualitative data is iterative—one that begins before each site visit, continues while on-site, and proceeds through the drafting of internal case study reports to cross-site analysis.

Before we conduct the site visits, we will collect and review relevant documents (e.g., information on state and district data management systems and any commercial software used). The analytic process will begin as we use these documents in conjunction with our conceptual framework to catalogue the major data-driven decision making activities at each site. We will review available documents to capture specific information—for example, the relationship between the state and district data management systems, the length of time data-driven decision making has been in place, and the tools used for generating decision-relevant data.

Analysis will continue during the site visits as researchers gather data and compare findings with the study’s conceptual framework. Two researchers will conduct each site visit, and throughout the visit, the team will informally discuss their initial impressions about key features of the data-driven decision-making process and the degree to which the emerging story matches or contradicts study hypotheses. More formally, the site visitors will meet each day of the visit to go through the case study debriefing form and formulate preliminary responses. For example, the debriefing form might call for site visitors to characterize the nature of school data-informed decision making activity. In attempting to report on this topic, site visitors will discuss what they learned in their interviews and, if necessary, fill in any gaps and examine initial hypotheses in subsequent interviews. Site visitors also will discuss any emerging themes that had not been anticipated when data collection protocols were developed. Engaging in this analytic process while on-site serves to tailor and refine data collection to capture the most important features of the data-driven decision-making process. It also allows researchers to generate and test hypotheses while still in the field.

Once each visit is completed, site visitors will draft their case study reports. Drafting such reports requires the researchers to reduce their field notes to descriptive prose within the structure of a formal debriefing form. This translation of field notes to a case study report involves sorting all the data collected in each site (interviews, observations, and document reviews) by the topic areas that define the sections of the debriefing form (e.g., data system use, assistance provided to the school). Within each section—or major topic area—the researchers will code for information on specific subtopics (e.g., support for data analysis). The researchers then will use the sorted data to draft each section of the case study report. Because the researchers will draw on information reported from a variety of respondents (e.g., both teacher and principal perspectives will be presented), they will use the case study report to synthesize their findings and note apparent contradictions. As they translate their field notes into the case study report, researchers will use specific examples and quotes as evidence for their assertions. Case reports will contain a level of detail adequate to permit analysts to classify types of support and decision-making activities at a later time. Distilling field notes into a case study report in this way serves three purposes. First, it reduces significantly the amount of data we must manage for further analysis. Second, it establishes a consistent within-case analytic process across sites. Third, it anticipates the cross-site analysis by seeing that each pair of researchers address the topics we expect to focus on as we look across sites.

The case study reports, structured by the debriefing form, are meant to facilitate cross-site analysis. Once the individual reports are completed, formal cross-site analysis will begin. The goal of the analysis is to compare, contrast, and synthesize findings and propositions from the single cases to make statements about the sample or segments of the sample (e.g., small rural districts and schools). We will begin the cross-site analysis process with a series of debriefing meetings. Debriefings of this type are an efficient means of developing themes for cross-site analyses. Individual researchers, assigned to specific topics, then will conduct more fine-grained analyses and report back to the larger group before we begin the process of integrating findings across data sources.

17. OMB Expiration Date

All data collection instruments will include the OMB expiration date.

18. Exceptions to Certification Statement

No exceptions are requested.


File Typeapplication/msword
AuthorNicolette Mayes
Last Modified ByDoED
File Modified2006-10-18
File Created2006-10-10

© 2024 OMB.report | Privacy Policy