Response to OMB Comments

IETE Response to OMB Questions_12-22-09.docx

International Experiences with Technology in Education

Response to OMB Comments

OMB: 1875-0257

Document [docx]
Download: docx | pdf

to: bridget dooling

ThROUGH: stephanie valentine

from: Marianne Bakia and Bob Murphy

subject: Response to OMB Questions

date: 2/3/21

The purpose of this memo is to provide responses to OMB questions and comments received on the OMB package for the International Experiences with Technology in Education (IETE) project. In the text that follows, OMB questions are in “bold” text followed by responses from project staff.

  1. Frequency of Collection. The Supporting Statement suggests that this is a one-time collection in it also refers to an annual compendium (p.4).  Please clarify.

The Department seeks permission for a one time collection. This survey instrument will be used only for a one time information collection. However, one of the purposes of this project is to explore the feasibility and utility of a future annual compendium. Some of the data to be collected will address the country’s interest in participating in a future annual compendium as well as what data countries might have readily available for inclusion in such a compendium.

  1. Regarding this research question: “what national policies and systems are being used to guide effective ICT investments?” – please clarify how the Department will determine the effectiveness of ICT investments?

The Department recognizes the challenges associated with determining “effective” investments and has revised the question to read “What national policies and systems are being used to guide systematic ICT investments?” “Systematic investment” is defined as investment that is aligned with policy goals and supported by explicit planning processes. We will also look for investments that are sustained over time and for examples of the use of evidence of both need and progress.

  1. Burden table (p.13).  Please check the math in the burden table, interview line – and amend the total as needed.  The ROCIS burden table will also have to be updated before we conclude review on this ICR.

The burden table has been corrected.

  1. Could the Department clarify the rationale for the first two research questions (p.4)? 

Questions in OMB package in question:

What international ICT indicators are currently being collected on an ongoing basis? What are the limitations of these data?

What national policies and systems are being used to guide effective ICT investments?



The purpose of the first question is to reduce redundancy and burden on potential respondents. By exploring the landscape of available data regarding international educational technology activities, the Department can focus on data that is not already being collected. With regard specifically to the reference to “limitation of these data,” the Department is interested in learning whether comparable data is available across countries of interest and whether the data address US educational priorities.

As described in response to question number 2 above, the Department has replaced the word “effective” with the word “systematic.” The Department is interested in the planning processes associated with systematic ICT investments for education. Examples of additional kinds of activities of interest represented by this question include inter-agency collaboration or public-private partnerships that leverage expertise of diverse set of stakeholders.

a) Specifically, how will the answers have direct relevance to Title II Part D or federal education technology policy? 

To paraphrase language already included in the package, the Department needs a clear understanding of how educational technologies are being used internationally in order to inform the domestic policy agenda. This collection will allow the US to gain valuable information and awareness about where other countries stand in terms of US EdTech priorities, thus enhancing the Department’s ability to measure international competition in this area. Specifically, the Department hopes to identify international policies and practices that inform ICT in ED adoption relative to US priorities in order to improve policies and programs domestically, including guidance for and implementation of the Title II D program.

b) The third research question seems particularly important, but also complex.  Did the Department consider narrowing the focus of the survey onto this question (and its subparts)? 

Question 3 in the OMB Package: What set of ICT indicators will be most informative for U.S. policy and feasible to collect on ongoing basis?

Yes, the Department narrowed this question to three main areas of interest. The three areas are:

  • Policy Priority 1. Improving Student Learning Through Enhanced Instruction

  • Policy Priority 2. Increasing Teacher Capacity to Teach

  • Policy Priority 3. Improving Schools Through the Use of Data Systems



These policy priorities are described more fully in Appendix A.

c) If not, is the Department concerned about the wide scope of this survey?

As described above in response to question 4B, the Department narrowed the scope of the survey. We are aware that the survey still covers a considerable breadth of material. We will use follow up respondent interviews to focus in on topics of particular importance.

  1. How involved was the education technology program staff at the Department in the development of the research questions?

Representatives from the Office of Educational Technology (OET) have been very involved in both the initial conceptualization of the project as well as its on-going implementation. For example, OET representatives have been at the project kickoff of and quarterly meetings as well as each of two TWG meetings held to date. In addition, they have provided comment on key project deliverables. Based on OMB’s comment, the newly appointed Director of the Office of Educational Technology reviewed the survey instrument and provided suggestions for revision. These suggestions as well as other improvements have been incorporated into the revised survey included with this response.

  1. Sample. 

It is unclear to what question 6 refers.

a) On page 3, the Supporting Statement indicates that 30 countries will participate, but the rest of the document says 25.  Are the extra five countries for piloting the survey instrument?  Please clarify. 

As initially planned 25 countries would participate in the survey, and an additional 5 countries would be highlighted in more in-depth case studies. The additional 5 countries were to be identified by the TWG as exhibiting exemplary or innovative practices. However, the Technical Working Group ended up recommending countries already included in the set of 25. Therefore, only 25 countries will be included in the survey.

b) What is the likelihood that the sample size will be large enough for conducting statistical analyses and generate meaningful results? 

Sample size will not affect planned analyses. Only descriptive statistics will be used on survey and interview data, i.e. the only statistical analysis planned include ranges, means, and perhaps modes. Country rankings are also planned.

c) How did the Department determine the number of countries to be sampled?  Did the Department consider limiting the sample to the most relevant countries?  For example, the Conceptual Framework section notes that outcomes are “driven by the stage of reform that each country’s programs represent.”   Did the Department consider identifying the countries at a similar stage of reform as the U.S.?

The number of countries to be included in the sample was determined through an iterative process with guidance from TWG members who have direct knowledge of ongoing technology efforts within the countries selected. The Department limited the sample to the most relevant countries. Additional details about country selection are provided in Appendix B.

  1. Please clarify the survey development and research methodology (i.e., interview and data collection process).

    1. Will the project team use one data collection instrument or will it be customized for each country? 

One survey will be used for all countries. It includes both closed and open-ended items. Researchers will follow up with a customized interview protocol that is based on country responses to data.

    1. Is the purpose of the interviews to verify information and seek clarification or will other questions be asked? 

The purpose of the interview is to verify information and seek clarification. In addition, follow up questions will be asked to gather additional data. For example, if a country indicates that it has a program to that provides online courses, we will ask for information regarding the scope and scale of this initiative.

    1. What is the plan for the five states that will be selected for in-depth qualitative examination? 

The TWG nominated the following 9 countries in relation to each of the three priorities identified by ED (see Appendix A for policy priorities):

Technology-enhanced Instruction: Australia, Singapore, United Kingdom - Scotland

Increasing Teacher Capcity through Technology: Chile, Denmark, Portugal, Sweden

Use of Data to Support Continuous Improvement: South Korea, United Kingdom

Data for these case studies will be gathered during the interview process for selected countries and through additional literature searches.

    1. How will these five states be selected?

The key criteria for inclusion were (1) reputation of innovativeness in ICTEd and (2) relevance to one of the three policy priorities. Researchers will review the literature in order to select country practices that are not already well documented and that appear to be truly exemplary or innovative.



  1. What is the likelihood that surveys will be completed in two weeks?  What steps will be taken if surveys aren’t received within that deadline?

We are optimistic about countries’ willingness to respond to the survey in a timely manner. Our contacts were identified with extensive involvement of the TWG, many of whom have professional relationships with identified contacts. After a two week window has passed, the research team will follow up with reminder emails and phone calls to the appropriate member of the Ministry. If a country is still nonresponsive, the research team will ask the Department and members of the TWG with contacts in the Ministry for assistance in discerning the reason for the Ministry’s lack of response.


Appendix A: U.S. Department of Education ICT in Education Policy Priorities

To help guide the selection of countries and the selection and development of indicators for the IETE project, the project team worked with the Department to develop a statement of the Department’s policy priorities for the use of ICT in K-12 education. The policy priorities identified by the Department include:

Policy Priority 1. Improving Student Learning Through Enhanced Instruction

Policy Priority 2. Increasing Teacher Capacity to Teach

Policy Priority 3. Improving Schools Through the Use of Data Systems

These are discussed more fully below.

Policy Priority 1. Improving Student Learning Through Enhanced Instruction: The focus of this priority is on policies, programs or outcomes related to school or classroom use of technology to improve student access to high-quality instruction and to address individual differences in students’ ability to learn subject matter content. Research questions supporting this priority include:

How are technologies being used to improve student access to high-quality instruction (e.g. online learning, simulations and other content for in-class use)?

How are ICT resources being used to address individual differences in various abilities, learning styles, and learning difficulties among students?

How are technologies being used to prepare students for the competitive, highly-productive workforce for the 21st century?

Indicators associated with these questions could include:

technology-supported student assessments (both innovative assessments – national and teacher-developed - and those to diagnose student difficulties or styles)

content management systems that help guide teachers to particular instructional or content resources related to individual student needs

technology-supported content including simulations, and other content for in-class use.

online or distance learning initiatives, enrollments, and delivery systems

open courseware initiatives


Policy Priority 2. Increasing Teacher Capacity to Teach: The focus of this priority is on policies, programs, and outcomes related to the use of technology to facilitate the professional development of teachers. The primary research question supporting this priority is -

How are technologies being used to increase teacher quality?

The focus is on use of technology as a venue for professional development, not the preparation of teachers to use technology, per se. The Department’s interest is the delivery of teacher professional development via technology as opposed to face to face instruction on how to use ICTs or how to integrate technology in the classroom.


Policy Priority 3. Improving Education Through the Use of Data Systems: The focus of this priority is on policies, programs, and outcomes related to the use of technology to support accountability, evaluation, and “continuous improvement” systems. The primary research question supporting this priority is -

How is technology being used by the education systems of other nations to track student progress and to create accountability systems for schools and teachers?

This theme is related to the first policy priority to the degree that they both include student assessments and student data. The primary distinction between this priority and policy priority 1 is that the first theme focuses on teacher-student interactions and classroom-based practices, while the focus of policy priority 3 focuses on administrative systems for accountability, evaluation and “continuous improvement.”





Appendix B: Country Selection Process

The process to select countries for participation in the study involved input from both the Department and members of the project’s technical working group (TWG). After a series of discussions with the Department and the TWG regarding preliminary criteria for country selection, consensus was reached that the countries selected should be economic competitors of the U.S. and have significant existing technology infrastructure to support the implementation of ICT in education. Geographical representation was not considered in the country selection process. An initial set of 24 countries was identified by the project team based on the top 30 rankings on both the Network Readiness Index and an index of labor productivity (2008 GDP per person employed)1. This list was shared with Department and the TWG prior to the second TWG meeting. TWG members were asked to review the list and approve, remove, or add countries based on their knowledge of a country’s ICT investments related to the three policy priorities.


All countries on the initial list were nominated for the final list by the TWG with the exception of Luxemborg which was considered to be too small to be relevant for U.S. policy purposes. Luxemborg was subsequently dropped from the final list. In addition to the remaining 23 countries on the list, the TWG recommended adding Chile and Portugal. Chile was nominated due to its long history of ongoing investments in ICT in education including its much-studied Enlaces network, a two decade effort to integrate ICT throughout its educational system. A case was also made to include Portugal because of more recent investments in ICT including the Magellan Project2, an ongoing government initiative to provide 500,000 laptops to students. (See Exhibit 2 below for final country selection and rankings on key indicator data).

Exhibit 2. Final Selection of Countries




PISA 2006

Country

Labor Productivity Index

(1 = high)

Network Readiness Index

(1 = high)

Math

Science

Reading


Rank

Rank

Score

Score

Score

Australia

11

14

520

527

513

Austria

16

16

505

511

490

Belgium

6

24

520

510

501

Canada

13

10

527

534

527

Chile

38

39

411

438

442

Denmark

15

1

513

496

494

Estonia

23

18

515

531

501

Finland

9

6

548

563

547

France

7

19

496

495

488

Germany

24

20

504

516

495

Hong Kong

2

12

547

542

536

Iceland

21

7

506

491

484

Ireland

5

23

501

508

517

Israel

22

25

442

454

439

Japan

19

17

523

531

498

Netherlands

18

9

531

525

507

New Zealand

30

22

522

530

521

Norway

8

8

490

487

484

Portugal

40

30

466

474

472

Singapore

17

4

***

***

***

South Korea

26

11

547

522

556

Sweden

12

2

502

503

507

Switzerland

25

5

530

512

499

Taiwan

14

13

549

532

496

United Kingdom

10

15

495

515

495

*** Country did not participate in PISA 2006.





1 Network Readiness Index (NRI). The NRI, developed by the International Technologies Group at Harvard University (http://cyber.law.harvard.edu/itg/) and currently administered by the World Economic Forum is a composite of three components: the environment for ICT offered by a given country or community, the readiness of the community’s key stakeholders (individuals, businesses, and governments) to use ICT, and finally the usage of ICT amongst these stakeholders. Data is derived from publicly available sources and an executive opinion survey conducted by leading research institutes and business organizations within the countries under analysis. In total, data on 27 indicators, including utility patents, mobile phone use and bandwidth available, are combined with 41 survey indicators to give the overall network readiness index score.

Labor Productivity Index (LPI). The LPI used is based on a country’s 2008 GDP per person employed in 2008 U.S. dollars. The international data reported was taken from the Total Economy Database on Output and Labor Productivity maintained by The Conference Board (http://www.conference-board.org/economics/database.cfm#6).

1



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorTarneisha Gross
File Modified0000-00-00
File Created2021-02-03

© 2024 OMB.report | Privacy Policy