60D Notice Comment Response Summary

IRIS Public Comment Response Summary.docx.xls

International Resource Information System (IRIS)

60D Notice Comment Response Summary

OMB: 1840-0759

Document [xlsx]
Download: xlsx | pdf

Overview

Sheet1
Sheet2


Sheet 1: Sheet1

Comment Commenter Status Resolution
There should be no technological barriers to allowing CIBER centers to continuously update reports with major events throughout the year, rather than forcing everyone to wait until report time to enter all the data. If data must still be updated only at set times due to regulatory restrictions, making it possible to preload data into a spreadsheet that matches a set template so that data doesn't have to be entered cell by cell would make reporting easier and faster for awardees and would significantly reduce transcription errors. James Hoadley Resolved. No change at this time. Thank you for your comments. According to the IRIS database administrator, there is no technological barrier to allowing any grantee user to enter and continuously update performance data throughout the performance period. There is an option to “save” without actually submitting the report. A user can return as many times as necessary before the report is submitted to update or add to what has already been completed. The information ultimately submitted must reflect the budget period covered by the report, but there are no restrictions on when the information may be entered or how many times it may be updated before submittal. If a user encounters technical difficulties while trying to enter and/or update performance data, that user is encouraged to contact the IRIS Help Desk. Your comments regarding the use of spreadsheet templates to upload performance data is one of many submitted on this subject. Currently, NRC grantees use such templates to submit course lists in IRIS. IFLE is exploring the possibility of allowing CSV uploads of spreadsheets for some additional programs when reporting certain data in IRIS. Due to time constraints, this process cannot be completed before this OMB clearance is approved, but IFLE hopes to make these new templates available to users within 12 to 18 months.
Having been involved in the CIBER program I have seen firsthand the significant impact that these programs have had on the thoughts and actions of our constituencies. I think careful measurement is an important tool in any program and regular review of the measurement categories and the collection tools is critical. I want to encourage the review committee to consider the challenges of certain "hard" measures in this type of programming environment. For small businesses, changing the attitude and then changing behavior comes long before the change in export performance. The same can be said for our work with Minority-Serving Institutions and Community Colleges. Sometimes a small change in the present allows a business, person or institution to take a very different path with extraordinary results in its future. LaVonne Schlegel Resolved. Thank you for your comments. Many individuals and offices within the U.S. Department of Education have been concerned about and involved in efforts to effectively evaluate IFLE programs. The development and revision of new GPRA measures for many IFLE programs took over a year, and the measures were finally approved by OMB in August 2012. In addition, IFLE is currently administering a contract to suggest new and innovative ways to assess the performance of IFLE programs. IFLE is sensitive to the nuances involved in trying to appropriately gauge and report on the success of a project or a program. However, one of the chief criticisms aimed at previous IFLE measures and reporting techniques was the lack of “hard”, quantifiable data to demonstrate the effectiveness of IFLE programs. In this latest information collection, efforts have been made to collect more performance data in quantifiable formats to address this issue. In response to several comments received for this information collection, a text box has been added to the PMF so that grantees can add narrative to explain or supplement the numerical data.
For the Undergraduate International Studies and Foreign Language Program (UISFL), I am surprised not to see any request for information about student impact i.e., the enrollments in courses added or revised with Title VI funding. Curiously, information about participants is requested in the screen for study abroad, which is not as much a priority activity as home-campus course development, in such a domestic funding program. Ann Imlah Schneider Resolved. Thank you for your feedback. We have experienced difficulties in collecting student impact data, since data of this kind is not easily obtained during the required reporting periods. For the purposes of more easily and effectively demonstrating the effectiveness of our programs, we are attempting to collect as much data as possible via data elements, which are more easily extracted and quantified. However, we recognize the need of UISFL grantees to submit additional student impact data, and we have added a comment box to the Performance Measure Form to provide this opportunity.
The following two comments are from a group of six international educators with a combined experience of over 35 years working in Title VI National Resource Centers at the University of Pittsburgh. In the section for FLAS Language Instructors, a question was added to collect an ILR-equivalent score for each fellow. As some FLAS instructors will have experience with the ILR and/or ACTFL OPI rating scale and others will not, we suggest there be a link from this screen to the descriptions of ILR levels so that FLAS instructors can understand each levels appropriate skills to better describe each fellows ability in the language. Jennifer Creamer, University of Pittsburgh Resolved. Thank you for your comments. We agree that this would be helpful to users. We will insert a link on the FLAS Language Instructor screen to direct users to the ILR website for the descriptions of the ILR Speaking Proficiency Levels.
In the performance measure reporting section, there is only space in the form to enter quantitative data to describe program success. We suggest adding a comment box so that program administrators can add some narrative which will offer some qualitative explanation of the data in addition to the numerical data. Jennifer Creamer, University of Pittsburgh Resolved. Thank you for your comments. We agree that this addition would be useful, and we have added a comment box to the Performance Measure Form to provide this opportunity.
The former president of the American Association of Teachers of Japanese, Dr. Patricia Wetzel at Portland State University, wrote a paper on OPI and Japanese. In the paper, she says "students should progress a maximum of one [OPI] level per "year" [Wetzel's emphasis] in Japanese study." She is talking about 10 OPI levels, which includes Distinguished. Because students can progress maximally only one OPI level a year, she says even OPI is not that useful for evaluation at midpoints in an academic year. As the ILR scale is less granular than OPI, the logical conclusion is that we won't be able to properly gauge the progress of languages, such as Japanese, if we swtich to ILR. Etsuyo Yuasa, Director of the East Asian Studies Center at The Ohio State University Resolved. No change. Thank you for your comments. We are aware that using the ILR as a measurement of a student’s language aptitude or proficiency based on the contact hours of instruction is inconsistent across FLAS languages. The challenges that you have brought to our attention, however, do not materially affect the information IFLE is collecting via IRIS. IRIS has its own language proficiency questionnaire that instructors use to evaluate fellows’ language proficiency pre- and post-fellowship. It is still our intention to use the scores from this questionnaire to evaluate fellows and to report data for GPRA measures. The addition of the “ILR equivalent” score was due to concern that fellows would not have a universally understood and accepted language proficiency score after their fellowships. We collect the “ILR equivalent” scores for this purpose, but they are not used to measure a fellow’s progress in relation to the grant award. I hope this addresses your concerns.
In a study done by the Foreign Service Institute, reproduced in the Omaggio-Hadley's methods textbook (2001), it takes 240 contact hours for average aptitude students in group 3 languages (Russian) to progress from Intermediate-Low/Intermediate-Mid to Advanced-Low/Advanced-Mid. Our academic year offers 120 contact hours in the third year and 90 for the fourth year. The OPI has the option to record half-levels of advancement which is the result of 120 hours, while ILR does not. The number of contact hours from Advanced-Low to Advanced-High is even bigger (600). This illustrates that the ILR measurement is an ineffective tool to assess the progress of students in the Intermediate-High to Advanced-High levels in Russian. Yana Hashamova, Director of the Center for Slavic and East European Studies at The Ohio State University Resolved. No change. Thank you for your comments. We are aware that using the ILR as a measurement of a student’s language aptitude or proficiency based on the contact hours of instruction is inconsistent across FLAS languages. The challenges that you have brought to our attention, however, do not materially affect the information IFLE is collecting via IRIS. IRIS has its own language proficiency questionnaire that instructors use to evaluate fellows’ language proficiency pre- and post-fellowship. It is still our intention to use the scores from this questionnaire to evaluate fellows and to report data for GPRA measures. The addition of the “ILR equivalent” score was due to concern among some in the Department that fellows would not have a universally understood and accepted language proficiency score after their fellowships. We collect the “ILR equivalent” scores for this purpose, but they are not used to measure a fellow’s progress in relation to the grant award. I hope this addresses your concerns.
For the National Resource Centers (NRCs), the addition of a question about whether a major, minor, or certificate program is new could be taken as a distraction from the more important questions about what is continued and the extent of student participation. In this era of consolidation and cutbacks, maintaining and strengthening existing degree programs (and related course offerings) seem crucial for a program whose strengths are its continuity and breadth of offerings. I am glad to see continuation of the screens for reporting degrees awarded by discipline, and for career plan information. Ann Imlah Schneider Resolved. No change. Thank you for your comments. Maintaining and strengthening existing degree programs and related course offerings are certainly crucial for NRCs. However, of the NRC GPRA measures recently cleared by OMB, two speak directly to increasing both courses and degree programs: 1. Percentage of NRCs that increased the number of intermediate or advanced level language courses in the priority and/or LCTLs during the course of the grant period (long-term measure). 2. Percentage of NRCs that increased the number of certificate, minor, or major degree programs in the priority and/or LCTLs, area studies, or international studies during the course of the 4-year grant period. The addition of the question relating to new major, minor and certificate programs is meant to collect data that will respond to the second GPRA measure above. Some in the Department are concerned that grantees may be using federal funding only to maintain capacity, and not to develop new initiatives and expand programs. This concern resulted in the development of the two GPRA measures cited above, and in the questions in IRIS that collect the related data, to dispel the perception that NRC program grantees aspire to be “timeless” rather than responsive to the global needs of the current time.
We do not believe that the proposed revisions will result in any reduction of the reporting burden for National Resource Centers (NRCs) and CIBEs. The estimate of 13 hours per response is considerably less than the actual time required to fill in the report data and considerably less than the time it takes to gather and evaluate the data before filling in the forms. Our Centers have to start working on reports weeks in advance in order to compile, analyze and enter data. Name: Anonymous Submitter's Representative: Ann Biersteker Organization: Michigan State University NRC and FLAS Centers Resolved. We agree with the commenter that 13 hours does not accurately reflect the time it takes for NRC and CIBE grantees to complete the online performance report, and therefore, we have increased the estimated burden to 100 hours.  The revised estimated burden includes the time to review instructions, search existing data resources, gather the data needed, and complete and review the information collection. It does not take into consideration the time it takes for respondents to evaluate or analyze their data, as these tasks are excluded from the Paperwork Burden Statement requirements for information collections.
An example of a completed “Performance Measure” form would assist us in evaluating and understanding how we will complete this form. Name: Anonymous Submitter's Representative: Ann Biersteker Organization: Michigan State University NRC and FLAS Centers Resolved. No change. Thank you for your suggestion. We agree that this would be helpful to users. While time constraints prevent us from developing a sample form before this collection is cleared by OMB, we plan to make one available to users within the next 12 to 18 months
We have had difficulty uploading the spreadsheet templates for language courses and for international and area studies courses. We now enter each course separately. It would be easier if we could upload Excel spreadsheets in which we have entered the information provided as stated in the template. Name: Anonymous Submitter's Representative: Ann Biersteker Organization: Michigan State University NRC and FLAS Centers Resolved. No change. Currently, IRIS allows the upload of the completed spreadsheet template in CSV format. If you encounter technical difficulties when attempting to upload your CSV template, please contact the IRIS Help Desk.
It would be useful if we could upload an Excel spread sheet detailing the required information on Outreach activities. This is not currently available. Manually entering individual entries takes hours for those of us with extensive outreach programs. Name: Anonymous Submitter's Representative: Ann Biersteker Organization: Michigan State University NRC and FLAS Centers Resolved. No change at this time. Your comments regarding the use of spreadsheet templates to upload performance data is one of many submitted on this subject. Currently, NRC grantees use such templates to submit course lists in IRIS. IFLE is exploring the possibility of allowing CSV uploads of spreadsheets for some additional programs when reporting certain data in IRIS. Due to time constraints, this process cannot be completed before this OMB clearance is approved, but IFLE hopes to make these new templates available to users within 12 to 18 months .
Under “Project Information” for NRCs we should be allowed to enter more than 15 languages, countries, and subject areas. Many NRCs offer more than 15 languages, cover more than 15 countries, and are relevant to more than 15 subject areas. Name: Anonymous Submitter's Representative: Ann Biersteker Organization: Michigan State University NRC and FLAS Centers Resolved. We limited the number to 15 because in comparing funded applications with the project information data in IRIS, we observed that many grantees indicated languages, countries, etc. in IRIS that were not necessarily reflective of their grant applications. We will increase the number to 40 to allow centers more flexibility to specify their research and instructional scope as well as their countries/regional coverage.
It is difficult to know how we will complete the “Diverse Perspectives and Areas of Need” sections of the Foreign Language and Area Studies Fellowship (FLAS) Director’s report without repeating what we have said in the “Priorities” section and/or in “Diverse Perspectives and Areas of Need” sections of the NRC report. Name: Anonymous Submitter's Representative: Ann Biersteker Organization: Michigan State University NRC and FLAS Centers Resolved. No change. If the narrative information submitted in response to the “Priorities” section is identical to the information required to respond to both the NRC and FLAS “Diverse Perspectives and Areas of Need” screens, users are allowed to enter the same data in both sections of their reports.
We are concerned that identifying the “Hiring Institutions for Doctoral Degree Higher Education Placements” in many cases identifies the individuals who have been placed. It would be preferable if we simply indicated the number of people who went on to work in Higher Education without naming institutions. Name: Anonymous Submitter's Representative: Ann Biersteker Organization: Michigan State University NRC and FLAS Centers Resolved. No change. We appreciate the commenter’s concern, but in requesting the names of the IHEs that employ those who have graduated from the Center institutions, the intention is to demonstrate the range of institutions that end up benefiting from the expertise that the Centers produce. Providing the names of IHEs does not translate into compromising the individual’s privacy.
I have one further comment regarding the screen labeled "Outreach Activities": After the questions, "Is this a teacher training activity?" and "Is this specifically for heritage learners?", each with appropriate sub-data fields, I propose that a third broad question be added: "Is this a technology-based activity?" Appropriate sub-data fields may be info regarding:
(1) Technology Type. Required field with options such as (A) online document (mainly info and links); (B) web application; (C) mobile app; (D) webinar, online workshop, training session, etc.; (E) online course, training over extended period.
(2) Scope of Outreach. Optional fields, since some but not all may apply, with options such as total number of hits, increase in hits since last period, number of individual users, number of countries with hits, session length (time on webpage or in app), session interval (time between visits/uses), etc.
(3) Other fields of possible tech interest to the US Department of Education and Title VI.
Dave Baer (CeLCAR, Indiana University) Resolved. Thank you for your comments. We agree that your proposed additional questions are relevant and useful. We have changed the LRC Outreach Activities screen accordingly.
When asked to select a specific language for a product/project/activity, please add an option that says, “Applicable for all languages.” These dropdown menus are problematic for the LRCs that are not targeted at specific languages/regions of the world. The work at language general LRCs is purposefully designed to meet the needs of language educators of ANY language. This is particularly important for the LRCs that have resources that really do apply to all languages. Currently, we have to artificially choose up to 15 that are “most relevant” when all languages are relevant.
Similarly, for fields entitled “Countries,” we recommend the inclusion of a selection that says “Applicable for all countries.”
We recommend removing the choice of “Not applicable” in the language and country dropdowns, as this choice tends to have a negative connotation.
Note: These fields appear in two other places (Projects Conducted and Outreach Activities).
Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. No change. Thank you for your comments. Adding a dropdown menu option that reads "Applicable to all languages/countries" might cause confusion, since two of the screens to which you refer (Project Information and Outreach Activities) are "shared" screens that grantees from other programs use as well. While we have left the dropdown menus unchanged, we have increased the number of languages that may be selected from 15 to 40, which will provide more flexibility. The term "not applicable" has no negative connotations to the program officers who review the performance reports and are well aware that LRCs are engaged in activities related to any and all languages.
This [performance measure form] is a new evaluation feature for this cycle. It is not clear from the instructions what needs to be added to this section. The instructions say “Add Data/Indicators for all Activities, and Baseline and Target units of measure for each Performance Measure,” but grantees were instructed to leave these sections blank at the time of proposal submission and told that IFLE would give further instruction to those who received awards. It is not clear what these items should be and how they will be added to this section. Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. Thank you for your feedback. Definitions for the terms used in the PMFs, i.e., indicators, baseline and target data, were included in the FY 2014 application guidelines, along with an example of a completed PMF. We will incorporate that guidance into the IRIS grantee guides and the screens, as appropriate, to assist grantees in completing the forms to be submitted in IRIS . In addition, we provided a technical assistance session on completing the PMF in IRIS at the Title VI Project Directors' Meeting in March 2015. The presentation is available at: http://iflemeetings.com/presentations/. If you require further assistance please contact your program officer.
What does the word “Actual” mean in this statement: When completing the Annual Performance Report, provide the "Actual" units of measure at the time of reporting. Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. No change. The “actual” number would be the quantitative unit of measure that reflects what was actually achieved during the reporting period, as opposed to the “target” number, which reflects the project’s goal at the start of the grant.
Activity type is limited to Teacher Training, Outreach, Curriculum Development and Study Abroad. It would be helpful to have definitions to differentiate between some of the terms. For example, would a large conference be considered training or outreach? Would Research-to-Practice briefs for teachers be considered “outreach” or “curriculum development?” What would assessment development be listed under—it doesn’t appear that anything would apply. Similarly, where would research activities be listed? Research on language learning does not fit under any of the four categories. Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. Thank you for your comments. We have developed a more comprehensive dropdown list of possible project activities. Since the PMF form must be used by all programs, the dropdown list must be as broad and all-inclusive as possible. We did not include "Other" because having activities described as "Other" does not serve data collection purposes. We would not be able to effectively search the database or draw down data that was collected based on "Other." We suggest that you choose the 1-3 activity types from the list provided, and clarify any inaccuracies in the comment box.
Report Schedule Screen:
The link to specific reports only becomes active after the previous report has been submitted. It would be helpful to be able to enter outreach activities on a regular basis rather than simply within a current reporting period.
Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. No change. Thank you for your comments. According to the IRIS database administrator, there is no technological barrier to allowing any grantee user to enter and continuously update performance data throughout the performance period. There is an option to “save” without actually submitting the report. A user can return as many times as necessary before the report is submitted to update or add to what has already been completed. The information ultimately submitted must reflect the budget period covered by the report, but there are no restrictions on when the information may be entered or how many times it may be updated before submittal. If a user encounters technical difficulties while trying to enter and/or update performance data, that user is encouraged to contact the IRIS Help Desk.
We would recommend exploring the opportunity to link a spreadsheet to outreach activities within the report for accurate documentation of activities in “real time.” Having the ability to upload a spreadsheet would be of great help in reporting, as many (or most) centers keep running records as events occur and this would eliminate the need to duplicate the information in IRIS. Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. No change at this time. Several commenters suggested that the IRIS screen provide a link to spreadsheet templates to upload outreach activities. Currently, National Resource Centers grantees use such templates to submit course lists in IRIS, and we agree that this would be useful for other programs that require outreach activities. IFLE is exploring the possibility of allowing CSV uploads of spreadsheets for additional programs. Due to time constraints for obtaining OMB approval of the IRIS reporting system, however, this suggestion cannot be incorporated into the reporting screens at this time. IFLE hopes to add these new templates to additional programs’ reporting screens within 12 to 18 months .
Objectives and Accomplishments and Exemplary Activities Screens:
We are discouraged to see that there is a proposal to eliminate these two screens. The documentation states that they will be “Removed as unnecessary (data will be collected via Performance Measure Form.”
LRCs have used these screens in the past as areas where we were able to enter a narrative for those activities which don’t fit neatly into IRIS categories. It seems that with the elimination of these screens, there is now no place in IRIS to write about the projects in a cohesive, contextualized manner.
Suggestion: Give grantees the opportunity to add additional information under “Other Comments,” a category that does not currently exist. The current format of the IRIS reports does not allow us to include anecdotal information, quotes, or other evidence of our impact that would be valuable for the granting agency to know.
Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. Thank you for your feedback. We have experienced difficulties in collecting data in a narrative format, since data of this kind cannot be easily compiled or analyzed, and are often anecdotal in nature. For the purposes of more easily and effectively demonstrating the effectiveness of our programs, we are attempting to collect as much data as possible via data elements, which are more easily extracted and quantified. However, we recognize the need of grantees to submit some data in narrative form, and we have added a comment box to the Performance Measure Form to provide this opportunity .
Fall Budget Screen:
It would be helpful to specify which version and formats of Excel are readable when uploaded or if other types of documents (such as PDFs) can be used.
Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. Thank you for your comment. The Database Administrator confirms that the system will accept documents in any version of Excel. The instructions provided on the upload screen have been revised to reflect this.
Define what is meant by “Total Other.” There appears to be a clickable link for information, but since we have access only to a screen shot, we are not able to see if the definition appears. Some centers have many other grants/activities, and others function mostly on LRC funding. What does USDE wish to learn from the report of funding under “Other”? Why should other funding from all other sources be included in the report on Title VI funding? Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. Thank you for your feedback. The information link includes the text: “Other funds allocated to these line items from other internal and external sources.” The reason that a column entitled “Total Other” is on the budget screens is that some Title VI programs require that the grantee provide matching funds. Since the budget reporting screens are “shared screens”, all Title VI grantees will use the same budget screens in their reporting, and the column must be available to those grantees who are required to demonstrate matching. We have revised the column’s title heading to read “Total Other Funding Sources.”
Projects Conducted Screen :
How will this section differ from the new Performance Measure section that essentially takes the projects and breaks them down into smaller activities? It seems that the new section would take in all of this information in a more detailed fashion.
Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. In the Performance Measure Activities section, grantees must limit the number of activities to no more than three activities. The Projects Conducted screen gives grantees the flexibility to report all activities conducted during the reporting period. This clarification will be added to the Projects Conducted screen. We have also added a question on “Project Deliverables Used or Institutionalized.” It includes a dropdown menu of materials that the LRC developed during the reporting period and that have been used/institutionalized by project beneficiaries.
If this section remains, in the field “Type of Project” there should be an option to select more than one type. For example, a center could have a project that includes professional development along with a focus on assessment, research, and/or material development. Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. No change. Thank you for your comments. Changing a dropdown menu from a "single select" to a "multi-select" causes technical as well as data collection difficulties. We recommend that users select the type that best describes the project, and clarify any ambiguities in the "Description of project" narrative box at the end of the screen.
For “Research Basis of Materials” there only two possible answers: “research supported by other Title VI project” or “research supported by this grant.” For many of our projects, the faculty and staff draw from research that has been funded by sources other than Title VI. This forced choice seems to suggest that all language-related research has been funded by Title VI, which is not true. It is important to offer a choice, for example, that indicates that a materials development project is based on research done without Title VI funding.

Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. We agree that the list to show the leveraged funds needs to be expanded to include “Non-Title VI Resources.” We have also revised the heading for the data element to read “Sources that Supported the Research Activities” to better convey what grantees should report.
Adoption of Outcomes screen: This is a confusing section and undoubtedly grantees complete these fields in many different ways, making it less useful for any kind of overall reporting for USDE.
The first question in this section asks grantees to quantify use of materials/products/ assessment instruments/research outcomes produced by the grant. “Use” can be ambiguous. Does it mean a web “hit”? A sale of a product? The use of an assessment? The number of people at a conference? While there is a space for the grantee to define what is being used to define “use” it is limited to 100 characters making it inadequate given the many possible items that could be included. Also, the separate field for “institutions” versus “organizations” is not clear.
Similarly, the question” How many individuals, institutions, or organizations have judged the items to be successful?” must be explained by USDE. How is this to be measured? How is this information different from the data in the new section on Performance Measures? Many LRC products are available free online, or otherwise disseminated in ways which are not easily trackable, so it is nearly impossible to connect with individual users and inquire about their judgment of the success of a specific product.

This second question regarding “becoming involved” is also ambiguous. Does this mean simply mean attending a summer institute or conference, or is the question meant to indicate more active involvement such as helping to lead an activity or formally pilot and report on a new material?
Recommendation: To improve the accuracy of reporting, grantees need to know how USDE interprets using materials, and what is meant by becoming involved. It is also necessary to have a clear definition of how LRCs are to determine if a user found an item successful.
Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. We agree with the commenter that this screen is too ambiguous and subjective to be effective for collecting meaningful or useful data. “Successful” as used in this screen is unquantifiable. We will delete the "Adoption of Outcomes" screen , and revise the “Products Conducted” screen to collect similar data. We will also use the "Outreach Activities" screen, which collects information about individual, institutional, and organizational participation in LRC activities. LRC grantees will assess each outreach activity (using questionnaires, surveys, interviews, etc.) to determine what was “successful”. The PMF also collects attendees (targets) and activity outcome assessment using measurable outcomes which would determine whether a project is successful. This was always reported under the outreach activities in narrative form, but using the PMF it will be possible to collect these data in quantifiable form .
Publications and Research Presentations Screen:
The Publications area also requires clarification. There are a number of new items proposed (e.g., “Presentations - Non-conference”) that seem like they might already have been counted in another section, like Outreach Activities. What is the difference between a Presentation - Non-conference and an Outreach Activity? If something is counted in one area, is it double-dipping to report it again, or is it showing breadth? Similarly, a new item is “Webinars” but at least one LRC has its webinars listed as projects, not publications. “Workshops” has also been added to publications, but what then happens to workshops that otherwise would have been listed in outreach?
It appears than some information buttons were added, but they do not clarify. For example, when the call-out is opened for webinars, it only says “Online and/or in person.” This does not give information about whether this kind of activity should be counted as a publication or an outreach activity.
Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. There is no differentiation between a “Presentation-non-conference” and an outreach activity, relative to reporting outreach activities in IRIS. A non-conference presentation is a form of outreach because information is being disseminated. An activity of this nature should only be included in one category, not both. In addition, we have revised the webinar information button to read “online and/or in person outreach activity/presentation” to clarify that a webinar is not a publication .
Outreach Activities Screen:
The LRCs conduct many outreach activities and need to generate many entries. It would be more efficient to be able to add this information to a spreadsheet to be uploaded to the IRIS system. If this was not possible, it would be helpful to have a way to duplicate an outreach activity without re-typing all of the information. For example, in a professional development series where the same topic is offered at several different times to meet different peoples’ scheduling needs, it would save time to be able to duplicate and event and then simply update the date and number of attendees for each record.
Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. Your comments regarding the use of spreadsheet templates to upload performance data is one of many submitted on this subject. Currently, NRC grantees use such templates to submit course lists in IRIS. IFLE is exploring the possibility of allowing CSV uploads of spreadsheets for some additional programs when reporting certain data in IRIS. Due to time constraints, this process cannot be completed before this OMB clearance is approved, but IFLE hopes to make these new templates available to users within 12 to 18 months .
Presenter(s): This list includes the selection “faculty of other institution” twice.
Partnership(s): Proposed changed text to “Select the type of partnership(s) that were utilized for this activity” is grammatically incorrect.
Project type: As in the Projects Conducted, section there should be an option to select more than one type. Many centers’ projects include professional development along with a focus on assessment, research, or material development. This section adds the choice of “workshop” in
addition to “professional development,” and the difference between the two is unclear. This field only allows one choice, which makes it difficult to adequately label some of the activities (e.g., a professional development series of workshops on assessments and material development).
(Also see comment above on languages and countries.)
Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. Thank you for your comments. We have addressed these errors.
Comment on Estimated Burden Hours:
Another related document under review states that it takes seven (7) hours to complete each report. The LRCs spend far more time than this on each report. We believe that seven hours is insufficient and should be raised to at least 35 hours.
Karin Larson (CARLA, University of Minnesota)
Joy Campbell (CLEAR, Michigan State University)
Margaret Malone (AELRC, Center for Applied Linguistics/ Georgetown University)
Resolved. We agree with the commenter that 7 hours does not accurately reflect the time it takes for LRC grantees to complete the online performance report, and therefore, we have increased the estimated burden to 100 hours.  The revised estimated burden includes the time to review instructions, search existing data resources, gather the data needed, and complete and review the information collection. It does not take into consideration the time it takes for respondents to evaluate or analyze data, as these tasks are excluded from the Paperwork Burden Statement requirements for information collections.
What about the important International Research and Studies (IRS) program, in the event that funding can be made available for it? Ann Imlah Schneider Resolved. No change. At this time, we have not revised the screens or the GPRA measures for the IRS program, as there is no indication that the IRS program will be funded in the foreseeable future.
The proposed changes to IRIS may well facilitate clearer demonstration of the important program outcomes that support U.S. global economic competitiveness, national security, and more diverse program participation. However, I also want to respond to the NPRM request for comments about the need for the IRIS data collection and about how to enhance the quality, utility, and clarity of the information to be collected. My major concern is that the table on page 10 of the Supporting Statement does not include any staff time for compilation, analysis, and distribution of the data collected an activity that is necessary to assist grantees, to provide an information base for administrators and planners in the Department of Education, and to inform interested members of Congress and the wider community. Indeed, this was strongly recommended by the GAO in its 1978 review of Title VI programs, and one that we carried out for many years for the NRC and FLAS programs, but it has been dropped in recent decades. Staff should be made available to facilitate accessibility to the data, in meaningful form, for reference and use by all in the international education community, and beyond. Ann Imlah Schneider Resolved. Thank you for your comments. Data submitted by grantees in IRIS is routinely used to assist grantees, to provide an information base for administrators and planners in the Department, and to inform interested members of Congress and the wider community. Some of the time and cost used to compile, analyze, and distribute data collected in IRIS is reflected in the “Contractor Support” line item of the table, since much of the data compilation, analysis and distribution of IRIS data is performed by government contractors. However Department staff spends time on these tasks as well, and that is not reflected in the table you reference. The table has been modified to reflect more accurate contractor costs and to include an additional line item titled “Compilation, Analysis and Distribution of IRIS Data.”
Changes to IRIS reporting:
1) Never clear exactly who is the audience for IRIS and how the report is actually used.. We came to the conclusion that it was randomly looked at...
Robert Spich UCLA Resolved. No change. Thank you for your comments. All data collected in IRIS serves the purposes of demonstrating substantial progress of funded projects in annual, interim and final performance reports, and also demonstrating program effectiveness via GPRA measures. All of the data submitted by grantees is reviewed by program staff for one or both of these reasons. In addition, data is regularly extracted to respond to internal and external inquiries, to assist grantees, and to distribute to the public via the IFLE Newsletter ( https://public.govdelivery.com/accounts/USED/subscriber/new?topic_id=USED_61) and the IFLE Twitter account (https://twitter.com/EDPostsecondary).

Sheet 2: Sheet2

Activity Type (PMF)
Area studies instruction
Business language instruction
Curriculum and/or materials development
Dissemination
Distance education
Evaluation
Faculty training/professional development
Faculty/staff salaries and stipends
Graduate courses in international business
Graduate programs in international business
Interdisciplinary international education programs
Internationalization of curricula at graduate and/or professional schools
Internationalization of curricula at MSIs and/or community colleges
Language instruction (including support of LCTLs instructors)
Language testing/assessment
Linkages and/or partnerships
Information resources development, maintenance, access
Outreach
Research
Student internships in international business
Study abroad
Summer institutes
Teacher training (K-12)
Technology-related activities
Travel
Undergraduate courses in international business
Undergraduate courses in international education
Undergraduate programs in international business
Undergraduate programs in international education
Visiting foreign faculty and scholars
Other
File Typeapplication/vnd.ms-excel
AuthorSara Starke
Last Modified BySara Starke
File Modified2015-07-27
File Created2015-06-05

© 2024 OMB.report | Privacy Policy