Att_suppstatptb REVISED AUGUST

Att_suppstatptb REVISED AUGUST.docx

Survey of Post-Graduate Outcomes for International Education Fellowship Recipients

OMB: 1840-0829

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT PART B

FOR PAPERWORK REDUCTION ACT SUBMISSION


Survey of Post-Graduate Outcomes for International Education Fellowship Recipients

U.S. Department of Education, Office of Postsecondary Education,

International and Foreign Language Education

EDICS Tracking Number:


B. Collection of Information Employing Statistical Methods


The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. The following documentation should be provided with the Supporting Statement Part A to the extent that it applies to the methods proposed. For further information, please consult OMB’s Standards and Guidelines for Statistical Surveys.


  1. Describe the potential respondent universe (including a numerical estimate) and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, state and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The survey of post-graduate outcomes for international education fellowship recipients by the International and Foreign Language Education will survey fellows from two grant programs: the Foreign Language and Area Studies (FLAS, CFDA 84.015B) and the Institute for International Public Policy (IIPP, CFDA 84.269A). The survey will be expanded to other Title VI and Fulbright-Hays grantees in three years, when this information collection package is submitted again.


The target population for this data collection is fellowship recipients from the FLAS and IIPP programs who have completed the degree program they were in when they were given their fellowship, estimated to be 4,468 indivuals.


There will be no sampling of this population. The survey will be a census of all the graduated fellows. The first survey will have two cohorts of graduates (FY10 and FY11) and is estimated to total 1,569 individuals. The second survey will have all four cohorts of graduates (FY10-13) and is estimated to total 4,4628 individuals. Cohorts 1 and 2 will be surveyed twice; cohorts 3 and 4 only once.


Response rates for fellows are estimated to be 90% for the first survey, and 85% for the second survey two years later, due to possible attrition. The response rates are estimated to be high due to our experience with fellows submitting performance reports on an annual basis. Nearly all fellows submit these reports during the course of their fellowship, with only a small number (2-3% not submitting). FLAS and IIPP fellowship coordinators will follow up with non-respondents who have not opted out of the survey to inquire about obstacles to completion that can be removed. Should the response rate not be on target to meet 90%, for the first survey and 85% for the second survey, IFLE staff will work with FLAS and IIPP fellowship coordinators to identify additional strategies to encourage participation. As all invited participants have a history with the programs, high levels of participation are expected. If the response rate falls below 85%, we will do an analysis of non-responders.


  1. Describe the procedures for the collection of information, including:


  • Statistical methodology for stratification and sample selection.


  • Estimation procedure.


  • Degree of accuracy needed for the purpose described in the justification.


  • Unusual problems requiring specialized sampling procedures, and


  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The information collection will be surveying the entire universe of graduated fellows. No sampling methodology will be used. The grantees for the FLAS and IIPP programs are institutions of higher education that maintain records on their students, including their graduation dates, which the Department of Education does not have. The grantees will be responsible for determining which fellows should be surveyed in a given year, since they have records of when fellows have graduated, and they will administer the survey.


This survey will be conducted on a biennial basis, in accordance with the statute.


  1. Describe methods to maximize response and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


The Department of Education will train grantees on how to conduct the survey.


The grantees administering the survey will maximize response rates in the following ways:


  • Informing potential fellows that they will be sent a survey on a biennial basis for eight years following their graduation from their degree program.

  • Making an effort to maintain accurate and up-to-date email addresses for all fellows at the time of graduation and thereafter, using efforts such as social media (e.g. Facebook and LinkedIn groups where contact information is automatically updated).

  • Sponsorship - Placing the grantee’s institution name and logo on the survey so that it is recognizable and familiar to the fellow.

  • Conducting the survey electronically to minimize burden and maximize response.

  • Implementing efficient skip patterns in the survey to make it easier and quicker to complete.

  • Communication - Following up with reminders and paper mailings if necessary, and offering to share the results.

  • The second survey will contain some information that is pre-populated from the initial survey. This will reduce burden and maximize response rates. These questions include 1-5 (basic award profile), 28 – 31 (demographics), and 32 – 34 (language background). The answers for these questions should not change at all from year to year.


If the response rate falls below 85%, the Department will do an analysis of non-responders.


  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


Multiple questions are standard questions from other surveys that have proven to be valid and reliable (questions 4f, 5, 8, 13, 14, 16, 17, 19, and 20).1 In addition, a limited pre-test was completed.


A pre-test was conducted of the survey questions for less than ten respondents. The respondents were FLAS fellows who had graduated from their degree program. Their feedback was positive. We asked respondents the following questions during the pre-test:


After completing the survey, please respond to these questions to help us improve it. Please review a copy of the survey while answering these questions.


  1. What do you think was the objective of the survey?

  2. Did you feel comfortable answering the questions? Which items produce confusion, embarrassment, or irritation?

  3. Is the wording of the questions clear?

  4. Are the answer choices compatible with your experience with the FLAS program?

  5. Do any of the items require you to think too long or hard before responding? If so, which ones?

  6. Do any of the questions generate response bias (meaning that the wording of a question seems to suggest that you give a certain answer)? If so, which ones?

  7. Is the survey too long?

  8. Have any other important issues been overlooked? Any other feedback?


The most substantial feedback was how questions were ordered and the ability of the survey to skip questions that were not relevant. We made adjustments to the survey so that respondents only answer the questions that apply to them, thus minimizing burden.



  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other persons who will actually collect and/or analyze the information for the agency.


Responsible for collecting and analyzing the information for the Department of Education:


KimOanh Nguyen-Lam, PhD, International and Foreign Language Education (IFLE), 202-219-7020.

Amy Wilson, Senior Program Officer, IFLE, 202-502-7689.


Consulted on the statistical aspects of the design:


Department of Education staff

Name

Title

Office

Telephone Number

Stuart Kerachsky


Director

Policy and Program Studies Service, Office of Planning, Evaluation and Policy Development

202-401-1270

Loveen Bains

Program Officer

IFLE, OPE

202-502-7709

Cheryl Gibbs

Senior Program Officer

IFLE, OPE

202-502-7634

Beth MacRae

Program Officer

IFLE, OPE

Unknown – no longer workers at the Department

Jessica Barrett Simpson

Senior Program Officer

IFLE, OPE

202-377-4090

Susan Lehmann

Education Research Analyst

Higher Education Programs, OPE

202-502-7516

John Clement

Director

Institutional Service, OPE

202-502-7520


Grantees

Name

Title

Institution

Telephone Number

Nicholas Bassey

Director of Institute for International Public Policy

United Negro College Fund Special Programs Corporation

703-677-3400

Michael Reynolds

Deputy Director for Research and Senior Research Scientist

National Opinion Research Center, University of Chicago

773-256-6053

Joshua Beck

Associate Director

Center for Latin American Studies, University of Chicago

773-702-8420

Anne-Maree Ruddy

Research Associate

Center for Education and Evaluation Policy, Indiana University

812-855-4438

Denise Gardner

Title VI Coordinator

College of Arts and Sciences, Indiana University

812-855-2608

Bayta Maring

Assistant Director

Office of Educational Assessment, University of Washington

206-543-5190

Marta Mikkelsen

Associate Director

Ellison Center for Russian, East European and Central Asian Studies, University of Washington

206-543-4852

Nancy J. Loncto

Associate Director for Administration

Southeast Asia Center, Cornell University

607-255-8902

Brad Washington

Program Evaluation Manager and Associate Academic Specialist

International and Area Studies, University of California, Berkeley

510-642-2472


1 Questions are sourced from the National Science Foundation’s Survey of Earned Doctorates and Survey of Doctorate Recipients, as well as the American Community Survey and the National Survey of College Graduates.



Shape1

6


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT PART B
AuthorAuthorised User
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy