Gilman Supporting Statement Final (7-2013)

Gilman Supporting Statement Final (7-2013).docx

Gilman Evaluation Survey

OMB: 1405-0212

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT FOR
PAPERWORK REDUCTION ACT SUBMISSION


Bureau of Educational and Cultural Affairs

Office of Policy and Evaluation

Evaluation Division:

Gilman Evaluation Survey


OMB Control Number: 1405-XXXX

SV-2012-0008

  1. Justification

  1. The Department of the State’s Bureau of Educational and Cultural Affairs (ECA), Office of Policy and Evaluation, Evaluation Division (ECA/P/V) is requesting a new information collection to conduct a new ECA survey. ECA evaluations to date have provided significant evidence of the effect of ECA programs on the personal and professional achievement of participants, and have identified critical changes in the institutions where they have worked or communities with which they have engaged.


This new descriptive evaluation will investigate the outcomes of study abroad on recipients of the Benjamin A. Gilman International Scholarship. The Gilman Scholarship, as it is more commonly known, offers grants for U.S. citizen undergraduate students of limited financial means to pursue academic study abroad, and thus provide opportunities for study abroad that would not have been possible otherwise. The Gilman Scholarship aims to support students who have been traditionally underrepresented in study abroad, including but not limited to, students with high financial need, community college students, students in under-represented fields such as the sciences and engineering, students with diverse ethnic background, and students with disabilities.


This survey will review the experiences of grant recipients while abroad; probe ways in which they shared what they learned with family, peers, and other community members upon returning to the United States; and investigate whether the international experience factored into their subsequent educational and professional choices.


This information collection will be conducted once, and will be comprised of one survey sent to all grant recipients who studied abroad during the nine-year period spanning the 2002/2003 and 2010/2011 academic years.


The data captured will help the Department and ECA Bureau successfully meet organizational performance and accountability goals established through the mandates contained in the following authorities;


  • Mutual Educational and Cultural Exchange Act of 1961, as amended (also known as the Fulbright-Hays Act) (22 U.S.C. 2451 et seq.)

http://www2.ed.gov/about/offices/list/ope/iegps/fulbrighthaysact.pdf


  • Government Performance and Results Act of 1993 (GPRA)

http://www.whitehouse.gov/omb/mgmt-gpra/gplaw2m.html


  • Government Performance and Results Modernization Act of 2010

http://www.whitehouse.gov/omb/performance/gprm-act


  • OMB Memo M-10-01, Increased Emphasis on Program Evaluations

http://www.whitehouse.gov/sites/default/files/omb/assets/memoranda_2010/m10-01.pdf

As stated in the memo, “OMB will work with agencies to make information readily available online about all Federal evaluations focused on program impacts that are planned or already underway” as part of a three-pronged effort to strengthen government-wide program evaluation efforts. The guidance noted that public availability of program evaluation information will promote transparency, since agency program evaluations will be made public regardless of the results.


  1. The primary purpose of this information collection is to provide ECA/P/V with the ability to assess the Gilman Scholarship program in accordance with GPRA, as well as OMB Guidance, and Executive Orders. The data collected will inform the Program Offices as to program management and future design issues or adjustments, program planning, results reporting, information dissemination and outreach initiatives.


This study will examine the international experiences of Gilman Scholarship recipients, the influence of study abroad on their personal lives, and their contributions to their communities in the United States after studying abroad. It will provide State Department leadership, ECA senior management, and program officers with data they currently do not have, and with analyses that can potentially be used to design new programs, improve existing programs, and to shed light on ongoing and future activities.


This study will assess achievement of program goals only to the extent to which they are reflected in the major research questions below. The table below lists the major research questions developed for this evaluation, the outcome measures that may be assessed, and provides contextual information for understanding the affects of this exchange program. The data source that will be used to answer all of the major research questions will be the on-line survey questionnaire.


Table 1


Major Research Questions

Outcome Measure


What were the Gilman Scholars’ experiences during their study abroad? What types of activities did they participate in? Did they participate in service learning activities?


Answers to questions regarding Scholars’ study abroad experience including: language study, internships, service activities, and extracurricular experiences.




What educational choices have Scholars made after returning home, and did the Gilman Program play a role in these choices?



Answers to questions pertaining to educational choices at the undergraduate and graduate levels, including whether Gilman Scholars began taking international subjects or area studies courses, took additional language courses, undertook international research projects, and/or participated in subsequent study abroad programs after the program.



What are Gilman Scholars’ professional and career plans, and has participation in the program influenced the evolution of these plans?



Answers to questions regarding the professional and career paths Gilman Scholars have followed to date in terms of professional fields chosen and areas of work. Also, whether their professional careers or career plans reflect an international emphasis or focus.



Has program participation influenced the Gilman Scholars’ personal lives and perspectives?



Answers to questions regarding Scholars’ continued engagement with individuals in their host country; interactions with family, friends, peers, and other community members about their international experience.



Has participation in the program influenced Gilman Scholars’ engagement in the international arena?


Answers to questions regarding Scholars’ participation in activities with an international focus, such as joining internationally-oriented community and religious groups, volunteering or donating on behalf of international causes, and following international media coverage.



How have Gilman Scholars shared about their experiences abroad with family, friends and communities?



The survey asks respondents to identify various ways they have interacted with family and friends about their international experience, including sharing stories, media, and cuisine from their host countries, encouraging others to participate in study abroad programs, and making presentations about their experiences abroad.


Answers to questions regarding each respondent’s engagement in international activities and continuing education about host country or other international issues.


The survey asked respondents to indicate if they are aware of any family members or peers pursuing other international education opportunities, as a direct result of conversations about their study abroad experience.



What was the nature and range of the Follow-on projects across the nine cohorts and how were the projects implemented?



On the survey, Scholars will be asked to identify the type of Follow-on project they implemented, as well as the subject of their Follow-on Project.


Analysis of all collected data will include descriptive statistics and frequencies, providing percents of scores and counts per each response category for the survey questions relating to each research question. Cross-tabular analysis of survey responses will be conducted to assess variances in effect by different participant or program characteristics. Examples may include comparing program-level findings by:

  • Participant demographic characteristics (race, ethnicity, gender, socio-economic background, disability status, age).

  • Country/region of the study-abroad program.

  • Discipline studied, field of work, and whether or not the Gilman Fellow received a Critical Needs Language Supplement.

  • Type of home educational institution (e.g., 2-year as compared with 4-year, public as compared with private, minority-serving as compared to non-minority-serving, small as compared to large institutions).

  • Cohort (differences in program outcomes relative to the length of time since program completion). We will also not report any finding when “n” is less than or equal to 5 in order to protect respondent confidentiality and to ensure we are not reporting invalid results.


  1. The information collection’s survey will be entirely web-based to ease any burden on the participant. The survey will be distributed using the survey application Vovici. Participants will be informed of the survey via e-mails and e-mail reminders that will provide instructions for how to access the survey electronically.


  1. Currently, no duplicative information exists, and there have been no other information collections for all these cohorts and research questions. There is no other reliable method for ECA to collect the information needed to fulfill the requirements of the Department’s annual strategic planning and reporting process and the annual Congressional budget process as part of the GPRA mandates.


  1. Information collected under this collection will have no impact on small businesses and other small entities.


  1. If the information is not collected, ECA will be unable to complete this study, or gather data requested by ECA senior leadership in order to assess and report on this study-abroad program, the only one that focuses solely on underserved populations. Moreover, the Department will be unable to comply fully with its congressional and Department executive mandates, including the GPRA Modernization Act of 2010 which requires the Department to evaluate and report the results of its exchange programs.


  1. There are no special circumstances.


  1. ECA/P/V has solicited public comments on this collection via a 60-day Notice published in the Federal Register on Dec. 19, 2012 (77 FR 75251). One comment was received. Upon reviewing the comment, ECA/P/V determined that the comment was unrelated to the information collection, and instead addressed broader Department wide policy and budget regarding the program. ECA/P/V has consulted with an external contractor, RSI about the surveys design, methodology, analysis, and data collection approach.


  1. No gifts or payments will be made to the respondents.


  1. No promises of confidentiality will be made to respondents. The Department of State intends to keep the information private to the extent permitted by law.


  1. No questions of a sensitive nature are asked in the survey.


  1. It is estimated that the total annual hour burden will be 1,031 hours for the 6,184 respondents that make up the census population. (As explained in Section B2, it is estimated that the response rate to the surveys will be 40%.) The annual hour burden was calculated with the expectation that 40% will complete each survey at 25 minutes. Because this survey will only be conducted once, the three year total is the same as the annual total. Burden hours took into account the total number of questions and the number of open-ended questions, as well as experience on previously conducted evaluations.


Table 2

Respondent Burden


ITEM

ANNUAL TOTAL

3 YEAR TOTAL

Estimated Number of Respondents

6,184

6,184

Average Hours Per Response

25 Minutes

25 Minutes

Estimated Number of Responses

2,474

2,474

Estimated Hours for Responses

1,031

1,031


To determine the estimated income per hour, the Bureau of Labor Statistics (BLS), “Table 1 Summary: mean hourly earnings and weekly hours for selected worker and establishment characteristics” were reviewed (http://www.bls.gov/ncs/ncswage2010.htm#Overview).  The specific data table is located at http://www.bls.gov/ncs/ocs/sp/nctb1475.pdf.  Average mean hourly civilian earnings are $21.29; private industry workers are $20.47, state and local government workers are $26.08, and minimum wage earning (per BLS) are $7.25.  Averaging the four totals $18.77 (weighted to $26.28 and rounded to $27). 


Table 3

Annualized Cost to Respondent for Hours Burden



Description of the Collection Activity

Estimated Total Annual Burden on Respondents (Hours)

Estimated Average Income per Hour

Estimated Cost to Respondents

Web Survey

1,031 hours

$27

$27,837


  1. There are no costs incurred by respondents.


  1. The estimated annualized cost to the Federal Government for this collection is $192,033. This number was calculated based on the contractor’s labor for associated tasks, as well as salary of ECA/P/V staff who manage the contractor (as broken down below):


  • The data collection budget for this evaluation survey is approximately $78,000. This includes contractor labor for 3 persons for drafting and finalizing the survey instrument, survey programming in the surveying system, survey administration including sending out survey reminders, and producing regular response rate reports, participation in status meetings with ECA/P/V, as well as fees for the software/server expenditures.


  • The contractor’s analysis and reporting budget the data collected through this collection is approximately $104,000 and will include contractor labor for 4 people to do the analysis, report writing and materials, and briefings.


  • The cost for ECA employee time is $10,033. This is calculated as 2 employees (GS-13), with a loaded (or weighted) average hourly wage of $59.72, who will spend approximately 5 percent (during about 10 months) of their time (which will equate to about 84 hours each) in providing oversight and additional guidance including reviews of survey, data administration, data analysis, and the report, as well as participation in status meetings with the contractor.


  1. This is a new collection.


  1. Survey data collection is estimated to begin immediately after OMB provides approval. It is estimated the data collection period will take at least 6 weeks. Following the data collection period, the external contracting firm (RSI) will conduct basic descriptive analysis (such as frequencies) and cross-tabular analysis as needed as explain per section A2. The contractor will develop a report for review and approval by ECA.


Once approved by the ECA Assistant Secretary, the evaluation report will be posted on the Evaluation Division site at http://exchanges.state.gov/programevaluations/completed.html for public release. Additionally, an appropriate distribution list, which will include key stakeholders and other organizations and individuals that may be interested in the evaluation results, will also be developed. They will receive notification of the release of this report via e-mail. The contracted evaluators are also required to present results of the evaluation to key stakeholder groups as requested by ECA for a period of time following the evaluation’s completion. Results for this evaluation are estimated to conclude about 9 months after the data collection period has ended.


  1. ECA/P/V will display the OMB expiration date.


  1. There are no exceptions requested for this collection.




  1. Collections of Information Employing Statistical Methods



  1. This information collection will consist of one electronic survey, conducted only one time as part of the Gilman Evaluation.


  1. The potential respondent universe for the survey will be all 6,184 recipients of the Gilman Scholarship who studied abroad during the nine-year period spanning the 2002/3 and 2010/11 academic years. The anticipated overall response rate for this entire collection is 40%. This number is based on careful triangulation of several sources: average response rates for previously conducted evaluation surveys; and on support from the program office and the grantee organization.


While sampling is a useful and effective statistical tool, it would not be appropriate to employ sampling in this data collection effort. As stated above, the anticipated response rate for this survey effort is 40%. Given the need to obtain sufficient responses to address diverse program characteristics such as the type of home institution which has changed over the years, the evaluation team has concluded that sampling as well as the exclusion of the earlier academic years (prior to 2005) would provide insufficient data. Additionally, some of our more recent evaluation surveys for programs that have gone as far back as 2006 support the efforts towards including earlier years, as those years have achieved similar response rates. It is understood that including the three program years that ended as far back as 2003 will be more of a challenge; however the need to include them is due to the scope of the evaluation which encompasses looking for changes over time in the program long-term outcomes. As a result, the survey will be administered using a census approach to ensure that the full span of the program is accounted for.


Furthermore the research team is quite confident that the potential respondent universe is a willing and capable target group. Even thought the universe includes individuals who participated in the program roughly ten years ago, they comprise Internet-savvy, digitally connected young professionals. All are American citizens, most of whom now reside in the United States. Thanks to the efforts of ECA’s alumni office programming, which utilizes digital platforms to remain in touch with program participants, we have excellent contact information on these individuals and evidence that they are responsive to electronic communication. None of the usual concerns of trying to reach older program participants applies here.



  1. All ECA/P/V data collection methods are tailored to fit the prevailing political, cultural, safety, security, and accessibility conditions in each country in which participants are located. Successfully contacting and achieving the highest possible response rates are the goals of survey administration. Our current methods will include:


  • Customized Intro E-mail: A customized intro e-mail will be sent at the start of survey administration to encourage respondent cooperation. This e-mail will inform them about the evaluation and will also provide ways for respondents to contact the evaluation’s contractor with any concerns or questions about the evaluation.


  • Participant Contact Information Verification: Extensive contact lists for the program were requested from the respective administering grantee organizations and State Department program office to establish baseline participation in each program over the 2002 – 2011 period and to obtain an initial set of contact data. In addition, ECA/P/V queried the State Department’s Alumni databases and alumni network to obtain any additional or updated contact information in order to ensure that the contact lists are as accurate as possible.


  • Informing the Grantee Organizations: Many program participants continue to be in communication with the grantee organization that administered their exchange program long after the program has ended. Informing the grantee organizations in advance of the start of the evaluation’s data collection period will allow the grantees to vouch for the survey requests that get sent out by the contractor. Doing this will only serve this purpose in the event any of the participants contact the grantee regarding any doubt as to the legitimacy of the initial intro e-mail that will be sent by Research Solutions International. No other information about the participants themselves will be provided to the grantee.


  • Survey Reminders: Besides the initial intro e-mail, three follow-up reminders will be sent to non-respondents to encourage them to respond over the course of the administration period, including a final reminder as the survey comes to a close that will indicate the urgency. Response rates and survey user feedback will be monitored and recorded upon each biweekly reminder to ensure a satisfactory response. ECA/P/V will also be ready to make a judgment call based on response rate status throughout the administration period to both extend the administration period as deemed fit, as well as send an additional reminder.


  • Pre-testing Survey: Pre-testing the survey was extremely useful for clarifying instructions and questions, refining the response categories, as well as ensuring clarity, brevity, relevance, user-friendliness, understandability, and sensitivity to a respondent’s culture and the political climate in which they live. This in turn allowed the survey’s questions to be designed in a way in which to minimize the burden to respondents and encourage them to complete their survey.


Using such methods has in our previous experiences stimulated response rates.


This data collected is only representative of the evaluation’s respondents and all analysis of results and future reports will be clearly linked to only the universe that was surveyed. We will monitor the potential for non-response bias, including tracking response rates by cohort over the collection period and reviewing both respondent and non-respondent demographics. These factors will be taken into account in our analysis and reporting of results, especially when disaggregating the data according to key demographics for which the number of respondents may be less than ideal.


  1. To enhance the questionnaire design, a small number of formative interviews were conducted. Five (5) former program participants were interviewed prior to the survey development phase. These interviews increased questionnaire designers’ level of understanding in regard to program participants’ experiences, particularly in terms of identifying the full range of activities, interactions, roles, and outcomes associated with program participation. In addition to formative interviews prior to questionnaire design, a small number of cognitive/pre-test interviews (6 interviews, which were comprised of different questions from the formative interviews) were conducted upon completion of the questionnaire design phase. As part of these interviews a small number of past program participants completed a test version of the on-line survey and were later be de-briefed through telephonic interviews or via e-mail to identify any needed modifications to the instrument prior to OMB submission. The debriefing interviews focused on determining whether question wording was clear, conveyed its intended meaning, contained realistic and mutually exclusive response options, and presented scaling of magnitude, agreement/disagreement, etc. that is relevant and understandable to the respondents.


  1. The ECA/P/V individual managing this evaluation’s external contractor, Research Solutions International, who will be collecting the data and analyzing the information is Eulynn Shiu, 202-632-6321.

7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT FOR
AuthorCrowleyml
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy