Gilman Supporting Statement Part B (7-2013)

Gilman Supporting Statement Part B (7-2013).docx

Gilman Evaluation Survey

OMB: 1405-0212

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT FOR
PAPERWORK REDUCTION ACT SUBMISSION


Bureau of Educational and Cultural Affairs

Office of Policy and Evaluation

Evaluation Division:

Gilman Evaluation Survey


OMB Control Number: 1405-XXXX

SV-2012-0008




  1. Collections of Information Employing Statistical Methods



  1. This information collection will consist of one electronic survey, conducted only one time as part of the Gilman Evaluation.


  1. The potential respondent universe for the survey will be all 6,184 recipients of the Gilman Scholarship who studied abroad during the nine-year period spanning the 2002/3 and 2010/11 academic years. The anticipated overall response rate for this entire collection is 40%. This number is based on careful triangulation of several sources: average response rates for previously conducted evaluation surveys; and on support from the program office and the grantee organization.


While sampling is a useful and effective statistical tool, it would not be appropriate to employ sampling in this data collection effort. As stated above, the anticipated response rate for this survey effort is 40%. Given the need to obtain sufficient responses to address diverse program characteristics such as the type of home institution which has changed over the years, the evaluation team has concluded that sampling as well as the exclusion of the earlier academic years (prior to 2005) would provide insufficient data. Additionally, some of our more recent evaluation surveys for programs that have gone as far back as 2006 support the efforts towards including earlier years, as those years have achieved similar response rates. It is understood that including the three program years that ended as far back as 2003 will be more of a challenge; however the need to include them is due to the scope of the evaluation which encompasses looking for changes over time in the program long-term outcomes. As a result, the survey will be administered using a census approach to ensure that the full span of the program is accounted for.


Furthermore the research team is quite confident that the potential respondent universe is a willing and capable target group. Even thought the universe includes individuals who participated in the program roughly ten years ago, they comprise Internet-savvy, digitally connected young professionals. All are American citizens, most of whom now reside in the United States. Thanks to the efforts of ECA’s alumni office programming, which utilizes digital platforms to remain in touch with program participants, we have excellent contact information on these individuals and evidence that they are responsive to electronic communication. None of the usual concerns of trying to reach older program participants applies here.



  1. All ECA/P/V data collection methods are tailored to fit the prevailing political, cultural, safety, security, and accessibility conditions in each country in which participants are located. Successfully contacting and achieving the highest possible response rates are the goals of survey administration. Our current methods will include:


  • Customized Intro E-mail: A customized intro e-mail will be sent at the start of survey administration to encourage respondent cooperation. This e-mail will inform them about the evaluation and will also provide ways for respondents to contact the evaluation’s contractor with any concerns or questions about the evaluation.


  • Participant Contact Information Verification: Extensive contact lists for the program were requested from the respective administering grantee organizations and State Department program office to establish baseline participation in each program over the 2002 – 2011 period and to obtain an initial set of contact data. In addition, ECA/P/V queried the State Department’s Alumni databases and alumni network to obtain any additional or updated contact information in order to ensure that the contact lists are as accurate as possible.


  • Informing the Grantee Organizations: Many program participants continue to be in communication with the grantee organization that administered their exchange program long after the program has ended. Informing the grantee organizations in advance of the start of the evaluation’s data collection period will allow the grantees to vouch for the survey requests that get sent out by the contractor. Doing this will only serve this purpose in the event any of the participants contact the grantee regarding any doubt as to the legitimacy of the initial intro e-mail that will be sent by Research Solutions International. No other information about the participants themselves will be provided to the grantee.


  • Survey Reminders: Besides the initial intro e-mail, three follow-up reminders will be sent to non-respondents to encourage them to respond over the course of the administration period, including a final reminder as the survey comes to a close that will indicate the urgency. Response rates and survey user feedback will be monitored and recorded upon each biweekly reminder to ensure a satisfactory response. ECA/P/V will also be ready to make a judgment call based on response rate status throughout the administration period to both extend the administration period as deemed fit, as well as send an additional reminder.


  • Pre-testing Survey: Pre-testing the survey was extremely useful for clarifying instructions and questions, refining the response categories, as well as ensuring clarity, brevity, relevance, user-friendliness, understandability, and sensitivity to a respondent’s culture and the political climate in which they live. This in turn allowed the survey’s questions to be designed in a way in which to minimize the burden to respondents and encourage them to complete their survey.


Using such methods has in our previous experiences stimulated response rates.


This data collected is only representative of the evaluation’s respondents and all analysis of results and future reports will be clearly linked to only the universe that was surveyed. We will monitor the potential for non-response bias, including tracking response rates by cohort over the collection period and reviewing both respondent and non-respondent demographics. These factors will be taken into account in our analysis and reporting of results, especially when disaggregating the data according to key demographics for which the number of respondents may be less than ideal.


  1. To enhance the questionnaire design, a small number of formative interviews were conducted. Five (5) former program participants were interviewed prior to the survey development phase. These interviews increased questionnaire designers’ level of understanding in regard to program participants’ experiences, particularly in terms of identifying the full range of activities, interactions, roles, and outcomes associated with program participation. In addition to formative interviews prior to questionnaire design, a small number of cognitive/pre-test interviews (6 interviews, which were comprised of different questions from the formative interviews) were conducted upon completion of the questionnaire design phase. As part of these interviews a small number of past program participants completed a test version of the on-line survey and were later be de-briefed through telephonic interviews or via e-mail to identify any needed modifications to the instrument prior to OMB submission. The debriefing interviews focused on determining whether question wording was clear, conveyed its intended meaning, contained realistic and mutually exclusive response options, and presented scaling of magnitude, agreement/disagreement, etc. that is relevant and understandable to the respondents.


  1. The ECA/P/V individual managing this evaluation’s external contractor, Research Solutions International, who will be collecting the data and analyzing the information is Eulynn Shiu, 202-632-6321.

2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT FOR
AuthorCrowleyml
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy