YTYA Supporting Statement Part B (5-2013)

YTYA Supporting Statement Part B (5-2013).doc

Young Turkey/ Young America Evaluation (YTYA) Survey

OMB: 1405-0213

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT FOR
PAPERWORK REDUCTION ACT SUBMISSION


Bureau of Educational and Cultural Affairs

Office of Policy and Evaluation

Evaluation Division:

Young Turkey/ Young America Evaluation (YTYA) Survey


OMB Control Number: 1405-XXXX

SV2013-0001



  1. Collections of Information Employing Statistical Methods


    1. There is no sampling for this information collection, as the potential respondent universe for this information collection is all 235 program participants from the Young Turkey/Young America Program. YTYA participants are both Turkish citizens and American citizens who were involved in the program while in their home country and in their host country, and funded under FY 2009 to FY 2011 grants. The participants from this program have never been surveyed by State/ECA. The anticipated response rate for this entire collection is 65%. This number is based on experience with previous State Department studies that have been completed and the fact that most participated in the program fairly recently.  


    1. This information collection will consist of one electronic survey. Given the low total N in the program we’re surveying, we believe that sampling would likely yield an insufficient number of responses. Therefore we are surveying the entire population rather than a specified sample. This information collection will only be conducted one time as part of the Young Turkey/Young America Evaluation.


    1. All ECA/P/V data collection methods are tailored to fit the prevailing political, cultural, safety, security, and accessibility conditions in the United States and in Turkey. Successfully contacting and achieving the highest possible response rates are the goals of survey administration. Our methods will include:


  • Customized Notification Emails and Letters: Pre-notification emails and/or letters will be sent to all program participants approximately 3 weeks prior to the survey launch to encourage respondent cooperation. These emails/letters will inform program participants of the upcoming online survey, and contain endorsements and mentions of State Department or grantees. This email will inform them about the evaluation and will also provide ways for respondents to contact the firm managing the evaluation on behalf of State/ECA, should they have any concerns or questions about the evaluation, or should they wish to provide an alternative email address to receive the online survey.  As needed, the contractor will send follow-up emails to individuals to resolve any questions or discrepancies in participants’ names, email addresses, or dates of program participation. A second customized introductory/invitation email containing a link to the online survey will be sent at the start of survey administration.


  • Participant Contact Information Verification: Extensive contact lists for the program were requested from the respective administering grantee organizations and State Department program office to establish baseline participation in the program and to obtain an initial set of contact data. In addition, ECA/P/V queried available State Department self-registered (i.e., provided by participants) alumni databases to obtain any additional or updated contact information in order to ensure that the contact lists are as accurate as possible.


  • Informing the Grantee Organizations: Many program participants continue to be in communication with the grantee organization that administered their YTYA exchange program. Grantees have been notified of the upcoming survey, in the event that the grantees are contacted by former participants from the program.


  • Survey Reminders: Besides the initial introductory/invitation email at survey launch, up to three follow-up reminders will be sent to non-respondents to encourage them to respond over the course of the administration period, including a final reminder as the survey comes to a close that will indicate the urgency. Response rates and survey user feedback will be closely monitored and recorded throughout the entire survey administration period to ensure a satisfactory response. ECA/P/V will also be ready to make a judgment call based on response rate status throughout the administration period to both extend the administration period as deemed fit, as well as send an additional final reminder.


  • Pre-testing Survey: Pre-testing the survey with four (4) participants was extremely useful for clarifying instructions and questions, refining the response categories, as well as ensuring clarity, brevity, relevance, user-friendliness, understandability, and sensitivity to a respondent’s culture and the political climate in which they live. This in turn allowed the survey’s questions to be designed and refined in a way that minimizes the burden to respondents and encourages them to complete their survey.


Using the data collection methods described above has in our previous evaluations ensured clarity about and transparency in the survey process and helped stimulate higher response rates.


This data collected is only representative of the evaluation’s respondents and all analysis of results and future reports will be clearly linked to only the universe that was surveyed. We will monitor the potential for non-response bias, including tracking response rates by cohort over the collection period and reviewing both respondent and non-respondent demographics. These factors will be taken into account in our analysis and reporting of results, especially when disaggregating the data according to key demographics for which the number of respondents may be less than ideal.


    1. To enhance the questionnaire’s design, during the survey pre-development phase the contractor conducted formative interviews with eight (8) former program participants. The formative interviews helped EurekaFacts to better understand program participants’ experiences, including the full range of activities, interactions, roles, and outcomes associated with program participation. These formative interviews allowed the contractor to develop the draft survey.


After the initial draft was developed four (4) of the past program participants completed a test version of the on-line survey. Three of these pre-test participants indicated no issues, while the one participant who suggested minor changes participated in a follow-up interview. Based on this interview and a review of data from the four surveys to ensure wording was clear, conveyed its intended meaning, contained realistic and mutually exclusive response options, and presented scaling of magnitude, agreement/disagreement, etc., that is relevant and understandable to the respondents, minor revisions were made to the survey instrument.


    1. The ECA/P/V individual who will be able to answer questions regarding this evaluation is Robin Silver.


2


File Typeapplication/msword
File TitleSUPPORTING STATEMENT FOR
AuthorCrowleyml
Last Modified Byciupekra
File Modified2013-05-28
File Created2013-05-28

© 2024 OMB.report | Privacy Policy