Final - Supporting Statement_ECA MODE

Final - Supporting Statement_ECA MODE .docx

Monitoring Data for ECA (MODE) Framework

OMB: 1405-0240

Document [docx]
Download: docx | pdf



SUPPORTING STATEMENT FOR
PAPERWORK REDUCTION ACT SUBMISSION

Monitoring Data for ECA Framework
OMB Number 1405-XXXX


A. JUSTIFICATION

  1. Why is this collection necessary and what are the legal statutes that allow this?

The Department of State’s Bureau of Educational and Cultural Affairs (ECA) regularly monitors and evaluates its programs through the collection of data about program accomplishments in order to enable program staff to assess the results of its programs, where improvements may be necessary, and to modify/plan future programs. In order to more systematically assess the efficacy and impact of ECA funded programs and to address the requirements of the Foreign Aid Transparency and Accountability Act (FATAA) and the Department of State’s updated monitoring and evaluation guidance (18 FAM 300), ECA’s Evaluation Division has created a robust performance monitoring framework that is responsive to these directives, measures programmatic goals and objectives and provides a comprehensive view of overall Bureau activities. The Monitoring Data for ECA (MODE) Framework (https://eca.state.gov/impact/eca-evaluation-division/monitoring-data-eca-mode-framework) includes a results framework with indicators1 designed to track program performance and the direction, pace, and magnitude of change of ECA programs, leading to strengthened feedback mechanisms resulting in more effective programs. Each of these indicators has corresponding data collection questions defined so data will be collected uniformly whether by the program office, the Evaluation Division, or an award recipient. Implementation of the MODE Framework will enable ECA to standardize and utilize its data in the following ways:

  • Assess data and performance metrics to enhance program performance

  • Inform strategic planning activities at the Bureau, division, and individual exchange program levels

  • Supplement the information ECA program officers receive from their award recipients and exchange participants to provide a comprehensive view of programmatic activities

  • Respond quickly and reliably to ad-hoc requests from Congress, the Office of Management and Budget (OMB), and internal Department of State stakeholders

In order to collect data for the MODE Framework, ECA Evaluation Division intends to conduct ongoing surveys of program participants, alumni, and participant host and home communities to monitor program performance, assess impact, and identify issues for further evaluation. Specifically, ECA will coordinate with award recipients to provide standard survey questions for both foreign national and U.S. citizen exchange participants immediately after completing the exchange (“Participant Post-Program Survey”). ECA’s Evaluation Division also intends to administer standard surveys to foreign national and U.S. citizen exchange alumni1 roughly one year, three years and five years after completing their exchange experience. Conducting post-program surveys, particularly after three and five years, will provide information on the impact of ECA programs and insight into the achievements of participants. To examine exchange multiplier effects on foreign and U.S. communities and institutions that sponsor, support, or provide exchange programs support or services, ECA intends to administer standard surveys to foreign and U.S. host community members (individuals or institutions).

The Department seeks OMB multi-year clearance on all these exchange survey questions to facilitate the ongoing data collection from all exchange participants, foreign national and U.S. citizens, that are involved in ECA exchange programming.

Legal authorities and administrative requirements that necessitate the collection of these data can be found below:

  1. Government Performance and Results Act of 1993 (GPRA)

  2. Government Performance and Results Modernization Act of 2010 (GPRAMA)

  3. Mutual Educational and Cultural Exchange Act of 1961, as amended, 22 U.S.C. 2451 et seq (also known as the Fulbright-Hays Act)

  1. What business purpose is the information gathered going to be used for?

The primary users of the collected data will be ECA’s Evaluation Division, program staff and leadership, as well as ECA’s award recipients. The information collected through the MODE Framework will enable ECA to standardize and utilize its data in the following ways:

  • Assess data and performance metrics to enhance program performance

  • Inform strategic planning activities at the Bureau, division, and individual exchange program levels

  • Supplement the information ECA program officers receive from their award recipients and exchange participants to provide a comprehensive view of programmatic activities

  • Respond quickly and reliably to ad-hoc requests from Congress, the Office of Management and Budget (OMB), and internal Department of State stakeholders

  1. Is this collection able to be completed electronically (e.g. through a website or application)?

Participant Post-Program surveys will be administered and managed by ECA award recipients on their electronic survey platforms.

Alumni and Host Communities surveys will be managed by the ECA Evaluation Division on a single survey platform, Qualtrics. Qualtrics provides functionality that will minimize time required to both administer and respond to the survey, including:

  • Strong validation measures to reduce errors and generate cleaner raw data;

  • Quick access dashboards for initial results; and

  • Security certifications including ISO 27001 certified and FedRamp Authorized.

Qualtrics allows survey design to be tested on mobile platforms to ensure correct display of questions.

We expect that all survey submissions, including Participant Post-Program surveys will be collected electronically.

  1. Does this collection duplicate any other collection of information?

This will not be a duplication of effort. The purpose of the data collection, and therefore the focus of the questions asked, is to understand the short-, mid- and longer-term impacts on personal, academic, and career development, as well as contributions to U.S. global competitiveness. Currently, ECA award recipients administer post-program surveys to their participants as part of their internal program monitoring data collection approach. ECA intends to leverage this ongoing survey process by providing program awardees standard indicators (we estimate anywhere from 10-15 for each award) and corresponding data collection questions, depending on the program orientation. In many instances, these standard indicators and questions will supplant existing awardee defined comparable indicators and questions with ECA defined uniform data requirements. This will ensure the data ECA gathers are valid and reliable across the range of exchange programs.

In previous years, the ECA Evaluation Division surveyed foreign alumni from a sample of 10 ECA programs. The MODE Framework data collections represent an expansion to include American participants and standardization of the data collection tools. Additionally, ECA has not collected these data in a systematic manner from U.S. and foreign host community members in the past.

  1. Describe any impacts on small business.

Representatives from small businesses may be surveyed as part of the Host Community Survey, but we expect the time burden to be minimal.

  1. What are consequences if this collection is not done?

Approval is being sought for multiple information collections. Historically, the Evaluation Division has only collected reliable performance monitoring for 10 ECA exchange programs. Absent these data collections, ECA will have limited ability to monitor and evaluate the outcomes and impacts of its programming rigorously.

  1. Are there any special collection circumstances?

These data collections involve no special circumstances.

  1. Document publication (or intent to publish) a request for public comments in the Federal Register

The 60-day Federal Register Notice was published on June 4, 2020 (85 FR 34481). One comment was received but did not provide feedback on the evaluation data collection tools and was not deemed relevant. The Department will publish a notice in the Federal Register soliciting public comments for a period of 30 days.

  1. Are payments or gifts given to the respondents?

No payments or gifts are proposed for respondents.

  1. Describe assurances of privacy/confidentiality

ECA and its award recipients follow all procedures and policies stipulated under the Privacy Act of 1974 to guarantee the privacy of the respondents.

Post-Program Participant surveys administered by Award recipients include consent statements, which meet the following guidelines:

  • State how long the data collection is expected to take

  • Explain the purpose of the data collection

  • Explain how data will be used, shared, and stored

  • Specify that participation in data collection is voluntary and will have no consequences to the respondent

  • Provide contact information for questions or technical support

In addition, Alumni and Host Community surveys administered by the ECA Evaluation Division will include the following Privacy Act Statement:



The Evaluation Division of the U.S. State Department’s Bureau of Educational and Cultural Affairs (ECA) would like to ask you a few questions about your knowledge, attitudes, and future plans following your program experience. [SPECIFIC PROGRAM LANGUAGE TO BE TAILORED AS NEEDED]

Privacy Act Statement

AUTHORITY: The information on this form is requested under the authority of 22 U.S.C. 2451 et seq (Mutual Educational and Cultural Exchange Act of 1961), P.L. 103-62 (Government Performance and Results Act of 1993), and P.L. 111-352 (Government Performance and Results Modernization Act of 2010).

PURPOSE: The purpose of gathering this information is to track program performance and the direction, pace, and magnitude of change for ECA programs.



ROUTINE USES: The information on this form may be shared with members of Congress, and the Office of Management and Budget (OMB). De-identified data files may be shared (without Personally Identifiable Information such as names or contact information) with ECA implementing partners and external researchers who are assisting ECA in measuring its impact. More information on the Routine Uses for the system can be found in the System of Records Notice State-08, Educational and Cultural Exchange Program Records.

DISCLOSURE: Responding to this survey is voluntary. The answers you provide on the survey will have no bearing on your participation in alumni activities or any future applications you may submit for U.S. State Department programs.

By clicking “Consent and Enter Survey” button below, you are consenting to the above terms.

You can read ECA’s full data sharing and storage policy here: [LINK TO DATA POLICY]

Please complete this survey by [DEADLINE]. The survey has [NUMBER OF QUESTIONS] questions and will take approximately [TIME ESTIMATE] minutes to complete.

Thank you in advance for your time and participation!

[CONSENT AND ENTER SURVEY BUTTON]

In line with ECA policy, individual names will not be reported, but responses will be used in the aggregated analysis, and may be disaggregated by variables of interest, such as country of study, age group (youth vs. non-youth), program, sex, etc. As noted above, however, the consent statement for these data collections include broad consent, which will allow ECA to reanalyze the data at a later time if deemed useful. Award recipients will report response rates as well as share raw data files with ECA for this purpose as part of their reporting requirements.

  1. Are any questions of a sensitive nature asked?

There are two questions that have been deemed to be of a sensitive nature on the post-program and alumni surveys: sex and race/ethnicity of the respondents. Both of these questions have been included because previous evaluations of a sample of ECA programs have indicated that there may be important differences in the experience of non-white participants and, in some countries, female participants. ECA therefore needs to be able to examine the differences in experiences and outcomes between white and non-white participants and between male and female participants, as well recommendations they may have for improving ECA programs.

There are two questions of a sensitive nature on the Host Community survey: sex and race/ethnicity. There may be important differences in perceptions among host community members by sex and race/ethnicity, and ECA would like to be able to analyze for those differences to strengthen communication with, and program components for those stakeholders in the future.

  1. Describe the hour time burden and the hour cost burden on the respondent needed to complete this collection

The total estimated hour burden for this data collection is 9,937 hours, broken down as follows in Table 1.

Table 1. Annual Hour Time Burden for Respondents to MODE surveys

Respondent Instrument

Estimated Number of Respondents

Average Time per Response

Total Estimated Burden Time

Post-Program Survey(s)

50,532

(75.77% represents ECA's historic average response rates for post-program surveys)

8 min

6,738 hours

Alumni Survey(s)

6,063

(44.61% represents ECA's historic average response rates for alumni surveys)

30 min

3,032 hours

Host Community Member Survey(s)

5002

20 min

167

Total Estimated Burden Time

57,095


9,937 hours

The average times were calculated based on estimated times from similar survey question lengths. 3

Time Cost to Respondents

Most respondents (alumni and host community members) will respond as individuals, and the cost to them is the opportunity cost if they were to undertake paid work for the time that they spend responding to surveys. Therefore, the cost estimate for these groups is based on the Bureau of Labor Statistics May 2019 National Occupational Employment and Wage Estimates4 national wages (as alumni and host community members are spread across the country), with the participants and/or alumni (as students or entry-level professionals in most cases and therefore using the “Life, Physical, and Social Scientist Technicians” category) rate of $38.40 (2019 mean hourly wage of $37.28 inflated by 3% for 2020 and 2021).5 Given the diverse sectors and career levels within each of the host community member respondent group, a variety of representative occupations were selected from the Bureau of Labor Statistics’ data as proxies to represent potential respondent careers and sectors. Compensation estimates for proxies were then averaged for an hourly wage of $50.38 and inflated by 3% to $51.89. Table 2 highlights the estimate of overall respondent group hour and cost burden. Table 3 highlights the occupation proxies selected for this analysis.

Table 2. Estimate of Annual Respondent Hour and Cost Burden

Respondent Group

Total Estimated Hours

Hourly Cost Rate

Total Cost Burden

Post-Program Participants

6,738

$38.40

$258,722.67

Alumni

3,032

$38.40

$116,428.80

Host Community Members

167

$51.89

$8,648.91

Total

9,937

$383,800.38



Table 3. Wage Estimate Proxies

Respondent Group

Occupation Proxies

Hourly Compensation (includes costs for wages, salaries, and benefits)

Total Average Hourly Compensation

U.S. Community Members

Employer Cost: Junior colleges, colleges, and universities (state and local government)

$58.83

$50.38

Employer Cost: Professional and business services (Private industry)

$42.21

Employer Cost: Management, business and financial (Private industry)

$69.26

Employer Cost: Educational Services (Private industry)

$46.30

Employer Cost: Health care and social assistance

$35.31

  1. Describe the monetary burden to respondents (out of pocket costs) needed to complete this collection.

There are no costs incurred by respondents that will not be covered.

  1. Describe the cost incurred by the Federal Government to complete this collection.

The estimated annual cost to the USG for the Monitoring Data for ECA as related to this collection is $416,373.60. This estimate includes all direct and indirect costs of the design, data collection, and analysis activities. The wage rates of Federal employees at DOS were estimated using Steps 1 for Grades 13 ($58.13/hour at 120 hours) and 14 ($49.19 at 480 hours) of the General Schedule in the Washington-Baltimore-Arlington, DC-MD-VA-WV-PA locality area. 6 The Department multiplied the hourly wage rate by 2 to account for a fringe benefits rate of 69 percent4 and an overhead rate of 31 percent.

  1. Explain any changes/adjustments to this collection since the previous submission

This is a new collection.

  1. Specify if the data gathered by this collection will be published.

Once data have been collected and analyzed, the Evaluation Division will produce an annual report for internal use. Aggregate figures may be used in media and in public-facing reports to demonstrate the impact of ECA programs to stakeholders. However, the raw data or those of individual respondents will not be published in any way with attribution.

  1. If applicable, explain the reason(s) for seeking approval to not display the OMB expiration date. Otherwise, write “The Department will display the OMB expiration date.”

The Department will display the OMB expiration date.

  1. Explain any exceptions to the OMB certification statement below. If there are no exceptions, write “The Department is not seeking exceptions to the certification statement”.

The Department is not seeking exceptions to the certification statement.

B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

This collection will employ statistical methods for the alumni survey and the parent survey. Additional information is provided below

1. Potential Respondent Universe and Respondent Selection Methods

The respondent universe for this evaluation includes the approximate 66,0007 unique individuals who participated in ECA sponsored exchange programs annually and U.S. host community members who have interacted with ECA program participants as home stay hosts, peer collaborators/professional mentors, and site visit and/or community service organizations. Each of these groups is discussed in its own section below.

Current ECA Participants (Post-Program Surveys)

The potential respondent universe of program participants includes all individuals who have participated in ECA sponsored exchange programs. While ECA’s historic average response rate for post-program surveys is 75.77%, response rates ranged from 31.8% to 100%, varying by program and year. As a result, it was determined that a census was the most effective way to achieve a reasonable sample size to allow the analyses by program, (host) country, and other disaggregation categories such as gender, and age, that will help ECA understand the impacts of its programs.

ECA Alumni (1-year, 3-year and 5-year surveys)

The potential respondent universe of alumni includes all U.S. citizens who have participated in ECA sponsored exchange programs. While ECA’s historic average response rate for post-program surveys is 44.61%%, response rates ranged from 22.22% to 70.37%, varying by program and year. As a result, it was determined that a census was the most effective way to achieve a reasonable sample size to allow the analyses by program, (host) country, and other disaggregation categories such as gender, and age, that will help ECA understand the impacts of its programs.

U.S. Host Community Members Surveys

U.S. host community members include individuals who have interacted with ECA program participants as home stay hosts, peer collaborators/professional mentors, and site visit and/or community service organizations. We propose a census of U.S. host community members as it is anticipated that the response rate to surveys will be low as the U.S. host community members may have less investment in the program than the alumni. This is also based on two recent program evaluations, which incorporated surveys of U.S. community members and experienced low survey response rates.

2. Procedures for the Collection of Information

For post-program surveys, ECA survey questions will be administered by award recipients and incorporated into larger feedback surveys and will thus will be administered as a census. To assist with standardized data collection, each indicator comes with a performance indicator reference sheet, which is used to define indicators and is key to ensuring indicator data quality and consistency. In addition, ECA programs have a wide range of participant sizes; and for many programs, while they may have had large numbers of participants, the participant groups are fairly small – most participants are in cohorts of 15 to 30, and experiences will have differed by cohort. Thus, to obtain information reflective of the range of experiences, a census is the most practical approach.

For the alumni surveys, a census will be taken, as we anticipate that a significant portion of the existing contact data may be out of date, and it may not be possible to update it prior to data collection. In addition, ECA programs have a wide range of participant sizes; and for many programs, while they may have had large numbers of participants, the participant groups are fairly small – most participants are in cohorts of 15 to 30, and experiences will have differed by cohort. Thus, to obtain information reflective of the range of experiences, a census is the most practical approach.

For the U.S. host/home community surveys, the data collection will rely on a snowball sampling approach: surveys will be shared with award recipient staff, with a request that these be shared either directly with their points of contact within the community, in which the U.S. host community members are located as well as via any relevant social media platforms. Thus, it is likely that the distribution of responses will be uneven.

Online Surveys

The ECA Evaluation Division will utilize surveys as a means for data collection using the Qualtrics survey tool. The evaluation team will customize electronic surveys for each key stakeholder groups. Wherever possible, questions and scales were designed for simplicity and parsimoniousness. As much as possible, response categories were limited to multiple choice and no more than five-point scale options to reduce the time required to respond to the questions.

The invitation to complete the survey will include an estimated completion time, which should encourage participation.

The surveys will be developed in Qualtrics, a sophisticated electronic survey tool. Each target key group will have its own survey structure that follows a logic based on the key group and sub-group. Each survey will be open for a period of 4 – 6 weeks. The ECA Evaluation Division will track diagnostics and response rates through our Qualtrics tool to confirm adequate survey coverage of key groups (e.g., at least 30% response for each survey stakeholder group). After the initial invitation, anyone who has not completed the survey will receive email reminders at two weeks, one week, four days, and one day to the survey closing to encourage additional responses. Those who have completed the surveys will not receive the reminders.

3. Maximization of Response Rates and Nonresponse

Expected response rates and nonresponse will vary by data collection instrument. Each is discussed in a separate section below.

Post-Program Participant Survey

ECA considers nonresponse to be a risk. For post-program surveys, ECA survey questions will be administered by award recipients as part of larger feedback surveys. Reminders will similarly be administered by award recipients and will be dependent on each award recipients’ practices.

Alumni Survey

ECA considers nonresponse to be a risk. As noted earlier, the surveys will be designed to be viewed on mobile devices, which may also increase willingness to complete the survey. Wherever possible, questions and scales were selected for parsimoniousness. As much as possible, response categories were limited to yes/no and no more than four-point scale options to reduce the time required to respond to the questions.

Although the alumni surveys will be sent as a census, it is possible that responses will be most forthcoming from the most recent alumni (e.g. one-year surveys) and may vary by program, and country. For all alumni surveys, ECA will conduct a nonresponse analysis during the data analysis phase to determine the extent to which the respondents represent the universe of ECA program participants. If necessary, responses may be weighted in the data analysis phase to adjust for nonresponse, and any adjustments to the analysis will be discussed in the methodology section of the internal report.

U.S. Host Community Member Survey

ECA considers nonresponse to be a risk as ECA has not historically surveyed this stakeholder groups. The survey of U.S. host community members will rely on a snowball sampling approach: surveys will be shared with award recipients staff, with a request that these be shared either directly with their point of contacts within the community, in which the U.S. host community members are located as well as via any relevant social media platforms. Thus, it is likely that the distribution of responses will be uneven. It is anticipated that the response rate to surveys will be low, as the U.S. host community members may have less investment in the program than the alumni. Based on past experience with two evaluations that incorporated data collections from this group, we anticipate the response rate to be low.

4. Data Analysis and Reporting

Once data have been collected and analyzed, the Evaluation Division will produce an annual report for internal use. Quantitative data will undergo fairly simple analyses to generate sums or averages. Survey results will primarily be presented in the aggregate, with survey questions tabulated and reported under their corresponding indicators. For example, the survey question: “Do you consider yourself an alumni of a U.S. Department of State program?” will be reported under the corresponding indicator Percent of participants who identify as a Department of State program alumni. Indicator analysis may include disaggregation by variables of interest, such as country of study, program, age group (youth vs. non-youth), sex, etc.

Although both the post-program and alumni surveys will be sent as a census, it is possible that responses will vary by program and country. For both surveys, ECA will conduct a nonresponse analysis during the data analysis phase to determine the extent to which the respondents represent the universe of ECA program participants and alumni. Award recipients will report response rates as well as share raw data files with ECA for this purpose as part of their reporting requirements. If necessary, responses may be weighted in the data analysis phase to adjust for nonresponse, and any adjustments to the analysis will be discussed in the methodology section of the internal report. Survey responses will be aggregated

5. Relevant Contacts

A number of ECA staff reviewed and approved the proposed methodology: Natalie Donahue, Chief of Evaluation ([email protected]). ECA’s Evaluation Division will collect and analyze the data on behalf of ECA.



1 An indicator is a marker of accomplishment/progress. It is a specific, observable, and measurable accomplishment or change that shows the progress made toward achieving a specific output or outcome. MODE Framework indicators will track program the performance and the direction, pace, and magnitude of change of ECA programs. Common examples of indicators include: participation rates, changes in attitudes and individual behaviors.

2 Note: This is the first time ECA will be surveying this group. Ad-hoc collections that were part of evaluations relied predominantly on snowball sampling and has historically yielded a very low response rate.

3 Research has shown that respondents can complete approximately five close-ended questions per minute and two open-ended questions per minute. URL: https://www.surveygizmo.com/resources/blog/survey-response-rates/

4 USDOL Bureau of Labor Statistics, May 2019 National Occupational Employment and Wage Estimates, https://www.bls.gov/oes/current/oes_nat.htm#25-0000, last updated March 31, 2020, accessed April 9, 2020.

5 These rates likely overstate the actual cost, as many of the alumni are full-time students and may not be employed, and many of the parents may have lower pay rates than our estimate, but these seemed reasonable proxies.

6 Source: Office of Personnel Management, “2020 General Schedule (GS) Locality Pay Tables,” https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/2020/general-schedule/

7 66,691 represents the 2016 number of ECA exchange participants.

7

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCanfield, Danielle P
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy