Supporting Statement ECA PFP_30 day 6-22

Supporting Statement ECA PFP_30 day 6-22.docx

Evaluation of the Professional Fellows Program

OMB: 1405-0239

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT FOR
PAPERWORK REDUCTION ACT SUBMISSION

Evaluation of the Professional Fellows Program
OMB Number 1405-XXXX





A. JUSTIFICATION

  1. Why is this collection necessary and what are the legal statutes that allow this?

The Department of State’s Bureau of Educational and Cultural Affairs (ECA) regularly monitors and evaluates its programs through the collection of data about program accomplishments in order to enable program staff to assess the impact of its programs, where improvements may be necessary, and to modify/plan future programs. ECA is currently conducting an evaluation of the Professional Fellows Program (PFP). The PFP is a two-way, global exchange program for mid-level emerging leaders from select foreign countries. The PFP is managed by the Professional Fellows Division of ECA. Foreign fellows come to the United States for a five- to six-week fellowship, including a minimum four-week tailored placement in a relevant professional organization (NGO’s, business, government, etc.) and an end of program conference in Washington, D.C. While in the United States the foreign fellows volunteer in their local community, stay with local families, and create follow-on project plans to implement back in their home country. A select number of U.S. counterparts travel overseas on an outbound program that is approximately two weeks in length to directly support foreign fellows’ follow-on projects.



Beginning with the FY2012 grant cycle (program implementation in 2013), the Professional Exchanges Division began instituting program standards across the programmatic themes and program implementers. The evaluation covers the period of this standardization – from 2013 through 2018 – and seeks to determine the effect of these changes on program outcomes and the extent to which the program is achieving its long-term goals of developing lasting professional collaborative relationships between alumni and their U.S. counterparts to serve common issues.



To conduct the evaluation, ECA contracted General Dynamics Information Technology (GDIT) to carry out in-depth interviews and surveys of the following stakeholder groups: foreign fellows, U.S. reciprocal fellows, the U.S. professionals with whom they worked, as well as U.S. homestay hosts.



Legal authorities and administrative requirements that necessitate the collection of these data can be found below:

  1. Government Performance and Results Act of 1993 (GPRA)

  2. Government Performance and Results Modernization Act of 2010 (GPRAMA)

  3. Department of State’s Program and Project Design, Monitoring, and Evaluation Policy

  4. Mutual Educational and Cultural Exchange Act of 1961, as amended, 22 U.S.C. 2451 et seq. (also known as the Fulbright-Hays Act)


  1. What business purpose is the information gathered going to be used for?

The data will primarily be used by ECA’s Evaluation Division, the PFP program staff and the PFP implementing partners to improve program design, effectiveness, and impact. A high-level version of the final report will also be made available to the public as part of ECA’s responsibility to be accountable for the use of its funds and performance of its mission. The ECA Evaluation Division, in partnership with GDIT, is conducting the evaluation, and will be responsible for collecting and analyzing the data. Note that these data collection tools are not designed for foreign participants in the PFP, as ECA has already collected data from this stakeholder group. We believe this new information collection will help supplement our understanding of this program by surveying the group of participants ECA has not previously solicited feedback from.



  1. Is this collection able to be completed electronically (e.g. through a website or application)?

The surveys will be administered using Survey Monkey. To protect the anonymity of the respondents, a general survey link (as opposed to individualized survey links) will be sent to the entire respondent base for each respective survey using updated emails supplied by the implementing partners. Survey Monkey allows for:

  • easy programming and testing before launch,

  • real-time monitoring of results,

  • real-time visual presentation of data, and

  • easy data exports to Microsoft Excel or IBM SPSS software for further data analysis.

Periodic reminders will be sent to the entire respondent pool throughout the survey window in order to improve the response rate.

Due to the nature of in-depth interviews, they will need to be administered in-person, by phone or via virtual meeting platform such as Skype.



  1. Does this collection duplicate any other collection of information?

This data collection does not represent a duplication of effort. The questions are specifically designed to understand the impact of the PFP on the foreign professionals who participated in the program, the U.S. reciprocal fellows, the U.S. professionals with whom they worked, and their homestay hosts. ECA has not collected these data from these particular stakeholders in the past.

  1. Describe any impacts on small business.

Representatives from small businesses may be interviewed and/or surveyed for the evaluation, but we expect the time burden to be minimal.

  1. What are consequences if this collection is not done?

Approval is being sought for a one-time data collection. Since 2013, the Professional Fellows Division has taken steps to standardize the program across thematic areas, program components, and program implementers. Since that time, the PFP has not been evaluated, and ECA deems it critical to conduct an independent evaluation to capture the impact of the program and to determine the extent to which current program requirements are conducive to meeting its long-term goals. Absent this data collection, ECA cannot fully answer questions about the long-term benefits of the program (or the lack thereof).



  1. Are there any special collection circumstances?

This data collection does not involve any special circumstances. It is a one-time data collection and does not require submission of any information that is not OMB-approved. The participants will be notified of how the data will be used, and the Department will obtain their consent. No proprietary information will be collected.

  1. Document publication (or intent to publish) a request for public comments in the Federal Register



The 60-day Federal Register Notice was published on November 13, 2019 (84 FR 61673).  No comments were received in that period.  The Department will publish a notice in the Federal Register soliciting public comments to OMB for a period of 30 days.



  1. Are payments or gifts given to the respondents?

No payments or gifts will be given to survey respondents or interviewees.

  1. Describe assurances of privacy/confidentiality

ECA and its external contractors follow all procedures and policies stipulated under the Privacy Act of 1974 to guarantee the privacy of the respondents. Beyond that, there are no assurances of confidentiality.

Each survey will include the following language:

Please note that your participation in this survey is completely voluntary, and you are free to end the survey at any time. By clicking the “Consent and Enter Survey” button below, you are consenting to the following terms: 


  • Any response you provide may be reported in the final report as part of the aggregated quantitative analysis or the de-identified qualitative analysis from open-ended responses.  

  • Responses may be reported by specific demographic category, program year, or program site. The only identifying information used will be the demographic information provided in the final section of the survey. 

  • De-identified data files will be submitted to ECA at the completion of the evaluation (without names or any contact information).

  • The data you provide may be reanalyzed at a later date for a follow-up study or other purpose as approved by ECA.



If you have any questions about this survey or the PFP Evaluation more generally, please feel free to contact GDIT Evaluation Team at [email protected].



CONSENT TO PARTICIPATE

By clicking the button to enter the survey below, you are giving your consent to participate in this evaluation. If you do not wish to participate, please click the exit survey link below.

Consent and Enter Survey Refuse and Exit Survey



Each Interview will begin with the following language:

Thank you for agreeing to meet with us. Please know that your participation in this interview is completely voluntary, that you may end the interview at any time. Your answers and opinions will be kept strictly confidential. Any response you provide may be included in the final report in the summary of the qualitative data or as an example, but no responses will ever be attributed to you personally. The de-identified data, that is, without names or contact information, will be submitted to ECA at the completion of the evaluation. The information you share may be re-analyzed at a later date for a follow-up study or other purpose approved by ECA. If you are in agreement, we would like to record this conversation so that we can focus on the conversation, rather than on taking notes. Do we have your consent to continue and begin the interview?

  1. Are any questions of a sensitive nature asked?

Demographics will be requested in the surveys to better understand if the program is reaching the desired audiences.

  1. Describe the hour time burden and the hour cost burden on the respondent needed to complete this collection

The hour burden time for this data collection is presented in the table below. Please note that only the data collection from U.S. stakeholders is included here.

Table 1. Hour Time Burden for U.S Stakeholders in the PFP

Respondent Instrument

Estimated Number of Responses

Average Time per Response

Total Estimated Burden Time

Professional Contact Survey

300


20 minutes

100 hours

Professional Contact Interview Guide

40


40 minutes

26.7 hours

Homestay Host Survey

86


15 minutes

21.5 hours

Homestay Host Interview Guide

40


30 minutes

20 hours

Total Estimated Burden Time



168.2 annual hours



Time burden estimated based on actual surveys and interviews carried out with similar audiences in foreign countries under other projects.

Time Cost to Respondents

The cost to the respondents is the opportunity cost if they were to undertake paid work for the time that they spend responding to surveys or participating in an interview. Therefore the cost estimate is based on the Bureau of Labor Statistics May 2018 National Occupational Employment and Wage Estimates.1 Both respondent groups are likely in senior positions in their careers, but represent a variety of fields and are spread across the country. Therefore the average hourly wage of $60.19 (2018 hourly wage of $58.44 inflated by 3% for 2019) for all Management Occupations (Major Group) will be used as a proxy for both groups.

Table 2. Estimate of Respondent Hour and Cost Burden

Response Group

Total Estimated Hours

Hourly Cost Rate

Total Cost Burden

Professional Contacts

126.7

$60.19

$7,626.07

Homestay Hosts

41.5

$60.19

$2,497.89

Total

168.2


$10,123.96



  1. Describe the monetary burden to respondents (out of pocket costs) needed to complete this collection.

Respondents will not incur any costs by participating in this data collection effort.

  1. Describe the cost incurred by the Federal Government to complete this collection.

The estimated cost to the USG for the PFP Evaluation as related to this collection is $95,386.80. This estimate includes all direct and indirect costs of the design, data collection, and analysis activities. In Table 3 below, Personnel and Fringe Benefit costs are for the contractor (GDIT) personnel who manage the evaluation. The wage rates of Federal employees at DOS were estimated using Steps 1 for Grades 13 and 14 of the General Schedule in the Washington-Baltimore-Arlington, DC-MD-VA-WV-PA locality area. 2 The Department multiplied the hourly wage rate by 2 to account for a fringe benefits rate of 69 percent3 and an overhead rate of 31 percent. 4


Table 3. Total Cost to Federal Government

Cost Item

Total

Federal Staff Costs [indicate grade and step and percentage of time]

  • GS-14, Step 1 equivalent – $56.15/hour @ estimated 40 hours total

  • GS-13, Step 1 equivalent – $47.52/hour @ estimated 120 hours

$13,650.80

Personnel

$64,904

Fringe Benefits

NA

Travel

$16,8325

Equipment

NA

Supplies

NA

Total Direct Costs

$ 16,832

Indirect Charges [include Overhead, Fee, and G&A]

NA

Total

$95,386.80



  1. Explain any changes/adjustments to this collection since the previous submission

This is a new data collection.

  1. Specify if the data gathered by this collection will be published.

Once data have been collected and analyzed, the evaluation contractor will produce a final report for publication, and to supplement that, a summary briefing and infographic for ECA’s use. The ECA Evaluation Division will publish the external version of the report, and infographic on its website (https://eca.state.gov/impact/eca-evaluation-division). However, the raw data or those of individual respondents will not be published in any way with attribution.



  1. If applicable, explain the reason(s) for seeking approval to not display the OMB expiration date. Otherwise, write “The Department will display the OMB expiration date.”

The Department will display the OMB expiration date.

  1. Explain any exceptions to the OMB certification statement below. If there are no exceptions, write “The Department is not seeking exceptions to the certification statement”.

The Department is not seeking exceptions to the certification statement.



B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

This collection will employ statistical methods for each survey and interview.

Respondent Universe: The respondent universe for this survey includes 171 host institution supervisors (44 of which participated in a reciprocal exchange to visit the foreign fellow) and 1,373 host institution point of contacts (487 of which participated in a reciprocal exchange) who interacted with a PFP fellow between 2013 and 2018.

Given that there is no additional cost or effort to sending emails to the entire universe and the desirability of having as large sample as possible with which to demonstrate the impact of the PFP on American stakeholders, we recommend a census approach, rather than a sampling approach for this survey. Based on past experience implementing web surveys with respondent groups for whom contact information is not regularly updated, we anticipate an overall response rate of 20%.

The contractor will employ the same census approach for the Homestay Host Survey as for the survey discussed above. There are approximately 855 individuals who hosted foreign fellows who participated in the PFP between 2013 and 2018 that will be invited to participate in the survey. We anticipate an overall response rate of approximately 10%.

Data collection methodology and maximizing response rates: To protect privacy (and to facilitate sending repeat reminders), the contractor will utilize Survey Monkey for data collection. The survey will be structured so that it follows a logic based on each respondent group, excluding any non-relevant questions and maximizing question applicability to each group. As soon as the contractor has entered the survey into Survey Monkey, we will conduct a pre-test by sending the link to internal contractor staff, to the Evaluation Division, and to Program staff to check for spelling and correct skip patterns. Any deficiencies will then be remedied.

The contractor will send a general survey link to the census of respondents with contact information. The invitation to complete the survey will include an estimated completion time, which should encourage participation. Once the survey has been launched (via an introductory email including the survey link), the contractor will send period reminders boost the response rate. Based on prior experience, we expect the survey to be open for 6 – 8 weeks, requiring two to three reminders. If response rates remain low despite reminders, the contractor will ask implementing partners to assist with promotion of the survey since those organizations have established relationships with many of the respondents. The contractor will track emails that are not deliverable, and will not send the reminders to these individuals.

For both surveys, analysis will be detected to determine if there are any significant differences between the survey respondents and non-respondents with respect to program year, program theme, program implementer, and respondent type. As the data presented in the final report will be descriptive of the program as a whole, any such non-response differences (biases) will be duly noted in the methodological section of the report. The goal is to demonstrate the extent to which the respondents are representative of the stakeholder population, to set the context for the results.

Relevant Contacts: This evaluation was contracted through a competitive process. General Dynamics Information Technology (GDIT) was selected to carry out the evaluation. GDIT’s technical evaluation experts (Dr. Marta Muco and Dr. Karen Aschaffenburg) developed the original design in response to the solicitation. A number of ECA staff reviewed and then approved the proposed methodology: Natalie Donahue, Chief of Evaluation (202-632-6193), Elizabeth Botkin (202-632-6423) and Linnéa Allison (202-632-6060). GDIT’s evaluation team will collect and analyze the data on behalf of ECA.



1 Source: Bureau of Labor Statistics May 2018 National Occupational Employment and Wage Estimates, Management Occupations, Major Group, https://www.bls.gov/oes/current/oes110000.htm


2 Source: Office of Personnel Management, “2019 General Schedule (GS) Locality Pay Tables,” https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/2019/general-schedule/

3 Source: Congressional Budget Office, “Comparing the Compensation of Federal and Private-Sector Employees, 2011 to 2015” (April 2017), https://www.cbo.gov/publication/52637. The wages of Federal workers averaged $38.30 per hour over the study period, while the benefits averaged $26.50 per hour, which is a benefits rate of 69 percent.

4 Source: U.S. Department of Health and Human Services, “Guidelines for Regulatory Impact Analysis” (2016), https://aspe.hhs.gov/system/files/pdf/242926/HHS_RIAGuidance.pdf. On page 30, HHS states, “As an interim default, while HHS conducts more research, analysts should assume overhead costs (including benefits) are equal to 100 percent of pretax wages….” To isolate the overhead rate, the Department subtracted the benefits rate of 69 percent from the recommended rate of 100 percent.

5 Based on travel to 5 locations, with 4 or 5 days per site, for 2-person evaluation teams.

Shape1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWatkinsPK
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy