Supporting Statement_MWF Evaluation_Final 02-24-2020

Supporting Statement_MWF Evaluation_Final 02-24-2020.docx

Evaluation of the Mandela Washington Fellowship for Young African Leaders

OMB: 1405-0235

Document [docx]
Download: docx | pdf



SUPPORTING STATEMENT FOR
PAPERWORK REDUCTION ACT SUBMISSION

Evaluation of the Mandela Washington Fellowship for Young African Leaders
OMB Control Number 1405-XXXX



A. JUSTIFICATION

  1. Why is this collection necessary and what are the legal statutes that allow this?

The Department of State’s Bureau of Educational and Cultural Affairs (ECA) regularly monitors and evaluates its programs through the collection of data about program accomplishments in order to enable program staff to assess the impact of its programs, where improvements may be necessary, and to modify/plan future programs. ECA is currently conducting an evaluation of the Mandela Washington Fellowship for Young African Leaders (Fellowship). The Mandela Washington Fellowship is the flagship program of the Young African Leaders Initiative (YALI), which is a signature effort to invest in the next generation of African leaders. The Fellowship includes a six-week Academic and Leadership Institute at a U.S. college or university; a three-day networking Summit in Washington, D.C.; a competitively selected U.S.-based professional development experience with U.S. non-governmental organizations, private companies, and governmental agencies; a competitively selected Reciprocal Exchange Component for American professionals to travel to sub-Saharan African countries to build on strategic partnerships and professional connections developed during the Fellowship; and Africa-based support following the conclusion of the Institutes. The U.S.-based components of the Fellowship – Academic and Leadership Institutes, Summit, professional development experiences, and Reciprocal Exchange components are managed by ECA and are the focus of this evaluation.



As the Fellowship has been implemented for five years, ECA is conducting this evaluation to determine the extent to which the program is meeting its stated goals, as well as the program’s impact on advancing DOS strategic policy priorities. In order to do so, ECA has contracted Guidehouse LLP (Guidehouse) to conduct surveys, interviews, and focus groups with Fellowship Alumni, Academic and Leadership Institute staff, Professional Development Experience representatives, Reciprocal Exchange Alumni, and other U.S. community members that Fellows interacted with during their time in the United States.



Legal authorities and administrative requirements that necessitate the collection of these data can be found below:

  1. Government Performance and Results Act of 1993 (GPRA)

  2. Government Performance and Results Modernization Act of 2010 (GPRAMA)

  3. Foreign Aid Transparency and Accountability Act of 2016, Public Law 114-191

  4. Foundations for Evidence-Based Policymaking Act of 2017

  5. Department of State’s Program and Project Design, Monitoring, and Evaluation Policy

  6. Mutual Educational and Cultural Exchange Act of 1961, as amended, 22 U.S.C. 2451 et seq (also known as the Fulbright-Hays Act)

  1. What business purpose is the information gathered going to be used for?

The primary users of the collected data will be ECA’s evaluation and program staff and its implementing partner. Additional stakeholders may include members from the ECA Evaluation Division, Program Offices in ECA, ECA senior leadership, staff at U.S. Government agencies implementing components of YALI and relevant U.S. Embassy personnel. The information collected will be used to inform any beneficial program adjustments to strengthen the utility and cost-effectiveness of the program.

A high-level version of the final report will also be made available to the public as part of ECA’s responsibility to be accountable for the use of its funds and performance of its mission. The ECA Evaluation Division, in partnership with Guidehouse (who is conducting the evaluation) will be responsible for collecting and analyzing the data.

  1. Is this collection able to be completed electronically (e.g. through a website or application)?

Surveys will be managed by Guidehouse on a single survey platform, Qualtrics. Qualtrics provides functionality that will minimize time required to both administer and respond to the survey, including:

  • Strong validation measures to reduce errors and generate cleaner raw data;

  • Quick access dashboards for initial results; and

  • Security certifications including ISO 27001 certified and FedRamp Authorized.

Qualtrics allows survey design to be tested on mobile platforms to ensure correct display of questions. We expect that all survey submissions will be collected electronically.

Due to the nature of in-depth interviews, these will need to be administered in-person, by phone or via virtual meeting platform such as Go-To-Meeting.

  1. Does this collection duplicate any other collection of information?

This will not be a duplication of effort. The purpose of the data collection and the focus of the questions asked is to understand the Fellowship’s impacts on personal and professional development, impacts on U.S. communities, and contributions to U.S. foreign policy goals and objectives. ECA has not collected these data from these U.S. stakeholders in the past.

  1. Describe any impacts on small business.

Representatives from small businesses may be interviewed and/or surveyed for the evaluation, but we expect the time burden to be minimal.

  1. What are consequences if this collection is not done?

Approval is being sought for a one-time data collection. As the Mandela Washington Fellowship has been operating formally since 2014, ECA deems it critical to conduct an independent evaluation to capture the impacts of the program and to determine whether or not any adjustments to the program are necessary to ensure that it is meeting its long-term goals. Absent this data collection, ECA cannot fully answer questions about the long-term benefits of the program.

  1. Are there any special collection circumstances?

This data collection involves no special circumstances, as it is a one-time data collection and does not require submission of any information that is not OMB-approved. Consent procedures include obtaining providing the notices outlined in paragraph 10, prior to collection of any data.

  1. Document publication (or intent to publish) a request for public comments in the Federal Register

The 60-day Federal Register Notice was published on July 26, 2019 (84 FR 36153). One comment was received but did not provide feedback on the evaluation data collection tools and was not deemed relevant. The Department will publish a notice in the Federal Register soliciting public comments for a period of 30 days.

  1. Are payments or gifts given to the respondents?

No payments or gifts are proposed for respondents.

  1. Describe assurances of privacy/confidentiality

ECA and its external contractors follow all procedures and policies stipulated under the Privacy Act of 1974 to guarantee the privacy of the respondents.

Each survey will include the following language:


Your participation in this survey is voluntary. You may opt to withdraw from the survey at any time, choose not to answer select questions, or choose to not submit your survey responses.


By selecting the “I consent to participate in this survey” below, you are consenting to the following:

  • Aggregated responses or de-identified qualitative insights from open-ended questions may be included in the final report or publications resulting from the evaluation.

  • De-identified data files will be submitted to the Bureau of Educational and Cultural Affairs (ECA) at the U.S. Department of State upon completion of the evaluation (without names or any contact information).

  • The data you provide may be reanalyzed at a later date for a follow-up study or other purpose approved by ECA.

 

Your contributions are confidential and no individual identities will be used in any reports or publications resulting from the evaluation unless the individual provides consent to the Evaluation Team. 


If you have any questions or concerns about this survey or the Mandela Washington Fellowship evaluation more broadly, please reach out to the Evaluation Team at [email protected].


Please acknowledge if you consent to participate in this survey:


Answer Options

  • I consent to participate in this survey

  • I do not consent to participate in this survey [note: if ‘I do not consent’ is selected, survey will automatically close]”

Each Interview will begin with the following language:


Your participation in this interview is voluntary. You may opt to withdraw from the interview at any time or choose not to answer select questions. For your awareness:

  • Aggregated responses or qualitative insights (without names) from open-ended questions may be included in the final report or publications resulting from the evaluation.

  • Qualitative data files will be submitted to ECA at the completion of the evaluation (without names or any contact information).

  • The information you provide may be re-analyzed at a later date for a follow-up study or other purpose approved by ECA.



Your contributions are confidential and no individual names will be used in any reports or publications resulting from the evaluation unless the individual provides consent to the Evaluation Team. If you have any questions or concerns about this interview or the evaluation more broadly, please reach out to the Evaluation Team at [email protected].


CONSENT TO PARTICIPATE

Do you consent to participate in this interview? [Verbal response will be documented]

  • I consent to participate in this interview

  • I do not consent to participate in this interview [note: if selected, the interview will end]”



In line with ECA policy, individual names will not be reported in the evaluation report, but responses will be used in the aggregated analysis, and may be disaggregated by variables of interest, such as country of study, program year, sex, etc. As noted above, however, the consent statement for this evaluation includes the notices outlined above. De-identified data files will be shared with ECA for this purpose at the end of the evaluation.

  1. Are any questions of a sensitive nature asked?

There is one question of a sensitive nature on the surveys targeting the Reciprocal Exchange Alumni, Professional Development Experience Hosts, Academic and Leadership Institute staff, and U.S. Community members: sex.  Although the pre-testing did not reveal any differences in answers by sex given the very small number of respondents, there may be important differences in perceptions among Reciprocal Exchange Alumni, Professional Development Experience Hosts, Academic and Leadership Institute staff, and U.S. Community members by sex. The team would like to be able to analyze for those differences to strengthen communication with, and program components for those stakeholders in the future. 

  1. Describe the hour time burden and the hour cost burden on the respondent needed to complete this collection

The total estimated hour burden for this data collection is 220.75 hours, broken down as follows in Table 1. The estimated number of respondents is approximately 30% of the respondent pool.

Table 1. Hour Time Burden for Mandela Washington Fellowship Evaluation Respondents

Respondent Instrument

Estimated Number of Responses

Average Time per Response (minutes)

Total Estimated Burden Time (hours)

Academic and Leadership Institute Survey

40

30

20

Academic and Leadership Institute Interview Questions

15

60

15

Professional Development Experience Host Organization Survey

122

30

61

Professional Development Experience Host Organization Interview Questions

15

60

15

Reciprocal Exchange Alumni Survey

52

30

26

Reciprocal Exchange Alumni Interview Questions

15

60

15

U.S. Community Member Survey

15

25

6.25

U.S. Community Member Focus Group Discussion Questions

15

90

22.5

Fellowship Experience Maps

40

60

40

Total Estimated Burden Time

329

49.44

220.75 annual hours

The average times were calculated based on the average pilot test times and estimated times from similar survey and interview question lengths. Each survey and key informant interview instrument was pilot tested with a set of respondents, and the mean time is reported above. The range of response times is shown in the table below.

Table 2. Pilot test response time ranges by data collection instrument

Data Collection Instrument

Number of Pilot Tests

Shortest Response Time

(minutes)

Longest Response Time

(minutes)

Academic and Leadership Institute Survey

6

15

50

Professional Development Experience Host Organization Survey

3

15

40

Reciprocal Exchange Alumni Survey

3

15

40

U.S. Community Member Survey

3

15

15

Although the questions asked in the focus group guide and interviews were tested, we were unable to simulate focus groups and full interviews in the pilot testing period. The time estimates for the focus group and interviews, therefore, are based on the number and complexity of questions to be asked and number of anticipated respondents per group based on the evaluation contractor’s past experience conducting interviews and focus groups. The interviewer/focus group moderator also has some control over how long the conversation lasts, and can steer the interviewee/group to the next question to ensure that the interview/group discussion does not run too long.

Time Cost to Respondents

To estimate the burdened labor rate for individuals participating in the data collection, the Department used total compensation rates obtained from the Bureau of Labor Statistics March 2019 Employer Costs for Employee Compensation Summary,1 which includes costs for wages, salaries and benefits (as Fellowship stakeholders are spread across the country). Given the diverse sectors and career levels within each of the respondent groups, a variety of representative occupations were selected from the Bureau of Labor Statistics’ data as proxies to represent potential respondent careers and sectors. Compensation estimates for proxies were then averaged for each respondent group. For example, Academic and Leadership Institute staff are typically staff within U.S. Universities, and as such, the cost burden for these groups was estimated with use of total compensation rates obtained from the For Academic and Leadership Institute staff, we used the state and local government “Junior colleges, colleges, and universities” category, with a total hourly compensation of $58.86 .2 Table 2 highlights the estimate of overall respondent group hour and cost burden. Table 3 highlights the occupation proxies selected for this analysis.

Table 3. Estimate of Respondent Hour and Cost Burden

Respondent Group

Total Estimated Hours

Hourly Cost Rate

Total Cost Burden

Academic and Leadership Institute Staff

35

$58.86

$2,060.10

Professional Development Experience Representatives

76

$49.69

$3,776.44

Reciprocal Exchange Alumni

41

$50.17

$2,056.97

U.S. Community Members

28.75

$51.17

$1,471.27

Fellowship Experience Map Respondents

40

$50.68

$2,027.00

Total

220.75

N/A

$11,391.78


Table 4. Wage Estimate Proxies

Respondent Group

Occupation Proxies

Hourly Compensation (includes costs for wages, salaries, and benefits)

Total Average Hourly Compensation

Academic and Leadership Institute Staff

Employer Cost: Junior colleges, colleges, and universities (state and local government)

$58.86

$58.86

Professional Development Experience Representatives

Employer Cost: Management, business and financial (Private industry)

$66.61

$49.69

Employer Cost: Educational Services (Private industry)

$46.71

Employer Cost: Health care and social assistance

$35.75

Reciprocal Exchange Alumni

Employer Cost: Junior colleges, colleges, and universities (state and local government)

$58.86

$50.17

Employer Cost: Professional and business services (Private industry)

$41.48

U.S. Community Members

Employer Cost: Junior colleges, colleges, and universities (state and local government)

$58.86

$51.17

Employer Cost: Professional and business services (Private industry)

$41.48

Employer Cost: Management, business and financial (Private industry)

$68.61

Employer Cost: Health care and social assistance

$35.75

Fellowship Experience Map Respondents

Employer Cost: Junior colleges, colleges, and universities (state and local government)

$58.86

$50.68

Employer Cost: Professional and business services (Private industry)

$41.48

Employer Cost: Management, business, and financial (Private industry)

$66.61

Employer Cost: Health care and social assistance

$35.75

  1. Describe the monetary burden to respondents (out of pocket costs) needed to complete this collection.

There are no costs incurred by respondents.

  1. Describe the cost incurred by the Federal Government to complete this collection.

The estimated cost to the USG for the Evaluation of the Mandela Washington Fellowship as related to this collection is $113,259.60. This estimate includes all direct and indirect costs of the design, data collection, and analysis activities. In Table 4 below, Personnel and Fringe Benefit costs are for the contractor (Guidehouse) personnel who manage the evaluation. The wage rates of Federal employees at DOS were estimated using Steps 1 for Grades 13 and 14 of the General Schedule in the Washington-Baltimore-Arlington, DC-MD-VA-WV-PA locality area. 3 The Department multiplied the hourly wage rate by 2 to account for a fringe benefits rate of 69 percent4 and an overhead rate of 31 percent. 5

Table 4. Total Cost to Federal Government

Cost Item

Total

Federal Staff Costs [indicate grade and step and percentage of time]

  • GS-14, Step 1 equivalent - $56.15/hour×2 @ estimated 40 hours total

  • GS-13, Step 1 equivalent - $47.52/hour×2 @ estimated 120 hours

$13,650.80

Personnel

$86,838.80

Fringe Benefits

NA

Travel

$12,770.00

Equipment

NA

Supplies

NA

Total Direct Costs

$12,770.00

Indirect Charges [include Overhead, Fee, and G&A]

NA

Total

$113,259.60

  1. Explain any changes/adjustments to this collection since the previous submission

This is a new collection.

  1. Specify if the data gathered by this collection will be published.

Once data have been collected and analyzed, the evaluation contractor will produce a final report for internal and external publication, and to supplement that, a summary briefing and infographic for ECA’s use. The ECA Evaluation Division will publish the external version of the report, and infographic on its website (https://eca.state.gov/impact/eca-evaluation-division). However, the raw data or those of individual respondents will not be published in any way with attribution.

The results of this evaluation are specific to the Mandela Washington Fellowship program. As such, the data, findings, and conclusions should be considered specific to the context of this program rather than for comparison with/against other programs operated by ECA or the Department.

  1. If applicable, explain the reason(s) for seeking approval to not display the OMB expiration date. Otherwise, write “The Department will display the OMB expiration date.”

The Department will display the OMB expiration date.

  1. Explain any exceptions to the OMB certification statement below. If there are no exceptions, write “The Department is not seeking exceptions to the certification statement”.

The Department is not seeking exceptions to the certification statement.

B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

This collection will employ statistical methods for each survey and interview.

For the U.S. Reciprocal Exchange Alumni, Professional Development Experience host organizations, and representatives from 50 Academic and Leadership Institutes, a census will be taken as we anticipate that a portion of the existing contact data may be out of date and it may not be possible to update it prior to data collection. The respondent population for this evaluation includes the following: 172 U.S. Reciprocal Exchange Alumni, representatives from 50 Academic and Leadership Institutes, and 407 Professional Development Experience host organizations. Surveys will be sent to the respondent population. Interviews will sample a subset of the overall population of these stakeholders. The evaluation team will also survey and interview U.S. community members who interacted with Fellows as home stay hosts, peer collaborators/professional mentors, and site visit and/or community service organizations. The specific numbers of these will depend on Academic and Leadership Institute recommendations. Additional information on methods and techniques are discussed below.


Data Collection Methods and Techniques

A variety of information gathering methods will be used to collect multiple lines of evidence which include:

  • Document and records review

  • Online surveys with key actors and stakeholders

  • One on One Interviews (remote and in-person)

  • Focus group discussions (in-person), and

  • Fellowship Experience Maps


Domestic data collection will leverage both online and remote techniques. Domestic fieldwork will include site visits to collect data from various stakeholders through interviews and focus groups. Both qualitative and quantitative (survey) data will be de-identified before being provided to ECA following completion of the evaluation.


Online Surveys

The evaluation team will utilize surveys as a means for data collection using the Qualtrics survey tool. The evaluation team will customize electronic surveys for each key stakeholder groups. Wherever possible, questions and scales were designed for simplicity and parsimoniousness. As much as possible, response categories were limited to multiple choice and no more than five-point scale options to reduce the time required to respond to the questions.


The invitation to complete the survey will include an estimated completion time, which should encourage participation.


The surveys will be developed in Qualtrics, a sophisticated electronic survey tool. Each target key group will have its own survey structure that follows a logic based on the key group and sub-group. Each survey will be open for a period of 4 – 6 weeks.


The evaluation team will track diagnostics through our Qualtrics tool to confirm adequate survey coverage of key groups (e.g., at least 30% response for each survey stakeholder group). The study population requires sufficient coverage from each of the Alumni cohorts and the key groups. The evaluation team will use the Qualtrics diagnostics to track response rates and develop customized targeted outreach efforts to facilitate sufficient survey participation.


The surveys will be open for 4-6 weeks, and after the initial invitation, anyone who has not completed the survey will receive email reminders at two weeks, one week, four days, and one day to the survey closing to encourage additional responses. Those who have completed the surveys will not receive the reminders. Survey data will yield quantitative and some qualitative data to include insights to relationships formed, skills developed, knowledge gained, professional and business opportunities for Alumni and U.S. participants, benefits to communities, and program components worked or did not work.

The survey will be sent to those available contacts for the following stakeholder groups: Academic and Leadership Institute staff, Reciprocal Exchange Awardees/Alumni, Professional Development Experience hosts, the invitation to participate will include a request to distribute to peers (for Academic and Leadership Institute staff and Professional Development Experience hosts, only one point of contact is available). To preserve anonymity and confidentiality, the survey requests that participants only provide their name and contact information if they would like to participate in a follow-up interview. Due to this, the evaluation team will have limited ability to validate respondents against known persons associated with the Fellowship.

For U.S. community members, the survey will be shared with Academic and Leadership Institute staff, with a request that these be shared with their point of contacts within the community, in which the Academic and Leadership Institute is located. Thus, it is likely that the distribution of responses will be uneven. The evaluation contractor will conduct a nonresponse analysis during the data analysis phase to determine the extent to which the respondents represent the universe of stakeholder group from 2014 to 2018. If necessary, responses may be weighted in the data analysis phase to adjust for nonresponse, and any adjustments to the analysis will be discussed in the methodology section of the report.

4. Tests of Procedures or Methods to be Undertaken

To ensure maximum clarity and accuracy in data collection, the evaluation team is currently scheduling pilot testing for each of the instruments with a small number of respondents representing various categories of respondents (i.e., Academic and Leadership Institute staff, Reciprocal Exchange Awardees/Alumni, Professional Development Hosts, U.S. community members).


Survey pilot test respondents will utilize electronic drafts of the instruments in Qualtrics. Qualtrics platform automatically generates the amount of time taken to complete the survey, and then the team will review each section with the respondent, identifying any concepts or questions that were misunderstood or unclear, additional guidance or response parameters that should be included in the response instructions to assist with clarity and recall, how burdensome the questions were, and any additional response categories needed. Feedback from these interviews will be used to make revisions to the survey instruments.


Focus Group Discussions (in-person)

The Focus Group Discussions will occur in person to provide or verify background information, help to generate new ideas and explore innovations for improving future programs. Our team will develop customized structured and semi-structured questions for each key group in consultation with ECA using several sources, including information from the document review, survey results, and our expertise and understanding of the program. Focus Groups will be used in contexts where group discussions will yield more robust data, such as with U.S. Community Members. Each group will be run and monitored by two members of the evaluation team and will be moderated by one of the team members while the other takes detailed notes. After a brief introduction, the moderator will explain the purpose of the meeting and stress the informal format so participants can express their views candidly.


Sampling/Selection:

Surveys to the different stakeholder groups include a question that allows participants to ‘opt in’ to the U.S. – based interviews and focus group discussions. Focus group participants will be selected from a pool of survey respondents who have ‘opted in.’ In cases where there is not a sufficient number of individuals from a particular stakeholder group for a focus group (at least 3) in any location, the interviewer will revert to conducting individual semi-structured interviews using the same instrument.


Interviews (remote and in-person)

For U.S.-based interviews, in-person and telephone interviews will be conducted. Such interviews and virtual meetings can provide the team a critical optic on those features perceived to contribute most effectively to program goals. The evaluation team will visit up to five sites to conduct in-person interviews. This may be supplemented by virtual interviews, depending on the availability of key stakeholders.


Sampling/Selection:

Surveys to the different stakeholder groups include a question that allows participants to ‘opt in’ to the U.S. – based interviews and focus group discussions. The sampling approach of interviewees will from a pool of survey respondents who have ‘opted in.’ For some stakeholder groups (such as U.S. community members) for which contacts are not directly available to the evaluation team and may result in a low response rate, the sample may be expanded using a snowball approach, with a question asking interviewees if they can refer the evaluation team to others within the community who have interacted with the Fellow.

Fellowship Experience Maps

To understand the variety of impacts of the Mandela Washington Fellowship program, it is important to balance the broad sweeping macro-level impacts of the program as a whole, with deeper micro-level analysis that illuminates the experience of individual Fellows and U.S. community members such as the Professional Development Experience, Academic and Leadership Institute staff, and Reciprocal Exchange Awardees. To better understand the complex system of stakeholders and dynamics that influence the decisions made by Fellowship Alumni and their resulting outcomes, the evaluation team will prepare six detailed Fellowship Experience Maps for select Mandela Washington Fellowship Alumni (at least four) and Reciprocal Exchange Awardee(s) (one), and potentially one American community member (such as a host family member) depending on survey results and available information. The evaluation team and ECA will review the list of potential candidates for Fellowship Experience Maps before making final decisions. Maps would identify those individuals who played key roles in the lives of Alumni/Community Member before, during and after the Mandela Washington Fellowship experience, to reveal their personal narrative or “Fellowship Experience”.

5. Relevant Contacts

This evaluation was contracted through a competitive process. A number of ECA staff reviewed and approved the proposed methodology: Natalie Donahue, Chief of Evaluation (202-632-6193) and Marie-Ellen Ehounou (202-632-2847). Guidehouse LLP is the contractor that was selected to carry out the evaluation. Guidehouse’s technical evaluation experts developed the original design in response to the solicitation, contact names and numbers for ECA’s Evaluation Division, which provided review and refinement of the proposed design upon receipt of the project files. Guidehouse’s evaluation team will collect and analyze the data on behalf of ECA.

1 USDOL Bureau of Labor Statistics, Employer Costs for Employee Compensation, https://www.bls.gov/news.release/ecec.toc.htm, last modified March 19, 2019, accessed June 28, 2019.

2USDOL Bureau of Labor Statistics, Employer Costs for Employee Compensation, Table 4. State and local government, by occupational and industry group, https://www.bls.gov/news.release/ecec.t04.htm, last modified March 19, 2019, accessed June 28, 2019.

3 Source: Office of Personnel Management, “2019 General Schedule (GS) Locality Pay Tables,” https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/2019/general-schedule/

4 Source: Congressional Budget Office, “Comparing the Compensation of Federal and Private-Sector Employees, 2011 to 2015” (April 2017), https://www.cbo.gov/publication/52637. The wages of Federal workers averaged $38.30 per hour over the study period, while the benefits averaged $26.50 per hour, which is a benefits rate of 69 percent.

5 Source: U.S. Department of Health and Human Services, “Guidelines for Regulatory Impact Analysis” (2016), https://aspe.hhs.gov/system/files/pdf/242926/HHS_RIAGuidance.pdf. On page 30, HHS states, “As an interim default, while HHS conducts more research, analysts should assume overhead costs (including benefits) are equal to 100 percent of pretax wages….” To isolate the overhead rate, the Department subtracted the benefits rate of 69 percent from the recommended rate of 100 percent.

Shape1 12

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy