Supporting Statment Template_Final 11-21

Supporting Statment Template_Final 11-21.docx

National Security Language Initiative for Youth Evaluation

OMB: 1405-0233

Document [docx]
Download: docx | pdf



SUPPORTING STATEMENT FOR
PAPERWORK REDUCTION ACT SUBMISSION

National Security Language Initiative for Youth Evaluation
OMB Number 1405-XXXX



A. JUSTIFICATION

  1. Why is this collection necessary and what are the legal statutes that allow this?

The Department of State’s Bureau of Educational and Cultural Affairs (ECA) regularly monitors and evaluates its programs through the collection of data about program accomplishments in order to enable program staff to assess the impact of its programs, where improvements may be necessary, and to modify/plan future programs. ECA is currently conducting an evaluation of the National Security Language Initiative for Youth (NSLI-Y) program. The NSLI-Y is a scholarship program to enable U.S. students aged 15-18 to study less commonly taught languages (Arabic, Chinese, Hindi, Indonesian, Korean, Persian, Russian, and Turkish) in summer or academic-year long programs in a variety of countries. In addition to increased language proficiency, participants gain understanding of their host country and its culture. As the NSLI-Y program has been implemented for more than 10 years, ECA is conducting this evaluation to determine the extent to which the program is achieving its long-term goals. In order to do so, ECA has contracted Dexis Consulting Group to conduct surveys and focus groups with alumni and their parents, and in-depth interviews with local program coordinators/resident directors and a sample of U.S. high school teachers and administrators.



Legal authorities and administrative requirements that necessitate the collection of these data can be found below:

  1. Government Performance and Results Act of 1993 (GPRA)

  2. Government Performance and Results Modernization Act of 2010 (GPRAMA)

  3. Foreign Aid Transparency and Accountability Act of 2016, Public Law No. 114-191

  4. Foundations for Evidence-Based Policymaking Act of 2017

  5. Department of State’s Program and Project Design, Monitoring, and Evaluation Policy

  6. Mutual Educational and Cultural Exchange Act of 1961, as amended, 22 U.S.C. 2451 et seq (also known as the Fulbright-Hays Act)


  1. What business purpose is the information gathered going to be used for?

The primary users of the collected data will be ECA’s evaluation and program staff and its implementing partners. The information collected will be used to inform any beneficial program adjustments to strengthen the utility and cost-effectiveness of the program. The final report will also be made available to the public as part of ECA’s responsibility to be accountable for the use of its funds and performance of its mission. The ECA Evaluation Division, in partnership with Dexis Consulting Group (Dexis) who is conducting the evaluation, will be responsible for collecting and analyzing the data.

  1. Is this collection able to be completed electronically (e.g. through a website or application)?

The survey will be managed by Dexis on a single survey platform, SurveyGizmo. SurveyGizmo provides functionality that will minimize time required to both administer and respond to the survey, including:

  • Individualized links to the survey for easy tracking of response rate by key categories;

  • Automated reminders to only those who have yet to complete the survey;

  • Strong validation measures to reduce errors and generate cleaner raw data;

  • Quick access dashboards for initial results; and

  • Options to configure it to be HIPAA and NIST compliant.

Survey design will also be tested on mobile platforms (both Apple and Android) to ensure that all questions display properly whether viewed on a computer or a handheld device. We expect that all submissions will be collected electronically.

  1. Does this collection duplicate any other collection of information?

This will not be a duplication of effort. The purpose of the data collection, and therefore the focus of the questions asked, is to understand the longer-term impacts on personal, academic, and career development, as well as contributions to U.S. global competitiveness. ECA has not collected these data from these stakeholders in the past.

  1. Describe any impacts on small business.

We do not expect there to be any impacts on small businesses

  1. What are consequences if this collection is not done?

Approval is being sought for a one-time data collection. As the NSLI-Y program has been operating since 2006, ECA deems it critical to conduct an independent evaluation to capture the impacts of the program and to determine whether or not any adjustments to the program are necessary to ensure that it is meeting its long-term goals. Absent this data collection, ECA cannot fully answer questions about the long-term benefits of the program (or the lack thereof).

  1. Are there any special collection circumstances?

This data collection involves no special circumstances, as it is a one-time data collection and does not require submission of any information that is not OMB-approved. Consent procedures include obtaining broad consent for use of the data collected, and no proprietary information will be collected.

  1. Document publication (or intent to publish) a request for public comments in the Federal Register

The 60-day Federal Register Notice was published on May 1, 2019 (84 FR 18627). No comments were received. The Department will publish a notice in the Federal Register soliciting public comments to OMB for a period of 30 days.

  1. Are payments or gifts given to the respondents?

No payments or gifts are proposed for respondents. However, as focus groups will require the respondents to travel to a selected site and will likely incur some cost in addition to their time (parking, taxi, train/bus fare, etc.), the contractor will reimburse respondents for the costs associated with their travel to the focus group location. Reimbursement will be up to $10 based on the metropolitan area and method of transport available. The amount is based on the GSA 0.58 per mile calculation and the assumption that participants will travel roundtrip 10 to 15 miles to the discussion location. As the focus groups are expected to last approximately 90 minutes, refreshments will also be provided for the focus group respondents.

  1. Describe assurances of privacy/confidentiality

ECA and its external contractors follow all procedures and policies stipulated under the Privacy Act of 1974 to guarantee the privacy of the respondents. In addition, each survey includes the following language:

Please note that your participation in this survey is voluntary, and you are free to end the survey at any time. By clicking the “Consent and enter survey” button below, you are consenting to the following terms: 

  • Any response you provide may be reported in the final report as part of the aggregated quantitative analysis or de-identified qualitative analysis of open-ended responses.  

  • Responses may be reported by specific demographic category, program year, or program site. The only identifying information used will be the demographic information provided in the final section of the survey. 

  • De-identified data files will be submitted to ECA at the completion of the evaluation (without names or any contact information).

  • The data you provide may be reanalyzed at a later date for a follow-up study or other purpose as approved by ECA.  

If you have any questions about this survey or the NSLI-Y evaluation more broadly, you can contact the Dexis evaluation team at [email protected].  

Please answer the questions to the best of your ability and use the comment boxes to provide fuller answers and more insight on your experiences with NSLI-Y. Thank you in advance for your time and input.

CONSENT TO PARTICIPATE - By clicking the button to enter the survey below, you are giving your consent to participate in this evaluation. If you do not wish to participate, please click the exit survey link below.

In line with ECA policy, individual names will not be reported, but responses will be used in the aggregated analysis, and may be disaggregated by variables of interest, such as country of study, program year, gender, etc. As noted above, however, the consent statement for this evaluation includes broad consent, which will allow ECA to reanalyze the data at a later time if deemed useful. De-identified data files will be shared with ECA for this purpose at the end of the evaluation.

  1. Are any questions of a sensitive nature asked?

There are two questions that have been deemed to be of a sensitive nature on the alumni survey: sex and race/ethnicity of the respondents. Both of these questions have been included because the document review of quarterly and annual reports indicate that non-white participants and, in some countries, female participants, have encountered more difficulty with host families’ preconceived notions or stereotypes than other participants. ECA therefore needs to be able to examine the differences in experiences and outcomes between white and non-white participants and between male and female participants, as well recommendations they may have for improving the program.

There is one question of a sensitive nature on the parent survey: sex. Although the pre-testing did not reveal any differences in answers by sex given the very small number of respondents, there may be important differences in perceptions among parents by sex, and the team would like to be able to analyze for those differences to strengthen communication with parents in the future.



  1. Describe the hour time burden and the hour cost burden on the respondent needed to complete this collection

The total estimated hour burden for this data collection is 853 hours, broken down as follows in Table 1.

Table 1. Hour Time Burden for NSLI-Y Evaluation Respondents

Respondent Instrument

Estimated Number of Respondents

Average Time per Response

Total Estimated Burden Time

Alumni Survey

1,797

(33% response rate)

11.3 minutes

338.4 hours

Parent Survey

701

(6.5% response rate)

8.6 minutes

100.5 hours

Alumni Focus Group

135

90 minutes

202.5 hours

Parent Focus Group

108

90 minutes

162 hours

Local Coordinator/Resident Director Key Informant Interview

35

60 minutes

35 hours

High School Teacher/Administrator Key Informant Interview

25

35 minutes

14.6 hours

Total Estimated Burden Time



853 annual hours

The average times were calculated based on the average pilot test times. Each survey and key informant interview instrument was pilot tested with a set of respondents, and the mean time is reported above. The range of response times is shown in the table below.

Table 2. Pilot test response time ranges by data collection instrument

Data Collection Instrument

Number of Pilot Tests

Shortest Response Time

(minutes)

Longest Response Time

(minutes)

Alumni Survey

8

9.68

20.73

Parent Survey

9

7.95

9.22

Local Coordinator/Resident Director Interview Guide

4

48

72

High School Teacher/Administrator Interview Guide

4

28

41

The greatest variance in the pilot testing was in the alumni survey, which was due to the skip patterns in the survey. For example, if an alumnus/alumna received two NSLI-Y scholarships, he/she answered questions regarding each award, whereas those who received one only answered those questions once. In addition, if the respondent was still in college, he/she typically skipped the section on employment and career, as they are not yet at that stage in their lives. Additional variance resulted from the length of the open-ended answers provided in several sections. Those with longer open-ended answers had longer response times.

Similarly, the variation in the parent survey was largely due to the length of the open-ended answers provided.

Although the questions asked in the focus group guides were tested, we were unable to simulate multiple focus groups in the pilot testing period. The time estimate for the focus group, therefore, is based on the number and complexity of questions to be asked and number of anticipated respondents per group based on the evaluation contractor’s past experience conducting focus groups. The focus group moderator also has some control over how long the conversation lasts, and can steer the group to the next question to ensure that the group does not run too long.

Time Cost to Respondents

Most respondents (alumni and parents) will respond as individuals, and the cost to them is the opportunity cost if they were to undertake paid work for the time that they spend responding to surveys or participating in focus groups. Therefore, the cost estimate for these groups is based on the Bureau of Labor Statistics’ May 2017 National Occupational Employment and Wage Estimates1 national wages (as alumni and parents are spread across the country), with the alumni (as students or entry-level professionals in most cases and therefore using the “Life, Physical, and Social Scientist Technicians” category) using the rate of $25.48 (2017 mean hourly wage of $24.02 inflated by 3% for 2018 and 2019) and parents (likely senior in their careers, but across a range of professions and therefore using the “General and Operations Managers” category as a proxy) using a rate of $62.96 (2017 mean hourly wage of $59.35 inflated by 3% for 2018 and 2019).2

We anticipate that the local coordinators/resident directors and high school teachers/administrators may respond as part of their work duties, and as such, the cost burden for these groups was estimated with use of average total compensation rates obtained from the Bureau of Labor Statistics’ September 2018 Employer Costs for Employee Compensation Summary.3 For local coordinators/resident directors, we used the private industry “Education and Health Services” group category, with a total hourly compensation of $38.42 (2018 rate of $37.30 inflated 3% for 2019). For high school teachers/administrators, we use the “Elementary and Secondary Schools” category, with a total hourly compensation $53.92 (2018 rate of $52.35 inflated 3% for 2019).

Table 3. Estimate of Respondent Hour and Cost Burden

Respondent Group

Total Estimated Hours

Hourly Cost Rate

Total Cost Burden

Alumni

540.9

$25.48

$13,782,13

Parents

262.5

$62.96

$16,527.00

Local Coordinator/Resident Directors

35

$38.42

$1,344.70

High School Teacher/Administrators

14.6

$53.92

$787.23

Total

853


$32,441.06

  1. Describe the monetary burden to respondents (out of pocket costs) needed to complete this collection.

There are no costs incurred by respondents that will not be covered (see question 10)

  1. Describe the cost incurred by the Federal Government to complete this collection.

The estimated cost to the USG for the NSLI-Y Evaluation as related to this collection is $656,264.91. This estimate includes all direct and indirect costs of the design, data collection, and analysis activities. In the table below, Personnel and Fringe Benefit costs are for the contractor (Dexis Consulting Group) personnel who manage the evaluation.


  1. Table 4. Total Cost to Federal Government

Cost Item

Total

Federal Staff Costs [indicate grade and step and percentage of time]

  • GS-14, Step 1 equivalent – $54.88/hour @ estimated 40 hours total

  • Contractor – $72.22/hour @ estimated 120 hours

$10,861.60

Estimate of Respondent Hour Cost and Burden (Question #12)

$32,217.30

Personnel

$560,302.00

Fringe Benefits

NA

Travel

$37,699.69

Equipment

NA

Supplies

$29,596.00

Total Direct Costs

$67,295.69

Indirect Charges [include Overhead, Fee, and G&A]

$28,667.22

Total

$656,264.91



  1. Explain any changes/adjustments to this collection since the previous submission

This is a new collection.

  1. Specify if the data gathered by this collection will be published.

Once data have been collected and analyzed, the evaluation contractor will produce a final report for publication, and to supplement that, a summary briefing and infographic for ECA’s use. However, the raw data or those of individual respondents will not be published in any way with attribution.

  1. If applicable, explain the reason(s) for seeking approval to not display the OMB expiration date. Otherwise, write “The Department will display the OMB expiration date.”

The Department will display the OMB expiration date.

  1. Explain any exceptions to the OMB certification statement below. If there are no exceptions, write “The Department is not seeking exceptions to the certification statement”.

The Department is not seeking exceptions to the certification statement.

B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

This collection does will employ statistical methods for the alumni survey and the parent survey. Additional information is provided below

1. Potential Respondent Universe and Respondent Selection Methods

The respondent universe for this evaluation includes the following: the 5,143 unique individuals who participated in the NSLI-Y programs between 2008 and 2017 and their approximately 10,000 parents. Each of these groups is discussed in its own section below.

Alumni

The potential respondent universe of alumni includes all individuals who received and accepted NSLI-Y scholarships between 2008 and 2017. As we anticipate an overall response rate of 33% (based on past experience with programs targeting this age group and the rapidity with which their contact information tends to change), it was determined that a census was the most effective way to achieve a reasonable sample size to allow the analyses by language group, and in some cases, country, that will help ECA understand the impacts of its program.

Exhibit 1. 2008-2017 alumni by language and host country4

NSLI-Y Language

Country

Alumni by Host Country

Alumni by Language

Expected Response Rate

Total Expected Respondents

Chinese

China

2,058

2,134

33.3%

711

Taiwan

76

Arabic

Egypt

140

1,041

33.3%

347

Jordan

200

Morocco

647

Oman

54

Hindi

India

274

274

33.3%

91

Korean

Korea

624

624

33.3%

208

Russian

Russia

717

957

33.3%

319

Latvia

40

Moldova

142

Estonia

58

Persian

Tajikistan

106

106

33.3%

35

Turkish

Turkey

254

254

33.3%

85

Total Alumni

5,390

5,390

33.3%

1,7975

The second group to be surveyed are the parents of the participants, as most will have clear perspectives on the changes in their children as a result of the program, which will allow us to triangulate some of the data received from the alumni and identify additional program areas for fine-tuning. Each of the alumni is presumed to have two parents, and the information in the table below is based on that assumption. We propose a census of parents as well, as we anticipate a much lower response rate among parents, 6.5%, in line with typical response rates to online surveys, as the parents may have less investment in the program than the alumni.

Exhibit 2. 2008-2017 parents by language and host country

NSLI-Y Language

Country

Parents by Host Country

Parents by Language

Expected Response Rate

Total Expected Respondents

Chinese

China

4,116

4,268

6.5%

277

Taiwan

152

Arabic

Egypt

280

2,082

6.5%

135

Jordan

400

Morocco

1,294

Oman

108

Hindi

India

548

548

6.5%

36

Korean

Korea

1,248

1,248

6.5%

81

Russian

Russia

1,434

1,914

6.5%

124

Latvia

80

Moldova

284

Estonia

116

Persian

Tajikistan

212

212

6.5%

14

Turkish

Turkey

508

508

6.5%

33

Total Alumni

10,780

10,780

6.5%

7016

2. Procedures for the Collection of Information

For the alumni and parent surveys, a census will be taken, as we anticipate that a significant portion of the existing contact data may be out of date, and it may not be possible to update it prior to data collection. In addition, while some countries have had large numbers of participants, the participant groups are fairly small – most participants were in cohorts of 15 to 30 students, and experiences will have differed by cohort. Thus, to obtain information reflective of the range of experiences, a census is the most practical approach.

The expected alumni response rates will be sufficient to estimate overall program impacts at a 95 percent confidence level with a confidence interval of 3 and 80% power. For analyses by language of instruction, four languages {Chinese, Russian, Arabic, and Korean) should have sufficient responses to estimate program impacts at a 95 percent confidence level with a confidence interval of 4 and 80% power. The remaining three, however, (Hindi, Tajik, and Turkish) have smaller numbers of participants, and low response rates among those alumni may result in samples too small to analyze by country. (See Section 3 below on Maximization of Response Rates.)

The expected parent response rates are sufficient to estimate overall program impacts at a 95 percent confidence level with a confidence interval of 3 and 80% power. Parent responses will only be analyzed by country or other factor if the response rate is higher than expected.

Responses to surveys will be collected via online/mobile survey platforms (SurveyGizmo). For more information about contact strategies, see Section 3 below.

3. Maximization of Response Rates and Nonresponse

Expected response rates and nonresponse will vary by data collection instrument. Each is discussed in a separate section below.

Alumni Survey

ECA considers nonresponse to be a risk to this evaluation for the surveys. As much of the alumni contact information may be out of date, one of the evaluation contractor’s early tasks is to update the alumni contact lists. This will be done primarily through the closed social media groups that are administered by the NSLI-Y implementing organizations, and each of the implementing partners will notify their formal and informal alumni lists of the upcoming evaluation. These initial messages will focus on the value of the alumni’s perspectives in refining the program to be better in the future and helping ECA understand how they perceive the program’s value in their lives.

A link to a site where alumni can update their information will be posted on each of the social media sites, and alumni will be encouraged to share it with others in their cohorts with whom they are still in touch. More time will be focused on tracking down earlier alumni (2008-2014) whose contact information is more likely to be out of date, academic year participants, and those who participated in the Hindi, Tajik, and Turkish programs, as they had smaller numbers overall.

Using updated information where available, and the most recent information where no updated information exists, the evaluation team will send individualized survey links to the alumni for their completion. The invitation to complete the survey will include an estimated completion time, which should encourage participation, as it is short. As noted earlier, the surveys will be designed to be viewed on mobile devices, which may also increase willingness to complete the survey.

Wherever possible, questions and scales were selected for parsimoniousness. As much as possible, response categories were limited to yes/no and no more than four-point scale options to reduce the time required to respond to the questions. The survey was pilot tested (see Section 4, below) to ensure a high degree of clarity and consistency in the questions.

The survey will be open for four weeks, and after the initial invitation, anyone who has not completed the survey will receive email reminders at two weeks, one week, four days, and one day prior to the survey closing to encourage additional responses. Those who have completed the surveys will not receive the reminders.

If the response rate is below 30% two weeks after the survey is released, the evaluation contractor will post additional reminders on the alumni social media sites and will send a thank you message to those who have completed the survey, asking them to encourage their cohort members and friends to complete it as well. The social media site postings will include instructions on what to do if an alumnus/alumna did not receive a survey, which will enable the evaluation contractor to undertake another round of contact updates and send new links to any new contact information provided at that time.

Although the survey will be sent as a census, it is likely that responses will be most forthcoming from the most recent alumni, and earlier alumni may be under-represented. The evaluation contractor will conduct a nonresponse analysis during the data analysis phase to determine the extent to which the respondents represent the universe of program participants from 2008 to 2017. If necessary, responses may be weighted in the data analysis phase to adjust for nonresponse, and any adjustments to the analysis will be discussed in the methodology section of the report.

Parent Survey

ECA anticipates that parent contact information may be more out of date than the alumni contact information, as it has not been updated since the alumni completed their programs. As noted above, parent contact information originally included on the applications will be mined, and parents who wish to participate will opt in for the survey. The notifications that are sent to the alumni to inform them that the evaluation will be conducted will also include a request that they alert their parents to the fact that they should receive a similar notice. Alternatively, alumni may send their parents a link where the parents can provide their contact information directly if they wish to participate.

Using updated information where available, and the original application information where no updated information exists, the evaluation team will send individualized survey links to the parents for their completion. The invitation to complete the survey will include an estimated completion time, which should encourage participation, as it is short. As noted earlier, the surveys will be designed to be viewed on mobile devices, which may also increase willingness to complete the survey.

Wherever possible, questions and scales were selected for parsimoniousness. As much as possible, response categories were limited to yes/no and no more than four-point scale options to reduce the time required to respond to the questions. The survey was pilot tested (see Section 4, below) to ensure a high degree of clarity and consistency in the questions.

The survey will be open for four weeks, and after the initial invitation, anyone who has not completed the survey will receive email reminders at two weeks, one week, four days, and one day prior to the survey closing to encourage additional responses. Those who have completed the surveys will not receive the reminders.

If the response rate is below 5% two weeks after the survey is released, the evaluation contractor will post additional reminders on the alumni social media sites and will send a thank you message to those who have completed the survey, asking them to share the link where parents can provide their contact information directly, which will enable the evaluation contractor to undertake another round of contact updates and send new links to any new contact information provided at that time.

Although the survey will be sent as a census, it is likely that the distribution of responses will be uneven. The evaluation contractor will conduct a nonresponse analysis during the data analysis phase to determine the extent to which the respondents represent the universe of parents from 2008 to 2017. If necessary, responses may be weighted in the data analysis phase to adjust for nonresponse, and any adjustments to the analysis will be discussed in the methodology section of the report.

4. Tests of Procedures or Methods to be Undertaken

To ensure maximum clarity and accuracy in data collection, the evaluation team pilot tested each of the instruments with a small number of respondents representing various categories of respondents (i.e., alumni representing each language of instruction and parents of alumni across a range of countries of study). The table below shows the number of pilot test respondents for each instrument.

Exhibit 3. Pilot Tests of NSLI-Y Data Collection Instruments

Instrument

Number of Pilot Test Respondents

Maximum Response Time

Minimum Response Time

Alumni Survey

8

20.73

9.68

Parent Survey

9

9.22

7.95

Survey pilot test respondents utilized paper drafts of the instruments to enable the team to make changes quickly (for example, to the instructions) between pilot tests. The respondents completed each section of the survey while the evaluation team timed their responses, and then the team reviewed each section with the respondent using a cognitive interviewing approach, identifying any concepts or questions that were misunderstood or unclear, additional guidance or response parameters that should be included in the response instructions to assist with clarity and recall, how burdensome the questions were, and any additional response categories needed. Most of the questions on the draft instruments were understood clearly, but in several cases, revisions were made to the instructions, and in a few cases, additional response categories were identified and included in the final versions of the instruments to capture the range of responses.

Two concepts stood out as needing greater clarification – resilience and home community. Rather than discussing resilience directly, the evaluation team included a thoroughly tested resiliency scale developed by scholars at The Ohio State University, which eliminated the confusion and yielded better information. The evaluation is also intended to capture effects (spread effects or ripple effects) on the alumni’s home community, but it was unclear to many whether the questions were intended to focus on their community prior to program participation, their current or university community (as many completed the program and then proceeded a few weeks thereafter to start their undergraduate studies), or the networks of family, friends, and acquaintances with whom they shared information about their program experiences. After consultation with ECA, it was determined that the wider network was the most appropriate way to capture the desired impacts of the program, and the instruments were revised to reflect this operationalization of “home community.”

5. Relevant Contacts

This evaluation was contracted through a competitive process. A number of ECA staff reviewed and approved the proposed methodology: Natalie Donahue, Chief of Evaluation (202-632-6193), Christopher Schwab (202-632-6350) and Michele Peters (202-632-6426). Dexis Consulting Group (Dexis) is the contractor that was selected to carry out the evaluation. Dexis’ technical evaluation experts developed the original design in response to the solicitation, Senior Director Natasha Hsi (202-625-9444 ext 89), Technical Advisor Amun Nadeem (202-625-9444 ext. 85), and the evaluation Team Leader, Christine Allison (202-368-7541), provided review and refinement of the proposed design upon receipt of the project files. Dexis’ evaluation team will collect and analyze the data on behalf of ECA.



1 USDOL Bureau of Labor Statistics, May 2017 National Occupational Employment and Wage Estimates, https://www.bls.gov/oes/current/oes_nat.htm#25-0000, last updated March 30, 2018, accessed January 2, 2019.

2 These rates likely overstate the actual cost, as many of the alumni are full-time students and may not be employed, and many of the parents may have lower pay rates than our estimate, but these seemed reasonable proxies.

3 USDOL Bureau of Labor Statistics, Employer Costs for Employee Compensation, https://www.bls.gov/news.release/ecec.toc.htm, last modified December 14, 2018, accessed January 2, 2019.

4 The discrepancy between the number of unique individual participants (5,143) and the totals in Exhibit 1 (5,390) are the result of 247 individuals receiving two scholarship awards.

5 Difference between the summed total of respondents and the calculated percentage due to rounding.

6 Difference between the summed total of respondents and the calculated percentage due to rounding.

9

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCanfield, Danielle P
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy