Supporting Statement_ECA_TechGirls

Supporting Statement_ECA_TechGirls.docx

TechGirls Evaluation

OMB: 1405-0243

Document [docx]
Download: docx | pdf



SUPPORTING STATEMENT FOR
PAPERWORK REDUCTION ACT SUBMISSION

TechGirls Evaluation
OMB Number 1405-XXXX



A. JUSTIFICATION

  1. Why is this collection necessary and what are the legal statutes that allow this?

The Department of State’s Bureau of Educational and Cultural Affairs (ECA) regularly monitors and evaluates its programs through the collection of data about program accomplishments in order to enable program staff to assess the impact of its programs, where improvements may be necessary, and to modify/plan future programs. ECA is currently conducting an evaluation of the TechGirls program. TechGirls enables students aged 15-17 to gain exposure to a range of careers in science, technology, engineering, and mathematics (STEM) through a month-long summer scholarship program in the United States. The program includes a computer programming bootcamp, leadership skills development, job shadow with American women in STEM fields, and a home stay with U.S. families. In addition to exposure to careers in STEM and related educational pathways, participants gain understanding of the United States and its culture and create a network of STEM-focused alumnae. As the TechGirls program has been implemented for almost 10 years, ECA is conducting this evaluation to determine the extent to which the program is achieving its long-term goals. To do so, ECA has contracted Dexis Consulting Group to conduct surveys with alumnae, implementing organizations’ program staff, host families, and job shadow hosts.



Legal authorities and administrative requirements that necessitate the collection of these data can be found below:

  1. Government Performance and Results Act of 1993 (GPRA)

  2. Government Performance and Results Modernization Act of 2010 (GPRAMA)

  3. Mutual Educational and Cultural Exchange Act of 1961, as amended, 22 U.S.C. 2451 et seq (also known as the Fulbright-Hays Act)

  1. What business purpose is the information gathered going to be used for?

The primary users of the collected data will be ECA’s evaluation and program staff and its implementing partners. The information collected will be used to inform any beneficial program adjustments to strengthen the utility and cost-effectiveness of the program. The final report will also be made available to the public as part of ECA’s responsibility to be accountable for the use of its funds and performance of its mission. The ECA Evaluation Division, in partnership with Dexis Consulting Group (Dexis) who is conducting the evaluation, will be responsible for collecting and analyzing the data.

  1. Is this collection able to be completed electronically (e.g. through a website or application)?

Yes, data collection will take place electronically. The surveys will be managed by Dexis on a single survey platform, SmartSurvey. SmartSurvey provides functionality that will minimize time required to both administer and respond to the survey, including:

  • Individualized links to the survey for easy tracking of response rate by key categories;

  • Automated reminders to only those who have yet to complete the survey;

  • Strong validation measures to reduce errors and generate cleaner raw data;

  • Quick access dashboards for initial results; and

  • Options to configure it to be HIPAA and NIST compliant.

The survey was tested on mobile platforms (both Apple and Android) to ensure that all questions display properly whether viewed on a computer or a handheld device. We expect that all submissions will be collected electronically.

  1. Does this collection duplicate any other collection of information?

This will not be a duplication of effort. The purpose of the data collection, and therefore the focus of the questions asked, is to understand the longer-term outcomes on academic and career development, as well as the effectiveness of networks formed by alumnae on their return home. ECA has not collected these data from these stakeholders in the past.

  1. Describe any impacts on small business.

We do not expect there to be any impacts on small businesses.

  1. What are consequences if this collection is not done?

Approval is being sought for a one-time data collection. As the TechGirls program has been operating since 2012, ECA deems it critical to conduct an independent evaluation to capture the impacts of the program and to determine whether or not any adjustments to the program are necessary to ensure that it is meeting its long-term goals. Absent this data collection, ECA cannot fully answer questions about the long-term benefits of the program (or the lack thereof).

  1. Are there any special collection circumstances?

This data collection involves no special circumstances, as it is a one-time data collection and does not require submission of any information that is not OMB-approved. Consent procedures include obtaining consent for use of the data collected, and no proprietary information will be collected.

  1. Document publication (or intent to publish) a request for public comments in the Federal Register

The 60-day Federal Register Notice was published on January 4, 2021 (86 FR 178).  No comments were received in that period.  The Department will publish a notice in the Federal Register soliciting public comments to OMB for a period of 30 days.

  1. Are payments or gifts given to the respondents?

No payments or gifts are proposed for respondents.

  1. Describe assurances of privacy/confidentiality

ECA and its external contractors follow all procedures and policies stipulated under the Privacy Act of 1974, as amended. In addition, each survey includes the following language:

Please note that your participation in this survey is voluntary, and you are free to end the survey at any time. By clicking the “Consent and enter survey” button below, you are consenting to the following terms: 

  • Your participation in this evaluation is voluntary. We do not anticipate that participating in this evaluation will result in any risks or direct benefit to you. However, your inputs may lead to recommendations that benefit the TechGirls program—and, thereby, the general public. You may skip any questions you are not comfortable answering.

  • The information that you provide in the survey will be used to write a report. This report will be shared with the U.S. Department of State and other stakeholders for comment and will eventually be made public. Any responses you provide may be reported in the final report as part of the anonymized aggregated quantitative analysis or the de-identified qualitative analysis from open-ended responses. 

  • The U.S. government and its contractors will take reasonable measures to protect privacy data, personally identifiable information, and other sensitive data obtained from the survey.

  • Data about your post-program connections with TechGirls will be used to generate maps of the linkages within and across countries.

  • The data you provide may be reanalyzed at a later date for a follow-up study or other purpose as approved by the U.S. Department of State.  The data may be made available to third parties as required by law.

  • You may withdraw your consent at any time by contacting [email protected].



If you have any questions about this survey or the TechGirls evaluation more broadly, you can contact the Dexis evaluation team at [email protected].  

CONSENT TO PARTICIPATE

By clicking the button to enter the survey below, you are giving your consent to participate in this evaluation. If you do not wish to participate, please click the exit survey link below.

In line with ECA policy, individual names will not be reported, but responses will be used in the aggregated analysis and the production of sociograms. As noted above, however, the consent statement for this evaluation includes consent, which will allow ECA to reanalyze the data at a later time if deemed useful. De-identified data files will be shared by Dexis with ECA for this purpose at the end of the evaluation.

  1. Are any questions of a sensitive nature asked?

There are no questions that have been deemed to be of a sensitive nature on this survey.

  1. Describe the hour time burden and the hour cost burden on the respondent needed to complete this collection

The total estimated hour burden for this data collection is 148 hours, broken down as follows in Table 1.

Table 1. Hour Time Burden for TechGirls Evaluation Respondents

Respondent Instrument

Estimated Number of Respondents

Average Time per Response

Total Estimated Burden Time

Alumnae Survey

160

(75% response rate)

46 minutes

122.6 hours

Host Family Survey

30

(50% response rate)

29 minutes

14.5 hours

Job Shadow Host Survey

21

(50% response rate)

16 minutes

5.6 hours

Implementing Partner Staff Survey

20

(50% response rate)

16 minutes

5.3 hours

Total Estimated Burden Time



148 annual hours

The average times were calculated based on the average pilot test times. Each survey instrument was pilot tested with a set of respondents, and the mean time is reported above. The range of response times is shown in the table below.

Table 2. Pilot test response time ranges by data collection instrument

Data Collection Instrument

Number of Pilot Tests

Shortest Response Time

(minutes)

Longest Response Time

(minutes)

Alumnae Survey

9

37

147

Host Family Survey

5

9

45

Job Shadow Host Survey

5

10

32

Implementing Partner Staff Survey

0

--

--

Given the very small numbers of participants in each category, only a few pilot tests were conducted for each instrument. The Alumnae Survey was tested with 9 participants so as not to reduce the pool of respondents for the survey. The greatest variance in the pilot testing was in the alumnae survey, which was due to the length of time it took some of the participants to retrieve information about their ongoing contacts with their peers and mentors. Additionally, the survey allows participants to save their survey and return to it, and those with the longest response times may have done that. Additional variance resulted from the length of the open-ended answers provided in several sections. Those with longer open-ended answers had longer response times.

The Host Family and Job Shadow Host Surveys were both tested with five respondents, some over the phone and some self-administered online to test both modalities. The responses collected over the phone also had the benefit of allowing the team to ask respondents whether any questions were unclear or missing response categories. No questions were unclear, but in one case, additional response categories were identified. There was significant variance in response times in these as well, a portion of which reflected the modality (those completed online had shorter response times than those completed over the phone) and the number of times that the respondents hosted program participants. Those who hosted in more years reported on interactions with more participants. In addition, some hosts were more engaged with the program than others. Those who seemed to be more engaged provided longer open-ended responses than those whose connections to the program seemed more tenuous.

The Implementing Partner Staff Survey was not pilot tested with a separate group, as it is identical to the Job Shadow Host Survey with the exception of two questions that were excluded for the staff. The time estimates and burden reported are based on the responses to the pilot testing to the Job Shadow Host Survey.

Time Cost to Respondents

Most respondents (alumnae, host families, and job shadow hosts) will respond as individuals, and the cost to them is the opportunity cost if they were to undertake paid work for the time that they spend responding to surveys.. Therefore, the cost estimate for these groups is based on the Bureau of Labor Statistics’ May 2019 National Occupational Employment and Wage Estimates1 national wages (as alumnae are spread across the country). As alumnae are students or entry-level professionals, we used the “Life, Physical, and Social Science Technicians” category using the rate of $25.93 (2019 mean hourly wage of $25.17 inflated by 3% for 2020). For host families, who likely work across a range of fields, we used the “General and Operations Managers” category as a proxy using a rate of $60.92 (2019 mean hourly wage of $59.15 inflated by 3% for 2020).2

We anticipate that the job shadow hosts are senior in their careers in STEM fields, and using “Computer and Information Research Scientists” as a proxy, calculated their hourly wage to be $63.09 (2019 mean hourly wage of $61.28 inflated by 3% for 2020).


We also expect that the implementing partner program staff may respond as part of their work duties, and as such, the cost burden for these groups was estimated with use of average total compensation rates obtained from the Bureau of Labor Statistics’ June 2020 Employer Costs for Employee Compensation Summary.3 For program staff, we used the private industry “Education and Health Services” group category, with a total hourly compensation of $42.80.

Table 3. Estimate of Respondent Hour and Cost Burden

Respondent Group

Total Estimated Hours

Hourly Cost Rate

Total Cost Burden

Alumnae

122.6

$25.93

$ 3,179,02

Host Families

14.5

$60.92

$883.34

Job Shadow Hosts

5.6

$63.09

$353.30

Implementing Partner Program Staff

5.3

$42.80

$226.84

Total

148


$ 4,642.50

  1. Describe the monetary burden to respondents (out of pocket costs) needed to complete this collection.

There are no costs incurred by respondents.

  1. Describe the cost incurred by the Federal Government to complete this collection.

The estimated cost to the USG for the TechWomen Evaluation as related to this collection is $76,055.41. This estimate includes all direct and indirect costs of the design, data collection, and analysis activities. In Table 4 below, Personnel and Fringe Benefit costs are for the contractor (Dexis) personnel who conduct the data collection.

The wage rates of Federal employees at DOS were estimated using Steps 1 for Grades 13 ($49.19/hour at 40 hours) and 14 ($49.19 at 120 hours) of the General Schedule in the Washington-Baltimore-Arlington, DC-MD-VA-WV-PA locality area. The Department multiplied the hourly wage rate by 2 to account for a fringe benefits rate of 69 percent and an overhead rate of 31 percent.


  1. Table 4. Total Cost to Federal Government

    Cost Item

    Total

    Federal Staff Costs [indicate grade and step and percentage of time

    • GS-14, Step 1 equivalent – $58.13/hour @ estimated 40 hours total

    • Contractor – $72.94/hour @ estimated 120 hours

    $11,078.00


    Estimate of Respondent Hour Cost and Burden (Question #12)

    $4,642.50

    Personnel4

    $59,136.58

    Travel5

    $1,460

    Equipment/Supplies6

    $1,500

    Total Direct Costs

    $77,817.08


    Indirect Charges7

    $999.88

    Total

    $78,816.96

  2. Explain any changes/adjustments to this collection since the previous submission

This is a new collection.

  1. Specify if the data gathered by this collection will be published.

Once data have been collected and analyzed, the evaluation contractor will produce a final report for publication, and to supplement that, a summary briefing and infographic for ECA’s use, which will be published on the ECA Evaluation Division’s website: https://eca.state.gov/impact/eca-evaluation-division . However, the raw data or those of individual respondents will not be published in any way with attribution.

  1. If applicable, explain the reason(s) for seeking approval to not display the OMB expiration date. Otherwise, write “The Department will display the OMB expiration date.”

The Department will display the OMB expiration date.

  1. Explain any exceptions to the OMB certification statement below. If there are no exceptions, write “The Department is not seeking exceptions to the certification statement”.

The Department is not seeking exceptions to the certification statement.

B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

This collection will employ statistical methods for the alumni survey and the parent survey. Additional information is provided below

1. Potential Respondent Universe and Respondent Selection Methods

The respondent universe for this evaluation includes the following: the 214 participants in the TechGirls program between 2012 and 2019, the 39 implementing partner program staff who had daily contact with the participants, the 60 host families who hosted 219 overseas TechGirls participants, and the 41 job shadow hosts who hosted both U.S. and overseas participants.

The potential respondent universe of alumni includes 214 individuals who received and accepted TechGirls scholarships between 2012 and 2019. As we would like to ensure a minimum response rate of 75%, it was determined that a census was the most effective way to achieve a reasonable sample size that will help ECA understand the impacts of its program.

The second group to be surveyed is implementing partner program staff (39) who had significant ongoing contact with the participants during the course of the program. We are proposing a census of these staff, and we expect a response rate of 50% as many of them are no longer with the implementing partner that employed them when they worked on the program.

The third group to be surveyed is the 60 host families who hosted overseas TechGirls participants, as the participants may have shared their experiences with those families, which will allow us to triangulate some of the data received from the alumnae and identify additional program areas for fine-tuning. We propose a census of host families, as we anticipate a lower response rate among host families, 50%.

The final group to be surveyed is the 41 job shadow hosts who provided career awareness raising, and in some cases, guidance for the participants. Some of the alumnae remained in contact with their job shadow hosts after the end of the program, and they will have helpful perspectives on the value of that program component. We propose a census of job shadow hosts as well, as we anticipate a response rate around 50%.

2. Procedures for the Collection of Information

For the surveys, a census will be taken, as we anticipate that some of the contact data may be out of date, and it may not be possible to update it prior to data collection. In addition, the numbers of participants are fairly small, and in order to obtain any significant results, almost all participants in each stakeholder group will need to respond. Thus, a census is the most practical approach for each survey.

The expected alumni response rates will be sufficient to estimate overall program impacts at a 90 percent confidence level with a confidence interval of 10 and 80% power. The target set for the host family response rate will also be sufficient to estimate program impacts at a 90 percent confidence level with a confidence interval of 10 and 80% power. The expected response rate for the program staff and job shadow hosts, however, will only enable us to estimate program impacts at a 90 percent confidence level with a confidence interval of 14 and 80% power. Low response rates among those alumnae may result in a sample too small to produce significant results. (See Section 3 below on Maximization of Response Rates.)

Responses to surveys will be collected via online/mobile survey platforms (SmartSurvey). For more information about contact strategies, see Section 3 below.

3. Maximization of Response Rates and Nonresponse

Expected response rates and nonresponse will vary by data collection instrument. Each is discussed in a separate section below.

Alumnae Survey

ECA considers nonresponse to be a risk to this evaluation for the surveys. As some of the alumnae contact information may be out of date, one of the evaluation contractor’s early tasks is to verify and update the alumnae contact list. This will be done primarily through direct phone calls to the alumnae’s homes. Using updated information where available, and the most recent information where no updated information exists, the evaluation team will send individualized survey links to the alumnae for their completion. The invitation to complete the survey will include an estimated completion time, which should encourage participation, as it is short. As noted earlier, the surveys will be designed to be viewed on mobile devices, which may also increase willingness to complete the survey.

As much as possible, response categories were limited to yes/no and no more than four-point scale options to reduce the time required to respond to the questions. The survey was pilot tested (see Section 4, below) to ensure a high degree of clarity and consistency in the questions.

The survey will be open for four weeks, and after the initial invitation, anyone who has not completed the survey will receive email reminders at two weeks, one week, four days, and one day prior to the survey closing to encourage additional responses. Those who have completed the surveys will not receive the reminders.

If the response rate is below 60% two weeks after the survey is released, the evaluation contractor will contact alumnae by phone and inquire whether the alumnae have had any difficulty accessing or completing the survey. If so, the evaluation contractor will administer the survey over the phone.

Implementing Partner Staff Survey

ECA anticipates that current implementing partner staff will be relatively easy to contact. Contact information for former staff, however, may be out of date, and the evaluation contractor will attempt to verify and update contact information using LinkedIn and other public sources. Implementing partner staff will receive emails (or, where contact information does not include an email address, a phone call) notifying them of the evaluation and asking them to participate in it. Using updated information where available, and the originally provided information where no updated information exists, the evaluation team will send individualized survey links to the implementing partner staff for their completion. The invitation to complete the survey will include an estimated completion time, which should encourage participation, as it is short. As noted earlier, the surveys will be designed to be viewed on mobile devices, which may also increase willingness to complete the survey.

As much as possible, response categories were limited to yes/no and no more than four-point scale options to reduce the time required to respond to the questions. This survey was not pilot tested separately, as it is identical to the Job Shadow Host Survey, minus two questions that were not relevant to the staff.

The survey will be open for four weeks, and after the initial invitation, anyone who has not completed the survey will receive email reminders at two weeks, one week, four days, and one day prior to the survey closing to encourage additional responses. Those who have completed the surveys will not receive the reminders.

If the response rate is below 35% two weeks after the survey is released, the evaluation contractor will contact implementing partner staff by phone and inquire whether they have had any difficulty accessing or completing the survey. If so, the evaluation contractor will administer the survey over the phone.

Host Family Survey

ECA anticipates that host family contact information may be more out of date than the alumnae contact information, as some have had no contact with the program since its earliest years. Host families will receive emails (or, where contact information does not include an email address, a phone call) notifying them of the evaluation and asking them to participate in it. Using updated information where available, and the originally provided information where no updated information exists, the evaluation team will send individualized survey links to the host families for their completion. The invitation to complete the survey will include an estimated completion time, which should encourage participation, as it is short. As noted earlier, the surveys will be designed to be viewed on mobile devices, which may also increase willingness to complete the survey.

As much as possible, response categories were limited to yes/no and no more than four-point scale options to reduce the time required to respond to the questions. The survey was pilot tested (see Section 4, below) to ensure a high degree of clarity and consistency in the questions.

The survey will be open for four weeks, and after the initial invitation, anyone who has not completed the survey will receive email reminders at two weeks, one week, four days, and one day prior to the survey closing to encourage additional responses. Those who have completed the surveys will not receive the reminders.

If the response rate is below 35% two weeks after the survey is released, the evaluation contractor will contact host families by phone and inquire whether they have had any difficulty accessing or completing the survey. If so, the evaluation contractor will administer the survey over the phone.

Job Shadow Host Survey

ECA anticipates that job shadow host contact information may be out of date, and the evaluation contractor will attempt to verify and update contact information using LinkedIn and other public sources. Job shadow hosts will receive emails (or, where contact information does not include an email address, a phone call) notifying them of the evaluation and asking them to participate in it. Using updated information where available, and the originally provided information where no updated information exists, the evaluation team will send individualized survey links to the job shadow hosts for their completion. The invitation to complete the survey will include an estimated completion time, which should encourage participation, as it is short. As noted earlier, the surveys will be designed to be viewed on mobile devices, which may also increase willingness to complete the survey.

As much as possible, response categories were limited to yes/no and no more than four-point scale options to reduce the time required to respond to the questions. The survey was pilot tested (see Section 4, below) to ensure a high degree of clarity and consistency in the questions.

The survey will be open for four weeks, and after the initial invitation, anyone who has not completed the survey will receive email reminders at two weeks, one week, four days, and one day prior to the survey closing to encourage additional responses. Those who have completed the surveys will not receive the reminders.

If the response rate is below 35% two weeks after the survey is released, the evaluation contractor will contact job shadow hosts by phone and inquire whether they have had any difficulty accessing or completing the survey. If so, the evaluation contractor will administer the survey over the phone.

4. Tests of Procedures or Methods to be Undertaken

To ensure maximum clarity and accuracy in data collection, the evaluation team pilot tested each of the instruments with a small number of respondents representing various categories of respondents. The table below shows the number of pilot test respondents for each instrument.

Table 5. Pilot Tests of TechGirls Data Collection Instruments

Instrument

Number of Pilot Test Respondents

Maximum Response Time

Minimum Response Time

Alumnae Survey

9

37

147

Host Family Survey

5

9

45

Job Shadow Host Survey

5

10

32

Implementing Partner Staff Survey

0

--

--

Survey pilot test respondents utilized the online survey while on the phone with one of the evaluation team members to enable the team to capture real-time feedback and make changes quickly (for example, to the instructions) between pilot tests. The respondents completed each section of the survey while the evaluation team timed their responses, and then the team reviewed each section with the respondent using a cognitive interviewing approach, identifying any concepts or questions that were misunderstood or unclear, additional guidance or response parameters that should be included in the response instructions to assist with clarity and recall, how burdensome the questions were, and any additional response categories needed. All of the questions on the draft instruments were understood clearly, but in one case, additional response categories were identified and included in the final versions of the instruments to capture the range of responses.

5. Relevant Contacts

This evaluation was contracted through a competitive process. ECA staff reviewed and approved the proposed methodology including Natalie Donahue, Chief of Evaluation (202-632-6193) and Sarah Shields (202-632-9261). Dexis Consulting Group (Dexis) is the contractor that was selected to carry out the evaluation. Dexis’ evaluation team will collect and analyze the data on behalf of ECA.



1 USDOL Bureau of Labor Statistics, May 2019 National Occupational Employment and Wage Estimates, https://www.bls.gov/oes/current/oes_nat.htm#19-0000, last updated March 31, 2020, accessed September 8, 2020.

2 These rates likely overstate the actual cost, as many of the alumnae are full-time students and may not be employed, and many of the host families may have lower pay rates than our estimate, but these seemed reasonable proxies.

3 https://www.bls.gov/news.release/pdf/ecec.pdf

4 Per the approved project budget with firm-fixed price labor, this is the fully burdened personnel rate.

5 Should travel be permitted, evaluation team personnel may travel to conduct data collection activities in-person. This amount is based on a 3-day trip to Blacksburg, VA.

6 This includes a license for the SmartSurvey platform and STATA data analysis software.

7 Indirect charges applied only to travel and equipment/supplies lines, as personnel line is fully burdened.

8


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCanfield, Danielle P
File Modified0000-00-00
File Created2021-04-07

© 2024 OMB.report | Privacy Policy