Request for Approval

1_FellowshipSurvey_RequestGenIC.docx

Data Collection for CDC Fellowship Programs

Request for Approval

OMB: 0920-1163

Document [docx]
Download: docx | pdf


Request for Approval Under Generic Clearance for CDC Fellowship Programs Assessments (OMB Control Number: 0920-1163)

Shape1 TITLE OF INFORMATION COLLECTION: Division of Scientific Education and Professional Development (DSEPD) Pan-Fellowship Stakeholder Survey


Instruction: This form should be completed by the primary project representative at the CIO sponsoring the genIC, after consultation with the Center, Institute, or Office (CIO) PRA contact. An FTE is required to serve as the primary investigator for all information collection requests. The completed form should be routed from the PRA contact to DSEPD Information Collection Request Liaison Fátima Coronado, [email protected]. Instruction: Please provide no more than two sentences for each item in this box.

Goal of the study: The purpose of this data collection is to inform DSEPD fellowship service improvement and ongoing program management activities.


Intended use of resulting data: Results will allow DSEPD to efficiently identify 1) areas for fellowship program improvement to help programs achieve intended outcomes and 2) program stakeholder public health training needs that fellowship programs can address. Specifically, DSEPD staff will use the results to inform recruitment of fellows and host sites; update fellowship curricula, fellowship competencies, and required activities to address priority training needs; ensure fellows’ service benefits host sites; and improve the overall management and delivery of DSEPD-managed fellowship programs.


Methods to be used to collect data: Web-based survey with quantitative and qualitative items.


Subpopulation to be studied: Up to 1,541 alumni who completed DSEPD-managed fellowship programs within the previous 10 years, 249 supervisors of fellows who completed DSEPD-managed fellowship programs within the previous 5 years, and 91 individuals who were both fellows and supervisors during the aforementioned time periods. Respondents might work in governmental agencies, nongovernmental organizations, or other settings.


How data will be analyzed: Descriptive analysis will be used for quantitative items and content analysis for qualitative items. Results will be reported in aggregate and stratified by fellowship program and type of respondent (i.e., alumni, supervisor, or both).


CIO or Division PRA Contact
Name: Isabella Hardwick
Email: [email protected]
Phone: 404.498.0241


Project Representative

Instruction: Complete the fields below with information about the project lead.

Name: Sally Honeycutt

Title: Health Scientist

Affiliation (CIO/Division): Center for Surveillance, Epidemiology and Laboratory Services/ Division of Scientific Education and Professional Development

Email: [email protected]

Phone: 404.498.1917

Abbreviated Supporting Statement A


DETERMINE IF YOUR INVESTIGATION IS APPROPRIATE FOR THIS GENERIC CLEARANCE MECHANISM

Instruction: Before completing and submitting this form, first determine if the proposed investigation is appropriate for the Data Collection for CDC Fellowship Programs Generic ICR mechanism. Complete the checklist below. If you select “yes” to all criteria in Column A, the Data Collection for CDC Fellowship Programs Generic IR mechanism can be used. If you select “yes” to any criterion in Column B, the Data Collection for CDC Fellowship Programs Generic ICR mechanism cannot be used.


Column A

Column B

Information gathered is intended for CDC fellowship service improvement and program management purposes.

[X] Yes [ ] No

The investigation is conducted to contribute to generalizable knowledge.

[ ] Yes [X] No

Data collection will be completed in 90 days or less.

[X] Yes [ ] No

Data collection is expected to require greater than 90 days.

[ ] Yes [X] No

No incentive (e.g., money, reimbursement of expenses, token of appreciation) will be provided to participants.

[X] Yes [ ] No

An incentive (e.g., money, reimbursement of expenses, token of appreciation) will be provided to participants.

[ ] Yes [X] No


Did you select “yes” to all criteria in Column A? YES


If so, the Data Collection for CDC Fellowship Programs Generic ICR might be appropriate for your investigation. You may proceed with this form.


Did you select “yes” to any criterion in Column B? NO


If so, the Data Collection for CDC Fellowship Programs Generic ICR is not appropriate for your investigation. Stop completing this form now and consult your PRA contact about alternatives.



PURPOSE

Instruction: Provide a brief description of the collection purpose and how it will be used. If this is part of a larger study or effort, please include this in your explanation.


The Division of Scientific Education and Professional Development (DSEPD) is responsible for 13 fellowship programs.* Five of these programs are managed by partner organizations working under cooperative agreements from DSEPD. DSEPD manages the following seven fellowship programs that are included in the proposed data collection:

  1. CDC Steven M. Teutsch Prevention Effectiveness Fellowship (PE)

  2. Epidemic Intelligence Service (EIS)

  3. Epidemiology Elective Program (EEP)

  4. Hubert Global Health Fellowship (Hubert)

  5. Presidential Management Fellowship (PMF) Program at CDC

  6. Preventive Medicine Residency and Fellowship (PMR/F)

  7. Public Health Informatics Fellowship Program (PHIFP)


These fellowships are experiential service programs that provide robust hands-on learning and enhance skills in leadership, management, policy, and public health sciences (e.g., epidemiology, surveillance, informatics, and prevention effectiveness). Programs are competency-based and require fellows to complete activities designed to prepare them for essential work performed by public health professionals. Fellows are assigned to a host site, where they work under the direction of a designated supervisor. Host sites include federal, state, and local public health departments and other agencies or organizations that conduct public health programs. Hubert and EEP are short-term programs (6–12 weeks) for medical and veterinary students to introduce public health as a career choice. The other five programs are one- or two-year career fellowships for post-graduate degree professionals establishing public health careers. The fellowship programs are designed to contribute to the DSEPD mission to improve health outcomes through a competent, sustainable, and empowered public health workforce.


The proposed data collection is a one-time collection focused on gathering information from alumni who completed fellowship programs in the previous 10 years (i.e., graduated from the program 2007–2016) and supervisors of alumni who completed fellowship programs in the previous 5 years (i.e., graduated 2012–2016). Because of the short-term nature of EEP and Hubert, alumni who completed only these fellowship programs will receive a subset of questions, and their supervisors are excluded from this data collection. Therefore, the set of questions for supervisors includes only five fellowship programs (PE, EIS, PMF, PMR/F, and PHIFP).


Fellowship alumni and host site supervisors are important program stakeholders, and it is imperative for DSEPD to understand the extent to which their needs are met. This data collection focuses on stakeholder assessment of training DSEPD fellowship programs should provide and intermediate program outcomes, i.e., how well fellowship programs prepare participants for a career in public health and benefit host sites. These outcomes cannot be assessed by participants and supervisors during their fellowship programs because they take time to occur. The proposed data collection is designed to answer the following three assessment questions:


  1. What are the public health workforce training needs our stakeholders see for DSEPD fellowship programs?

  2. To what extent do our fellowship programs (i.e., alumni and service provided) meet public health agency needs?

  3. How well did our fellowship programs prepare alumni for jobs and career progression in public health?


Results will allow DSEPD to efficiently identify 1) areas for fellowship program improvement and 2) training needs that DSEPD fellowship programs can address. Specifically, DSEPD staff will use the results to inform recruitment of fellows and host sites; update fellowship curricula, fellowship competencies, and required activities to address priority training needs; ensure fellows’ service provides benefit to host sites; and improve the overall management and delivery of DSEPD-managed fellowship programs.


This quick, low-burden assessment is instrumental in helping DSEPD learn about stakeholder perspectives and will yield immediate results that can quickly be used by program staff in multiple fellowship programs. This information is not available from any other source.


DESCRIPTION OF RESPONDENTS

Instruction: Provide a brief description of the group(s) targeted for this information collection. These groups must have experience with the program.

Check all that apply.

[ ] Potential applicants or applicants

[ ] Current fellows (nonfederal employees)

[X] Alumni

[X] Mentors or supervisors

[X] Alumni and mentors or supervisors

[ ] Employers of alumni

[ ] Other (describe): ____________________



TYPE OF COLLECTION

Instruction: Check all that apply.

[ ] Focus group

[ ] Face-to-face interview

[ ] Telephone interview

[ ] Self-administered hard copy questionnaire

[X] Self-administered Internet questionnaire

[ ] Self-administered electronic questionnaire (e.g., fillable form)

[ ] Other (describe): ____________________



CERTIFICATION

Instruction: Please read the certification carefully. If you incorrectly certify, the collection will be returned as improperly submitted or it will be disapproved.


I certify the following to be true:

  1. The collection is voluntary.

  2. The collection is low burden for respondents and low cost for the Federal Government.

  3. The collection is noncontroversial and does not raise issues of concern to other Federal agencies.

  4. Information gathered will be used primarily to inform programs of efficiency and effectiveness of fellowship programs and will not be used for the purpose of substantially informing influential policy decisions.

  5. The collection is targeted to the solicitation of opinions from respondents who have experience with the program or may have experience with the program in the future.

  6. With the exception of information needed to contact participants, personally identifiable information (PII) is collected only to the extent necessary and is not retained.

  7. If this genIC requires collections of race and ethnicity data, the questions are consistent with HHS policy and standard OMB classifications.

  8. A copy of the IRB approval or exemption determination with description of participation consent and secure collection, storage, and management of participant data and information is attached.

  9. A currently valid OMB control number and expiration date is displayed in the upper-right corner at the beginning of the data collection instrument.

  10. The following statement is displayed at the bottom of the first page of the data collection instrument or will be read to the participant prior to data collection: “Public reporting burden of this collection of information is estimated to average 8 minutes per response for most respondents and 16 minutes per response for respondents who have both completed and supervised a fellowship program, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. An agency may not conduct or sponsor, and a person is not required to respond to a collection of information unless it displays a currently valid OMB control number. Send comments regarding this burden estimate or any other aspect of this collection of information including suggestions for reducing this burden to CDC/ATSDR Reports Clearance Officer; 1600 Clifton Road NE, MS D-74 Atlanta, Georgia 30333; ATTN: PRA (0920-1163).”

    1. If the Privacy Act applies, the following statement is also included: “The Privacy Act applies to this information collection. The requested information is used toward assessment and continuous quality improvement of CDC fellowship activities and services. CDC will treat data/information in a secure manner and will not disclose, unless otherwise compelled by law.”

  11. A Part II Worksheet is included in this submission.


Certified by CDC Sponsoring Program Division or CIO PRA Oversight Official:


Name: Kate Glynn

Date of Certification: (04/21/2017)

Email: [email protected]
Phone: 404.498.6169


Personally Identifiable Information

  1. Is personally identifiable information (PII) collected? [ ] Yes [ X ] No

  2. If Yes:

    1. Is the information that will be collected included in records that are subject to the Privacy Act of 1974?
      [ ] Yes [ ] No

    2. Please provide justification for collecting PII: _____________________

    3. Please describe efforts to use existing PII to avoid duplication (e.g., information from the Fellowship Management System [OMB No. 0920-0765], FedScope): ________________

    4. In advance of any data collection, the following statement will be provided directly to the participant (e.g., in a written statement on a survey tool prior to beginning a questionnaire, read to participant prior to interview): “The Privacy Act applies to this information collection. The requested information is used toward assessment and continuous quality improvement of CDC fellowship activities and services. CDC will treat data/information in a secure manner and will not disclose, unless otherwise compelled by law.”


Sensitive Questions

Instruction: If sensitive questions will be asked, provide justification and specific use.

There will be no sensitive questions included in this data collection.


BURDEN HOURS

Instruction: Complete Table 1 using the following column headings to calculate the burden hours for respondents.


  • Category of Respondents: Identify who you expect the respondents to be in terms of the following categories: (1) Potential applicants/applicants, (2) Current fellows (nonfederal employees), (3) Alumni, (4) Mentors or supervisors, (5) Employers of alumni, (6) Other (please describe).


  • Form Name: Include the type of data collection (e.g., “Electronic survey of fellowship applicants,” “Telephone interview of recent graduates”).


  • No. of Respondents: Provide an estimate of the number of respondents.


  • No. of Responses per Respondent: Provide the number of times the same respondent will be contacted for data/information collection.


  • Average Burden per Respondent (in hours): Provide an estimate of the amount of time required for a respondent to participate (e.g., time required to fill out a survey or participate in a focus group).


  • Total Burden Hours: Provide the total burden hours by multiplying as follows:
    ([No. of Respondents] x [No. of Responses per Respondent] x [Average Burden per Respondent]) in each row. Then total the rows.


We estimate approximately 862 nonfederal Alumni and 75 nonfederal Supervisors (937 total) will complete the Electronic Survey of DSEPD Fellowship Alumni and Supervisors. The estimated burden per response is 8 minutes and the total burden is 125 hours.


Approximately 10 nonfederal individuals qualify as both Alumni and Supervisors. The estimated burden per response is 16 minutes, and the total burden is 3 hours.


All information is collected electronically.  The total number of nonfederal responses is 947 and the total estimated annualized burden is 128 hours.




Table 1. Estimated Burden

Category of Respondent

Form Name

No. of Respondents

No. of Responses per Respondent

Average Burden per Respondent (in hours)

Total Burden Hours

Alumni or Supervisor

(i.e., completed a DSEPD-managed fellowship within the previous 10 years or supervised a fellow who completed a DSEPD-managed fellowship within the previous 5 years)

Electronic Survey of DSEPD Fellowship Alumni and Supervisors

937

1

8/60

125

Individuals who qualify as both Alumni and Supervisors

Electronic Survey of DSEPD Fellowship Alumni and Supervisors

10

1

16/60

3

Totals


947



128



FEDERAL COST


Table 2. Estimated Cost to the Government

Staff or Contractor

Average Hours

Average Hourly Rate

Total Cost

GS-15 FTE: Project oversight, technical assistance on data collection, analysis, reporting.

10 hours

59.96

$599.60

GS-13 FTE: data collection, analysis, reporting

40 hours

43.14

$1,725.60

ORISE Evaluation Fellow (GS-9 step 1 equivalent): data collection, analysis, reporting

80 hours

25.01

$2,000.80

Total

130 hours


$4,326.00

Link to U.S. Office of Personnel Management Pay Tables: https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/2016/general-schedule/.

Note: numbers reflect the estimated annual cost to the government.


PROJECT SCHEDULE

Instruction: Provide an estimated schedule indicating start dates, allowing sufficient time for delays and unforeseen circumstances. Sample activities and time schedules are provided; please modify as needed.


Project Time Schedule

Activity

Time Schedule

Determine whether collection of information in identifiable form (IIF) is needed

At least 6 months prior to data collection to allow time to plan and collect IIF

Design methods and data collection instruments

At least 5 months prior to data collection

IRB determination

At least 4-5 months prior to data collection

Pilot test instrument


At least 4 months prior to data collection

Develop genIC request

At least 3-4 months prior to data collection

Submit genIC to ICRO (then ICRO into ROCIS)

3 months prior to data collection

Receive OMB approval for genIC

At least 1 month prior to data collection

Implement data recruitment and collection

As soon as genIC is approved or as indicated by the genIC data collection plan

Analyze data as planned

Approximately within 3 months of close of data collection

Produce technical report and lay audience fact sheets

Approximately within 6 months of close of data collection: communicate to leadership, program, or stakeholders about results and recommendations for improvement or actions

Submit findings for scientific publications, manuscript, or presentation (TBD)

The project team will determine if this step is appropriate based on data analysis. If appropriate, findings will be submitted 6 months or more from close of data collection.



Abbreviated Supporting Statement B


Selection of targeted respondents

Instruction: Please provide a description of how you plan to identify your potential group of respondents and how you will select them.


Respondents will consist of alumni who completed DSEPD-managed fellowship programs in the 10 years prior to the survey (i.e., graduated 2007–2016) and primary host site supervisors who supervised at least one fellow who completed a DSEPD-managed fellowship program in the 5 years prior to the survey (i.e., graduated 2012–2016). Some potential respondents are in both categories and will be asked about their experiences as alumni and as supervisors. Including both supervisors and alumni in one data collection reduces overall participant burden for those respondents in both groups, as certain questions will be asked of all respondents. Approximately half of our respondents (47%) are EIS program alumni (n=652), supervisors (n=158), or both (n=77).


Program records (including fellowship exit surveys, applicant contact information, and alumni contact lists), estimate a maximum of 1,881 respondents, of which 934 are expected to be federal employees, and 947 are expected nonfederal. Most respondents will be employees of CDC, other government agencies, or nongovernmental organizations (including academia and healthcare). No sampling will be employed.


Administration of the instrument

Instruction: Identify how the information will be collected.

  1. How will you collect the information? (Check all that apply)

[X] Electronic (see Attachment A: Fellowship Survey Word Version and Attachment B: Fellowship Survey Web Version)

[ ] Telephone

[ ] In-person

[ ] Hard copy

[ ] Other, explain: ____________________


The following procedural steps will be followed to conduct the data collection:

  1. Fellowship program staff will provide contact lists of alumni and supervisors who meet the specified criteria for selection of targeted respondents. Contact lists will include name, email address, fellowship program, and type of respondent (i.e., alumni, supervisors, or both).

  2. The DSEPD Science Office will compile the fellowship program contact lists into a combined list and remove duplicate names (i.e., individuals who are alumni or supervisors of more than one fellowship program or who are both alumni and supervisor).

  3. One week before data collection begins, fellowship program staff will email their respective alumni and supervisors to inform them about the survey and to encourage participation (see Attachment C: Fellowship Survey Announcement).

  4. The DSEPD Science Office will send all selected respondents an email invitation to complete the web-based survey (developed in SurveyMonkey) (see Attachment D: Fellowship Survey Email Invitation).

  5. Respondents will have 15 business days to respond to the web-based survey.

  6. The DSEPD Science Office will send up to three reminder emails to non-responders encouraging participation prior to the survey close date (see Attachments E-G: Fellowship Survey Reminder Emails).

  7. The DSEPD Science Office will close the survey no more than 20 business days after initial administration. This request is for a single data collection that will end when the survey is closed.


  1. Will trained interviewers or facilitators be used? [ ] Yes [ ] No [X] N/A


Methods to maximize response

Instruction: Provide a brief description of the procedures planned to maximize response rates.


Although participation in this information collection is voluntary, the DSEPD Science Office will make every effort to maximize the response rate. The Science Office will collect data via a web-based survey instrument, which will allow respondents to complete and submit their responses electronically. This method was chosen to reduce the overall burden on respondents and allow respondents to complete the assessment at their own convenience. Importantly, the web-based survey allows the Science Office to use extensive skip patterns so that respondents will skip items that are not relevant or applicable. The Science Office designed the survey instrument to collect the minimum information necessary. The survey consists of a total of 36 items; however, most respondents will receive a subset of these items. Alumni will receive about 28 questions, and supervisors will receive about 14 questions. The actual number will vary by respondent (e.g., some items only apply to alumni of specific fellowship programs, or may be skipped based on responses to previous questions). Skip patterns are programmed into the electronic survey in SurveyMonkey (see Attachment B: Fellowship Survey Web Version) and explained in text in the Word version of the survey (see Attachment A: Fellowship Survey Word Version). Including both supervisors and alumni in one data collection reduces overall participant burden. The Science Office also conducted extensive pretesting to ensure that the survey is user-friendly and easy for respondents to understand and complete (see pilot testing section below).


Before the Science Office sends the email invitation to complete the survey, fellowship program staff will send an announcement to all targeted respondents from their fellowship program. This announcement will ensure that respondents first learn about the survey from a known program contact with a familiar email address. The announcement will encourage potential respondents to participate and inform them about the importance of the survey and how findings will be put into action to improve the fellowship program. Fellowship program staff will also informally promote the survey during routine communications with potential respondents (e.g., at regularly scheduled conferences or during alumni association meetings, see Attachment H: Fellowship Survey Sample Script). The survey will include an introduction that informs potential respondents of what the project is asking, why it is being asked, who will have access to the data, how the results will be used, and how the findings will be put into action.

SurveyMonkey offers the ability to collect anonymous responses (i.e., no names, email addresses, IP addresses, or other contact information attached to results) and still track email invitations. This is possible because tracking information is tied to the email invitation, not the survey results. Non-responders will receive up to three reminders to encourage completion of the survey and maximize response rate.


The Science Office will use a dedicated email address ([email protected]) for respondents to confirm legitimacy of the data collection, ask questions, voice concerns, or seek technical assistance. The survey invitation and reminder emails will include an option for respondents to opt out of receiving additional emails.


Analysis plan

Instruction: Provide a brief description of the analysis plan, including quality control procedures, and estimation procedures

Data will be downloaded from SurveyMonkey into Microsoft Excel, SPSS, or MaxQDA for analysis. The DSEPD Science Office will conduct descriptive analyses for quantitative items, including frequency distributions, measures of central tendency, and correlations (e.g., correlation between time since completing the fellowship and achievement of intended intermediate outcomes). The team will conduct content analysis for qualitative items to identify major themes or patterns in the data. All data will be kept secure in password protected files on a secure network drive that is only accessible to the DSEPD Science Office. All results will be reported in the aggregate and stratified by fellowship program and type of respondent (i.e., alumni, supervisors, or both).


The Science Office will prepare a summary report and share it with stakeholders including fellowship program staff and DSEPD leadership. The team will also engage stakeholders in a collaborative process to interpret findings and generate practical, immediate recommendations for program improvement. If appropriate, DSEPD may disseminate the results more broadly (e.g., to other CDC fellowship programs or in a manuscript submitted for publication in a scientific journal). If results are disseminated outside of DSEPD, authors will clearly describe the scope of the data collection, types of respondents, and lack of direct generalizability to fellowship programs external to DSEPD.


Additionally, each fellowship program will have the option to receive a data set limited to their program’s respondents. In order to maintain respondents’ anonymity, the Science Office will clean the data to remove responses that could be used to identify a specific individual (e.g., in an open-ended field, listing one’s job title as Director of a specific program or office). Providing data to fellowship programs will allow them to conduct additional analyses that might be useful for their programs.


Pilot testing

Instruction: Provide a brief description of pilot-test efforts.

The DSEPD Science Office conducted two phases of pilot testing. In the first phase, six public health professionals representing the different perspectives of the expected respondents completed cognitive interviews about the survey instrument. Cognitive interviews lasted up to 90 minutes and were conducted via telephone with two members of the Science Office (one interviewer and one note-taker). Participants had a Word version of the survey instrument and were instructed to “think aloud” as they responded to each survey item. Interviewers also asked probing questions for additional information on specific topics that did not emerge naturally during the think aloud process. The purpose of cognitive interviews was to ensure participants from different professional backgrounds, fellowship programs, and settings understood the items as intended. On the basis of cognitive interviews, the Science Office identified and revised several items that participants did not understand. Cognitive interviews and survey instrument revisions were an iterative process, such that major survey revisions were included in later interviews to confirm that the revisions successfully addressed the problems identified in earlier interviews.


In the second phase, 9 CDC employees, contractors, and current fellows pilot tested the web-based survey. To ensure that the pilot test would provide an accurate time estimate for respondents with different skip patterns, the pilot test included responses as alumni only, supervisor only, and both alumni and supervisor. The average time and range for each group, including time for reviewing instructions and completing the survey, is as follows:

  • Alumni only: average 7 minutes (range 5–9 minutes)

  • Supervisor only: average 8 minutes (range 4–10 minutes)

  • Both alumni and supervisor: average 16 minutes (range 13–19 minutes)

Given the similarity between alumni only and supervisor only times, these two groups are listed as one respondent category in the burden table. The estimate for burden hours is based on average times from the pilot test, using the higher average for the alumni only or supervisor only category (8 minutes).


Instruction: Describe efforts to improve or refine the instruments based on the pilot-test findings and feedback.

[ ] No changes necessary, based on pilot-test findings and feedback.

[X] Changes (please describe): The Science Office revised several questions that cognitive interview participants found confusing or did not understand as intended. For example, item #29 asks about benefits that host sites experienced after a fellow’s service ended. In the initial cognitive interviews, participants reported on activities that took place during the fellow’s service. The Science Office revised this item and confirmed that later cognitive interview participants understood it correctly. After feedback from cognitive interviews was incorporated, no additional changes were necessary.


Consultation on statistical aspects

Were outside agencies, partners, or organizations consulted on statistical aspects of the design?

[ ] Yes

[X] No


If yes, list the following information of all persons consulted.


Name: __________________

Agency/organization (e.g., companies, state or local governments): __________________

Title: __________________

Telephone number: __________________

Email address: __________________


Please ensure that all instruments, instructions, and scripts are submitted with this request.


DATE SUBMITTED TO DSEPD INFORMATION COLLECTION REQUEST LIAISON (ICRL)

Instruction: Please indicate the date (MM/DD/YYYY) the request is submitted to the ICRL.

04/20/2017


Email the completed form to the DSEPD Information Collection Request Liaison, Fátima Coronado, at [email protected].

* Division of Scientific Education and Professional Development Fellowships and Student Programs. https://www.cdc.gov/ophss/csels/dsepd/fellowships.html Accessed March, 2017.

15

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleDOCUMENTATION FOR THE GENERIC CLEARANCE
Author558022
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy