IASP_IC_Survey Supporting Statement_2013 FINAL - PART B

IASP_IC_Survey Supporting Statement_2013 FINAL - PART B.docx

Information Assurance Scholarship Program (IASP) Surveys

OMB: 0704-0508

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT – PART B



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


1. The purpose of the IASP project assessment surveys is to measure the performance of the IASP from different stakeholder viewpoints (for the purposes of this collection requirement, we are specifically referring to the following two stakeholder groups: active recruitment scholars and their faculty advisors, known as “Principal Investigators”, or “PIs”). Ideally the surveys will take place annually and all active stakeholders from both groups will be asked to participate.


Because the two groups have different goals and expectations of the program, student participants are asked to complete a different survey than PIs. The questions asked in each survey are broad and comprehensive in scope. The DoD IASP Executive Administrator and the IASP Steering Committee want to factor in all experiences and satisfaction levels of the respondents. The data from the survey responses is compiled into key themes (e.g., satisfaction with application process, suggestions for improving the program, the performance of the Executive Administrator, etc) and is used to identify key performance gaps that require immediate attention, as well as any enhancements or changes that can be made to improve the IASP program experience for all personnel involved. The information gathered is used to determine program strengths and weaknesses, drive improvements to program processes, strategic planning, human resources management, and communications at all levels and to all audiences of the program.


The potential responses vary by the number of recruitment students funded each year; the number of academic institutions with enrolled scholarship recipients; and the length of each student’s funded scholarship period of study. Typically, the scholarship recipient population ranges from 50 to 100 students and the number of PIs (faculty members overseeing each participating institution’s IASP students) ranges between 1 and 3 individuals per CAE across 168 CAEs. We currently have 93 students in the program and 400 PIs.


2. The scope of our investigation is too narrow to be stratified beyond the two parameters identified in Supporting Statement Part A. We will employ a process that identifies respondents within the two parameters to create a snapshot of the program from different perspectives. Since we have not previously distributed any program surveys, we do not have any historical information to rely on for the percentage of expected responses. While our ultimate goal is to achieve a 100 percent response rate, we do not expect that to occur. We anticipate a response rate that will fall somewhere between 20% and 80%. The low figure of 20% is based upon the typical response rate for DoD surveys. We expect this population, however, to have a much higher response rate since it is a small population engaged in the subject matter and most of the participants are at more senior levels in their career (i.e. Professors). Obtaining a broad perspective of opinions for the purposes of our assessment will provide sufficient information to make informed decisions about any changes/improvements that could be made to the program.


Maintaining an annual collection schedule allows for yearly snapshots of how the program is perceived by its most important stakeholders and provides for consistent opportunities to improve the program.


The steps in the assessment process are as follows:

  1. Identify the individuals to be surveyed.

  2. Send an e-mail to the individuals with an attachment to their respective survey.

  3. Individuals open the attachment and complete the assessment.

  4. Send reminder e-mail’s to non-responders.

  5. Individuals will email the completed assessment to a generic e-mail box specifically created for assessment responses.

  6. Assessment data is collected.

  7. Quantitative data is tabulated.

  8. Qualitative data is reviewed to identify common themes.

  9. An aggregate report of qualitative and quantitative data is generated for the IASP Steering Committee and the DoD Office of the Chief Information Officer.


3. If a sufficient number of responses are not received, the (original) assessment participation email will be followed up with a general reminder email and phone call to all participants. The expectation is that the recruitment students and PIs will have the same sense of ownership of the program and a degree of empowerment in the process such that the response rate will be met, similar to that of previous government participant assessments. The assessment serves as a mechanism for managing the program and is the only means for obtaining essential feedback from the students and PIs.


4. The assessment was developed with input from key stakeholders. A pre-test of the surveys was distributed to six Federal employees who completed a total of eight surveys between them (four PI surveys and four recruitment student surveys). Feedback was obtained on the format and content of each survey. The surveys have been refined according to the feedback received to develop a tight, cohesive collection of questions designed to capture information central to improving the IASP experience for all personnel. This method of continuous improvement via collaborative development serves as an effective testing mechanism.


5. Individual consulted on statistical aspects of the assessment design:

Paul Rosenfeld

Defense Manpower Data Center

571-372-0987


Point of contact for data collection and analysis:

Leah Loeffert

Business Analyst

Office of the DoD Chief Information Officer

571-372-4487



1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorFrederick Licari
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy