Generic Template

Clearance Documentation - MSURSD School Risk Survey.doc

Generic Clearance for Federal Student Aid Customer Satisfaction Surveys and Focus Groups Master Plan

Generic Template

OMB: 1845-0045

Document [doc]
Download: doc | pdf


Clearance Documentation

Documentation for the Generic Clearance of Customer Service Satisfaction Collections

TITLE OF INFORMATION COLLECTION: Data Collection for School Risk Confirmation and Management Tools

[ x ] SURVEY [ ] FOCUS GROUP [ ] SOFTWARE USABILITY TESTING


DESCRIPTION OF THIS SPECIFIC COLLECTION


BACKGROUND


Under Title IV of the Higher Education Act of 1965, which covers the administration of U.S. federal student financial aid programs, Federal Student Aid (FSA) administers the following programs: Pell grants, Stafford loans, PLUS loans, and the “campus-based” programs including Federal Work Study, Perkins Loans, and Federal Supplemental Educational Opportunity Grants. Federal regulation requires schools to have written policies and procedures for the administration of the Title IV student assistance programs. To ensure institutional regulatory compliance, FSA provides training and technical assistance for financial aid administrators, institutional leaders, and other institutional support staff nationwide via its Information for Financial Aid Professionals (IFAP) website and the School Experience Group (SEG). The SEG’s mission is to identify the unique service needs of all post-secondary education institutions and provide them with tailored resources to meet those needs. Within SEG, the Minority Serving and Under Resourced Schools Division (MSURSD) is responsible for providing support, assessment, and training specifically targeted for Minority Serving Institutions (MSIs).


MSURSD is currently working to enhance the services and resources they provide to the MSIs, with the dual goals of increasing MSI compliance rates and of assisting MSI’s in improving student performance outcomes such as increased student retention and graduation rates. Previously, Windwalker has mapped MSURSD’s business process and has used the results to formulate a Performance Enhancement Pilot (PEP) program. In addition, a segmentation analysis was conducted, which was followed by a benchmarking study in order to compare variables across different segments of MSI’s. This helped to identify the unique challenges faced by MSI’s in different segments. Following these studies, Windwalker was tasked with assessing the impact of policy changes on institutions identified as at-risk. The specific goals of the study are to:


  • Identify how policy changes effect operations, policies, procedures, and student performance.

  • Assess what tools schools would find the most helpful, and

  • Categorize the best audiences for the tools.





INTENDED PURPOSE AND NEED FOR THE COLLECTION


As was mentioned in the previous section, MSURSD has contracted Windwalker Corporation to assist in the recommendation of management tools that will best serve at-risk schools. Windwalker has already conducted an extensive operational review and has completed detailed maps of MSURSD’s major processes. In addition, previous research has been done by Windwalker to identify best practices at low-risk schools and assess the needs of different MSI’s. This previous research will help guide Windwalker in the present study.


This current phase of the research is a survey, it will be sent to 163 institutions. These institutions have been designated by MSURSD as TTAC Level III and Level IV schools. The survey findings will assist MSURSD with understanding how FSA policy changes impact TTAC Level III and IV institutions.


The survey contains 51 questions to gather the desired information. The survey should take no more than 20 minutes. Specifically, the survey will ask respondents questions on the following themes:


  • Identification of the impact of policy changes on operations, policies, procedures, and student performance

  • Identification of management tools that would assist institutions

  • Identification of the content of management tools

  • Identification of best audience and mode of delivery to maximize effectiveness

  • Identification of practices employed to address findings


The results from the survey will be presented in a report that includes recommendations on the ways in which MSRUSD can communicate with and engage schools while they plan for policy changes.



COLLECTION PROCEDURES


The web-based survey will be developed using Survey Monkey, a survey platform and data analysis tool. Using this platform allows for the development of a computer-based survey of varied question types (e.g., questions using Likert scales and open-ended questions) to be administered to respondents via the delivery of a web link through an email message. Both closed- and open-ended items are included in the survey to obtain quantitative and qualitative data. Specifically, thirteen items are close-ended, nineteen items are open-ended, and an additional nineteen items have both close-ended and open-ended components. The survey is expected to take respondents approximately 18 minutes to complete.

Once the survey is finalized and ready to deploy, FSA will provide Windwalker a list of names, email addresses, and demographic variables of interest (e.g., institution name and participant’s position title) of staff from TTAC Level III and IV institutions to which the survey link will be sent. Each participant will receive an email from an FSA point of contact via surveymonkey.com with an invitation to complete the survey. The email invitation will include details on the purpose of the survey, the anticipated length of time to complete the survey, and the survey deadline date. The survey will remain open for two weeks, and two reminder emails will be sent subsequent to the launch email to participants who have not yet completed the survey. The first reminder will be sent five business days after the survey launch, and a second reminder will be sent the business days before the survey closes.

Once the survey has closed, Windwalker will begin data analysis. Data will be captured by the Survey Monkey tool, allowing for download into IBM SPSS Statistics, a software package used for statistical analysis. Survey Monkey’s text analysis software will be used to analyze the open-ended responses. Where applicable, survey data will be compared to the pre- and post-seminar evaluation survey data previously collected by FSA. Open-ended responses will be coded by theme and the frequency counts for comments by theme will be provided in the final report (see the “Planned Use of Data” section). Should the number of comments for any given open-ended response exceed Survey Monkey’s text analysis software’s minimum threshold of 20 comments, a word cloud will be created for each open-ended survey question, providing a visual representation of the frequency of the most popular terms. All coding rubrics will be provided to FSA.


DATES, LOCATIONS, AND PARTICIPANTS


The proposed timeframe for the survey launch will be between January 11th, 2016 and January 29th, 2016, and will be conducted entirely online. No focus groups are planned and no payments, stipends, or incentives are proposed. Windwalker will administer the survey to a sample of financial aid directors, whose institutions are considered at-risk. This timeline meets the contractual requirement that the survey be administered and the final report be made available to MSURSD by March 31, 2016. The survey will also be shared with MSURSD/FSA leadership for approval.



PLANNED USE OF DATA


As was previously mentioned, the survey will provide MSURSD insight into the impacts of policy changes on TTAC Level III and IV institutions. This will provide the groundwork for Windwalker to recommend management tools that MSURSD can use to better serve MSIs. The survey will provide both quantitative and qualitative responses. This allows us to target specific policies and tools, while allowing respondents to input their own wherever it is logical. A report to be delivered by the end of March to FSA will include a summary of findings from the Risk Management Survey. The goals of the report are to identify policy impacts and suggest specific tools to help reduce these impacts. Frequency and word counts will be used to identify common themes from open-ended questions. In addition, piping logic will be used so that tools can be identified for the policies that had the greatest impact on institutions.





AMOUNT OF ANY PROPOSED STIPEND OR INCENTIVE


Not applicable.


BURDEN HOUR COMPUTATION (Number of responses (X) estimated response or participation time in minutes (/60) = annual burden hours):


Category of Respondent

No. of Respondents

Participation Time

Burden

Financial aid directors

160

20 min

53 hours





Totals

160

20 minutes

53 hours



STATISTICAL INFORMATION


We expect a 60% response rate which is typical for surveys disseminated online. The quantitative data will be analyzed statically using descriptive statistics to identify trends in responses. The qualitative data will be analyzed by counting the frequencies of themes that emerged and identifying tools that respondents have suggested that were previously considered.


REQUESTED APPROVAL DATE: 10 business days past submission date


NAME OF CONTACT PERSON: Chris Lemmie


TELEPHONE NUMBER: (202) 377-3225


MAILING LOCATION: 830 First St. N.E., Washington, DC 20202


ED DEPARTMENT, OFFICE, DIVISION, BRANCH: Chief Customer Experience Office,

Office of Federal Student Aid, U.S. Department of Education





File Typeapplication/msword
AuthorAlison Curtis
Last Modified ByKate Mullan
File Modified2016-01-14
File Created2016-01-14

© 2024 OMB.report | Privacy Policy