1652 0013 Ss 9252014

1652 0013 SS 9252014.doc

Aviation Security Customer Satisfaction Performance Measurement Passenger Survey

OMB: 1652-0013

Document [doc]
Download: doc | pdf


  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statue and regulation mandating or authorizing the collection of information. (Annotate the CFR parts/sections affected.)


TSA is committed to being attentive and responsive to the experiences of its customers, particularly the flying public. Over the past few years, Congress agreed with TSA on the importance of assessing customer satisfaction. In support of this effort, TSA plans to conduct a passenger survey at airports nationwide. The survey will be administered using an intercept methodology. The intercept methodology uses TSA personnel who are not in uniform to hand deliver paper survey forms to passengers immediately following the passenger’s experience with the TSA’s checkpoint security functions. Passengers are invited, though not required, to complete and return the survey via an online portal, or by responding in writing to the survey questions on the customer satisfaction card and depositing the card in a drop-box at the airport or using U.S. mail, TSA personnel decide the method by which passengers will be asked to complete and return the survey. The intercept methodology randomly selects times and checkpoints to select passengers to complete the survey in an effort to gain survey data representative of all passenger demographics.


TSA wants to conduct these surveys to further our mission to ensure the security of our Nation’s commercial aviation system under the authority granted to TSA under the Aviation and Transportation Security Act, Pub. L. 107-71, 115 Stat. 597 (November 19, 2001). TSA issued a required report to Congress on May 19, 2002 entitled “Performance Targets and Action Plan: 180 Day Report to Congress” in which TSA committed to “collecting information to baseline customer satisfaction as well as perceptions of the quality and courteousness of our security operations.” Congress demonstrated a keen interest in the security operations that affect the flying public. For example, in the Conference Report associated with H.R. 4775, “Making Supplemental Appropriations for Further Recovery from and Response to Terrorist Attacks on the United States for the Fiscal Year Ending September 30, 2002, and for Other Purposes” (H. Rept. 107-593, July 19, 2002), Congress directed TSA to measure both the “average wait time at passenger screening checkpoint[s]” and the “number of complaints per 1,000 passengers” for airports at which security is federalized. Further, in the General Accountability Office (GAO) report entitled “Transportation Security Administration: Actions and Plans to Build a Results-Oriented Culture,” GAO-03-190, January 17, 2003, GAO praised TSA’s customer-focused performance measurement programs, including the airport survey. The report recommended that TSA “[c]ontinue to develop and implement mechanisms, such as the customer satisfaction index, to gauge customer satisfaction and improve customer service.” In furtherance of this effort, TSA wants to continue to use surveys to measure customer satisfaction and confidence with TSA’s aviation security procedures.



  1. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


This airport survey represents an important part of TSA’s efforts to collect data on customer satisfaction. We propose to continue conducting airport surveys to gauge customer satisfaction and confidence with TSA’s aviation security procedures. The objective is to capture individuals’ experiences with the passenger security checkpoint.


Previous Survey Efforts


TSA has conducted airport surveys at select airports nationwide on an annual basis since September 2003 (OMB No. 2110-0011, and later 1652-0013), the results of which have achieved statistical validity in measuring customer satisfaction and confidence in TSA through the Customer Satisfaction Index – Aviation (CSI-A), provided key performance data that is used to improve TSA’s goal of providing world-class customer service, and validated the methodology discussed in this document. In 2002 and 2003, TSA conducted focus groups with the public and used a statistical analysis of passenger survey results to validate the findings of these focus groups, thereby gaining a better understanding of how different aspects of the customer experience influence satisfaction and confidence. The CSI-A survey was too costly and subsequently discontinued. Later, TSA relied on the Bureau of Transportation Statistics Omnibus Household Survey (BTS) to acquire travel-related data; however, the BTS survey did not provide data at the level of granularity needed to properly assess customer satisfaction. Therefore, TSA proposes the airport-level survey to acquire customer satisfaction data in an effort to more efficiently manage its airport security screening performance. A discussion of the survey methodology and statistical analysis is contained in the Supporting Statement at Part B of this application. Using lessons learned from previous survey efforts described below, TSA developed the current customer surveys as outlined below.


Current Survey Content


TSA developed a list of questions for the survey that meet the needs of evaluating key performance elements of TSA’s mission delivery, providing managers with tangible prescriptions for performance improvement, and coinciding with the areas of service that are perceived most relevant to passengers.


TSA is seeking OMB approval for 82 questions, 81 previously approved and 1 new. TSA intends to ask 10 to 15 questions on each survey. The first 10 questions will be standard on every survey at every airport where TSA is conducting the survey. The TSA Federal Security Director (FSD) at each participating airport has the option to select up to five additional questions from the OMB-approved question list; these questions would be questions 11 through 15. The five additional questions that an FSD can add to the standard 10-question survey are meant to provide individual airports with the most relevant information for their environment. For example, relevant information for the environment may deal with wait times at airports with historically long wait times, or with checked baggage screening at airports with checked baggage screening processes that involve customer interaction. All questions on the survey promote a quality response so TSA can identify areas in need of improvement. FSDs may seek guidance from Headquarters about sampling and survey distribution, and are given limits on the individual and cumulative burden on passengers that they are allowed to impose each year.


TSA will also conduct statistical analysis of the results to determine how different areas correlate to overall satisfaction and confidence. TSA will only compare airports based on results from the 10 standard survey questions. Responses from the five, FSD-chosen questions will be for airport information only and will not be used to draw statistical conclusions.


Use of Survey Results


TSA uses the survey results to measure performance and gauge customer satisfaction and confidence with TSA’s aviation security procedures. TSA uses survey results as a basis for changes or improvements of current policies and operations as well as for personnel related issues, including: (1) to support industrial-engineering studies (such as by conducting a survey with several questions about passenger wait and service times to evaluate a change in the checkpoint configuration); or (2) to evaluate process changes (such as to evaluate response to a localized media campaign or other public-relations effort or a reduction in staffing at an airport). More detailed examples of these uses are outlined below.


TSA uses the results to assess its performance with various components related to customer satisfaction and confidence. In particular, it measures passenger perceptions of the courtesy and professionalism of Transportation Security Officers, wait times, thoroughness of screening, and overall satisfaction. The results of these aspects are examined at the aggregate and airport levels, and examined across time. TSA identifies factors and best practices contributing to higher scores and assesses ways to implement those into operational policies and procedures.


TSA uses the survey results to improve TSO training to include new or updated customer service related themes. The survey results are also used to evaluate the effect of policy and procedural changes as they relate to customer satisfaction and confidence. In December 2005, TSA implemented more thorough procedures for secondary screenings. TSA used the survey results along with other data to evaluate customers’ perceptions of these procedural changes and to modify the implementation of these procedures to increase customer satisfaction while maintaining security. Further, TSA uses the survey results to measure the effectiveness of specific programs at reaching the customer base, such as how well TSA communicates with customers using airport signage. After review of survey results, TSA was able to determine that airport signage is widely recognized as an effective means of communicating critical information to passengers over other means, such as public service announcements or printed informational pamphlets.


TSA uses the survey results to assess the impacts of organizational changes. Survey results were also used to help TSA determine the effectiveness of the Security Screening Partnership Pilot Program (PP5) airports as compared to that of airports with a federalized screener workforce. Each of the PP5 airports was surveyed to generate customer satisfaction and confidence data for those airports. These results were then used as the basis for the examination of customer service–one of the three major areas used in the evaluation. The results demonstrated the success of the PP5 airports in providing customer service comparable to that of airports with federalized screeners. This finding, as well as similar findings found in the areas of security and efficiency, demonstrated the program’s success, and allowed the PP5 program to be transformed into the Screener Partnership Program (SPP). All airports using TSA security can now apply to implement a privatized screener workforce under TSA management through the SPP program. Each year these airports are included in the survey program to measure their performance with customer service to ensure that they meet customer satisfaction and confidence standards.


The TSA Office of Strategic Communications and Public Affairs issued press releases to communicate the results of the program to the public. On March 3, 2005, the Department of Homeland Security and TSA issued a press release entitled “Air Travelers Continue to Express High Confidence and Satisfaction In TSA Security and Customer Service.” The accompanying article described the program, methodology, results, and the insight learned and provided question by question and airport scores. In addition, several FSDs participating in the program were able to issue their own local press releases describing the program and the results seen at their airports.


  1. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden. [Effective 03/22/01, your response must SPECIFICALLY reference the Government Paperwork Elimination Act (GPEA), which addresses electronic filing and recordkeeping, and what you are doing to adhere to it. You must explain how you will provide a fully electronic reporting option by October 2003, or an explanation of why this is not practicable.]


Passengers are invited, though not required, to view and complete the survey via an online portal, or by responding in writing to the survey questions on the customer satisfaction card and depositing the card in a drop-box at the airport or using U.S. mail. TSA personnel decide the method by which passengers will be asked to complete and return the survey. Responses will be stored and available for possible reporting measures. The online portal for retaining responses and providing reports supports the initiatives of the Government Paperwork Reduction Act (GPEA).


  1. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purpose(s) described in Item 2 above.


The BTS survey acquires the following types of information: wait time and screening time satisfaction, passenger satisfaction, passenger knowledge of procedures, sources of passenger information, passenger level of confidence in layers of security, and public transportation. While these types of categories are similar to the proposed approach, the BTS survey does not provide the level of granularity needed to provide a detailed assessment of customer satisfaction at the airport-specific level.


Some airport administrations (either local Government or private entities) may conduct customer surveys at airports. TSA does not consider these efforts to be duplicative because they do not specifically address the aspects of TSA performance that the TSA survey will include. We share data with those airport operators conducting their own surveys to the fullest extent possible and seek to include questions on their instruments to reduce overall public burden through the efforts of individual FSDs.


These collections differ from the TSA customer comment card, which is designed to give individual airports frequent customer-initiated feedback. The TSA customer comment card is a vehicle for gathering daily feedback at individual airports from passengers who approach TSA personnel at airports to initiate complaints and compliments.


  1. If the collection of information has a significant impact on a substantial number of small businesses or other small entities (Item 5 of the Paperwork Reduction Act submission form), describe the methods used to minimize burden.


The proposed survey has no impact on small businesses.


  1. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


The collection of data via surveys is essential for TSA to understand its impact on the flying public, and to be able to respond to that impact by improving service, reducing burden on travelers, and improving communication. Given the Congressional mandates to collect this data, it is crucial to TSA’s mission to secure the commercial aviation system while maintaining the highest customer service standards. Moreover, GAO and OMB have concurred with TSA on the importance of this element of our performance measurement system. The results from the survey may be used for annual performance measurement at the surveyed airports, as well as system-wide. The results from the 10 standard, OMB-approved questions on the survey are statistically significant and can be used to draw conclusions about the traveling population as a whole such as customer satisfaction and confidence levels.


  1. Explain any special circumstances that require the collection to be conducted in a manner inconsistent with the general information collection guidelines in 5 CFR 1320.5(d)(2).


There are no circumstances that require the information to be conducted in a manner that is inconsistent with the general information collection guidelines in 5 CFR 1320.5 (d)(2).


  1. Describe efforts to consult persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d) soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


TSA published a 60-day notice in the Federal Register (78 FR 32416; May 30, 2013) and a 30-day notice (78 FR 46594; August 1, 2013), announcing its intent to conduct these surveys. To TSA’s knowledge, no public comments have been received in response to the notice.


TSA collaborated with experts familiar with statistical intercept survey techniques in order to develop the methodology for the formal survey. TSA also engaged a contractor to support its performance measurement efforts since our inception. The contractor helped TSA define the survey. Details about and rationale for our sampling and survey distribution methodology are provided in Supporting Statement Part B.


  1. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


TSA will not provide any payment or gift to survey respondents.


  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


Survey forms will be anonymous; TSA will not solicit specific identifying information. Thus, by design, the survey will ensure confidentiality through anonymity.


  1. Provide additional justification for any questions of sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private.


We propose to ask no such questions.


  1. Provide estimates of hour burden of the collection of information.


All FSDs at airports have the capability to conduct this survey. Based on prior survey data and research, a sample size of 384 needs approximately 1,000 surveys. The TSA assumes that there will be 384 respondents from 1,000 surveys distributed. At an individual airport, we assume the burden on passengers who choose to respond to be approximately five minutes per respondent. Therefore, 384 respondents x 1 airport = 384 respondents a year. It takes approximately 5 minutes for each respondent to complete the survey so the total burden at one airport is 384 respondents x 5 minutes = 1,920 minutes or 32 hours per airport. We estimate that 25 airports will conduct the survey each year. Therefore, 384 respondents x 25 airports = 9,600 respondents a year. Since we assume it takes approximately 5 minutes for each respondent to complete the survey the total burden is 9,600 respondents x 5 minutes = 48,000 minutes, or 800 hours per year.


  1. Provide an estimate of the total annual cost burden to respondents or recordkeepers resulting from the collection of information.


Respondents will incur no direct cost resulting from this data collection.


  1. Provide estimates of annualized cost to the Federal Government. Also, provide a description of the method used to estimate cost, and other expenses that would not have been incurred without this collection of information.


We estimate the Federal Government cost for this data collection to be approximately $400,000 annually. These costs include all direct costs of the survey, costs for research and development, and costs for contractor and technology support to manage the data collection, and produce and analyze the survey data



  1. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I.


TSA is seeking approval of 82 questions, where 81 have previously been approved by OMB and 1 is a new survey question, which requires first time approval from OMB.



  1. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


Public and governmental interest in TSA’s performance in providing excellent customer service is high, and the results of these collections will be of great interest to many parties. There is the potential for TSA to share the survey results externally.


  1. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


TSA is not seeking approval to not display the expiration date.


  1. Explain each exception to the certification statement identified in Item 19, “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I.


TSA seeks no exceptions to the certification statement.


9


File Typeapplication/msword
AuthorKatrina Wawer
Last Modified ByWalsh, Christina A.
File Modified2014-09-25
File Created2014-09-25

© 2024 OMB.report | Privacy Policy