X-59 Flight Community Response Supporting Statement A

X-59 Flight Community Response Supporting Statement A.docx

X-59 Quiet SuperSonic Community Response Survey Preparation

OMB: 2700-0191

Document [docx]
Download: docx | pdf





Supporting Statement for Information Collection Requirements

X-59 Quiet SuperSonic Community Response Survey Preparation





Part A. Justification

1. Need for the Information Collection: Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.

Supersonic passenger flight over land is currently restricted in the U.S. and many countries because sonic booms have been known to disturb people on the ground. There is a potential for a change in federal and international regulations if supersonic flight can occur at acceptably low noise levels. NASA is preparing a series of Community Response Surveys coupled with research flights to gather data on the public acceptability of low noise supersonic flight.

Prior to the Community Response Surveys, NASA will conduct a check of the overall survey process without accompanying flights (Community Response Survey Preparation). This is necessary to minimize the risk of problems or errors with the actual Community Response Surveys, which will involve coordinating efforts with preparing and scheduling flights of the X-59 aircraft.

NASA has supported two prior field tests to evaluate data collection methods for community response to low noise supersonic flight; one test was at Edwards Air Force Base, California in 2011 and the second was the Quiet Supersonic Flights 2018 (QSF18) study in Galveston, Texas. The findings from these prior tests were not intended for gathering data supporting regulatory changes but to provide lessons learned in the survey methodology that will be employed in this study.

After the Community Response Survey Preparation, NASA plans to conduct up to five Community Response Surveys (CRSs) in different areas of the contiguous U.S. Each CRS will have a maximum of 113 responses (“activities”) per respondent, spread across a 30-day period. Some responses are collected up to six times per day, while other responses are collected once per day.



2. Use of this Information: Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.

NASA will use the outcomes from the Community Response Survey Preparation to refine the design of the later X-59 Community Response Surveys (CRSs), including the survey processes and materials.

The Community Response Survey Preparation is aimed at providing answers to the following questions:

  • What is the response rate for the study overall, and to each of the different surveys?​

  • What proportion of respondents chooses to use the app versus the web?​

  • What proportion turns off app notifications, and does that impact response rate?​

  • How many respondents work outside of the target area?​

  • Are respondents able to clearly distinguish between the single event and daily surveys?​

  • Do methods for locating respondents when not at home or work provide the needed data?​

  • How useful are questions on building construction? Do respondents have the knowledge to answer these questions?​

  • How well do operations and the instrument work?​

  • How well do the automated survey data processing methods work?

  • How are categorical attributes such as vibration, rattle and startle related to the annoyance response?

  1. Use of Information Technology: Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.

The Community Response Survey Preparation will use web-based surveys to collect respondent data. The web surveys will be compatible with mobile devices and will also be offered through a customized app. Invitations and reminders to complete the web surveys will be sent by in-app notifications, email, and text message (depending on the respondent’s preference and permissions). The web survey also includes a mapping application that provides a visual map that allows the respondent to provide their location.

The decision to use web-based surveys and electronic communications will accommodate the need to begin sending survey invitations and begin collecting survey data soon after designated times (simulating future X-59 supersonic passes) to allow collection of annoyance levels as soon as possible after each designated time. The use of technology also reduces participant response burden as they are afforded the opportunity to use their privately owned, readily available device to respond at their convenience through user-friendly online instruments (see Appendices A-E).

4. Efforts to Identify Duplication: Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.

While other surveys [e.g., the Federal Aviation Administration’s recently published Neighborhood Environmental Survey (NES)] have measured annoyance in response to overall aircraft noise, they did not use the specific survey design of the Community Response Survey Preparation and thus offer very little to inform the design of the X-59 Community Response Surveys (CRSs).

The Community Response Survey Preparation provides information that also cannot be found from prior NASA field tests. For example, because the prior tests were much shorter in duration, they cannot be used to estimate response rates for the X-59 CRSs. Also, the target areas in prior tests were much smaller and cannot be used to estimate how many respondents work outside of the target area. Finally, the recruitment methods, survey interfaces, and survey data processing methods differed in past tests.

The Community Response Survey Preparation will ask participants about general aircraft noise rather than sonic booms because the X-59 is not yet operational. Other than this wording change and the absence of X-59 overflights, all other aspects of the survey will be identical to what is currently planned for the X-59 CRSs.

5. Burden on Small Business: If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.

Collection of this information does not have a significant impact on small businesses.

6. Consequences of Not Collecting the Information: Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

This information is not scheduled to be collected by any other agency or program. Failure to collect data from the Community Response Survey Preparation will introduce substantial risks that later X-59 Community Response Surveys will not collect sufficient data.

7. Special Circumstances: Explain any special circumstances that would cause an information collection to be conducted in a manner:

REQUIRING RESPONDENTS TO REPORT INFORMATION TO THE AGENCY MORE OFTEN THAN QUARTERLY;

No participant will be asked to provide information more often than quarterly. All data from respondents will be collected within the span of one quarter.

REQUIRING RESPONDENTS TO PREPARE A WRITTEN RESPONSE TO A COLLECTION OF INFORMATION IN FEWER THAN 30 DAYS AFTER RECEIPT OF IT;

No participant will be asked to prepare written responses to the collection of information.

REQUIRING RESPONDENTS TO SUBMIT MORE THAN AN ORIGINAL AND TWO COPIES OF ANY DOCUMENT;

No participant will be asked to submit more than the original copy of any document.

REQUIRING RESPONDENTS TO RETAIN RECORDS, OTHER THAN HEALTH, MEDICAL, GOVERNMENT CONTRACT, GRANT-IN-AID, OR TAX RECORDS FOR MORE THAN THREE YEARS;

No participant will be asked to retain records for more than three years.

IN CONNECTION WITH A STATISTICAL SURVEY, THAT IS NOT DESIGNED TO PRODUCE VALID AND RELIABLE RESULTS THAT CAN BE GENERALIZED TO THE UNIVERSE OF STUDY;

No invalid statistical survey is anticipated.

REQUIRING THE USE OF A STATISTICAL DATA CLASSIFICATION THAT HAS NOT BEEN REVIEWED AND APPROVED BY OMB;

No unapproved data classification activities are anticipated.

THAT INCLUDES A PLEDGE OF CONFIDENTIALITY THAT IS NOT SUPPORTED BY AUTHORITY ESTABLISHED IN STATUE OR REGULATION, THAT IS NOT SUPPORTED BY DISCLOSURE AND DATA SECURITY POLICIES THAT ARE CONSISTENT WITH THE PLEDGE, OR WHICH UNNECESSARILY IMPEDES SHARING OF DATA WITH OTHER AGENCIES FOR COMPATIBLE CONFIDENTIAL USE; OR

All pledges are supported by the authority established in statute or regulation.

REQUIRING RESPONDENTS TO SUBMIT PROPRIETARY TRADE SECRET, OR OTHER CONFIDENTIAL INFORMATION UNLESS THE AGENCY CAN DEMONSTRATE THAT IT HAS INSTITUTED PROCEDURES TO PROTECT THE INFORMATION'S CONFIDENTIALITY TO THE EXTENT PERMITTED BY LAW.

No trade secrets or items of similar confidential information will be requested.

8. Consultation and Public Comments: If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.

60-day FRN: Federal Register Volume 87, Number 064, on 4/4/2022. No comments were received.

30-day FRN: Federal Register Volume 87, Number 130, on 7/8/2022.

  1. Payments to Respondents: Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.

The Recruitment Mailing includes an invitation letter with a $2 token incentive sent to all potential recruits. The use of a prepaid token is recommended in the Tailored Design Method (Dillman, Smyth and Christian, 2009) recruiting strategy which uses a targeted Address Based Sampling (ABS) approach. A small pre-incentive of $2 can increase response rates by 10 to 15 percent.

Once recruited, participants will be offered additional incentives to complete the single event and daily summary surveys. Because each recruited respondent is valuable and irreplaceable for the study and because maximizing the number of survey responses from each participant will enhance the value and utility of the resulting data, this study will use escalating incentives to encourage participants to remain in the study through its completion and to respond to a high percentage of requested surveys. The incentive structure is shown in the table below:

Week or Item

50% Completion

75% Completion

100% Completion

Week 1 

$25

$35

$45

Week 2 

$30

$40

$50

Week 3 

$35

$45

$55

Week 4 

$40

$50

$60

Subtotal 

$130

$170

$210

Prepaid invitation 

$2

$2

$2

Background Survey 

$10

$10

$10

End of Study Survey

$20

$20

$20

Grand Total 

$162

$202

$242

  1. Assurance of Confidentiality: Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

The survey will conform to the practices as approved by the Institutional Review Board at Westat (NASA’s data collection contractor). Each survey respondent will be told that their responses are voluntary, and their identities will not be associated with their responses. As such, their responses are treated as confidential. All individuals who participate will be assigned a unique identification number that will be associated with their survey responses.

All survey responses will be merged into a single data set that will allow for detailed analysis. All personally identifiable information will be removed from the data and will only be linked by case ID. The crosswalk linking the ID numbers with PII will be kept secure at Westat and not shared with other organizations. Westat will delete the crosswalk with all PII shortly after the conclusion of the study and distribution of all incentives.

11. Sensitive Questions: Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

There are no questions of a sensitive nature in any of the information collection protocols.

12. Respondent Burden Hours and Labor Costs: Provide estimates of the hour burden of the collection of information.

Number of Respondents

Maximum number of responses per respondent*

Maximum number of total responses

Average time per response**

Maximum total burden hours

500

113

56,500

2 minutes

1,883 hours


* 1-7 times per day over a 30-day period, with a maximum of 113 responses per month.

** Some portions of the survey – such as the initial screening – will take longer than 2 minutes to complete. However, those sections will be completed only once per respondent and all other portions of the survey which make up the vast majority of the 113 responses will take 2 minutes or less to complete. As such, 2 minutes per response is an average over the “life” of the respondent’s participation.

Using Federal minimum wage of $7.25/hour, the cost per response is $0.24 and the total cost per respondent (assuming they complete the maximum number of responses) is $27.31. The total cost burden for all respondents is $13,654.17.

13. Estimates of Cost Burden to the Respondent for Collection of Information: Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).

No additional cost burden will be imposed on respondents aside from the labor cost of the burden hours shown above.

14. Cost to the Federal Government: Provide estimates of annualized costs to the Federal government.

The estimated cost to the Federal government for the contractor team for the Community Response Survey Preparation is approximately $629,000.

15. Changes in Burden: Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I.

This is a new information collection. No change in the burden is anticipated.

16. Publication of Results: For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.

Findings of the Community Response Survey Preparation will be disseminated in appropriate professional conferences and refereed journals.

17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

Not applicable. This research will display the expiration date for OMB approval of this information collection on the background survey/consent document.

18. Explain each exception to the certification statement identified in Item 19, "Certification for Paperwork Reduction Act Submissions," of OMB Form 83-I.

Not applicable. There are no exceptions to the certification statement.



References

Dillman, Don A., Smyth, Jolene D., Christian, Leah Melani. 2009. Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition. John Wiley: Hoboken, NJ.

Fidell, S. et.al, The “Pilot Test of a Novel Method for Assessing Community Response to Low-Amplitude Sonic Booms” NASA/CR-2012-217767

Fidell, Horonjeff, Tabachnick, Clark 2020 “Independent Analyses of Galveston QSF18 Social Survey” NASA/CR–2020-500547 https://ntrs.nasa.gov/citations/20205005471  

Miller, N., Czech, J., Hellauer, K., Nicholas, B., Lohr, S., Jodts, E., Broene, P., Morganstein, D., Kali, J., Zhu, X., Cantor, D., Hudnall, J., & Melia, K. (2021). Analysis of the Neighborhood Environmental Survey. For Federal Aviation Administration. https://www.airporttech.tc.faa.gov/Products/Airport-Safety-Papers-Publications/Airport-Safety-Detail/ArtMID/3682/ArticleID/2845/Analysis-of-NES.

Page, J.A, Downs, R. 2017. “Sonic boom weather analysis of the F-18 low boom dive maneuver”, J. Acoust Soc. Am, Paper 2pNSb8, 173rd Meeting of the Acoustic Society of America, June.

Page, J., Hodgdon, K., Hunte, R., Davis, D., Gaugler, T., Downs, R., Cowart, R., Maglieri, D., Hobbs, C., Baker, G., Collmar, M., Bradley, K., Sonak, B., Crom, D. & Cutler, C. (2020). Quiet Supersonic Flights 2018 (QSF18) Test: Galveston, Texas Risk Reduction for Future Community Testing with a Low-Boom Flight Demonstration Vehicle. (Contractor Report No. CR–2020-220589). National Aeronautics and Space Administration https://ntrs.nasa.gov/citations/20200003223. Appendices/Volume 2 available at https://ntrs.nasa.gov/citations/20200003224  

Page, J.A., Hobbs, C.M., Plotkin, K.J., 2013. “Waveform and Sonicboom Perception and Response (WSPR) Program: Low Amplitude Sonic Booms over Small and Large Communities”, Proceedings of Meetings on Acoustics, Vol. 19, 040042.

Page, J.A., Hodgdon, K.K., Hobbs, C., Wilmer, C., Krecker, P., Cowart, R., Gaugler, T., Shumway, D., Rosenberger, J., Phillips, D. (2012). “Waveforms and Sonic Boom Perception and Response (WSPR) Program Final Report, Low-Boom Community Response Program Pilot Test Design, Execution and Analysis,” Wyle Research Report WR 12-15, NASA Contractor Report-2014-218180, March 2014.

Rizzo, L., Brick, J., & Park, I. (2004). A Minimally Intrusive Method for Sampling Persons in Random Digit Dial Surveys. Public Opinion Quarterly, 68(2), 267–274. 







5


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupport Statement for Information Collection Requirements
AuthorHodgdon
File Modified0000-00-00
File Created2022-07-12

© 2024 OMB.report | Privacy Policy