X-59 Flight Community Response Supporting Statement B

X-59 Flight Community Response Supporting Statement B.docx

X-59 Quiet SuperSonic Community Response Survey Preparation

OMB: 2700-0191

Document [docx]
Download: docx | pdf



Support Statement for Information Collection Requirements

X-59 Quiet SuperSonic Community Response Survey Preparation


Part B. Collections of Information Employing Statistical Methods

1. Potential respondent universe

The NASA project team has selected an area in and near Nashville, TN to conduct the Community Response Survey Preparation. The size of the sampling area was defined by the planned sound profile of the X-59, an area approximately 30 nautical miles x 20 nautical miles where the supersonic signature would likely be audible if a supersonic test flight occurred. The location of the sampling area was chosen to be where the public could hear (subsonic) aircraft noise from aircraft operating to/from Nashville International Airport, but sufficiently far from the airport so that the aircraft noise would not be the primary noise source. While the potential universe would include all adults who are in this area during the scheduled times, sampling practicalities led to restricting the frame to people living in households within the defined area.

2. Procedures for the Collection of Information

  • Statistical methodologies for stratification and sample selection

The research plan is to sample from the population using a targeted Address-Based sampling (ABS) approach towards a goal of reaching 500 respondents to complete the background survey and commit to responding to the subsequent surveys. The recruiting strategy uses a Tailored Design Method (Dillman, Smyth and Christian, 2009) approach to reach approximately 5000 randomly sampled household addresses in the targeted area. For households with multiple adults, the web survey will apply Rizzo-Brick-Park method (Rizzo et al., 2004) to randomly select a specific adult from the household as the respondent (see Appendix A).

  • Sample size

As mentioned above, this study targets 500 adults. This sample size should be sufficient to answer the key research questions below. Questions related to response rates will be based on all sampled households and not just those responding, so estimates will have a level of precision based on a sample of about 5,000 initially selected addresses. For a sample of 5,000 households, Table 1 provides the standard errors and half-width 95% confidence intervals for response rates ranging from 10 percent to 25 percent. All the confidence intervals are around 1 percentage point, which gives adequate precision for estimating the response rate for the study.

Table 1. Standard Errors and Half-width Confidence Intervals for Response Rates for a Sample of 5,000 Households.

Response Rate

Standard Error

Half-width 95% confidence Interval

10%

0.4%

0.8%

15%

0.5%

1.0%

20%

0.6%

1.1%

25%

0.6%

1.2%

For questions that address respondent behavior (choosing to use the app, turning off app notifications, working outside the target area), the level of precision offered by a sample with 500 respondents will suffice to determine the protocols for the later Community Response Surveys. Table 2 provides the standard errors and half-width confidence intervals for proportions ranging from 0.1 to 0.5 for 500 respondents and a design effect of 1.2. For example, if the proportion of persons who work outside the target area is estimated to be around 20%, this will have a 95% confidence interval of ±3.8 percentage points. This is adequate precision for planning for the larger Community Response Survey Preparation. Similarly, the precision for other estimates is adequate for planning purposes (for full set of questions to be assessed from the Community Response Survey Preparation, see below).

Table 2. Standard Errors and Half-width Confidence Intervals for Response Rates for a Sample of 500 Respondents.

Survey Estimate

Standard Error

Half-width 95% confidence Interval

10%

1.5%

2.9%

20%

2.0%

3.8%

30%

2.2%

4.4%

40%

2.4%

4.7%

50%

2.4%

4.8%


Research questions about operations require only a small sample to confirm and Westat knows from many previous surveys that the scalability of a sample of 500 to 1,000 will not be an issue. The remaining questions are about respondents’ perceptions and understanding of the instruments and specific questions. These questions are more qualitative where identifying potential issues will prompt careful review regardless of how frequent they occur.

Response Rates

  • What is the response rate for the study overall, and to each of the different surveys?​

Respondent Behavior and Perceptions

  • What proportion of respondents chooses to use the app versus the web?​

  • What proportion turn off app notifications, and does that impact response rate?​

  • How many respondents work outside of the target area?

  • How are categorical attributes such as vibration, rattle and startle related to the annoyance response? ​

  • Respondent Understanding

  • Are respondents able to clearly distinguish between the single event and daily surveys?​

  • Do methods for locating respondents when not at home or work provide the needed data?​

  • How useful are questions on building construction? Do respondents have the knowledge to answer these questions?​

  • Survey Operations

  • How well do operations and the instrument work?​

  • How well do the automated survey data processing methods work?


A larger sample size will likely be necessary for the actual X-59 Community Response Surveys to yield sufficiently precise modeling of dose-response. Current plans are for these sample sizes to be 1,000 adults from each of 5 selected communities.


  • Estimation Procedures and Analysis Model

For the research questions about response rates and respondent behavior, estimates will come from weighted frequencies and associated confidence intervals. The research team will analyze these frequencies and adjust the sampling plan for the Community Response Surveys as warranted.

For questions related to respondent understanding of the instruments and process, the research team will analyze survey responses to look for missing data and for questions with a disproportionate number of “don’t know responses”. Also, Westat will keep a log of respondents who contact the Help Desk by either phone or email that documents the contact reason. Westat will review these logs for any mention of respondents having trouble answering questions or reported confusion about the survey process.

For questions related to survey operations, the research team will debrief programmers, operations managers, and data managers to identify problems, risks, and areas for improvement. The team will also analyze Help Desk logs for any technical issues reported by respondents.


  • Degree of accuracy needed for the purpose described in the justification

As shown above, the proposed sample sizes will yield statistical estimates with 95% confidence intervals of ±5 percentage points, depending on the size of the estimate. This should be adequate for evaluating the Community Response Survey Preparation and making decisions for the larger Community Response Surveys.


  • Unusual problems requiring specialized sampling procedures

We are conducting this Community Response Survey Preparation for later surveys that will use a noise dose-response test. To simulate the later Community Response Surveys, the households included for sampling will be within the estimated boom footprint area across the community, rather than from the community at large.


  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

Due to the need to eventually understand community annoyance levels over a protracted period, the X-59 Community Response Surveys are planned with many flights occurring over a 30-day period. Because survey attrition is an important factor to estimate, the Community Response Survey Preparation will also use this 30-day period. The surveys that occur multiple times per day are as short and simple as possible (see Appendix C) to minimize burden.


3. Maximization of Response Rates, Non-response, and Reliability

To maximize response rate, the survey instruments are accessible by web or smart phone to facilitate ease of access and to be more respondent friendly.

The initial Background survey mailing includes an introductory letter that includes information on how to complete the survey on-line and a prepaid $2 incentive. Additionally, the full survey protocol uses an incentive structure that increases the incentive amount that respondents receive when they complete a higher proportion of requested surveys. Incentives for all levels of participation also increase each week to encourage respondent engagement throughout the study. The incentive structure is shown in the table below.


Week or Item

50% Completion

75% Completion

100% Completion

Week 1 

$25

$35

$45

Week 2 

$30

$40

$50

Week 3 

$35

$45

$55

Week 4 

$40

$50

$60

Subtotal 

$130

$170

$210

Prepaid invitation 

$2

$2

$2

Background Survey 

$10

$10

$10

End of Study Survey

$20

$20

$20

Grand Total 

$162

$202

$242

In addition to initial invitations sent by text, email, and/or app notification, non-responders will receive reminders. For the initial Background survey, an intensive mailing protocol will maximize the response rate for survey recruitment. This protocol is as follows:

  • Week 1 – Mail initial invitation letters (see Appendix F)

  • Week 2 – Mail reminder postcards to sampled addresses (see Appendix G)

  • Week 3 – Mail express invitation letters (see Appendix H), mail initial invitation letters to alternate selected adults (see Appendix I)

  • Week 4 – Mail final invitation letters (see Appendix H), mail postcards to alternate selected adults (see Appendix J)

  • Week 5 – Mail express letters to alternate selected adults (see Appendix K)

  • Week 6 – Mail final invitation letters to alternate selected adults (see Appendix K)

The alternate selected adult applies when someone other than the adult who initially starts the Background survey is selected as the respondent. If this person does not complete the survey soon after they are selected, they will receive additional mailings specifically targeted at them.

4. Tests of Procedures or Methods

Almost all aspects of the Community Response Survey Preparation are designed to test procedures and methods. The general model of conducting Single Event surveys for each sonic boom, Daily Summary surveys and End of Study surveys to measure aggregate annoyance come from previous NASA tests. However, the Community Response Survey Preparation will be using a longer field period and some additional questions such as those about housing construction and being startled by the noise.

The general approach of using ABS with mail invitations to complete a web survey is a standard industry practice. However, the plan to engage respondents for a 30-day period and push real-time survey invitations in response to noise stimulus will require careful testing of communication and incentive strategies. The Community Response Survey Preparation will test all these components and the elements of the technical infrastructure such as web survey programming, secure data transfer, server capacity and reliability, immediate electronic notifications to respondents, and displaying survey data in real time through a User Interface.

5. Statistical Consultation and Information Analysis


The analysis of statistical data and the results of methodological and protocol testing will be conducted by experts at Westat (one of NASA’s subcontractors for the Community Response Survey Preparation).


Methodology:

David Cantor, Ph.D.

Vice President, Associate Director

Westat

301-294-2080

[email protected]


Statistical Analysis:

Jean Opsomer, Ph.D.

Vice President, Associate Director

Westat

301-738-3577

[email protected]



References

Dillman, Don A., Smyth, Jolene D., Christian, Leah Melani. 2009. Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition. John Wiley: Hoboken, NJ.

Fidell, S. et.al, The “Pilot Test of a Novel Method for Assessing Community Response to Low-Amplitude Sonic Booms” NASA/CR-2012-217767

Fidell, Horonjeff, Tabachnick, Clark 2020 “Independent Analyses of Galveston QSF18 Social Survey” NASA/CR–2020-500547 https://ntrs.nasa.gov/citations/20205005471  

Hodgdon, Kathleen K.; Page, Juliet. “Low amplitude sonic boom noise exposure and social survey design.” The Journal of the Acoustical Society of America vol. 133 issue 5 May 2013. p. 3368-3368. DOI: 10.1121/1.4805768. ISSN: 0001-4966.

Miller, N., Czech, J., Hellauer, K., Nicholas, B., Lohr, S., Jodts, E., Broene, P., Morganstein, D., Kali, J., Zhu, X., Cantor, D., Hudnall, J., & Melia, K. (2021). Analysis of the Neighborhood Environmental Survey. For Federal Aviation Administration. https://www.airporttech.tc.faa.gov/Products/Airport-Safety-Papers-Publications/Airport-Safety-Detail/ArtMID/3682/ArticleID/2845/Analysis-of-NES.

Page, J.A, Downs, R. 2017. “Sonic boom weather analysis of the F-18 low boom dive maneuver”, J. Acoust Soc. Am, Paper 2pNSb8, 173rd Meeting of the Acoustic Society of America, June.

Page, J., Hodgdon, K., Hunte, R., Davis, D., Gaugler, T., Downs, R., Cowart, R., Maglieri, D., Hobbs, C., Baker, G., Collmar, M., Bradley, K., Sonak, B., Crom, D. & Cutler, C. (2020). Quiet Supersonic Flights 2018 (QSF18) Test: Galveston, Texas Risk Reduction for Future Community Testing with a Low-Boom Flight Demonstration Vehicle. (Contractor Report No. CR–2020-220589). National Aeronautics and Space Administration https://ntrs.nasa.gov/citations/20200003223. Appendices/Volume 2 available at https://ntrs.nasa.gov/citations/20200003224  

Page, J.A., Hobbs, C.M., Plotkin, K.J., 2013. “Waveform and Sonicboom Perception and Response (WSPR) Program: Low Amplitude Sonic Booms over Small and Large Communities”, Proceedings of Meetings on Acoustics, Vol. 19, 040042.

Page, J.A., Hodgdon, K.K., Hobbs, C., Wilmer, C., Krecker, P., Cowart, R., Gaugler, T., Shumway, D., Rosenberger, J., Phillips, D. (2012). “Waveforms and Sonic Boom Perception and Response (WSPR) Program Final Report, Low-Boom Community Response Program Pilot Test Design, Execution and Analysis,” Wyle Research Report WR 12-15, NASA Contractor Report-2014-218180, March 2014.

Rizzo, L., Brick, J., & Park, I. (2004). A Minimally Intrusive Method for Sampling Persons in Random Digit Dial Surveys. Public Opinion Quarterly, 68(2), 267–274. 

5


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupport Statement for Information Collection Requirements
AuthorHodgdon
File Modified0000-00-00
File Created2022-07-12

© 2024 OMB.report | Privacy Policy