Renewal

Renewal OMB Package with Supporting Statements and Appendices.pdf

Survey on Usage and Functionality of Smoke Alarms and Carbon Monoxide Alarms (SCOA) in US Households

Renewal

OMB: 3041-0180

Document [pdf]
Download: pdf | pdf
INFORMATION COLLECTION REQUEST (ICR):
OMB supporting statement and
privacy impact assessment for:
Survey on Usage and Functionality of Smoke
Alarms and Carbon Monoxide Alarms (SCOA)
in US Households.
Renewal

June 19, 2021

Table of Contents
OMB Supporting Statement Part A and Part B ........................................................................................... 2
A. JUSTIFICATION ...................................................................................................................................... 2
B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS .............................................. 20
Appendix A: Door Hanger Literature to Distribute to Households .......................................................... 29
Appendix B: In-Home Informed Consent Form ......................................................................................... 30
Appendix C: Waiver, Release and Hold Harmless Agreement.................................................................. 31
Appendix D: Thank You Email For Participating in In-Home Interview .................................................... 32
Appendix E: Extended Explanation of Study ............................................................................................. 33
Appendix F: Frequently Asked Questions (FAQs) ..................................................................................... 34
Appendix G: In-Home Survey – Consumer Product Safety Commission (CPSC) Survey on Usage and
Functionality of Smoke Alarms and Carbon Monoxide Alarms in Households ....................................... 36
Appendix H: Abbreviated Survey for Participants without Detectors or with Detectors Connected to a
Security Alarm System ............................................................................................................................... 74
Appendix I: Newspaper Advertisement/Newsletter (long and short form ads)..................................... 87
Appendix J: EurekaFacts Press Release/ EF website and LinkedIn .......................................................... 88
Appendix K: Social Media Post/Google Ad/ Craigslist Ad ........................................................................ 89
Appendix L: EurekaFacts Website – Landing Page Text ........................................................................... 90

Prepared in compliance with ISO 20252 International Quality Standard for Market, Public
Opinion and Social Research

EurekaFacts, LLC

January

Page 1

OMB Supporting Statement Part A and Part B
A. JUSTIFICATION
A.1. Circumstances Making the Collection of Information Necessary
This is a request for the implementation of a national in-home survey to estimate usage, user
hazard perception, and functionality of the smoke and carbon monoxide (CO) alarms in US
households. This would be accomplished through the administration of the Survey on Usage and
Functionality of Smoke Alarms and Carbon Monoxide Alarms in US Household, hereby referred
as the SCOA survey. This data collection effort will provide an updated national estimate of
operability of smoke alarms and carbon monoxide alarms based on direct observation. This data
will allow for better targeting of policy, messaging, and interventions to improve the operability
rate of smoke and CO alarms, as well as inform the Consumer Product Safety Commission
(CPSC) of recommendations to state/local jurisdictions related to codes, standards, and/or
regulations of smoke and CO alarms.
In 1992, the Consumer Product Safety Commission (CPSC) sponsored a national in-home survey
to collect information on the number of residential smoke alarms in actual use in homes and to
evaluate the operability of the sampled alarms. The results were published in the 1994 report,
Consumer Product Safety Commission Smoke Detector Operability Survey Report On
Findings 1, which turned 25 years old in 2017. Although the survey results were instrumental for
many years in developing codes and standards related to smoke alarms, subsequent changes in
technology, installation codes, and state/local ordinances have rendered the information outdated
and less effective, and therefore less applicable. Given the changes in technology and state/local
regulations, the increased use of CO alarms, and the value of the past study, CPSC seeks to
collect new data related to smoke and CO alarm use and operability.
Two organizations, National Fire Protection Association (NFPA) and Vision 20/20, have
expressed the need and benefits of repeating the CPSC 1992 survey. The NFPA publishes a
periodical report, Smoke Alarms in U.S. Homes Fires 2, which provides the latest information
about smoke alarms in home fires. The report recognizes the importance of the 1992 study. The
report states, “This study is the gold standard for smoke alarm research. The most complete
study of smoke alarm presence and operational status in the general population was done by the
U.S. Consumer Product Safety Commission's (CPSC’s) National Smoke Detector Project in
1992.” The report points out the key aspect between the CPSC study and other recent studies “This [CPSC] project surveyed the general population, not just high-risk groups or people who
had fires.” More recent studies by other groups have usually been combined with smoke alarm
installation programs and typically target high-risk groups, rather than the general population.
The NFPA still sees the importance of the survey even though the information may be outdated.
The Institution of Fire Engineers US Branch has established a steering committee, Vision 20/20,
comprised of noted fire service and related agency leaders to guide a national strategic planning
process for the fire loss prevention that results in a national plan that will coordinate activities
Charles L. Smith, Smoke Detector Operability Survey – Report on Findings, (Bethesda, MD: U.S. CPSC, November
1993).
2
Marty Ahrens, Smoke Alarms in U.S. Home Fires, Quincy, (MA: NFPA, September 2015).
1

EurekaFacts, LLC

June 2021

Page 2

and fire prevention efforts. In March 2015, Vision 20/20 hosted a one-day Smoke Alarm Summit
at Johns Hopkins Bloomberg School of Public Health that included representatives from
different stakeholder groups such as the fire service, academia, government, non-profit, and
private sector organizations convened on the summit to develop consensus recommendations on:
1. Evidence-based and evidence-informed policy and practice interventions that will increase the
installation and maintenance of smoke alarms in all homes in the United States
2. High priority research gaps that need to be addressed
3. Next steps to ensure that the findings from this meeting inform policy and practice
The findings from the report, Evidence Informing Action: Consensus Priorities to Increase the
use of Smoke Alarms in U.S. Homes, 3 identified the next steps and priorities for a national effort
to increase the installation and maintenance of smoke alarms that were obtained from experts
who presented at the Summit and respondents who provided feedback during and after the
Summit. The number one priority was, “1. Conduct a national census (or representative sample
in-home survey) on the prevalence and characteristics of smoke alarms.” The experts at the
summit all agreed that an updated national survey needs to be conducted to develop a national
effort to increase the installation and maintenance of smoke alarms in the US.
A.2. Purpose and Use of the Information Collection
The purpose of the SCOA survey is to collect data that will assist CPSC with better estimation of
the number and types of smoke and CO alarms installed in US households, the proportion of
working smoke and CO alarms, the characteristics of residences and residents where the smoke
and CO alarms are not operational, perceptions of residents related to the cause of “false” alarms
or causes of faulty alarms, consumer hazard awareness, and consumer behavior related to alarm
use and smoke and CO hazards.
The information collected from this survey will allow CPSC to provide an updated national
estimate of operability of smoke alarms and CO alarms based on direct observation. It will also
allow us to create a demographic profile of groups that do not have operable smoke alarms
and/or CO alarms. This includes measures from the perspective of household members lacking
operable alarms as to why they lack functional alarms. This will allow for better targeting of
policy, messaging and interventions to improve the operability rate of these alarms. It will also
provide insights as to the kinds of alarms that are present to determine whether one variety or
another is more likely to be inoperable as well as provide some measure as to the age of alarms
in households. Results of the survey will inform CPSC of recommendations to state/local
jurisdictions related to codes, standards, and/or regulations of smoke and CO alarms. The
information can help improve the voluntary standard for carbon monoxide alarm, UL 2034 4, and
guide state and local jurisdictions for the use and installation of CO alarms. While the installation
Johns Hopkins Center for Injury Research and Policy, Evidence Informing Action: Consensus Priorities to Increase
the use of Smoke Alarms in U.S. Homes (Warrenton, VA: National Smoke Alarm Submit, 2015).
4
Underwriter Laboratories, “Standard for Single and Multiple Station Carbon Monoxide Alarms,” Standard 2034,
Edition 4, March 31, 2017. https://www.shopulstandards.com/ProductDetail.aspx?UniqueKey=32610
3

EurekaFacts, LLC

June 2021

Page 3

codes for the two products, especially as required by states or local jurisdictions, are different, it
was determined that the information collection regarding these two products could be combined
in one survey as a means of optimizing resources and reducing burden.
A.2.1. Description of Survey
The SCOA survey seeks to collect information from 1,185 households ∗ within the United States.
The survey will be conducted only through face-to-face in-home interviews. Since previous
research showed that self-reporting surveys on use and functionality of smoke alarms provided
overestimated results of smoke alarms operability, CPSC identified the need to conduct in-home
direct testing and examination of smoke alarms, in addition to conducting data collection through
traditional survey questions.
Households will be recruited to participate at their front door. If the head-of-household is
interested in participating they will be immediately screened. In accordance with CDC
guidelines, the interviewer will ask a series of questions to ensure that no one in the household
has COVID-19, symptoms of COVID-19, or are currently quarantining because of COVID-19. If
respondents clear all questions, the rest of the screening questions would be asked. This ensures
a safe environment for the research team and the members of the household.
During the screening process, if the respondent indicates they have a smoke alarm that is not
connected to a central or security alarm, and thus allows a direct testing of the alarms, the
respondents will be eligible for the full-length in-home interview. However, if the smoke alarm
cannot be tested because the household does not have an alarm installed or if the alarms are
connected to a central alarm system that will notify the police or fire department, then the
respondent will only be eligible for a shortened version of the survey. This shortened version
consists of a subset of survey questions about safety attitudes and demographics. CPSC’s
Contractor—EurekaFacts, a market and social sciences research company—will conduct all the
tasks related to design, administration, fielding, analysis and reporting of the survey.
This survey will allow CPSC to better assess the next steps and priorities to increase installation
and maintenance of smoke and CO alarms for the general population by understanding their level
of awareness, perceptions, and demographics. The survey items will also help inform CPSC of
recommendations to provide state/local jurisdictions related to codes and standards.
The SCOA survey will provide the only source of data available to answer the following
research questions:
• What proportion and number of households have smoke and/or carbon monoxide (CO)
alarms installed in their home? Of these households with alarms, what proportion and
number have an operational alarm?
• What proportion and number of respondents perceive their home as safe? Does the
availability of smoke or CO alarms influence their sense of safety? For what reasons do
respondents not have alarms installed?

∗

1,185 in-home surveys include 1,055 in the main survey and 130 in the pilot survey

EurekaFacts, LLC

June 2021

Page 4

•
•
•
•
•

Does the characteristics of a respondent’s residence affect the availability or operability
of smoke or CO alarms? Does the characteristics of residency characteristics affect fire
and CO risks?
What proportion and number of respondents are aware of how to maintain and test their
fire and/or CO alarms? Of these respondents, what methods, if any, do they use to
maintain and test their alarms?
Are there behaviors or activities, if any, that impact respondents either having alarms in
their home and/or having functioning alarms in their home?
What proportion of respondents seek out information about fire and CO safety? Of these
respondents, what resources do they use to seek out information about fire and CO
safety?
What, if any, demographics demonstrate a relationship between respondents’ possession
of fire or CO alarms and their risk of fire and/or CO incidents?

The table below shows how survey items will aid in answering the research questions and what
type of information it will provide. 5
Table 1. Question Mapping of Survey Instrument to Research Purpose

Research Question
What proportion and number of
respondents have smoke and/or
carbon monoxide (CO) alarms
installed in their home?

Corresponding Survey
Item(s)
4a-4c, 5a-5c, 11a-11d, 14a14d, 15a, 15b, 19a-19d, 20,
22a-22b, 25, 26-1a-26-1aa,
30, 32

Purpose of Collected
Information
The results will provide insight
into the prevalence of alarms in
respondents’ homes, identify
the types of alarms installed,
and determine how many, if
any, alarms are operational.
Conversely, these items will
also aid in revealing the
proportion of the residents who
do not have alarms in their
home and help uncover the
reasons why.

4d, 5d, 20, 29, 30, 31, 32

This information will help
understand how respondents
personally define “safety” and
how this perception influences

Of these respondents with
alarms, what proportion and
number have an operational
alarm?

What proportion and number of
respondents perceive their home
as safe? Does the availability of
smoke or CO alarms influence

The terminology “smoke alarms” and “CO alarms” is used in technical codes and standards to describe devices
that incorporate a sensing component (detector) and an audible component (alarm). It was determined through
cognitive testing that “smoke detector” and CO detector” has a higher consumer understandability for smoke
alarms and CO alarms. The instrument incorporates the terminology “smoke detector” and CO detector” but in this
document the terminology smoke alarm, CO alarm, or alarms (both units) will be used.

5

EurekaFacts, LLC

June 2021

Page 5

their sense of safety? For what
reasons do respondents not have
alarms installed?

whether or not they have alarms
installed within their homes.

Do the characteristics of a
respondent’s residence affect the
availability or operability of
smoke or CO alarms? Do the
characteristics of residency
characteristics affect fire and CO
risks?

1a, 4a – 4c, 5a – 5c, 6, 7, 8,
9a – 9c, 25, 27, 28

What proportion and number of
respondents are aware of how to
maintain and test their smoke
and/or CO alarms?
Of these respondents, what
methods, if any, do they use to
maintain and test their alarms?

10a – 10c, 11a – 11d, 12, 13, These questions help
18a – 18b, 19a – 19d, 21, 23 understand whether or not
people have the knowledge and
ability to test and maintain their
smoke and/or CO alarms and
the types of methods used.
This can inform CPSC of the
type of information that needs
to be dispersed.

Are there behaviors or activities,
if any, that impact respondents
either having alarms in their
home and/or having functioning
alarms in their home?

33a – 33d, 35

This information is important
as it will help understand the
relationship between how
respondents behave and what
activities they engage in that
may influence the likelihood of
having alarms in their home
such as their cooking behaviors
of using a stove or oven.

What proportion of respondents
seek out information about fire
and CO safety?
Of these respondents, what
resources do they use to seek out
information about fire and CO
safety?

34a – 34c

This information will assist
CPSC with addressing the best
types of resources to disperse
information about fire and CO
safety.

EurekaFacts, LLC

June 2021

The results will provide insight
into if the resident owns or
rents the home, duration of
residency, and the age of the
household. These items will
shed light on if there is a
relationship between the
characteristics of a respondent’s
home and their status of having
alarms such as having an
attached garage unit if they live
in a single family detached
house.

Page 6

What, if any, demographics
demonstrate a relationship
between respondents’ possession
of fire or CO alarms and their
risk of fire and/or CO incidents?

36 - 44

This will help provide insight
into the relationship between
respondent demographics and
their risk of fire or CO
incidents. This will also shed
light as to their status of
whether or not they have a
smoke or CO alarm(s).

A.2.2. Survey Administration Procedures
Originally, randomly selected households within the randomly selected tracts were contacted in
advance via a mailed pre-notification letter. Households were then called to be screened to
determine their eligibility for either an in-home or telephone interview and scheduled for a
relevant type of administration mode. The initially approved OMB methodology yielded a
response rate of less than one quarter of 1% (only 0.23%) during recruitment efforts in two
metropolitan areas. OMB approved revisions were made to the screening instrument to raise the
appeal, urgency and information on the public benefit of the study, along with streamlining of
language for greater efficiency in screening potential participants. Following these revisions to
the recruitment efforts and their implementation, the response rate results were unchanged and
remained inadequate in meeting the schedule and the current contract with CPSC.
In the fall of 2019, EurekaFacts submitted and was approved by OMB to redesign the
recruitment effort as a random walk door-to-door knocking sample methodology. To maintain
the structure of the original recruitment procedure, field teams will first distribute door hangers
as a pre-notification that researchers will be knocking on doors asking for participation in a
survey. This provides households a distinctive piece of literature with vital information about the
study and sources to seek out more information. A map of the tract will be marked where the
door hangers were left, so field interviewers can follow the same path to recruit from those
households a few days later.
The recruitment, screening, and in-home survey will be conducted by a qualified two-member
team (this may consist of fire inspectors, fire educators, firefighter from a local fire department,
survey research professionals, or other qualified individuals with either fire safety or research
experience from the local area). The field teams will be made up of local partners who
understand and can gain the trust of the local community. Both members will present their
government-issued IDs and their official badges (either representing the company they work for
or badges designed by EurekaFacts for the purpose of the study) to confirm their identity and
legitimacy. The team will carry with them a letter printed on official letterhead with
endorsements from the local fire department and CPSC, should they be needed. If the home is in
an apartment building or condominium, prior permission will be obtained from the property
manager to proceed with the in-home survey administration. A consent form will be provided to
the participant to explain the purpose, the statement of confidentiality, and the benefits and
potential risks of the study.
Following the entrance, the survey professional will begin to administer the questions based on
the respondent’s residence type, and smoke and CO alarms availability and functionality. Once
EurekaFacts, LLC

June 2021

Page 7

the survey professional finishes asking questions about the smoke and CO alarms, the survey
team will move on to examine the smoke and CO alarms in the residence. The fire alarm
inspector will then identify, test, and examine the alarms to determine different variables such as
their operability, energy source, their type, and age. After examining each alarm, the survey
team, and resident, will repeat the testing procedure on another alarm (if applicable). Due to the
time constraint of the survey, not all alarms in a home can usually be inspected. The survey team
will coordinate with the participant to test a reasonable number of alarms in as varied of
locations as possible within the time constraint of the survey.
If the alarm or alarms are found to be faulty, the resident will have the option of either receiving
a new alarm, receiving new batteries, or having no action taken at all if the respondent chooses
not to have the alarm fixed or replaced. In all cases, respondents will sign a waiver indicating
whether they refuse, or any other course of action taken during the in-home administration.
Once the administration is complete and the final set of demographic questions is administered,
the survey professional will offer the participant the monetary incentive for their completion of
the survey.
EurekaFacts will work with on-the-ground partners to take all necessary COVID-19 precautions
and procedures in accordance with local and federal guidelines throughout the duration of the
survey. This includes working with partners to be sure all guidelines are being implemented,
including wearing masks, using hand sanitizer, maintaining social distancing and regularly
checking the health and wellness of all those involved in the study. EurekaFacts will coordinate
training of field workers to apply these principles in the field and provide the needed personal
protective equipment (PPE).
EurekaFacts will provide masks for all field workers and extra masks to give to participants that
do not have one on hand. Field teams will be instructed to maintain a 6-foot distance when
screening heads of households at the door and when interviewing them in their house. Field
teams will each be given hand sanitizer to use periodically throughout the day as well as
disinfecting wipes for tablet surfaces.
A.2.3 Audiences of Data and Results
The designated CPSC Contracting Officer’s Representative (COR) and assigned CPSC staff will
be the primary audience of the data and results. A summary report of aggregated results will be
presented that encompasses all phases and methods employed in the study and will present a
comprehensive description to help inform the agency of the number and types of smoke alarms
and CO alarms installed in households, the characteristics of residences and residents where the
smoke and CO alarms are not working, perceptions related to the cause of “false” alarms or
causes of faulty alarms, and resident alarm maintenance habits. In addition to the summary
report, a PowerPoint presentation, raw data, a table of univariate results, and various data
analysis documentation will also be delivered electronically to the primary audience identified
above.
A.2.4 Methods of Dissemination
EurekaFacts, LLC

June 2021

Page 8

The contractor’s final report will be made available to the public after the draft report has been
reviewed and approved by the CPSC’s COR and assigned CPSC staff.
The final report will be released by the Commission by disseminating the report on the agency’s
website and presentations at meetings and conferences related to the subject matter. The
procedures to disseminate the information by the Commission, its staff, agents and
representatives will be accordance with the law and Commission policy to ensure the
information is accurate and not misleading.
In order to encourage dissemination of the findings, the report will be freely accessible on
cpsc.gov. The work was prepared in the course of the author's official contracting duties with
CPSC, thus Title 17 U.S.C. Section 105 provides that there can be no copyright in a United
States government publication.
A.3. Use of Improved Information Technology (IT) and Burden Reduction
In order to minimize respondent burden, the respondents that do not have smoke alarms installed
or have a central alarm system, and thus are not eligible for the full-length interview which
includes alarm inspection and testing, will participate in the shorter version of the survey and
answer only a portion of questions. All data from the in-home interviews, both full-length and
short, will be collected using a tablet computer. Both versions of the survey instrument will be
programmed into a singular programmed survey using Qualtrics software and will be
administered via tablet, with the interviewer reading the questions to the participant. Qualtrics is
programmed with the appropriate question skipping patterns to ensure that interviewers only ask
each respondent survey items appropriate for the respondent’s residence type, and smoke and CO
alarms availability and functionality.
The instrument was first pre-tested through in-depth cognitive interviews with a sample of 18
respondents (OMB Control Number 3041-0136) to certify that the survey items are clear and
easy to understand when the survey is administered on a wider scale, reducing any potential
burden for respondents.
Aligned with the original approach, EurekaFacts sought to identify and adjust any recruitment or
data collection procedures or aspects to the instruments during the initial launch of the study. In
the original methodology, EurekaFacts found that the mailing and multiple attempts of calling
participants yielded a very low response rate (less than one-quarter of 1%). EurekaFacts initially
sought to correct this issue by purchasing more sample and focusing on calling households first,
then mailing interested residents. When this did not change response rate, EurekaFacts changed
methodology entirely to a door-to-door random walk recruitment. After the first round of
recruitment and data collection, EurekaFacts found no major issues and continued with the data
collection effort. After the first 50 completes were collected, a brief analysis of selected
questions was conducted to ensure data quality and instrument functionality; no changes were
needed. Additionally, an internal debrief was conducted and lessons learned from those initial
interviews were incorporated into the rest of the data collection effort and highlighted in the pilot
report.
EurekaFacts, LLC

June 2021

Page 9

EurekaFacts plans to continue fielding the study to collect 1,055 total completes. The
information will be summarized into a final report, which will be electronically submitted to the
CPSC Contracting Officer’s Representative (COR).
A.4. Efforts to Identify Duplication and Use of Similar Information
The intent of this data collection is to obtain information that is not readily available elsewhere.
The last time this type of data was collected occurred 25 years ago by CPSC. Other recent
studies were focused on targeting high-risk groups or people who had fires; however, the
estimates for a general population are not available, thus, CPSC specifically selected to focus this
survey on the general population. This data collection will help CPSC develop a national effort
to increase installation and maintenance of smoke alarms in the U.S.
The need for the proposed data collection and the design of this national survey was based on
several consultative efforts with and feedback from experts, stakeholder groups such as the fire
service agencies, academia, government, non-profit and private sector organizations 6 7. The
collected input from experts and stakeholders ensured that the present survey does not duplicate
the information available elsewhere.
A.5. Impact on Small Businesses or Other Small Entities
The information will not be collected from small businesses or other small entities.
A.6. Consequences to Federal program or policy activities if collection is not conducted or is
conducted less frequently
The 1992 national in-home survey, sponsored by CPSC, helped collect information on the
number of residential smoke alarms in actual use in homes and evaluated the operability of the
sampled alarms. The 1992 CPSC survey had the most impact to the installation code, NFPA 72 8,
for smoke alarms. The 1992 CPSC survey set the foundation for many installation and give-away
programs to target specific groups that do not have smoke alarms, thus increasing the presences
of smoke alarms in US households. The presence of smoke alarm in the household considerably
increases the chances of the occupants escaping a home fire.
However, this survey will be 25 years old as of 2017. In order to ensure that the collected
information being referenced remains current and that changes in technology and installation
codes are upheld, the collection of information must be conducted again. By implementing the
new nation-wide SCOA survey, the codes and standards will be current so that fire prevention
organizations and agencies will have all the up-to-date information needed to efficiently and
effectively target the areas for improving life safety and saving lives.
A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5
6. Johns Hopkins Center for Injury Research and Policy, Evidence Informing Action: Consensus Priorities to Increase
the use of Smoke Alarms in U.S. Homes.
7. Amanda Kimball, P.E., Workshop for Survey on Usage and Functionality of Smoke Alarms and CO Alarms in
Households, (Quincy, MA: NFPA, 2017).
8. NFPA 72 – National Fire Alarm and Signaling Code, (Quincy, MA: NFPA, 2016).

EurekaFacts, LLC

June 2021

Page 10

There are no special circumstances. This information collection is consistent with the guidelines
prescribed in 5 CFR 1320.5.
A.8. Consultation and Public Comments
Part A. PUBLIC NOTICE
A 60-Day Federal Register Notice for the collection published on July 23, 2021. The 60-Day FR
citation is 86 FR 39006. The CPSC received one comment during the 60-Day Comment Period.
The commenter stated that although survey email may produce some results, door-to-door
solicitation should not be conducted because people do not want strangers coming to their front
door.
Staff agrees that current public perceptions regarding an in-person survey are significantly
different than when the smoke alarm survey was last conducted in 1992. However, the initial
rollout of the survey in 2019, soliciting randomly selected households via a mailed prenotification letter, which were subsequently screened for an in-home or telephone interview,
resulted in an extremely low response rate. To increase the response rate, the SCOA survey
recruitment effort was redesigned as a door-to-door walk-recruitment methodology. Field teams
distribute door hangers on randomly selected households to provide prenotification that
researchers will be knocking on doors asking for participation in a survey. A pilot survey
conducted in the Washington metro area showed significant improvement in the response rate.
Accordingly, to obtain the best information available, the SCOA survey data collection will
continue to use this door-to-door recruitment methodology, recognizing that home visits by
trained data collectors with inspection and testing provide much better-quality data compared to
telephone or Internet surveys.
A 30-Day Federal Register Notice for the collection published on October 26, 2021. The 30-Day
FR citation is 86 FR 59152.
Part B. CONSULTATION
CPSC consulted with various stakeholder groups in planning the survey. Stakeholders that
participated included representatives from the fire service, enforcers/authority having
jurisdictions (AHJs), public educators, researchers, equipment manufacturers, standards
developers, and others.
To gauge interest in the need for this data, CPSC hosted or participated in the following industry
events:
•
•
•

SCOA survey planning workshop, hosted by CPSC on February 16, 2017.
Vision 20/20 workshop on smoke alarms in March 2015. CPSC received input on a
representative, in-home survey on the prevalence and characteristics of smoke alarms.
International Conference & Workshop Current Practices in Emergency Response: Carbon
Monoxide Poisoning on September 26, 2018. CPSC received input from representatives
from the fire service, enforces/AHJs, public educators, researchers, equipment
manufacturers, standards developers, and others on CO poisoning and CO alarms.

EurekaFacts, LLC

June 2021

Page 11

EurekaFacts, National Fire Protection Association (NFPA), Vision 20/20, and United States Fire
Administration (USFA) were consulted in the availability of accurate smoke and CO alarm
operability data for consumer homes. All four confirmed that information for in-home
operability of smoke and CO alarms have not been available since the last time the survey was
conducted by CPSC in 1992, and that current phone surveys of inoperable smoke alarms in the
US are less reliable.
CPSC staff consulted with EurekaFacts in developing and executing the survey. EurekaFacts is
compliant with the standards in quality for a research organization. 9 EurekaFacts was consulted
on the frequency of collection and the total number of responses required to provide estimates on
the operability of smoke and CO alarms in the US.
CPSC staff consulted with EurekaFacts in developing the survey questioner and to ensure the
understandability and clarity of the question being asked.
A.9. Explanation of any Payment or Gift
Contractor will provide a monetary incentive to respondents through the form of a gift card from
a major credit card company. Based on their eligibility, as determined through the screening
process, respondents will receive one of two incentive amounts at the completion of the survey.
If respondents qualify for the in-home survey administration, respondents will receive a $50 gift
card from a major credit card company in appreciation for their completion of the survey.
However, if respondents qualify for the shorter survey administration, at the completion of the
survey, respondents will receive a $25 gift card from a major credit card company. The variation
of monetary value is due to the amount of time and effort involved in the in-home full survey and
alarm testing administration compared to the shorter survey administration.
A.10. Assurance of Confidentiality Provided to Respondents
Participation in the survey is voluntary and respondents will be so informed before the screening
and at the beginning the survey. Subjects are informed of the measures taken to protect their
confidentiality in the introductory language read to sampled persons. Information collected from
respondents will be kept confidential and only used for research purposes.
Survey respondents will have assigned a Random ID number not linked to any personal
identifying information. Respondents’ contact information (name, address, phone number, e-mail
address) along with the Random ID number will be maintained in one secure database
(“Database 1”). The survey responses and respondents/household demographic information will
be maintained in a second secure database (“Database 2”) where potential survey participants are
identified by Random ID Number only. Database 2 will not contain participants’ names,
addresses, phone numbers, e-mail addresses, or other personally identifying information (PII).

EurekaFacts holds a certification for the ISO 20252: Market, Opinion, and Social Research International Quality
Standard.

9

EurekaFacts, LLC

June 2021

Page 12

Analysis will be conducted on data sets that include only respondent ID numbers; they will not
contain any identifying data. The software that EurekaFacts will use to collect survey data,
Qualtrics, is a secure platform endorsed by the federal government. Qualtrics has FedRAMP
authorization, ISO 27001 certification, and FISMA compliance, ensuring data security. All
collected data will be secured by EurekaFacts and will be kept on the password protected
computers and secure server and locked file cabinets (as applicable), accessible only to project
staff.
Access to the facilities and server where data will be stored is restricted only to authorized
individuals. Access restrictions are defined for each individual based on his/her role. Access to
data requires the entry of a valid account username and password. Project staff receive data
security training and sign an assurance of confidentiality of survey data. All project staff
complete required annual privacy and security training and sign a document pledging
confidentially and maintaining privacy according to Health Insurance Portability and
Accountability Act (HIPAA). The training includes information and data security factors, using
information sources responsibly, employee responsibilities, and how to report instances where
violation of data security is suspected.
Any administrative and PII collected from respondents may be destroyed within 365 days after
of the end of the study. However, to ensure the possibility for potential replication of the study in
the future, any non-administrative data may be kept by CPSC indefinitely.
A.11. Justification for Sensitive Questions
A majority of questions asked in the survey are not typically considered sensitive in nature.
Potentially sensitive questions include the demographic questions that ask about the respondent’s
ethnicity/race, ages of those living in the household, disabilities, and combined annual household
income. Both the trained interviewer and the communication materials will reassure that
participation is voluntary, that they may choose not to answer some questions, and that responses
are confidential. The instructions presented in the survey is designed to make respondents feel as
comfortable as possible in answering these questions.
In addition, each respondent will be informed that a unique ID will be assigned to them that does
not link to any personal identifying information. Data analysis will be conducted on data sets that
include only respondent ID numbers; they will not contain any identifying data.
A.12. Estimate of hour burden to respondents
Upon launch of the survey phase in 2019 fielding in two metro areas, response rate and
cooperation were very low as outline above, impeding the success of the study within contract
timeline, budget and respondent burden level. Revised sampling methods and corresponding
response rates were submitted and approved by OMB in the interim from the initial approval and
the renewal of the project. To complete 1185 interviews (the total burden for the study including
the Washington Metro Area pilot and 24 metro areas that constitute the random sample of
primary sampling units), will require 1,552 burden hours on the public. Several factors may lead
to lower respondent burden. The revised methodology requires a fewer number of interactions
per household which may ultimately reduce the total respondent burden when compared against
EurekaFacts, LLC

June 2021

Page 13

the original address-based sampling (mail to phone to household) methodology. (Please see
section A.15, for further explanation of methodology and response rate change).
The time for screening an individual and starting the interview is also reduced. Multiple phone
calls for screening, scheduling, and confirmation are replaced with interviewers at the door
immediately ready to do screen and conduct interviews upon contact with potential participants.
The original methodology experienced high attrition between scheduling a session and
interviewers arriving at the door, but the revised methodology is expected to receive hardly any
barriers to completing a confirmed interview (baring some extreme circumstance) since the
interview immediately proceeds after screening.

EurekaFacts, LLC

June 2021

Page 14

Below is a discussion of the burden hours.
Table 2. Total Burden Hours by Recruitment and Data Collection Task
Recruitment
activity/ Survey
instrument
Invitation
Recruitment appeal
at door
Screener
Agree to screening
and are screened and
found eligible to
participate
Survey
Full-length survey
(one hour)
Shortened survey for
no-alarm and
security alarm
households (20minutes)

Hours per
respondent

Total number
of contacted
participants

Response
rate

Number of
respondents

Total hours

0.05
(3 minutes)

22,931

30%

6,879

344

0.075
(4.5 minutes)

6,879

17.4%

1,197

90

1

1,096

99%

1085

1085

0.33

101

99%

100

33

1185

1,552

Total Burden Hours: 1552 hours
According to the U.S. Bureau of Labor Statistics, the total compensation for civilian workers in
March 2021 was $39.01 per hour (Employer Cost for Employee Compensation, Table 2,
https://www.bls.gov/news.release/ecec.t02.htm). Therefore, CPSC estimates the cost burden for
respondents to be $60,544 ($39.01 per hour × 1,552 hours = $60,543.52).
A.13. Estimate of total annual cost burden to respondents
There are no costs to respondents to complete this collection other than the labor burden costs
addressed in Section 12 of this document, and there are no respondent recordkeeping
requirements associated with the SCOA survey. There are no operating, maintenance, or capital
costs for respondents associated with the collection.
A.14. Estimate of annualized costs to the Federal government
The contracts to design and conduct the Survey on Usage and Functionality of Smoke Alarms
and Carbon Monoxide Alarms in Households were issued to Eureka Facts LLC under contract
numbers F-16-0091 and F-17-0088 for $562,725 (this figure does not include the cognitive
testing phase that was approved through OMB Control Number 3041-0136).
Salary and benefits costs for government personnel assigned to this study are estimated using the
January 2021 pay scale for a GS-13, Step 5 employee in the Washington, D.C. area, of $117,516,
EurekaFacts, LLC

June 2021

Page 15

and the March 2021 Employer Costs for Employee Compensation (ECEC), published by the
U.S. Bureau of Labor Statistics (https://www.bls.gov/news.release/ecec.t02.htm). According to
table 2 of the ECEC, 68.8 percent of total compensation is paid in wages and the remaining 31.2
percent is benefits. Therefore, in 2021 the staff cost is $142,340, based on 10 staff months
(($117,516/.688) × 10 staff months). In 2022, the staff cost is $106,755, based on 7.5 staff
months (($117,516/.688) × 7.5 staff months). And, the total estimated cost to the federal
government is $249,095 ($142,340 + $106,755), in government labor.
A.15. Program changes or adjustments
Since the initial OMB application and approval in October 2018, EurekaFacts has submitted and
been approved to make several changes to the sampling and data collection process.
Changes that have been incorporated into the current process (from most to least significant)
include:
1. Modifying the third stage of the sampling approach (selection of occupied housing units
in tracts). Originally, houses were randomly selected through ABS (address-based
sampling) with follow-up phone call appointments to conduct the interviews, but the
extremely low response rate and logistical challenges on part of both participants and
field teams results in only a few completed in-home interviews. To streamline the
process, the recruitment method was changed to a random walk door-to-door knocking
methodology. This allowed for direct recruitment and completion of the in-home
interview at one time.
2. Altering the pre-notification document from a mailed letter to a streamlined and eyecatching door hanger to compliment the modified sampling approach. This maintains the
process of pre-notifying residents about the study with a cost-efficient alternative that
raises both individual and community awareness. Distributing the door hangers provides
the field teams flexibility to pre-notify residents of a tract a few days before the intended
recruitment effort, thus maximizing the impact of the literature.
3. Increasing the incentive amount from $25 to $50 for completion of the full-length (60
minute) survey interview. This is an important recruitment tool to increase the
cooperation rate of contacted households and more closely parallels the monetary
incentive offered in 1992, once adjusted for inflation.
4. Implementation of COVID-19 screening questions and protocols. Because of the
coronavirus (COVID-19) pandemic, the project was paused in March 2020, pending
evaluation of the public health environment to determine when best to relaunch the study
in each chosen metro area. EurekaFacts is following the CDC recommendations to ensure
both interviewer and participant safety, including masking and social distancing. If heads
of households are interested in participating after hearing the introduction and purpose of
the study, the interviewer will ask a series of questions to ensure that no one in the
household has COVID-19, symptoms of COVID-19, or are currently quarantining
because of COVID-19. If respondents clear all questions, the rest of the screening
questions would be asked.
5. Inclusion of refusal aversion language to persuade residents to participate. This additional
approved language provides field teams with additional information to recruit
participants.
EurekaFacts, LLC

June 2021

Page 16

6. Revising the expected response rate of the study. The original sampling design resulted in
a response rate of 0.23% (or less than one-quarter of 1%). Upon revision to the door-todoor methodology, EurekaFacts garnered a response rate of 3.5% in the Washington D.C.
metro area during the pilot of the methodology (15x that of the original response rate).
Additionally, field teams during the pilot had face-to-face interaction with 20% of
households contacted. Once an interviewer initiated the recruitment process, there was a
17% chance of participant cooperation that resulted in a completed interview.
EurekaFacts is factoring in the recorded response rate and cooperation data of the pilot
location into calculating the efforts for other metro areas and the study overall. Because
of the overall lower response rate, more households need to be contacted for the initial
pitch of the study; however, the revised methodology reduces the number of contacts
made per residence, which reduces the overall burden per respondent.
7. The original sample selection of metro areas (AKA, primary sampling units) included
each of the following 24 metro locations proportionally drawn (based on concentration of
occupied households) from each of four U.S. Census regions. The Washington DC Metro
area was not randomly selected for this effort but was instead a purposively selected
metro area to test revisions to the sampling method mid-field for proof of concept. As
others may remember, following great challenges experienced with the original 2018
sampling design using an address-based sampling approach, a new door-to-door (D2D)
sampling method was proposed and approved by CPSC and OMB. To test feasibility of
D2D methods for the SCOA survey, the Washington DC Metro was proposed and
endorsed. The advantages of the Washington area included close and convenient data
collection for successful monitoring, the ability to judge and react quickly to challenges,
and cost containment measures, among other benefits.
The research design and budget contracts for this survey effort did not include a pilot
location for the testing of methods. A redesign was not anticipated. Only the 24 metro
locations identified above were selected for sampling to constitute the N=1,185
nationwide proportionally representative interviews as approved under the study design
and budget.
Ultimately, the decision was made by CPSC and EurekaFacts to not consider the
Washington metro as eligible for the SCOA survey (within the N=1,185 total completes)
and instead treat this location as a pilot study only. 10 In turn, the survey is being
completed in each of the originally selected 24 metro locations, while reducing the total
nationwide sample size to N=1,055 (i.e., N=1,185 completes minus the N=130 interviews
completed in the Washington Metro area). The number of expected completes has been
redistributed in proportion to occupied housing unit counts for each of the 24 metro
locations. These changes were made, in part, to complete the study in full within the
contracted periods established by CPSC and National Fire Protection Association
(NFPA), no later than fall 2022 and accounting for time lost for data collection

CPSC SCOA Survey – Washington, DC Door-to-Door Pilot (April 3, 2020), EurekaFacts, Rockville, MD CPSC-SurveyRevised-DiagnosticReport_11_18_20206b6.pdf
10

EurekaFacts, LLC

June 2021

Page 17

attributable to a work stoppage for face-to-face interviewing during the COVID-19
pandemic.
To adjust the total sample selection nationwide, among the 24 metros, each metro area’s
expected sample size was reduced in proportion to its share of occupied housing units in
the overall sample frame. For example, Los Angeles, CA (the largest metro area in the
sample) had an original sample size expectation of N=205 based on 1,185 interviews.
After recalibration, using a total nationwide sample size equal to 1,055 the new sample
size expectation for Los Angeles, CA metro area equals N=183 completes. Providence,
RI (one of the smallest metro areas) had an original sample size expectation of N=34
completes based on 1,185 total interviews. After recalibration, the new sample size
expectation for the Providence, RI metro area equals N=30 completes.

Table 3. Calculation for adjustment in metro area sample sizes.
Formula for metro area sample sizes:
(Solve for SampleN)

OHUs^ for each metro

=

Total OHUs all metros
Los Angeles, CA Metro Area

62,942

Total sample (N=1,055)
=

363,111
Providence, RI Metro Area

10,350
363,111

^

SampleN

183
1,055

=

30
1,055

Occupied housing units (OHU).

Note that the changes listed here, and the details of the resulting administrative, technological,
and sampling revisions, are incorporated throughout the text of the revised supporting statements
A and B.
A.16. Plans for tabulation and publication
A.16.1 Analysis Plan
Prior to data analysis, EurekaFacts will complete data cleaning and a non-response analysis. The
data cleaning process will include: identification and removal or re-coding of inconsistent
responses and subsequent inclusion in the final data file and elimination of or recoding of
respondents’ choices when outside the ranges specified in the response categories. A nonresponse analysis will follow the data cleaning. The objective is to identify differences between
respondents and non-respondents based on their demographics and other measurable
characteristics to assess the representativeness of our sample necessary to allow statistical
EurekaFacts, LLC

June 2021

Page 18

inferences of the survey results. Weights will be applied to correct an over or underrepresentativeness of categories of the target audience in the final survey data.
The analysis will provide estimates of operability of smoke alarms and CO alarms, estimates of
percentages of households as well as subgroups with installed of smoke alarms and CO alarms,
estimates of the proportions of respondents demonstrating hazard awareness, and relevant
behavior related to alarm use and smoke and CO hazards. Analysis will include evaluation of
factors leading to inoperable alarms, types of housing relative to alarm operability conditions.
Analysis will identify demographic groups that do not have operable smoke alarms and/or CO
alarms, as well as demographic characteristics affecting alarms operability conditions.
The data analysis will include a tabulation of all survey questions, graphs, frequency
distributions, and two-or-three way cross-tabulations of meaningful parameters to show
similarities or differences among respondents. Analysis will be conducted using case-appropriate
statistical, data-mining, and database modeling procedures. Analysis deliverables will include a
final technical report describing the SCOA methodology and summarizing the results, findings,
and conclusions. The report will include American Association for Public Opinion Research
(AAPOR) indices for survey response rates, descriptive statistics on the demographic data,
summary lists of open responses, and frequency distributions. A table of survey interviews and
non-responses, in accordance with nationally recognized guidelines from AAPOR, will also be
delivered.
A.16.2 Publication Plan
The Contractor will develop a technical report that will present a description of study design,
research methods, summary of results, finding and conclusions.
The final technical report will be released by the Commission by disseminating the report on the
agency’s website and presentations at meetings and conferences related to the subject matter.
The procedures to disseminate the information by the Commission, its staff, agents and
representatives will be accordance with the law and Commission policy to ensure the
information is accurate and not misleading. The agency will disseminate the findings when
appropriate, strictly following the agency’s “Guidelines for Ensuring the Quality of Information
Disseminated to the Public”.
In order to encourage dissemination of the findings, the report will be freely accessible on
cpsc.gov. The work was prepared in the course of the author's official contracting duties with
CPSC, thus Title 17 U.S.C. Section 105 provides that there can be no copyright in a United
States government publication.
A.17. Rationale for not displaying the expiration date for OMB approval
No such exception is sought. The OMB survey number and expiration date will be displayed on
the initial screener and informed consent forms to be used as a reference if needed.
A.18. Exception to the certification statement
No such exception is sought. These activities comply with the requirements in 5 CFR 1320.9.
EurekaFacts, LLC

June 2021

Page 19

B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS
B.1. Respondent Universe and Sampling Methods
The proposed survey will include a nationally representative survey of households within the
United States. Eligible respondents must be 18 years of age or older, and be considered one of
the heads of the household. The questions asked in the survey require knowledge regarding
duration of residency, the age of the house and the equipment installed within the house. This
necessitates that the respondent be a person who makes major decisions within household. A
head of household will be considered a person living or staying in the home in whose name the
house or apartment is owned, being bought, or rented. A probabilistic multistage sampling
approach will be utilized to select sample units for this survey. A probabilistic sampling method
will allow a random selection of units with a calculable probability of selection of each unit in
the target population.
Considering the three-stage sampling approach, the primary sampling units are metropolitan
areas, the second sampling units are US Census Tracts, and the elementary sampling units are
US Households. At the first stage, we considered US metropolitan areas as primary sampling
units (PSU), and select a random sample of 24 metropolitan areas among the 389 at the first
stage (https://www2.census.gov/programs-surveys/metro-micro/geographies/referencefiles/2015/delineation-files/list1.xls, accessed on 12/04/2017 at 3:04PM). Then, we considered
US Census Tracts as secondary sampling units (SSU). More precisely, we selected a random
sample of Census tracts in each of the 24 metro areas selected at the first stage, as well as a
random sample of Census tracts in non-metropolitan areas in the proximity of metro areas.
Lastly, we selected a random sample of occupied households through a random walk
methodology in each Census tract selected at the second stage for survey fielding.
The end-objective is to ensure that the ultimate survey sample includes a majority of completes
from housing units within metropolitan areas and a quantifiable minority from non-metro areas.
The respondents’ universe is, therefore, a random sample of US households selected from a
random sample of Xensus tracts selected from a random sample of US metropolitan areas.
B.1.1 Sampling Frame
The sampling frame will consist of occupied housing units within metropolitan and nonmetropolitan areas. Records from the latest housing surveys from the US Census Bureau will be
used to identify Metropolitan Areas and Census Tracts.
Table 3. Summary of Sampling Stages and Respective Sampling Frames
Sampling Stage
#1
#2
#3

Sampling Unit
US Metropolitan Areas
US Census Tracts
US Housing Units

Population Size
388
74,002
118,860,065

Sample
24
192
1,055

The original sampling methodology utilized an address-based sample (ABS) list of residential
addresses to determine its sample frame. The revised methodology employs a random walk doorto-door knocking strategy to pre-notify and recruit households in each Census tract. As such, any
occupied household within a chosen tract is eligible to participate in the study. Research teams
EurekaFacts, LLC

June 2021

Page 20

will select different parts of a tract to focus their recruitment efforts ensuring a mix of
areas/neighborhoods, household types and demographic characteristics of respondents recruited
to participate.
B.1.2 Sampling Approach
EurekaFacts team will adopt a proportional multistage sampling approach to select housing units
for the SCOA survey. To ensure that housing units in metropolitan areas as well as those not in
metropolitan areas are included in the survey, we consider the following steps:
1. At the first stage, a random sample of 24 metropolitan areas were selected among the 388
Census tracts as primary sampling units. The sample was stratified by Census Region and
then stratified by metro area population size (those with a population of 1 million or more
and those with less than 1 million), ensuring the number of Primary Sampling Units
(PSUs) selected for each region is proportionate to the number of occupied housing units
(OHUs).
2. At the second stage, a random sample of residential census tracts was selected in
proportion to the number of OHUs within each of the 24 metropolitan areas selected at
the first stage. Furthermore, a random sample of additional census tracts within nonmetropolitan areas located within the same state of each PSU were selected at this stage.
3. At the third stage, a random walk door-to-door methodology is instituted in each Census
tract so field interviewers can recruit residents of occupied housing units for the in-home
survey (full length and shortened versions). Each tract has a target quota in relation to the
OHUs of other tracts randomly selected.
The end-objective is to ensure that our survey sample has a majority of completes from housing
units within metropolitan areas and a quantifiable minority from non-metro areas. Each tract has
a target quota in relation to the other tracts randomly selected. The fourth stage of sampling
would require selection of additional tracts to either replace or supplement tracts where
recruitment is far below the threshold of the tract’s quota. For example, if a tract has too many
empty units or residents with a language barrier, a new tract would be randomly selected from
tracts of a similar size.
In total, this approach makes it possible to calculate the probability of selection of every sample
units at every stage, as well as to reliably calculate design effect, sampling error, etc. and infer
the findings to the housing units in the US with a calculable level of precision.
B.1.3 Sample Size
CPSC has available funding for a sample size of 1,185 households. This sample size is large
enough to provide accurate estimates representing the diversity of housing and household types,
attitudes and state laws on smoke alarms and CO alarms across the country. This sample size is
large enough to yield a margin of error of +/-2.83% at 95% level of confidence for estimates of
the survey results in a simple random sample.
The margins of error provided below are only indicative and based on assumptions of a statistical
power of 80% (usual default value), a confidence interval of 95% and a target population size of
EurekaFacts, LLC

June 2021

Page 21

116,900,000 units 11. The actual margin of error that will be provided with survey results may
slightly differ from the ones shown here, as the actual multistage sampling design will provide a
slightly different margin of error overall. The specific margin of error for multistage sampling
will be calculated when the actual sample is drawn.
Table 4. Margin of Error by Sample Size
Sample Size (N)
Statistical Power
Confidence Interval
Total Occupied Housing Units
Margin of Error (SRS*)
*SRS: Simple Random Sample

1,055
80%
95%
116,900,000
3.0%

For a total sample size of 1,055 households the sample distribution will be:
• Metropolitan Areas: 24
• Census Tracts per Metropolitan Area: Proportion to the number of OHUs within each of
the 24 metropolitan areas for a total of 192 tracts
• Housing Units per Census Tracts in Metropolitan Areas: Minimum of 2; quota per tract
will be proportionate to the combined OHUs of the other tracts to reach target completes
• Housing Units in non-Metropolitan Area: Minimum of 2; quota will be proportionate to
the combined OHUs of the other non-metro areas to reach target completes
• Total Housing Units in Metropolitan Areas: 995
• Total Additional Housing Units in non-Metropolitan Areas: 190
B.2. Procedures for the Collection of Information
The original survey methodology employed a mixed-mode, multi-stage approach to data
collection. The combination of mailed pre-notification letters, multiple calls to gauge interest and
screen for eligibility, additional calls to coordinate and confirm the interview date, and finally the
arrival of the survey team at a designated time proved to be too complex of a recruitment
procedure. The drop off in interested participants increased with each step, which hurt overall
response and cooperation rates.
The revised process condenses the timeline between initial contact and potential participation in
the study. Once a Census tract is randomly chosen, random areas of the tract are selected for
field teams to then randomly walk and place door hangers on residences. These door hangers act
as a pre-notification to the household that mentions the study’s purpose, incentive, and
website/phone number to learn more. Additional details from the original pre-notification letter
are also now available online on a publicly available EurekaFacts website. The posting contains
a detailed explanation of the study and its objectives, who are the sponsors, details on how their
responses and PII will be kept secure, FAQ, and links to the posting of the Federal Register.

11. U.S. Census, Quick Facts: Population Estimates. (2016),
https://www.census.gov/quickfacts/table/PST045216/00

EurekaFacts, LLC

June 2021

Page 22

In the following one to three days after respondents have received their pre-notification door
hanger notifying them of the study, field teams will follow the same walking route and knock on
doors to recruit for immediate participation in the study. Field teams will have the prenotification letter, copy of the door hanger, and their official badges visible to provide credibility
to our survey effort and encourage respondents’ cooperation. Overall, this reduces the number of
total contacts made with the public and diminishes burden. This also allows field teams to have
more meaningful, efficient, and prompt interactions with the community.
If someone is interested in participating in the survey, the field team can immediately screen
them at their door. Depending on the status of smoke alarm installation or its type, eligible
respondents will be engaged to participate in either the full-length survey interview or shortened
survey version.
Residents who have a smoke alarm, that is not connected to a central or security alarm that will
notify the police or fire department, will be eligible for full-length interviews. If, however, they
do not have a smoke alarm, or, if they do, but it is connected to a central or security alarm, the
respondent will be eligible to participate in the shortened survey version. The survey instrument
will be programmed on Qualtrics and will be administered via in-home interviews using a
Computer Assisted Personal Interview (CAPI) format.
During the in-home interviews, a qualified two-member survey team will ask household
residents questions related to smoke and CO alarms. The full-length interview will include
testing of some (potentially all depending on the number) of their smoke and CO alarms. If any
of the alarms do not work, we will offer to provide a new one to them free of charge. If,
however, residents do not have a smoke alarm, they will receive a shorter version of the
questionnaire. The survey interview will take between 20 - 60 minutes, depending on if the
shortened version of the survey is administered (about 20 minutes) or the full-length survey and
alarm testing interview in conducted (no more than 60 minutes). The individual data will not be
identified to a specific person. Any data provided to the client or included in the report will be
delivered in the aggregate form.
B.2.1 Statistical Methodology for Stratification and Sample Selection
The survey's sample selection, and sampling methodology is discussed earlier in Question 1. A
probabilistic multistage sampling approach will be utilized to select sample units for this survey.
A probabilistic sampling method will allow a random selection of units and an equal chance for
inclusion in the survey. Considering that there are more than 134 million housing units in the
nation, a simple random sampling, a systematic sampling, a stratified sampling, or a cluster
sampling will be too costly and less likely to capture the diversity of the smoke alarms and CO
alarms adoption, operationality, and local jurisdictions laws and regulations. The proposed
proportional multistage sampling approach will consider metropolitan areas as primary sampling
units, residential census tracts within metropolitan areas as secondary sampling units, and
housing units within those residential census tracts as the final sample units. At the first two
stages, a simple random sampling will be applied to select units. The third stage utilizes a
random walk methodology as a more practical strategy for gaining completes, yet all households
in a tract are eligible to participate. This approach makes it possible to calculate the probability
of selection of every sample units at every stage. As such, we will be able to reliably calculate

EurekaFacts, LLC

June 2021

Page 23

estimates and sampling error and infer the findings to the housings in the US with a calculable
level of precision.
The American Community Surveys data from the U.S. Census Bureau was used to identify and
quantify the number of U.S. Metropolitan Areas, the number of US Census Tracts overall and
within each metropolitan area, and finally, the number of housing units overall and within each
census tract. Those entities—Metropolitan Areas, Census Tracts and Housing Units—are
considered as defined by “U.S. Census Bureau, Population Division, based on Office of
Management and Budget, July 2015 delineations.”
B.2.2 Estimation Procedure
Estimates will be produced using standard survey estimation procedures. Survey estimates
include estimates of operability of smoke alarms and CO alarms, estimates of percentages of
households as well as subgroups with installed of smoke alarms and CO alarms, estimates of the
proportions of respondents demonstrating hazard awareness. EurekaFacts will consider and
compare different methods of variance estimation such as Taylor series approximation or various
replication (Jackknife, Balanced Repeated Replication) methods.
B.2.3 Unusual Problems Requiring Specialized Sampling Procedures
We adopted an on-the-ground door-knocking sampling approach over the multi-stage mixedmode approach originally designed. As previously described, the extremely low response rate
made the methodology impractical for completing the study in a timely and efficient manner.
The new design is ultimately more streamlined and provides a cost-effective, time-efficient, and
flexible strategy for field teams to recruit for in-person research that requires researchers to be in
the participant’s home. The differences in efficiency are demonstrated in the contrast between
the fielding attempts using the original methodology vs. the revised methodology.
Table 5. Comparison or Original and Revised Sampling Designs
Methodology
ORIGINAL:
Address-based
sampling multimode recruitment
approach
REVISED: Door-toDoor household
random walk
method sampling
approach
Change

Fielding
dates

Total
weeks

Response
rate

Cooperation
rate

Completes to
quota ratio

Jan. 1 - May
30, 2019

23
weeks

.09%

3.0%

15%
(9 completes/
59 quota)

Dec. 21,
2019 March 1,
2020

11
weeks

3.5%

17.4%

101%
(130 completes/
128 quota)

+3.41%

+14.4%

+86% pts.

-12
weeks

Additionally, vacant households are excluded from this survey. The U.S. Census Bureau
estimates that 12% of housing units in the US are vacant. Field teams will skip leaving door
EurekaFacts, LLC

June 2021

Page 24

hangers and knocking on residences that are clearly abandoned/vacant to maintain efficiency
during recruitment.
B.3. Methods to Maximize Response Rates and Deal with Non-response
To maximize response rates, EurekaFacts’ approach includes multiple compounding efforts.
First, a pre-notification door hanger will be left on the door of a sample of households in a
randomly selected area of a tract. The door hanger will briefly provide the purpose of the
research, sponsors of the research, a telephone number and a website providing more detailed
information and an FAQ. The door hanger is in public view of all the neighbors, so residents can
see that it is a neighborhood-wide canvas and not just their home that was selected. This allows
time for neighbors to discuss the legitimacy of the study, which builds confidence in residents
when field teams approach their household to recruit them. Dropping a unique piece of literature
followed by researchers at their doors just a few days later is designed to enhance response and
cooperation rates, and thus data quality. EurekaFacts will also create social media posts
advertising the study and target their appearance by ZIP code based on dates field teams will be
in that area.
The two-member survey team will take additional steps to persuade residents to participate in the
study. Both members will present their government-issued IDs and their official badges to
confirm their identity and legitimacy. They will also be wearing high-visibility safety vests, so
their presence is well announced to everyone in the neighborhood. The survey team interviewers
will be trained in refusal conversion techniques to reassure the respondents of the legitimacy of
the team’s presence and gain cooperation. The team will also carry with them a letter printed on
official letterhead with endorsements from the local fire department and/or CPSC, should they be
needed.
In the SCOA survey, two types of non-response may occur. The first type is unit non-response,
which occurs when data is not obtained for the sample unit (i.e., a respondent chooses not to
participate in the survey). The second type is item non-response, which occurs when a
respondent fails to answer one or more of the survey questions. For unit non-responses,
EurekaFacts anticipate a response rate below 80% and will therefore conduct a non-response bias
analysis. We will use households’ characteristics selected from the American Community
Surveys data of the US Census Bureau to assess whether there exists a significant difference
between households who responded to the survey and those who did not. If the non-response
analysis reveals the necessity to weight the survey data, we plan to use a weighting scheme that
makes respondents representative of their respective metropolitan area, as oppose to an overall
weighting scheme. Information about households that do not participate in the study (time of
day, tract, reason for not participating, demographic information about person who answered
door – if applicable) will be captured on a screen-out capture form on the researcher’s tablet.
This will allow for tracking of the total number of residents spoken to, capturing screen-out data,
and analyzing non-response by demographics and reasons given not to participate.
Since the survey is administered by interviewer, item non-response has low likelihood.
Nonetheless, a survey entry will be considered as complete only when 80% of the survey
questions have been answered, skip patterns excluded. Based on the initial non-response analysis
EurekaFacts, LLC

June 2021

Page 25

the appropriate approach to handling missing data will be identified and whether simple or
multiple-imputation is required.
B.4. Test of Procedures or Methods to be Undertaken
The current survey content was developed by Vision 20/20 and Tridata, LLC, which specialize in
fire safety and engineering, and reviewed by specialists with expertise with fire and CO safety,
including staff from fire departments, Red Cross, subcontracted engineering firms, and CPSC
staff. EurekaFacts reviewed instruments to ensure adherence to survey design guidelines and
ensure an efficient and accurate data collection process which would minimize overall
administration length and reduce the burden placed both on the respondents and the survey team.
Three survey methodologists and two statisticians reviewed the survey and provided
recommendations for ensuring relevance, clarity, and minimal response burden.
The full survey instrument underwent testing via cognitive interviews with 18 household
residents (OMB Control Number 3041-0136). Cognitive testing was carried out to ensure that
any questions that were misunderstood by respondents or that were difficult to answer would be
improved prior to the survey fielding, and thus increase the overall quality of survey data and the
accuracy of the study results. The respondents for the cognitive interviews were recruited using a
slightly modified version of the survey screener (modified to meet the needs of recruiting for
cognitive interviews). Recruitment was conducted using a mixed-mode methodology that
combined posting on community websites such as Craigslist, and telephone outreach to followup specific leads generated by the advertising. The interview sessions lasted for approximately
ninety minutes. The cognitive interviews were conducted with two groups of participants: 1)
individuals who report having a smoke alarm that is not connected to a central alarm that may
notify the police or fire department, and 2) individuals who do not have a smoke alarm installed,
or if they do, their smoke alarm is connected to central or security alarm. Similarly, as planned
during the actual survey fielding, during the cognitive pretesting the first group (with smoke
alarm) was administered the full in-house version of the survey instrument and the second group
(no smoke alarm or central alarm) was administered the telephone version of the survey. The
cognitive interviews examined how well the questions performed and ensured that the material
was clear and easy to understand among potential survey respondents. As part of this effort, the
cognitive testing was designed to assess the question‐response process in terms of respondent’s
comprehension, information retrieval process, judgment as to providing requested information,
and perceived degree of ease or difficulty experienced in formulating accurate/correct responses
to each question posed.
Overall the cognitive interviews demonstrated that the survey instrument did not pose any
considerable challenges to the respondents. A majority of the survey questions were clear and
easy to understand, and the response categories for multiple-choice questions were relevant. The
testing identified minor misunderstanding or different ways of interpreting a few terms. Based on
the pretesting results, participants’ comments and suggested improvements of the question
wording were identified and implemented. By utilizing this method prior to survey fielding, this
helped to increase the overall quality of the survey data and the accuracy of the study results.
EurekaFacts continued to adjust and refine the instrument with CPSC to make it appropriate for
fielding. After collecting the first 9 completes using the original methodology, EurekaFacts
evaluated the overall process and proposed the change to the door-to-door random walk
EurekaFacts, LLC

June 2021

Page 26

methodology. A non-substantive change request to OMB was submitted by CPSC, and
EurekaFacts adopted the new method with OMB approval.
After the first round of recruitment and data collection using the new methodology, EurekaFacts
found no major issues and continued with the data collection effort. After the first 50 completes
were collected, a brief analysis of selected questions was conducted to ensure data quality and
instrument functionality; no changes were needed. Additionally, an internal debrief was
conducted and lessons learned from those initial interviews were incorporated into the rest of the
data collection effort and highlighted in the pilot report. All data collected will be incorporated
into the overall sample of 1,055 cases.
B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing
Data
Consulted on statistical aspects of the design and to collect and analyze the data:
Jorge Restrepo (240) 403-1636
Robert Suls (240) 403-1641
Djass Mbangdadji (240) 403-1640
Bohdana Sherehiy, PhD (240) 403-1637
EurekaFacts, LLC
Arthur Lee (301) 987-2008
Matthew Brookman (301) 987-2467
Consumer Product Safety Commission

EurekaFacts, LLC

June 2021

Page 27

Appendices

EurekaFacts, LLC

June 2021

Page 28

Appendix A: Door Hanger Literature to Distribute to Households

Note that the door card information will slightly change based on the staffing and location
(e.g., partner company mentioned, phone number to contact, and image of interviewer
badges). Use of NFPA logo, mention of CPSC and NFPA as sponsors, and EurekaFacts
name and website will remain consistent.

Appendix B: In-Home Informed Consent Form
INFORMED CONSENT
Consumer Product Safety Commission (CPSC) Survey
on Usage and Functionality of Smoke Detectors and
Carbon Monoxide Detectors in Households
Thank you for your interest in participating in the research study. This study is conducted by EurekaFacts on behalf
of the U.S. Consumer Product Safety Commission (CPSC). We are conducting a nationwide survey on household
fire and carbon monoxide (CO) safety. The purpose of this study is to gather information about the functionality of
smoke detectors and CO detectors in U.S. households by asking a series of survey questions and testing your
household smoke and CO alarms. Findings from this research will help CPSC improve home safety.
The combined survey and testing of smoke alarms and CO alarms will take up to 60 minutes. Our trained
and qualified two-member survey team will ask you questions related to your smoke detectors and CO detectors,
and then inspect these devices in your home. You will receive a $50 gift card from a major card company as a
token of appreciation for completion of the study.
If the survey team finds any detectors to be non-functioning, new detectors and/or batteries can be offered, free
of charge, based on availability. If you are renting your home, the property manager will need to be contacted to
arrange installation of the new detector at a later date. With your permission, we would like to collect nonfunctioning smoke or CO detectors and send them to CPSC’s lab to find out why they do not work. In
addition, we may request your permission to take a photograph of your smoke and CO detector(s) to study
different alarm types and functionalities.
Information collected from this study will help CPSC to improve household fire and CO safety. Your input will assist
with developing standards and guidelines that will help protect property and human life. This research does not
involve any foreseeable risks.
Your participation in this research study is completely voluntary. You may stop at any time if you do not want to
continue with the study by notifying a member of the survey team. Your responses will be maintained confidential
and will be used for research purposes only. At no time will any identifiable information be linked to any of your
answers. All information collected through our research process is reported to CPSC anonymously.
We ask for your consent to participate in answering questions as part of the survey portion of this study.



I consent
I do not consent

We ask for your consent to participate in the smoke and CO alarm testing portion of this study in your home.



I consent
I do not consent

Your signature below means that you have freely agreed to participate in this research study. You should consent
only if you have read this document and you understand its contents.

Print name

Signature

Date

Appendix C: Waiver, Release and Hold Harmless Agreement

Home Address:

____________________________________________________________

In consideration of the voluntary performance of my participation in the U.S. Consumer Product Safety
Commission (CPSC) Survey on Usage and Functionality of Smoke Detectors and Carbon Monoxide
Detectors in Households, which is being conducted at my residence, located at
______________________ I, on behalf of myself, and all members of family, as well as my heirs, executors, administrators or successors, hereby waive any claim or cause of action of any nature that I have, or in the future may have, against any and all individual or organizational participants in the CPSC Survey on Usage and Functionality of Smoke Detectors and Carbon Monoxide Detectors in Households, including but not limited to CPSC and EurekaFacts, LLC, its agents or employees, which claim or cause of action grows out of or results from increased levels of carbon monoxide, a fire or other damage, following the testing and inspection of one or more of the smoke and or carbon monoxide detectors, in addition one or more of the following action(s): 1. 3. 5. 7. Replaced batteries Provided new smoke detector(s) Collected faulty smoke detector(s) Obtained photograph of smoke/ carbon monoxide detector(s) (Device only) 2. Provided new CO detector(s) 4. Collected faulty CO detector(s) 6. Took no additional action I further hereby agree to release and hold harmless any and all organizational and individual participants including the [Partner Name] and municipality of [MUNICIPALITY NAME] in the CPSC Survey on Usage and Functionality of Smoke Detectors and Carbon Monoxide Detectors in Households from and against all damages of any kind, to persons or property, growing out of or resulting from a fire or increased levels of carbon monoxide in my referenced home. I acknowledge having read, understood, and agreed to the above waiver, and release. Print name Signature Date Witness (Print name) Signature Date *This form generally indicates that the occupant or owner of the property agrees to waive his or her rights to sue any individual, any municipality and any other organizations or individuals involved in the safety inspection of this home, if a fire or increased levels of carbon monoxide occurs after the inspection. The purpose of the waiver is to protect the individual or any of the organizations involved against liability arising from the home fire inspection. This statement is intended for information only, the terms of the waiver themselves shall prevail if there are any questions. You should seek advice if you do not understand this waiver. Appendix D: Thank You Email For Participating in In-Home Interview DATE> IF SENT AS EMAIL - Subject: Thank you for your participation in the CPSC Research Study Dear , Thank you for participating in the interview survey about smoke and carbon monoxide (CO) safety for the U.S. Consumer Product Safety Commission (CPSC) on at
File Typeapplication/pdf
Authorlglatz
File Modified2021-10-28
File Created2021-10-28

© 2024 OMB.report | Privacy Policy