Table Saw Survey Supporting Statement Part B

Table Saw Survey Supporting Statement Part B.docx

Table Saw Survey

OMB: 3041-0168

Document [docx]
Download: docx | pdf

Shape1

Table Saw Survey: Part B


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


In this study, the goal is to find owners of table saws that were sold with a modular blade guard. In November 2007, a seventh edition of UL 987 was published with requirements for a new modular blade guard that is designed to be adjustable for more cuts, to provide clear visibility of the cutting edge, and to be easily installed and removed without the use of tools. The majority of table saws sold in the United States since 2009 meet the UL 987 seventh edition requirements of the modular blade guard. Though there were some table saws available before 2009 with the modular blade guard, the proportion was small, and this group was eliminated from the population of interest. Thus, the population of interest can be defined as owners of table saws that were manufactured in 2009 or later, which ensures that these are owners of table saws that were originally sold with modular blade guards.


CPSC’s advanced notice of proposed rulemaking (ANPR) provides the following data on the estimated annual shipments of table saws based on data obtained from the Power Tools Institute (PTI). “According to PTI, estimated annual shipments of table saws have fluctuated widely in recent years. In 2006 and 2007, estimated shipments were 800,000 to 850,000 units. However, estimated shipments declined to 650,000 in 2008, 589,000 in 2009, and 429,000 in 2010.” If each product shipped indicates a household with a table saw of interest, and assuming these estimated numbers are what was seen in 2011-present, then the proportion of households in the U.S. with a table saw of interest is small, and therefore difficult to find in large enough numbers with general population sampling techniques such as random digit dialing (RDD) or address-based sampling (ABS).


Thus, the chosen form of obtaining respondents for this survey is the recruitment of respondents via various methods of advertisements, which does not enable extrapolations to the population, and limits conclusions from analyses to the responding group.


2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection:


None applicable. Participants will be recruited from a variety of resources where woodworkers can be found.


* Estimation procedure:


No estimation techniques will be applied to this data collection. Results will be tabulated, no extrapolations will be attempted. Any statistical testing methods that are applied to the data, such as factor analysis or non-parametric analysis, will be used only to identify patterns within the responding group.


* Degree of accuracy needed for the purpose described in the justification:


Not applicable to this study.


* Unusual problems requiring specialized sampling procedures


A sampling frame does not exist for this group, nor does an efficient, resource-conscious method exist for creating a sample where statistical methods could apply, such as ABS or RDD, due to the small sampling fraction within the population. Finding respondents in this small population by random digit dialing is likely to yield few results. As such, recruitment of respondents, instead of a statistical selection of respondents, was chosen to use in this survey, with the knowledge of the limitations to the generalizability and statistical inferences applicable to the results. A detailed screening survey was created to ensure the exact group of interest is the group that is surveyed, and a $50 incentive will be provided to those who qualify for and complete the full survey.


The contractor secured for recruitment efforts by CPSC is Westat. Recruitment efforts are broad in an attempt to get a range of owners of table saws that came with a modular blade guard. Four main strategies are a) advertising in woodworking websites and publications, b) E-mail blasts to targeted panels by a firm such as EMI, c) Advertising in online communities such as Craigslist, and d) advertising in "Woodworkers Club" stores or store flyers and newsletters. “Woodworkers Clubs” are stores that have machines and classes for hobbyists. Each of these strategies is discussed below.


The options identified for advertising in woodworking websites and publications are:


Fine Woodworking magazine: This is a magazine that includes an audience of table saw users; the magazine indicates that they have a range of user levels, including hobbyist and contractor. Discussion threads on the online Fine Woodworking magazine site include table saw safety discussions. The magazine would want to review the ad, purpose, and sponsor before including an ad in their magazine, and materials would need to be approved in early August for an October publication.


Popular Woodworking and Wood Magazine are other magazine options: For Popular Woodworking, materials are needed in mid-August for November publication.


Woodworking Club Newsletters: Woodworking clubs will be contacted to ask if they will post flyers in common areas, and if they have newsletters, we will ask to buy space in several newsletters.

The second recruitment strategy is the use of e-mails to targeted panels by marketing firms. This recruitment and screening through vendor-maintained web panels will allow the contractor to use lists of e-mail addresses from marketing firms’ sources which include warranty lists, subscription information, and club affiliation, among other sources. Payment is made to the marketing firm for each legitimate recruit generated.


The third area of the recruitment strategy is contact with woodworking clubs. The contractor plans to contact woodworking clubs and ask them to post flyers for us in common areas of the club. The contractor will start by calling staff using a directory of woodworking clubs identified through the website finewoodworking.com.1 The contractor will work through the directory and attempt to gain cooperation from 10–15 clubs that agree to post the flyers. The phone call will begin with a discussion of the purpose of the recruitment effort and the study, what is being asked of the club (post a flyer or flyers in common areas), and the details about the process for recruitment and doing the interview. If the club has a newsletter that accepts advertising, the contractor will ask for details about the timing of the next issue, readership, and the cost of advertising, and, if it fits in the schedule, the contractor will include advertising. If the contact agrees to post the flyer, the contractor will send the appropriate number of flyers to them via first class mail. Experienced contractor interviewers will staff the contact effort.


The fourth area of the recruitment strategy is advertising in Craigslist community groups. The contractor will place ads in Craigslist communities. Craigslist ads will be placed strategically around the country and in areas not reached by the Woodworkers Club newsletter and Woodworking Clubs that have agreed to either include an ad in their newsletter or have agreed to post a flyer in their common area. Craigslist ads are organized in different geographic communities and by interest area (e.g. “Skilled Trade Services,” “For Sale: Tools” and/or “Jobs: Skilled Trades/Craft”). Table saw users are likely to be found browsing any of these categories and including the ad in these sections will be useful.


An additional option that could be used is to contact people the contractor knows who are table saw users. The contractor plans to call, email, or visit face-to-face, asking them if they might be interested or would be willing to provide names and contact information of anyone else who might be eligible and interested. While this may help generate 200 respondents for this effort if needed, this strategy is not scalable and would not work well for potential larger efforts in the future.


The contractor will monitor and report on the yield from each of the different strategies and will make adjustments to the strategies, if necessary. The contractor will implement each of the strategies noted above in an order that will maximize the chances that we will secure the 200 respondents from a range of backgrounds, ages, user-types, internet access, and from different regions around the country. The contractor will “bookend” the timing of the strategies to begin with the print publications and end with the Craigslist and on-line methods. The strategies may yield more than 200 respondents, but it is not possible to definitively know the yield from each strategy before embarking on the strategies.


* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


This is a one-time survey, and will end when 200 respondents have been recruited successfully.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


As discussed in response to question (2), emphasis was placed in recruiting across geographical locations, age ranges, different levels of technology use, different experience levels (as much as getting 200 qualified respondents will allow).


Recruitment is the biggest challenge in this survey, and various recruitment techniques will be used in order to reach the full range of the target population. A $50 incentive will be offered, and advertised, for those that qualify for and complete the full survey. This will pull in more of the target population. With an incentive, there is a need to weed out respondents that are not actually part of the target population. The advertisements give enough information to recruit the target population of table saw users; however, leave out certain details of how that group qualifies. The screener survey asked specific questions to ensure that only those respondents eligible are recruited for the full survey.


The contractor secured by CPSC to complete the full survey is EurekaFacts, LLC. Computer Assisted Telephone Interview (CATI) will be used to administer the full survey. The contractor’s in-house call center provides robust capabilities that include a tracking system to manage productivity and compliance with key metrics such as answer time, hold time, call duration and number of calls by agent, hour, project, etc. This tracking system will be used to monitor interviewers’ performance and track survey response and completion data.


The contractor will provide professional survey administrators (e.g., interviewers) who will be trained by Home Innovation Research Labs staff in best methods and background of consumer research regarding carpentry and power tools. This training will include a discussion of the overall study objectives, a section-by-section review of the survey instrument, and mock interviews prior to contact with research participants, covering a variety of frequently asked questions and objections raised by individuals when contacted and invited to participate in similar research activities.


When placing calls, the interviewers will closely follow all IRB and OMB specified protocols. This includes a Verbal Consent script which provides respondents with a clear description of the research, confidentiality associated with their participation and response, and any potential risks that may be associated with their participation.


During the interviewing effort, the contractor’s field supervisor and/or field services manager will monitor all interviewing activities at regular intervals. Supervisory personnel will monitor phone interviewers to ensure that they adhere to the approved protocols throughout the entire interview effort. These measures will serve to both ensure protocol compliance, as well as methodological compliance throughout the survey deployment.


Each interviewer’s performance will be monitored as they conduct telephone interviews during their entire first hour of calling on this study and at least two to three times per calling shift thereafter. While listening to the phone interview, the supervisors will be required to fill out a monitoring report which covers all aspects of performance. Interviewers will be evaluated on the following dimensions:

  • The ability to provide participants with relevant information about the study

  • Accuracy

  • Voice quality

  • Timing and conversational quality

  • Consistent adherence to protocol

  • Non-response

Feedback to telephone interviewers will be provided on an ongoing basis; as soon as any issues and opportunities for improvement are detected, the supervisor will immediately provide feedback and take appropriate steps. All monitoring reports will be reviewed by a field service manager after each interviewing shift. In addition to “live” monitoring, each interviewer’s performance is tracked based on their previously assigned interviewer ID number from the time they log on to the CATI system through our server until they end their shift. This allows our system to track and generate records that tabulate data that includes manageable indicators that feed into productivity, such as: the number of “dials” per recruiter, the number of completes per interviewer, the average length in minutes for each call, soft and hard refusals, and the percentage of refusal conversions. Our monitoring process, along with the computerized reports, will ensure that phone interviews are conducted according to study specifications.


Survey response rates will be rigorously monitored throughout the field period. Disposition tables showing the outcomes of all contact and re-contact attempts will be prepared on a daily basis to ensure that the dialing procedure is optimal and that the survey is progressing according to schedule.


After completing the first set of survey interviews (n=50), the results will be processed and analyzed to identify any issues with regard to the survey instrument and data collection procedures. To the extent that any resultant issues are identified, the contractor will propose any changes to the Survey Work Plan in writing to CPSC. Recruitment efforts will be placed “on hold” to accommodate this effort. Demographics, among other variables, will be summarized to determine if recruitment efforts are drawing a range of respondents.


Results will be tabulated and used as anecdotal evidence, which will be added to product testing results, subject matter input analysis, and other study data. The whole of data collected, from this survey and from all sources, will be used by staff to develop a rule that will protect consumers from injury while using table saws. See the Justification section of Part A for further details on why this data is needed. Results from this survey will be used only as they reflect the use of modular blade guard and other information collected for those that respondent. At no time will the data be used without the caveat of its limitations in what it can and does represent.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


The full survey underwent testing via cognitive interviews of nine respondents. Cognitive testing was carried out to ensure that any questions that were misunderstood by respondents or that were difficult to answer would be improved prior to the survey fielding, and thus increase the overall quality of survey data and the accuracy of the study results. In evaluating a question’s performance, the cognitive testing was designed to examine the question response process in terms of the respondent’s comprehension, information retrieval, judgment as to providing requested information, and perceived degree of ease or difficulty experienced in formulating accurate/correct responses to each question posed. The respondents for the cognitive interviews were recruited using a slightly modified version of the recruitment survey (modified to meet the needs of recruiting for cognitive interviews). The interview sessions lasted for ninety minutes. During the sessions, interviewers explained the think-aloud process, conducted a think-aloud practice exercise, and then asked a series of verbal questions. During the cognitive interviews, interviewers used a structured protocol to conduct a one-on-one interview using two methods: think-aloud interviewing and verbal probing techniques. When conducting think-aloud interviewing, respondents are explicitly instructed to "think aloud" (i.e., describe what they are thinking) as they answer the questions. Since the survey will ultimately be fielded over the telephone, the interviewer read the questions to the respondents, to simulate the actual fielding experience as closely as possible. The probing questions were predefined and provided in the interview protocol. These probes were developed to test respondent’s understanding of the question intent, understanding of specific terms used in the questions, and relevance of the response categories for multiple-choice questions. At the end of the session participants were briefly interviewed about their overall impressions of the survey items. The participants were thanked, remunerated, and asked to sign a receipt for their incentive payment.


The results of the cognitive interviews are as follows:

Overall, the survey instrument did not pose any considerable challenges to the respondents. The cognitive interviews showed that a majority of the survey questions were clear and easy to understand, and the response categories for multiple-choice questions were relevant. In addition, the questions asking about socially undesirable behaviors that were considered to be potentially sensitive (e.g. removal of blade guard, etc.), did not cause any discomfort among respondents; on the contrary, the respondents considered them as justifiable and important questions. However, the testing identified issues with respondents’ understanding of some of the terms used in the questions that are discussed below.


There was some terminology that was new to the respondents in the survey, including the term “cabinet saw” and “modular blade guard,” the respondents were still able to accurately answer the questions concerning these, though additional explanations of the terms were included in the questions based on the cognitive interviews results.


Areas of difficulty were identified and corrected based on the cognitive interviews to improve understanding of questions or the choices presented to answer the question. Modifications were made to the answer choices for the question concerning when the respondent replaces the blade guard after it was removed. Clarifications were made to the question which compares traditional blade guard experience to the modular blade guard experience. Wording changes to the question and answer choices were implemented for the question concerning general opinions about the modular blade guard’s performance in woodworking scenarios.


With a limited number of cognitive interviews, there is still a need to ensure no problems in respondents’ understanding and accurately answering the questions. As part of the contract for performing the data collection, after 50 full surveys are completed, there will be analysis to ensure there are no problems in the survey. Adjustments will be made, if necessary, based on the results of this analysis.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Sarah Garland, Ph.D.

Sr. Mathematical Statistician

US CPSC

Directorate of Epidemiology

Division of Hazard Analysis

[email protected]

301.504.7331


Recruiting and Screening Survey:

Regina Yudd, PhD
Senior Research Analyst, Westat
1650 Research Blvd.
Rockville, MD 20850

[email protected]
301.738.3510


Full Survey and Analysis:

Bohdana Sherehiy

Human Factors Research Manager

EurekaFacts, LLC

451 Hungerford Drive, Suite 515

Rockville, MD 20850

[email protected]

240.403.4800, x202


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSarah Garland
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy