ICR-2494.01 Part B 07-02-14

ICR-2494.01 Part B 07-02-14.docx

Survey of the Public and Commercial Building Industry

OMB: 2070-0193

Document [docx]
Download: docx | pdf

PART B OF THE SUPPORTING STATEMENT


1. QUESTIONNAIRE OBJECTIVES, KEY VARIABLES, AND OTHER

PRELIMINARIES


1(a) Questionnaire Objectives


As part of its investigation into lead exposure resulting from RRP activities in P&CBs, EPA seeks to obtain a better understanding of the incidence of different types of RRP activities that disturb lead-based paint in P&CBs, the methods that are used to conduct these activities, the work practices that are used to contain and clean the resulting dust, and the characteristics of the buildings. EPA will use this information to estimate the resulting exposures to lead dust. If EPA determines that a regulation is needed, the Agency will use this data to assess the incremental benefits and costs of potential options to reduce such exposures.


EPA’s primary objectives are to use the survey data to:


  1. Gather information on building and activity patterns that may affect exposures to lead dust from RRP activities in P&CBs;

  2. Determine the number of firms that perform RRP activities in P&CBs;

  3. Determine the types and numbers of RRP activities that are performed;

  4. Determine the extent to which various work practices that disturb painted surfaces are currently being used in RRP jobs in P&CBs; and

  5. Determining the extent to which various work practices that help with the containment and cleanup of lead dust are currently being used in RRP jobs performed in P&CBs.


Secondary objectives of the survey include:


  1. Collecting information on how the proportion of jobs that use particular practices differs between different types of firms; and

  2. Collecting information on the percentage of contractors who already have training and certification under EPA’s residential RRP Program.


1(b) Key Variables


The key information to be collected includes the following:


  1. Respondent Identification


    • These questions will be used to identify the individual at the firm with the knowledge required to complete the survey.


  1. Screening and Scope


    • These questions will be used to determine whether a respondent performs potentially regulated RRP activities in P&CBs. Only those who perform RRP activities in P&CBs will be asked to complete the full survey. Responses from the screening and scope questions will be used to estimate the universe of firms conducting RRP activities in P&CBs.



  1. Firm Type and Size


    • Responses from these questions will be used to assign the appropriate sampling weights.


  1. Total Number of Activities


    • The survey includes 11 categories of RRP activities that disturb painted surfaces. For each of these 11 activity categories, the survey asks two questions: (1) the total number of jobs and (2) the total number where more than a small areas of paint (equivalent to the minor maintenance exception in the RRP rule) is disturbed by the respondent firm’s employees.


    • In addition, contractor respondents who report painting activities will be asked about the numbers of jobs in the past year where they use one of the following paint removal techniques: (1) open flame or torch, (2) high temperature heat gun, (3) abrasive blasting (exterior only), and (4) needle gun/scaler. In general, the survey only asks detailed questions about activity types (including paint removal techniques) and work practices for the most recent job a respondent has completed. Responding about the most recent job may be easier and less burdensome for respondents than if they were asked about all of the jobs they have conducted in the past year, given the difficulty of recalling the details of all of those jobs. And given the variation in jobs a firm may conduct (including job type, job size, and building type), it might be difficult for a respondent to characterize their “typical” work practices. However, the four paint removal techniques described above are the exception to asking about the most recent job because these techniques potentially generate high levels of lead dust and are believed to be uncommon enough that information about them may not be captured if the survey only asks about them in the context of the respondent’s most recent job.


  1. Recent Job: Job Size, Job Duration, LBP Testing, Access to Work Area


    • Each respondent will be randomly selected for a more detailed series of questions about a single RRP activity type that they perform (such as removing or replacing painted building components, cutting or making holes in painted surfaces, or surface preparation for painting). Once the activity type is selected, the respondent will be prompted to recall the most recent job of that type performed in a building where a test indicated the presence of LBP or surfaces that might contain paint that was applied before 1978 were disturbed. If there were no recent jobs meeting these criteria they will be asked about the most recent job.


    • Respondents will be asked to characterize the size and type of building the work was performed in. This information will allow EPA to estimate how much paint might be disturbed and the number of people potentially exposed to lead dust generated by the RRP activity.



    • For respondents being asked about a recent painting job, they will be asked how long the paint preparation took and how many workers were doing this work, the percent of the painted surface disturbed, and the average number of layers of paint removed during the paint preparation. This information will be used to estimate how much paint was removed.


    • For respondents being asked about a recent window or exterior door job, they will be asked how many windows or doors were replaced. This will be used to estimate how much paint was disturbed.


    • All respondents will be asked about: (1) the total duration of the job, (2) the times of day the work was done, and (3) access to the work area by building occupants. Responses to these questions will be used in estimating the exposure to lead dust that building occupants might experience.


  1. Job Combinations


    • For the most recent RRP activity a respondent is asked about, he or she will be asked to report what other RRP activities were performed at the same time (such as replacing windows and performing surface preparation before repainting walls as part of the same job). The responses to these questions will be used to ensure that the estimates of the number of activities are not based on double counting, and to estimate the total lead dust generated from jobs composed of multiple RRP activities.


  1. Containment


    • There are seven questions related to dust containment practices. Responses to these questions will be used to estimate the extent to which firms are already using practices to contain dust from RRP activities.


  1. Cleanup


    • Responses to these questions will be used to estimate the extent to which firms are already using practices to clean up dust following RRP activities.


  1. Paint Preparation and Removal Techniques


    • Respondents being asked about a recent job involving paint preparation or removal will be asked about the specific techniques they used.


  1. Routine Cleaning


    • Respondents will be asked how often most areas of the building receive routine janitorial cleaning. This information will be used to estimate changes in exposures to lead dust over time following an RRP activity.





  1. Motivation for lead-safe work practices


    • Respondents will be asked what one or two factors usually influence the level of dust containment and cleaning for a given job in P&CBs.


  1. Baseline RRP Certification


    • Respondents will be asked whether they are already certified under the residential RRP rule. This is asked because being certified for residential work may affect firms’ baseline work practices in P&CBs, and to estimate the potential costs if uncertified firms are required to become certified in order to perform RRP work in P&CBs.


  1. Open Ended


    • Respondents will be given the option to provide additional comments or information.


1(c) Statistical Approach


This ICR will allow the Agency to produce statistically valid estimates for the population of RRP firms and jobs that disturb paint in P&CBs. Without the proposed information collection, these estimates will have to be derived from assumptions, anecdotal information, or convenience samples rather than parameters estimated from a survey designed to use a probability sample to calculate nationally representative parameter estimates. Therefore, the Agency has chosen a statistical approach for this ICR.


The Agency will be assisted in this survey effort by Abt Associates, an independent contractor, who will be responsible for developing the survey response database; identifying the survey sample; overseeing the conduct of the survey and creating a cleaned survey data file of the survey results; weighting, tabulating and analyzing data; and reporting results. Abt Associates has extensive experience with sample design, survey methods, internet surveys, telephone data collection using CATI, data editing and cleaning, and calculation of sampling weights.


1(d) Feasibility


A knowledgeable person at the respondent firm should be able to complete the survey. To make respondent recall easier the more detailed questions about the use of work practices are asked only about a recent job – reviewing or consulting company records is not necessary. EPA conducted a pretest of the contractor survey with 6 respondents and has made some revisions to the questionnaire based on the insights gained from the pretest. EPA has planned for and allocated resources for the efficient and effective management of the information collection.







2. QUESTIONNAIRE DESIGN FOR THE STATISTICAL APPROACH


2(a) Target Population and Coverage


The target population consists of three groups of establishments described in Table B2.1.


Table B2.1: Target Population


Group 1

Group 2

Group 3

Substantive description

Contractors

Nonresidential Lessors, Property Managers and Facility Support Services

Building Occupants

[NAICS codes]

[NAICS 2361, 2362, 2381, 2382, 2383, 2389]

[NAICS 53112, 53131, 5612]

[All other NAICS codes]

1-4 employees

1,328,230

(83.2%)

150,725

(89.9%)

12,561,214

(75.6%)

5-9 employees

142,486

(8.9%)

8,969

(5.4%)

1,577,814

(9.50%)

10-49 employees

106,645

(6.7%)

5,814

(3.5%)

1,390,439

(8.4%)

50+ employees

13,432

(0.8%)

1,327

(0.8%)

371,500

(2.2%)

Unknown size

4,802

(0.3%)

866

(0.5%)

711,991

(4.3%)

Total

1,595,595

(100%)

167,701

(100%)

16,612,958

(100%)

Note: frame counts provided by sample vendor (SSI) using a Dun & Bradstreet database.

The incidence of RRP work disturbing lead-based paint is expected to differ markedly between the three groups, hence the rationale for stratification. Also, the larger companies, while representing fewer establishment in the universe, take a disproportionately larger share of employment, and hence the number of jobs performed and the total amount of lead-based paint they are likely to disturb (all other things being equal). Thus, the larger firms need to be oversampled relative to mid- and small-size firms. Also, the larger establishments from Group 3 are more likely to have their own maintenance departments that could be conducting RRP jobs disturbing lead-based paint, while the smaller firms may be more likely to subcontract such jobs to establishments in Groups 1 and 2. While over 99% of the entities in the U.S. are small, small entities are expected to represent roughly 97% of the survey respondents.


From this population of establishments, EPA will select a representative sample large enough to support nationally representative estimates of the number of affected firms, number of affected employees, and the number of RRP jobs in which lead-based paint is disturbed.


2(b) Sample Design


At this stage, little information is available except for the frame counts to guide the sample design. The information that EPA needs to continue analyzing RRP activities in P&CBs refers to three different populations:

  1. The population of establishments, to estimate the total number of affected establishments, necessary to estimate the potential certification costs.

  2. The population of contractor, property management and building maintenance workers to estimate the total number of workers affected, necessary to estimate potential training costs.

  3. The population of RRP jobs, to estimate the total number of jobs disturbing lead-based paint, necessary to estimate potential work practice costs and public exposure to lead dust generated by disturbing lead-based paint.


For the first of the above population tasks, a simple random sample (SRS) of establishments is the optimal design to estimate the incidence of RRP work. This can be used to construct a ratio estimator of the total number of firms across the nation conducting such jobs. An SRS solution is conceptually easily to execute and does not require any additional information.


For population task 2 above, the optimal design is a sample proportional to the number of workers employed in RRP jobs. However, this particular sample is not feasible to obtain because the number of RRP workers cannot be identified on the DMI frame. It can only be assumed that a sizeable proportion of the workers in Group 1 establishments (contractors) are likely participants in RRP projects. For Group 2 and especially Group 3, maintenance workers will constitute a small to zero fraction of the total employment of establishments.


Finally, for population task 3, the optimal design would be one that is proportional to the number of jobs performed by a given establishment. Again, such a sample is not feasible because the number of RRP jobs is also not available on the DMI frame.


In order to proceed with the development of a possible sample design that provides some level of utility, assumptions have to be made regarding:

  1. the incidence of RRP jobs in the population or its segments; and

  2. the relation between the frame information (number of employees) and the target information (number of RRP workers; number of jobs performed by an establishment).

To devise an efficient sampling scheme, it is assumed that the establishment size (number of workers) and the number of jobs are jointly distributed. The design assumptions made regarding the population parameters are shown in the top half of Table B2.2. The parameters of these distributions are obtained from U.S. Census data.


All of these assumptions regarding cooperation and completion will need to be verified in the field. After the first 120 cases are completed, the design assumptions parameters will be evaluated and adjustments will be made as needed.


The bottom half of Table B2.2 shows the proposed design to generate the estimates for the three Groups listed above. Given the total sample size, the sample sizes per employee size category are driven by attempting to obtain the maximum precision within each size category.


Table B2.2: Design Assumptions and Proposed Allocations


Group 1

Contractors

Group 2

Lessors and Property managers

Group 3

Building Occupants

Design assumptions *




Incidence of RRP in P&CBs

20%

10%

5%

Fraction of employees in RRP jobs

90%

40%

5%

Avg. number of jobs/year

100

50

10





Proposed design assumptions and estimates

Allocation proportions for screeners (Total = 100%)

19.4%

24.0%

56.6%

Total no. of screeners (Total = 8,485)

1,645

2,040

4,800

Screeners in 0-4 employees

745

935

1,513

Screeners in 5-9 employees

369

374

860

Screeners in 10-49 employees

317

565

1,686

Screeners in 50+ employees

214

166

742





Completes (Total = 402)

254

68

80

Margin of error (0.5 base)

± 6.1%

± 11.9%

± 11.0%

*Based on professional judgment

Unlike in many other surveys with eligibility at or close to 100%, most of the firms that will be contacted are not expected to be eligible to participate in the survey. Numerous contractors only work in residences and do not work in P&CBs. Similarly, many lessors and property managers as well as building occupants hire contractors to perform RRP activities, instead of using their own staff. However, information on whether an establishment performs RRP activities in P&CBs is not available in the Dun & Bradstreet database, or through any other readily accessible source. Therefore, it is not feasible to design a random sampling method for the survey without contacting a large number of establishments and screening them through the questionnaire. However, the questionnaire has been designed to quickly screen out respondents who do not perform RRP activities in P&CBs, in order to reduce the burden of locating eligible respondents.


Adaptive sample design


It is a realistic possibility that the assumptions that were made to obtain allocations in Table B2.2 will be far from the data that could be observed in practice. For these reasons, there will be an initial smaller sample for the purpose of providing more informed parameter estimates for the rest of the study sample. Once a sample size of 10 full completes is achieved in each of the NAICS group by establishment size cell, the assumed parameters in Table B2.2 will be re-estimated. Thus, this initial sample size would be a total of 120 establishments (10 completes × 12 strata = 120) out of the total study sample size of 402 establishments. For Group 1 alone, it would be 40 establishments (10 completes x 4 strata = 40). Based on these initial findings, EPA can re-design the balance of the sample and adaptively optimize the sample allocations to assure the best possible approximation of study estimates. (If deemed necessary, the establishment size categories can be redefined if substantial efficiency gains can be expected from a different size stratification scheme.)


Adaptive sample design is and has been a standard practice in many survey organizations and federal statistical agencies. Groves and Heeringa (2006) provided a review of the current approaches in using process paradata to make interventions into ongoing data collection with the aim of improving coverage or response rates in specific subgroups. The idea of “adaptive” or “responsive” sample design is actually much older, since using the incoming information on the population parameters for which little was known at the design stage has often been a practical, corrective necessity in well-managed and monitored data collection operations over past decades.


  1. Sampling Frame


The Dun & Bradstreet “Dun’s Market Identifiers” DMI file will serve as the sampling frame for this establishment survey. The DMI is considered to be the most comprehensive commercially available list of U.S. businesses. What is essentially critical is that the DMI includes industry and establishment size information necessary for stratifying the sample. Additionally, the sample provider, Survey Sampling, Inc. (SSI), can provide additional information, such as the name, the email address, and the phone number of a named executive for a specified managerial job title.


An establishment will be defined as the business located at a particular address or location. Eligibility for the screener will be determined by verifying that they are still in business. Data will be collected with respect to this location, even if the firm has other locations. The unit of analysis is the worksite, defined as a “single physical location where business is conducted or where services or industrial operations are performed.” The headquarters and other purely administrative sites of the larger contractors, nonresidential lessors, property managers and facility support services firms will be removed at the sampling stage using SSI corporate linkage information.

(ii) Sample Size


It is anticipated that 8,485 establishments will be screened in order to achieve a desired total number of 402 full completes. For Group 1, 1,645 establishments would be screened to identify and complete 254 interviews. This sample size was selected in order to achieve the precision targets shown in the bottom row of Table B2.2 (6.1% for Group 1 and between 11.0% and 11.9% for the other two Groups).

(iii) Stratification Variables

A key requirement of a successful establishment sample is that it produces reliable estimates by establishment size and industry grouping. The most efficient way to accomplish this goal is to stratify the DMI sampling frame by the cross-classification of establishment size and industry. The initial proposed stratification by industry, to the extent the companies in these NAICS groups are expected to be involved in RRP jobs disturbing lead-based paint, and to the extent that incidence and the number of jobs is expected to vary with sample size, is the one given in Table B2.2.


Establishments will be selected with equal probabilities within each of the 12 strata defined by the establishment size and industry classification groups. Larger establishments will be sampled at a higher rate to ensure that enough large establishments are available for the analysis. It is expected that these larger establishments will account for a disproportionately high fraction of the total RRP jobs. The employer survey weight will correct for the oversampling of large establishments by weighting by the inverse of the probability of selection. The weights will also correct for the expected non-response by using the standard non-response adjustment techniques (non-response cell adjustments or propensity score modeling). Finally, the weights will be normalized to agree with the distribution of the size of establishments as indicated by the Census distribution via the appropriate weight calibration (raking) procedures. This will yield approximately unbiased estimates of establishments in the DMI frame and, assuming it provides adequate coverage, in the United States.


(iv) Sampling Method

Establishments will be selected using the method of stratified random sampling without replacement.


2(c) Precision Requirements


(i) Precision Targets


EPA’s survey has been designed to obtain estimates that have margins of error of 6.1% for Group 1 and between 11.0% and 11.9% for the other two Groups. EPA feels that these precision rates will be adequate to characterize the number of regulated activities and the extent to which renovators currently use certain containment and cleaning practices.


  1. Nonsampling Errors


An expected source of non-sampling error for this study will be non-response bias – i.e., that the non-respondents may differ from respondents. Non-response is best handled at the design stage of a survey, rather than after the data have been collected. Therefore, the strategy to minimize non-response for this survey is to use a survey design that will minimize the incidence of non-response. Finally, the data collected in the survey and the follow-up data will be analyzed to determine the extent to which non-response may bias the results. This non-response plan summarizes the approach to dealing with two forms of non-response: unit non-response (i.e., when a survey questionnaire is not completed by a sampled establishment) and item non-response (i.e., when a survey questionnaire is finished, but some data elements are missing).


There are a number of reasons why selected respondents may not respond to the survey. The four major reasons for potential non-response are likely to be:


  1. Mistrust of regulatory agencies – Some individuals contacted may not have responded because they have an inherent mistrust of regulatory agencies and a concern about the actions that they think EPA may take based on the data they provide in the survey.

  2. Sensitivity to disclosing technical data – Some individuals contacted may be concerned about disclosing their work practices and data, which may result in their failing to respond.

  3. Burden – Individuals contacted for the survey have limited time to respond. The respondent will need to make time to respond to the survey. Thus, some non-response may occur because the contact person does not have the time to respond or will not make the time to respond.

  4. Questions unclear – If the questions in the survey are unclear, then the individual receiving the survey may decline to respond.


The survey instruments have been designed to reduce the number of non-respondents while still gathering the information needed for the Agency’s analysis and decision making. Respondents have the option to either complete the survey online or over the phone. The requested information should be readily available to a knowledgeable individual at the firm, not requiring any additional research on their part. The survey asks detailed technical questions only about the respondent’s most recent job to make recall easier.


A multi-staged respondent contact process will be used to reduce the number of initial non-respondents, and to follow up with initial non-respondents in order to convert them to respondents.


Stage 1 – Notification Letter. EPA will send a letter on EPA letterhead notifying firms in the sample that the survey is taking place and telling them that EPA’s contractor will contact them by telephone to conduct the survey. The letter will be short and will describe the type of data that EPA is collecting, explain why EPA is collecting the data, note that the identity of the firms will be kept anonymous and state EPA’s appreciation for their participation in the survey. The letter will also include an address for the optional online survey.


Stage 2 – Technical Contact Identification Call. EPA’s contractor will call all businesses in the sample to determine the identity and contact information for a knowledgeable person who can complete the survey. During the call, the interviewer will reiterate the purpose of the survey.


Stage 3 – EPA contractor conducts interview on phone. This can be the same call as in Stage 2 or a subsequent appointment time. If the interview cannot be completed by telephone, an email will be sent to the key informant of the eligible businesses with a link to the online survey.


Stage 4 – Reminder Notices. EPA’s contractor will send up to two reminder emails (as needed) to recipients who have provided an email address but have not completed their online survey after the Stage 3 call. If needed, EPA’s contractor will also make a reminder phone call. If the reminder call reaches the contact person, an offer to complete the survey with a telephone interview will again be made.

All interviewers will be trained to identify and avert potential refusals and attempt to convert non-respondents who refused. Interviewers will record information about refusals, which may facilitate subsequent interview attempts if refusal conversion is deemed possible. The combination of a well-designed and well-executed survey with a highly capable staff of interviewers should minimize the magnitude of non-response and ensure the reliability of the survey results.


Despite efforts to design effective survey instruments, however, some level of non-response is expected. As a final stage in the non-response plan, the data will be analyzed for potential non-response biases. To assess the possibility of non-response bias, the characteristics of respondents and non-respondents will be examined in terms of size, geography, type of firm, etc., to determine whether there could be any significant differences in responses between the two groups. A common procedure in surveys to reduce the bias because of non-response is to adjust the sampling weights of respondents to account for non-respondents after forming weighting classes. The assumption is that respondents and non-respondents within a weighting class are similar. This is a more reasonable assumption than assuming that the total sample of respondents is similar to non-respondents.


For example, if the response rates differ by size of firm and the percentages of interest (i.e., how they answered a question) also differs by size of firm, then size groups can be formed as weighting classes. Within each size group, the weights of respondents can be adjusted to account for non-respondents. There will be a reduction in the bias in the estimate if there is reason to believe that the respondents and non-respondents are similar within a size group. It is worth noting however, that the weighting approach may be only partially successful, since some of the responses may not be closely correlated with firm size or other available characteristics.


A second potential source of non-sampling error is measurement error. If respondents have difficulty interpreting a question, they may provide inconsistent answers, leading to inaccurate responses. Information provided from memory may also be inaccurate. The survey includes instructions for phone interviewers to help them clarify any concepts that are challenging and the online version of the survey instrument will include text boxes that include these clarifications. In addition, the more technical survey questions are focused only on the respondents’ most recent job to aid in recall.

2(d) Questionnaire Design


Respondents will receive an initial notification letter encouraging them to follow the instructions provided for completing the survey using the online instrument. Those respondents who do not complete the survey online will be contacted by telephone using trained interviewers calling from a survey telephone center.


The telephone survey instrument will use a CATI screener questionnaire and detailed interview questionnaire. Respondents will also have the option to complete the survey online. Skip and branching logic will be programmed into the CATI system and the online survey instrument. This allows irrelevant questions to be skipped automatically (e.g., if a respondent reports that she only does exterior painting, she will not be asked about interior painting) and also allows questions asked later in the survey to be based on earlier questions (e.g., if a respondent indicates that 40 percent of his jobs are interior painting and 60 percent are exterior painting, the survey instrument will be programmed so that there is a 40 percent chance that the most recent job he will be asked about is an interior painting job, and a 60 percent chance that he will be asked about an exterior painting job). While this type of questionnaire design is well suited to a telephone or online survey, it is not feasible for a mail survey.


The survey instrument includes questions that are multiple choice, numeric (generally asking for counts), and text (open-ended). The open-ended question simply provides respondents with the option to provide any additional information to EPA that they think might be useful.


3. PRETESTS AND PILOT TESTS


3(a) Pretest


The Agency conducted a pretest of the contractor survey instrument with 6 respondents. The respondents were asked to comment on the ease of answering the questions and about how they interpreted the questions. Several changes to the survey were made based on insights gained by conducting the pretest, including the following:


  • Respondents reported that EPA is likely to have a difficult time convincing respondents to participate in the study. In response to this feedback EPA has added an incentive of $50 for eligible contractor respondents that complete the entire survey.

  • Some respondents had difficulty recalling the numbers of projects where specific work practices were used (e.g., projects that involved paint preparation for interior painting). Pretest respondents told the interviewers that it would be easier to respond to these questions by reporting the number of jobs as a percentage of the total jobs that they performed. The questionnaire was revised so that respondents have the option to report the number of jobs as a count or as a percentage.

  • The questionnaire was initially worded so that respondents would be asked about a recent job in a pre-1978 building. The objective of limiting response to pre-1978 buildings was to limit responses to instances where LBP might have been disturbed. However, more than one case arose where the respondents knew that LBP was not present in the painted surfaces they disturbed even though it was an older building. The questionnaire has been revised to ask about recent jobs where LBP might have been disturbed either because the paint was tested for LBP or the respondent reported that the surfaces they disturbed might contain paint that was applied before 1978. Two pretest respondents completed the revised version of these questions and did not have any difficulty with them.

  • Respondents were asked whether they would be able to report the average number of layers of paint that were removed during paint preparation. They said that they could provide an estimate of this, so the question was added to the survey.


3(b) Pilot Survey


Given time and budget constraints, EPA does not plan to conduct a pilot test.



  1. COLLECTION METHODS


EPA has chosen to conduct its survey by telephone and online interview for three reasons:


  • Skip and branching logic will be programmed into the online instrument and the CATI system. This interactive survey format is not feasible for a mail survey.

  • Having both an online and telephone option should maximize cooperation. Some respondents may prefer to respond via an online survey, which will allow them to respond to the survey whenever it is most convenient for them. Also, telephone surveys can maximize cooperation through direct contact with the respondents and provide an opportunity to clarify terms and aid recollection through probes.

  • Telephone surveys provide a rapid turnaround time.


For those respondents that decide not to complete the survey online, initial contacts and follow-ups will be made using the CATI system. Interviewers will receive training on the intent of the survey, the range of potential responses, and definitions of key terms used or addressed by the survey.


4(a) Collection Methods


The process to conduct this information collection is illustrated in Exhibit B4.1.









Exhibit B4.1: Survey process flow chart.


Shape1

Purchase sample of contractors, manager/lessors and building occupants



Shape2

Shape26 Shape3 Shape4 Shape5 Shape6 Shape7 Shape8 Shape9 Shape10 Shape11 Shape12 Shape13 Shape27 Shape14 Shape15 Shape16 Shape17 Shape18 Shape19 Shape20 Shape21 Shape22 Shape23 Shape24 Shape25

Conduct interviewer training

Conduct telephone interviews to complete 120 interviews across all three groups

Analyze data to determine if sample allocation needs to be modified

Resume telephone data collection

Eligible firms do not complete telephone interview

Send email notification to key informant with link to web survey

Eligible firms do not complete on web interview

Contacted firms complete screening and main interview in either mode

Data from both modes is checked and processed to create a combined data file

Analysis is performed

Report is produced

Send notification letters



Shape28

Program and test CATI instrument



























Shape29



Shape30

Non-response bias analysis




EPA will send a notification letter to all sampled addresses to inform them about their selection in the sample. The objective of this letter will be to inform them about the study and invite their participation. A link to the online address where the surveys can be completed will be included with this letter. The online survey will allow respondents to complete the screening as well as the main questionnaire. One week after the mailing of this letter, the first phase of telephone interviews will begin. Businesses in the sample will be called to screen for eligibility and to identify the key informant. Once contacted, interviewers will attempt to complete the screening as well as the main interview. EPA’s contractor will attempt to initially complete 120 interviews across the three study groups and analyze the data collection process to determine if sample allocation needs to be modified for the rest of the study.


Telephone data collection will be conducted by interviewers employed by EPA’s contractor who are experienced in conducting large-scale national surveys of business populations. Prior to telephone data collection, interviewers will receive training about the study. The objective of the training will be to cover background information about the study, discuss any special characteristics or interviewing protocols, and thoroughly discuss the instrument. Given this is an establishment survey of firms of varying sizes and structures, EPA’s contractor will need to develop special calling protocols to increase cooperation and ensure quality. Telephone calls will be conducted on a schedule designed to facilitate successful contact with targeted businesses, which may require making calls during morning and evening hours when respondents are more likely to be available to participate in a phone interview.


When calling each establishment, the interviewer will use a systematic procedure to determine if the intended business has been reached, and whether the business is eligible for the study. The interviewer will also attempt to identify an appropriate key informant for the study and will then ask to speak to that person. If the contact person is not available to talk, the interviewer will probe for an appropriate callback time or set up an appointment to talk. The interviewer will also ask for the email address of the contact person so that they can email them a link to the online version of the survey. If the contact person still refuses to participate, the interviewer will record that information and the reasons given. When the contact person agrees to participate, the interviewer will conduct the interview over the phone.


4(b) Survey Response and Follow-up


Identifying and classifying non-responders


To gain some insight into non-responding sample units, the first step is to screen the sample being fielded. Non-responders will fall into two segments, those that were successfully screened and found eligible but chose either not to participate or for some reason an interview could never be conducted. This group is referred to as Screened Eligible Non-Responders. The other category consists of sample units that could not be successfully screened and thus their eligibility status is unknown. These are Unscreened Unknown Eligibility Non-Responders.


In the Exhibit B4.2 schematic, the Screened Eligible non-responders are in segment A and the Unscreened Unknown Eligibility non-responders are in segment B, which is likely to be this study’s largest segment due primarily to non-cooperation and inability to make contact. Regarding segment B, some frame information on these cases will be known, but because of the unknown eligibility status, it is impossible to determine which cases may be eligible. Thus, it is not possible to asses if their exclusion contributes to any bias in the study findings. The non-response bias analysis will report, using frame information, any differences that may exist between the screened vs. unscreened entities. However, it would be unknown if any differences are a contributor of bias in the study estimates. Segment A offers some opportunity to examine non-response based entirely on information provided in the frame. If it is assumed that non-participation is uncorrelated with work practices, then the nonresponse analysis could consider whether these non-responders are different from or similar to cooperative responders based exclusively on the frame information.


Response Propensity Models


Analytically, propensity models can be developed to examine whether there are statistical differences between the screened eligible responders and screened eligible non-responders on their likelihood to complete the survey. Likewise, the analysis can examine the propensity to cooperate between all those who were screened (regardless of eligibility status) and those establishments who were not screened. The models would be developed for each of the three study groups for a total of six propensity models.


The Dun and Bradstreet frame offers a few variables that can be used in these analyses. These establishments, within study groups, can be defined by their location (four Census regions), the size based on number of employees (e.g., categorized: 0-4, 5-9, 10-19 …. 100+) and the possibility of NAICS subgroups within study groups, although these NAICS subgroups will have to be evaluated for feasibility and practicality.


Shape38
Shape37 Shape33 Shape35 Shape31 Shape32 Shape36 Shape34

Screened Eligible

Responders

Screened as Ineligible

Screened Eligible

Non-Responders

B

Unscreened Unknown Eligibility

Non-responders

A

Exhibit B4.2. Responder, Non-responder schematic






5. ANALYSIS AND REPORTING QUESTIONNAIRE RESULTS


5(a) Data Preparation


The interview data will be extracted into a database with all identifying respondent information removed.


A weight will be computed for each completed screener and long survey that adjusts for the differential probabilities of selection as well as nonresponse. Specifically, the weight will be computed with the following adjustments:


  1. a base weight reflecting the probability of selection of the establishment;

  2. an adjustment for nonresponse to the screener and extended interview; and

  3. a post-stratification adjustment to Census population controls for the number of establishment in each of the twelve strata defined by size and industry.

The sample design of the employer survey necessitates that appropriate statistical software be used to estimate the precision of the survey estimates. Abt SRBI (the contractor conducting the survey for EPA) shall compute the main weight, as well as the replicate weights for variance estimation that will ensure that the standard errors based on the RRP survey correctly reflect the sample design.


5(b) Analysis


The data will be analyzed using a statistical software package. Data analysis will include both descriptive statistics (e.g., frequencies of survey variables) and relationship analyses (e.g., regression analysis).


5(c) Reporting Results


EPA will use the survey data in preparing exposure and economic analyses. Results of EPA’s analyses are usually reported publicly in three ways: (1) within Federal Register notices; (2) within development and supporting documents; and (2) within materials placed in the rulemaking record. All of these classes of documents would be made available by EPA on the Internet.


Note that neither EPA nor any other person or entity other than the contractor hired to perform the survey will have access to personal identifiers in the raw survey data. (These personal identifiers include the respondent’s name, the respondent’s phone number, and the name of the organization the respondent works for.) All personal identifiers will be stripped from the database before it is conveyed to EPA. The original survey database will remain under the control of the contractor hired to perform the survey.

References

Groves, R.M. and Heeringa, S.G. (2006). Responsive Design for Household Surveys: Tools for Actively Controlling Survey Errors and Costs, Journal of the Royal Statistical Society, Series A, 169, 3, 439-459.



ATTACHMENTS TO THE SUPPORTING STATEMENT



Attachment 1 - Toxic Substances Control Act Section 402; 15 U.S.C. 2681 et. seq.


Attachment 2 - Public Consultations


Attachment 3 - Public Comments on Proposed ICR and EPA Responses to Public Comments


Attachment 4 - EPA Public and Commercial Building Contractor Survey Questionnaire


Attachment 5 - EPA Public and Commercial Building Manager/Lessor Survey Questionnaire


Attachment 6 - EPA Public and Commercial Building Occupant Survey

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRichmond, Jonah
File Modified0000-00-00
File Created2021-01-27

© 2025 OMB.report | Privacy Policy