2585ss01_PartB

2585ss01_PartB.docx

TSCA Existing Chemical Risk Evaluation and Management - Generic ICR for Surveys (New)

OMB: 2070-0218

Document [docx]
Download: docx | pdf


Supporting Statement for Paperwork Reduction Act Generic Information Collection Submissions for “TSCA Existing Chemical Risk Evaluation and Management; Generic ICR for Surveys” (EPA ICR No. 2585.01; OMB Control No. 2070-NEW)


Part B.


1. QUESTIONNAIRE OBJECTIVES, KEY VARIABLES, AND OTHER

PRELIMINARIES

1(a) Questionnaire Objectives

Under this generic ICR, EPA will survey chemical users, processors, distributors, manufacturers (including importers), and recyclers, chemical waste handlers, consumers of chemical-containing products, employees who may be exposed to the chemical evaluated, state and local regulators, non-governmental organizations, industry experts, and knowledgeable members of the public (including potentially exposed or susceptible subpopulations) related to information collection for TSCA chemical risk evaluation and risk management. Information from surveys conducted under this generic ICR will be used to fill information gaps not addressed through other publicly available information sources.

EPA will use survey data to gain information on conditions of use (such as manufacturing, import, and processing), consumption, market for, exposure to, and substitutes for each chemical evaluated and, if warranted, regulated. The specific questions will depend on the specific information needs of the EPA with respect to risk evaluation and risk management of specific chemicals and conditions of use.

In general, for conditions of use and exposure EPA’s primary objectives are to use survey data to obtain information about:

  1. production volume and process information;

  2. conditions of use;

  3. site release and transfer information (including disposal);

  4. workplace exposure and practices in place to prevent or reduce worker exposure;

  5. consumer exposure and practices in place to prevent or reduce consumer exposure (if applicable);

  6. environmental exposure;

  7. potentially exposed or susceptible populations;

  8. chemical end uses, including products containing the chemical (if applicable); and

  9. other information needed to identify conditions of use or exposures for the subject chemical or chemical category.


For substitutes, EPA’s primary objectives are to use the survey data to obtain information about:

  1. chemical and/or process substitutes for risk management of a given condition of use of a chemical;

  2. efficacy and performance of substitutes;

  3. experience and use practices with use of any substitute;

  4. advantages and disadvantages of any substitute; and

  5. other information needed to calculate costs and benefits of various substitutes.


1(b) Key Variables

The key information to be collected includes the following:

  1. Respondent identification



    • These questions will be used to identify the individual at the firm with the knowledge required to complete the survey.



  1. Screening and scope



    • These questions will be used to determine whether a respondent uses and/or is exposed to the chemical in the conditions of use covered by the survey (e.g. exposed to tetrachloroethylene during dry cleaning at a dry cleaner). Only those who use the chemical in the targeted condition(s) of use will be asked to complete the full survey. Responses from the screening and scope questions may be used to estimate the number exposed to the chemical for a given condition of use.


  1. Firm type and size



    • Responses from these questions will be used to assign the appropriate sampling weights.


After the basic questions above, the questions will be different for surveys about conditions of use and surveys about alternatives. These survey questions will avoid duplication of information in the possession of the federal government.


For surveys about conditions of use, the questions will generally include:



  1. Production volume and process information (questions omitted if already provided through CDR or TRI)



    • How much of the chemical was manufactured and/or imported last year?

    • How much of the chemical or material was distributed off-site (as % of manufacture or import) and how was the chemical distributed?

    • Narrative description and process flow schematic of on-site activities, providing information that gives an understanding of the nature and extent of potential exposures to the subject chemical. The narrative and process flow schematic should cover major unit operations and chemical conversions for manufacturing and on-site uses, if applicable. Make sure to include on-site treatment and disposal, also if applicable. The narrative should provide insight into why and how releases are caused by the process. The schematic should show the points of release of the subject chemical in the workplace and to the environment. In the event the subject chemical is used in many different processes, provide information on each major process instead of each individual process. Please provide mass balance information for various life cycle stages, if available.



  1. Site release and transfer information

    • (If reporting to TRI)

      • Please estimate the following information on days/year of releases and transfers to supplement your TRI report.

        • How many days/year do you have air releases? Fugitive (non-point) air releases? Stack (point) air releases?

        • How many days/year do you have water releases?

        • How many days/year do you have releases to publicly owned treatment works (POTW)?

      • What is your throughput volume of the chemical per batch and per day?

      • What methods are used to treat or dispose of the chemical off-site and on-site? Provide a flow diagram and narrative of treatment and disposal methods and information on the efficiency of those methods.

    • (If not reporting to TRI)

      • What are your estimated annual total fugitive (non-point) releases? What are your estimated total stack (point) releases? How many days/year do you have air releases? Fugitive (non-point) air releases? Stack (point) air releases?

      • What are your estimated annual total water releases (not to POTW)? To which water body(ies) do you release to? How many days/year do you have water releases?

      • What are your estimated total releases to land, including landfills, land treatment/land amendment, surface impoundments, underground injection, and other releases to land?

      • What are your estimated annual total releases to publicly owned treatment works (POTW)? How many days/year do you have releases to POTW?

      • How much do you transfer to other off-site locations? Separate by type of location (incinerator, wastewater treatment, underground injection, hazardous waste (Schedule C) landfill, other) and amount recycled.

      • What are your costs of disposal of the chemical (if applicable)?

      • What is your throughput volume of the chemical per batch and per day?

      • What methods are used to treat or dispose of the chemical off-site and on-site? Provide a flow diagram and narrative of treatment and disposal methods and information on the efficiency of those methods.



  1. Workplace exposure

    • How many sites manufacture, process, distribute, use, recycle, or dispose of the chemical? Where are these sites?

    • How is the chemical delivered? What container sizes are used?

    • How much of the chemical is stored on site and how is the chemical stored?

    • Are there any products or articles containing this chemical on site? If so, which products or articles?

    • How many workers are exposed to the chemical, and for how long (days/year and hours/day?) If you use a product containing the chemical, what product do you use and what is the concentration of the chemical in the product? Please provide safety data sheets for the products used.

    • What routine worker activities result in worker exposure and what type of exposure (dermal, inhalation, etc.)? For each activity, in what physical state and concentration is the chemical? What is the typical setting for use? Provide industrial hygiene monitoring data, if available, with a brief description of the sampling method and exposure scenario monitored.

    • What engineering controls are used to minimize exposure to this chemical and how effective are those controls?

    • What administrative controls are used to minimize exposure to this chemical and how effective are those controls?

    • What personal protective equipment (PPE) is regularly worn by workers to prevent exposure to this chemical?



  1. Consumer exposure (if applicable)

    • What consumer products use the chemical, and what is the concentration of the chemical in the product?

    • What consumer articles contain the chemical?

    • How many consumers are potentially exposed?

    • How do consumers use the product and what type of exposure results? In what physical state and concentration is the chemical when consumers are exposed?

    • Please provide all directions, labels, and warnings instructing consumers how to use (and not use) the product.

    • What are the properties of the product, typical setting for use, frequency of use, duration of use, amount consumed during use, and emission from the article (if applicable)?

    • Provide exposure monitoring data, if available, with a brief description of the sampling method and exposure scenario monitored.



  1. Chemical end uses (if applicable)

    • End use as an intermediate consumed to make other chemicals

      • What major product chemical classes consume the highest volume of the subject chemical on-site and off-site? List these chemical classes, separately for on-site and off-site. For each chemical class (or chemical if a small number of chemicals), detail the total percentage of subject chemical manufactured or imported.

    • End uses other than as a consumed intermediate

    • What are the function, application, and setting for each of the non-intermediate end uses of the chemical? For each of these uses, what is/are the (1) % of total manufactured or imported volume going to this use, (2) % weight if used in a mixture, and (3) all physical forms of the chemical in this use? Was the amount of chemical contained in the product measured experimentally or is it a nominal/reported value?

For surveys about alternatives, the questions will generally include:

  1. Chemical and/or process alternatives for risk management of a given condition of use of a chemical;

  2. Chemical and/or process alternatives and percent of market using these alternatives;

  3. Experience with use of any substitute chemical or alternative method;

  • Did you try to switch to another chemical, product, or process? Did you switch back? If so, what did you switch to? If you switched back, why did you switch back? What made you switch in the first place?

  • Do you sometimes use other chemicals or processes for certain situations or applications? If so what are the circumstances where an alternative works better?

  1. Advantages and disadvantages of any alternative, including in terms of efficacy, exposure, performance, cost, labor, training and hazard; and

  2. Other information needed to calculate costs and benefits of various alternatives.

      • Identification of alternative chemical/product/process

      • How much of the alternative product/chemical would be needed to perform same activity

      • Capital costs including new equipment, time required to identify an alternative and to switch to an alternative, retrofitting of old equipment, etc. of using the alternative chemical/process, loss of use of existing equipment

      • Number of workers required, amount of worker time required, training

      • Number of workers exposed

      • Process changes required (e.g., different preparation, additional/less time to complete task, additional steps, etc.)

      • Energy/water usage

      • Other operation and maintenance costs (e.g., filters, tank cleanings, etc.)

      • Changes in production or output of operation

      • Releases of alternative chemicals/products

      • Waste and disposal costs associated with alternative chemical/process

      • Changes in your product/service quality

      • Training, medical surveillance, or other employee-related costs

      • Recordkeeping burden/costs

      • Monitoring and testing costs

1(c) Statistical Approach

In general, surveys conducted under this ICR can allow the Agency to produce statistically valid estimates for the chemical and conditions of use surveyed. Without the proposed information collection, assumptions, anecdotal information, or convenience samples might be used to derive estimates rather than parameters estimated from a survey designed to use a probability sample to calculate nationally representative parameter estimates.

The Agency will be assisted in its surveys by an independent contractor who will be responsible for developing the survey response database; recordkeeping; identifying the survey sample; overseeing the conduct of the survey and creating a cleaned survey data file of the survey results; weighting, tabulating and analyzing data; and reporting results. The contractor will have extensive experience with sample design, survey methods, internet surveys, telephone data collection using computer-aided technology, data editing and cleaning, and calculation of sampling weights.

1(d) Feasibility

A knowledgeable person at the responding entity should be able to complete the survey. In most cases, EPA will pretest surveys and make revisions based on the insights gained from the pretest. EPA will plan for and allocate resources for the efficient and effective management of information collection.

2. QUESTIONNAIRE DESIGN FOR THE STATISTICAL APPROACH

2(a) Target Population and Coverage


The target population depends on the chemical and condition of use evaluated.

2(b) Sample Design

Sample design, including sampling frame, sample size, stratification variables, and sampling method will depend on the chemical and condition of use evaluated. EPA will consult with statisticians internal to EPA and statisticians working with EPA’s contractor to determine the most appropriate sampling method.

2(c) Precision Requirements

            1. Precision Targets

Before developing a survey, EPA will determine precision targets based on the needs of the risk evaluation or risk management. Precision targets will therefore be adequate to the chemical and condition of use evaluated.

            1. Nonsampling Errors

An expected source of non-sampling error for surveys will be non-response bias – i.e., that non-respondents may differ from respondents. Non-response is best handled at the design stage of a survey, rather than after the data have been collected. Therefore, surveys under this generic ICR will be designed to minimize the incidence of non-response. Finally, data collected in these surveys and follow-up data will be analyzed to determine the extent to which non-response may bias the results. This non-response plan summarizes the approach to dealing with two forms of non-response: unit non-response (i.e., when a survey questionnaire is not completed by a sampled establishment) and item non-response (i.e., when a survey questionnaire is finished, but some data elements are missing).

There are several reasons why selected respondents may not respond to a survey. The four major reasons for potential non-response are likely to be:

  1. Mistrust of regulatory agencies – Some individuals contacted may not have responded because they have an inherent mistrust of regulatory agencies and a concern about the actions that they think EPA may take based on the data they provide in the survey.

  2. Sensitivity to disclosing technical data – Some individuals contacted may be concerned about disclosing their work practices and data, which may result in their failing to respond.

  3. Burden – Individuals contacted for the survey have limited time to respond. The respondent will need to make time to respond to the survey. Thus, some non-response may occur because the contact person does not have the time to respond or will not make the time to respond.

  4. Questions unclear – If the questions in the survey are unclear, the individual receiving the survey may decline to respond.


Survey instruments under this generic ICR will be designed to reduce the number of non-respondents while still gathering the information needed for the Agency’s analysis and decision making.

For most surveys, a multi-staged respondent contact process will be used to reduce the number of initial non-respondents, and to follow up with initial non-respondents to convert them to respondents. A typical respondent contact process could include:

Stage 1 – Notification Letter. EPA will send a letter on EPA letterhead notifying firms in the sample that the survey is taking place and telling them that EPA’s contractor will contact them by telephone to conduct the survey. The letter will be short and will describe the type of data that EPA is collecting, explain why EPA is collecting the data, note that the identity of the firms will be kept anonymous and state EPA’s appreciation for their participation in the survey. The letter will also include an address for the optional online survey.

Stage 2 – Technical Contact Identification Call. EPA’s contractor will call all chemical users in the sample to determine the identity and contact information for a knowledgeable person who can complete the survey. During the call, the interviewer will reiterate the purpose of the survey.

Stage 3 – Reminder Notices. EPA’s contractor will send up to two reminder emails (as needed) to recipients who have provided an email address but have not completed their online survey after the Stage 3 call. If needed, EPA’s contractor will also make a reminder phone call.

Despite efforts to design effective survey instruments, however, some level of non-response is expected. As a final stage in the non-response plan, the data will be analyzed for potential non-response biases. To assess the possibility of non-response bias, the characteristics of respondents and non-respondents will be examined in terms of size, geography, type of firm, etc., to determine whether there could be any significant differences in responses between the two groups. A common procedure in surveys to reduce the bias because of non-response is to adjust the sampling weights of respondents to account for non-respondents after forming weighting classes. The assumption is that respondents and non-respondents within a weighting class are similar. This is a more reasonable assumption than assuming that the total sample of respondents is similar to non-respondents.

For example, if the response rates differ by size of facility and the percentages of interest (i.e., how they answered a question) also differs by size of facility, then size groups can be formed as weighting classes. Within each size group, the weights of respondents can be adjusted to account for non-respondents. The bias in the estimate will be reduced if there is reason to believe that the respondents and non-respondents are similar within a size group. It is worth noting that the weighting approach may be only partially successful, since some of the responses may not be closely correlated with firm size or other available characteristics.

A second potential source of non-sampling error is measurement error. If respondents have difficulty interpreting a question, they may provide inconsistent answers, leading to inaccurate measurement of responses. Information provided from memory may also be inaccurate. Surveys will be designed to minimize measurement error through the use of testing groups and pretests as necessary.

2(d) Questionnaire Design

Questionnaire design will depend on the needs of the survey. In general, respondents will receive an initial notification letter (paper and/or electronic) encouraging them to follow the instructions provided for completing the survey. Respondents who do not complete the survey will be contacted by telephone as a reminder.

Survey instruments may include questions that are multiple choice, numeric (generally asking for counts), and text (open-ended).

3. PRETESTS AND PILOT TESTS

3(a) Pretest

In general, the Agency will use interviews and/or focus groups to pretest survey instruments. In a pretest, respondents will comment on the ease of answering the questions and about how they interpreted the questions. Changes to the survey will be made based on insights gained by conducting the pretest, as necessary.

3(b) Pilot Survey

EPA will rarely conduct pilot tests because of the tight schedules for risk evaluation and risk management under TSCA.

  1. COLLECTION METHODS



In most cases, EPA will choose to conduct its surveys online because many of the questions (like a request for diagrams) do not lend themselves to interviews. EPA may choose to conduct interviews by telephone in cases where the questions lend themselves to such collection and respondents have already failed to reply online. Occasionally, depending on the sets of questions, EPA may choose conduct interviews in person, either at separate sites or at one site (like at a conference).

4(a) Collection Methods

The process to conduct surveys under this generic ICR will vary. In general, EPA will send a notification letter (paper or electronic), and then contact prospective survey participants to convince them to complete an online survey. EPA will then follow-up if surveys are not completed.

4(b) Survey Response and Follow-up



Identifying and classifying non-responders

To gain some insight into non-responding sample units, EPA will first screen the sample being fielded. Non-responders will fall into two segments: those successfully screened and found eligible but who chose either not to participate or those with whom for some reason an interview could never be conducted. The former group is referred to as Screened Eligible Non-Responders (Segment A). The latter group consists of sample units that could not be successfully screened and thus their eligibility status is unknown. These are Unscreened Unknown Eligibility Non-Responders (Segment B).

Segment B is likely to be this study’s largest segment due primarily to non-cooperation and inability to make contact. Regarding segment B, some frame information on these cases will be known, but because of the unknown eligibility status, it is impossible to determine which cases may be eligible. Thus, it is not possible to asses if their exclusion contributes to any bias in the study findings. The non-response bias analysis will report, using frame information, any differences that may exist between the screened vs. unscreened entities. However, it would be unknown if any differences contribute to bias in the study estimates. Segment A offers some opportunity to examine non-response based entirely on information provided in the frame. If non-participation is uncorrelated with work practices, then the nonresponse analysis could consider whether these non-responders are different from or similar to cooperative responders based exclusively on the frame information.

Response Propensity Models

EPA will develop propensity models to examine whether there are statistical differences between the screened eligible responders and screened eligible non-responders on their likelihood to complete the survey. Likewise, the analysis can examine the propensity to cooperate between all those who were screened (regardless of eligibility status) and those establishments who were not screened.

For many surveys, the Dun and Bradstreet frame offers a few variables that can be used in these analyses. These establishments, within study groups, can be defined by their location (four Census regions), the size based on number of employees (e.g., categorized: 0-4, 5-9, 10-19 …. 100+) and the possibility of NAICS subgroups within study groups, although these NAICS subgroups will have to be evaluated for feasibility and practicality.

5. ANALYSIS AND REPORTING QUESTIONNAIRE RESULTS

5(a) Data Preparation

The survey data will be extracted into a database with all identifying respondent information removed.

A weight will be computed for each completed screener and long survey that adjusts for the differential probabilities of selection as well as nonresponse, and these weights will differ based on the survey.

Survey sample designs will generally necessitate that appropriate statistical software be used to estimate the precision of the survey estimates. In most cases, EPA’s contractor will compute these weights.

5(b) Analysis

The data will be analyzed using a statistical software package. Data analysis will include both descriptive statistics (e.g., frequencies of survey variables) and relationship analyses (e.g., regression analysis).

5(c) Reporting Results

EPA will use the survey data to prepare exposure, economic analyses, or other components of the risk evaluation or risk management processes. Results of EPA’s analyses are usually reported publicly in three ways: (1) within Federal Register notices; (2) within supporting documents placed in public dockets for risk evaluation, risk management, or other actions; and (3) within materials placed in the rulemaking record. All of these classes of documents would be made available by EPA on the Internet and through the docket system.

Neither EPA nor any other person or entity other than the contractor hired to perform the survey will have access to personal identifiers in the raw survey data. These personal identifiers include the respondent’s name, the respondent’s phone number, and the name of the organization the respondent works for. All personal identifiers will be stripped from the database before it is conveyed to EPA. The original survey database will remain under the control of the contractor hired to perform the survey.



References

Groves, R.M. and Heeringa, S.G. (2006). Responsive Design for Household Surveys: Tools for Actively Controlling Survey Errors and Costs, Journal of the Royal Statistical Society, Series A, 169, 3, 439-459.









File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMonroe, Albert
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy