2285ss01b

2285ss01b.doc

Schools Chemical Cleanout Campaign (SC3)

OMB: 2050-0203

Document [doc]
Download: doc | pdf

PART B OF SUPPORTING STATEMENT

1. Survey Objectives, Key Variables, And Other Preliminaries


The purpose of the SC3 Program is to promote chemical management programs that remove outdated, unknown, or unneeded amounts of dangerous or inappropriate

chemicals from K-12 schools. SC3 also promotes the creation of policies and practices that prevent future accumulations of chemicals and encourages responsible management

practices of chemicals used in schools. These efforts aim to minimize exposure to students and staff, thus improving the learning environment and reducing school days lost.


Under SC3, EPA partners with companies that agree to work directly with K-12 schools to remove their chemical wastes and otherwise improve chemical/waste management. EPA has developed two survey forms to learn about Partners’ experiences and needs under the Program: an Initial Survey form (to be completed by a Partner in its first year of participation in the SC3 Survey) and an Annual Update (to be completed in each subsequent year of its partnership). This is a census of Partners. There are currently 11 Partners.

1(a) Survey Objectives

The SC3 Survey has four main objectives:


        1. Collect information on the Partner’s reasons for joining the SC3 Program and its future plans.


        1. Identify the activities of Partners under the SC3 Program and how many schools, students, and staff are affected.


        1. Identify resources needed by Partners to accomplish SC3 goals.


        1. Collect lessons learned from Partners on what has worked and what has not worked under the Program, so this information can be shared with others.


Refer to Section 2(d) of Part B of this supporting statement for a description of how these survey objectives are achieved by the information collected by the survey.

1(b) Key Variables


The SC3 Survey is designed to collect information from Partners on their experiences and needs under SC3 during the past year or longer. A key variable, therefore, is the extent to which a Partner is able to remember and/or access records in order to report its activities performed during this time period. Partners are free to draw on any available information to do so (e.g., records kept as a standard business practice such as invoices and records kept in accordance with existing regulations, such as hazardous waste manifests). Each Partner’s memory and records will vary, e.g., in quality and detail. Based on the pilots, EPA is confident that Partners will be able to complete the survey effectively, e.g., based on information that is kept as a standard business practice and/or in accordance with existing regulations.

1(c) Statistical Approach


This section is not applicable to the SC3 Survey because EPA does not intend to use any statistical methods in the collection or analysis of survey data. Refer to Section 5(b) of Part B of this supporting statement for additional information.

1(d) Feasibility


EPA intends to email the survey forms to Partners to complete and return. The surveys have been prepared in the Microsoft Word program. Partners must open the file in Word to complete the survey electronically. The primary feasibility issue is whether a Partner has access to the Internet and the Word program. If not, this would limit the feasibility of the emailed Word file for the Partner.


A Partner without Internet access and/or the Word program can request a hardcopy of the survey, which can be completed and returned by fax, regular mail, or special delivery.

2. Survey Design

2(a) Target Population and Coverage


The SC3 Survey is a census of Partners. There are currently 11 Partners. EPA expects this number to increase over the coming years as more organizations learn about the benefits of the SC3 Program and join.

2(b) Sample Design


This section is not applicable to the SC3 Survey because EPA will not perform any sampling.

2(c) Data Quality


In designing the SC3 Survey, EPA considered potential data quality issues that could be associated with collected data. These are discussed below.


(i) Response Rates


EPA has considered both unit (survey) and item (question) non-response. EPA estimates that the unit response rate will be at least 50% to 60% for the SC3 Survey. This estimate is based on EPA’s pilot test of the survey, in which 50% of participants completed and submitted a survey form. EPA expects, however, that the response rate for the full-scale survey will be higher. First, EPA intends to increasingly promote the survey to new and existing Partners (e.g., by discussing the importance of the survey in communications with Partners). In addition, EPA expects Partners’ participation to increase as they gain more experience with the survey and the SC3 Program generally and learn convenient ways to keep track of their activities and accomplishments. Finally, EPA will use the follow-up methods described in Section 4(b) to maximize response rates.


To minimize item non-response, EPA has carefully reviewed the survey questions to ensure that they are easy to understand and use familiar terms; are formatted in a logical sequence; and request data that are readily available to Partners. In this manner, EPA expects to minimize inaccurate or incomplete responses that can occur due to misinterpretations and the unintentional skipping of questions. Additionally, a cover letter will provide the name of a contact person, email address and phone number to assist Partners, if needed.


After receipt of the completed surveys, EPA will conduct follow-up with respondents as needed (e.g., to address missing data). Refer to Section 5(a) of Part B of this supporting statement for information on EPA’s data review procedures when completed surveys are received.


(ii) Data Entry Errors


EPA has designed the survey forms to be user friendly for Partners. The survey forms are protected Microsoft Word files, which means that Partners will be able to electronically enter data only into the specified fields of the forms. They will not be able to modify the forms in any other way. This will simplify their data entry and minimize errors.


In addition, the survey includes a number of tables with pull-down menus. This will simplify Partner responses and minimize the need to enter data.


Finally, the survey forms encourage Partners to respond to some questions by providing existing documentation instead of entering information into the forms, such as hazardous waste manifests or shipping papers. This will reduce burden and minimize data entry errors.


After receipt of the completed surveys, EPA will conduct follow-up with respondents as needed (e.g., to address errors). Refer to Section 5(a) of Part B of this supporting statement for information on EPA’s data review procedures when completed surveys are received.


(iii) Biased Responses


EPA has considered the possibility for biased responses to the survey, which could result from questions that are worded in such a way that a particular answer is favored over others. EPA has carefully phrased each question so that it does not lead to biased responses. For example, EPA conducted a pilot test of the survey instrument and contacted pilot test participants to discuss their responses. EPA examined whether they were providing the requested information without bias or misunderstanding. If any bias, misunderstanding, or other problem was detected, EPA revised the question as appropriate.

EPA notes that some Partners may elect not to submit a survey in a given year for a variety of reasons and that Partners that do submit a survey may be those with a more compelling reason to do so. For example, the more active Partners may be more inclined to complete the survey in order to demonstrate their achievements and the less active Partners may be less inclined to complete the survey because they have fewer achievements to demonstrate. Such factors could lead to a self-selection bias. However, EPA does not believe this is a concern. EPA is not performing extrapolations or other data modeling, so there is not a need for representative data. Rather, EPA will provide only a straightforward presentation of the information collected. In addition, EPA will attempt to maximize response rates by using the procedures described in Section 4(b) of Part B of this supporting statement.


(iv) False Information Provided by Respondents


EPA is in periodic, informal contact with Partners during the year and has generally a good idea of the level and types of a Partner’s activity and accomplishments. If EPA has questions about a Partner’s survey results, EPA will contact it for clarification.

2(d) Questionnaire Design



The Initial Survey and Annual Update were designed to be as unambiguous and straightforward as possible. Survey questions include simple instructions on how to provide a response. Five worksheets are included to assist Partners provide the requested information. A cover letter provides an EPA contact person’s name, phone number and email address if assistance is needed.


Each survey form is organized into three parts. Each part addresses one or more of the survey objectives identified in Section 1(a) of Part B of this supporting statement. Following is a brief discussion of how these survey objectives are addressed by the information collected on the survey forms.


  • Part 1: General Information on Partner. This part of the survey addresses Objectives 1 and 2. It collects information on, among other things, the Partner’s reasons for joining the SC3 Program. It also collects information on the number of schools, students, and staff that were affected by the Partner’s activities under SC3 collectively.


  • Part 2: Description of Services and Resources Provided. This part of the survey address Objectives 2 and 3. It collects information on each type of service and resource that a Partner has provided under SC3. It also collects information on the number of schools, students, and staff that were affected by each service and resource provided (e.g., chemical inventories, chemical cleanouts, etc.).


  • Part 3: Successes, Barriers, and Future Activities. This part of the survey addresses Objectives 1, 3 and 4. It collects information on a Partner’s future activities (e.g., its anticipated plans under SC3 over the coming years). It collects information on resources needed by Partners to accomplish SC3 goals (e.g., what types of incentives EPA can provide to encourage Partner accomplishments and what improvements can be made to resources at the SC3 web site). It also collects lessons learned from Partners on what has worked (e.g., success stories) and what has not worked under the Program (i.e., barriers it has encountered under the Program).

3. Pretests And Pilot Tests


In late June to mid-July of 2008, EPA conducted a pilot test of the draft SC3 Survey. EPA emailed the draft survey instrument and a Feedback Form for completion by eight Partners. The Feedback Form asked for Partners’ comments on the clarity and user-friendliness the survey questions, how the questions could be improved, and their burden hours for completing the survey. EPA reviewed the completed surveys and Feedback Forms, followed up with participants to get additional feedback, and then revised the survey to address their and EPA’s suggestions and concerns.

Refer to Section 3(c) of Part A of this supporting statement for additional information on the pilot test.

4. Collection Methods And Follow-up

4(a) Collection Methods


Each year, EPA will email the survey forms to Partners well in advance of the due date for submittal. A cover letter describes the purpose of the survey, indicates the due date for submittal, and describes submittal methods, including email, fax, regular mail, and special delivery. It also includes an EPA contact person’s name, phone number and email address if assistance is needed.

4(b) Survey Response And Follow-up


EPA will perform follow-up activities, described below, after the surveys are emailed out, in order to increase response rates:

  • Email a first reminder a few weeks prior to the due date.

  • On the due date, email a second reminder to those who have not responded.

  • Within two or three weeks after the due date, email a third reminder to those who have not responded and/or call them directly.

5. Analyzing And Reporting Survey Results

5(a) Data Preparation


EPA will take the following steps to identify and resolve quality problems in the survey data:

  • When a survey is received, EPA will review it initially for completeness and quality. EPA will look for errors, including the following:

-- Data entry errors. EPA will detect these errors by looking for 1) conflicting/inconsistent responses, 2) typographical errors, and 3) other noticeable errors.

-- Incomplete/missing data. EPA will detect these errors by looking for gaps in completed surveys (i.e., missing responses that logically should be completed based on other responses in the survey or what is otherwise known about the Partner).

-- Misinterpreted questions. EPA will detect these errors by looking for responses that do not respond logically to the survey question (e.g., non sequiturs).

  • If there are simple errors (e.g., typographical errors), EPA may resolve them on its own.

  • If there are errors or other data quality problems that EPA cannot resolve on its own, it will contact the respondent for resolution before processing the data.

5(b) Data Analysis and Reporting


After resolving the data quality problems identified above, EPA intends to use the information as follows. Refer to Section 2(a) and 2(b) of Part A of this supporting statement for additional information on uses of the survey responses.


(i) To Examine Survey Data Internally and Make Improvements to the SC3 Program


EPA may examine survey responses, for example, on how Partners became aware of the SC3 Partner Program (see Question 1.1 of the Initial Survey). Survey data may be entered into a spreadsheet or word processing program (e.g., MS Word), and reviewed for data entry errors. EPA may then examine and compare the different ways Partners learned about the Program (e.g., via SC3 web site, trade organizations, etc.) to identify the most and least prevalent ways. This information could be helpful in assessing the most and least effective ways to reach prospective organizations to increase participation under SC3. It might also be helpful in assessing existing communication methods that could be improved.


(ii) To Share Anecdotal/Qualitative Information with Others


For example, EPA may copy a Partner’s “success story” from its survey form into a word processing program (see Question 3.5 of Initial Survey). EPA will review its own work carefully to identify and correct data entry errors. EPA will evaluate the success story to determine if the public or others, such as current or prospective partners, would benefit by reading it. If so, EPA may share this information with the public (e.g., on the SC3 web site).


(iii) To Share Quantitative Information with Others


EPA may keep track, for example, of the number of schools that have received support by SC3 Partners. The survey forms include questions that enable EPA to keep track of the number of schools supported by a Partner since it joined the Program (e.g., see Question 1.4 of Initial Survey). These questions are designed to avoid the double-counting of schools and other data quality problems. EPA may enter the number of schools from Partners’ surveys into a spreadsheet and add them up to derive the total number. EPA may share this total with the public (e.g., at the SC3 web site). For example, EPA may use the following type of statement: “Based on the SC3 Survey, Partners have supported [ ] schools under SC3 since joining the Program.”


Note: EPA will share qualitative and quantitative information with others only by providing a straightforward presentation of the information as reported by Partners. EPA will not manipulate the data in any way, except to summarize or add up data (e.g., to add up the total number of schools supported by Partners).


EPA will not perform any of the following statistical methods with the information collected:


  • Calculations of mean, median, or modal values.


  • Regression, extrapolation, imputations (e.g., to address missing data), or other data modeling.


  • Establish a cause and effect link between the SC3 Program and Partner activities and accomplishments.


B-8





File Typeapplication/msword
File TitlePART B OF SUPPORTING STATEMENT
Last Modified Byctsuser
File Modified2008-12-02
File Created2008-12-02

© 2024 OMB.report | Privacy Policy