Program Evaluation_2010

Program Evaluation_2010.pdf

EPA's WasteWise Program (Renewal)

Program Evaluation_2010

OMB: 2050-0139

Document [pdf]
Download: pdf | pdf
Attachment B:
Evaluation of the WasteWise Program: Promoting
Environmental Results through Evaluation. EPA
Office of Policy, Economics, and Innovation, July
2010.

July 2010

Evaluation of the
WasteWise
Program

Promoting Environmental Results
Through Evaluation

Acknowledgements

This evaluation was performed by Industrial Economics, Incorporated
(IEc) for EPA’s Office of Policy, Economics, and Innovation (OPEI)
under Contract EP-W-07-028 between EPA and IEc. The IEc
evaluation team included Angela Helman, Cynthia Manson,
Christopher Leggett, Laurie Finne, Kelsey Rioux, and Katie Barnes.
Terry Grist and John Cross of EPA’s Office of Resource Conservation
and Recovery provided input over the course of the evaluation; Janice
Sims and Irene Dooely (formerly of the Office of Conservation and
Recovery) provided background information. Terell Lasane of OPEI
was the technical advisor for this project.
This report was developed under the Program Evaluation Competition,
sponsored by OPEI. To access copies of this or other EPA program
evaluations, please go to EPA’s Evaluation Support Division’s website at
http:www.epa.gov/evaluate.

TABLE OF CONTENTS

EXECUTIVE SUMMARY
CHAPTER 1 | INTRODUCTION AND PURPOSE
Report Organization 1-2
Program Logic Model 1-2
Background Information on WasteWise Data Collection Efforts 1-4
Evaluation Questions 1-6

CHAPTER 2 | METHODS
Evaluation Design 2-1
Analysis of Existing Data 2-1
Website Data 2-2
WasteWise Conference Data 2-2
Helpline Data 2-2
Awards and Recognition 2-3

New Data Collection Efforts 2-3
Literature Review 2-3

Focus Groups 2-4
Selection of Focus Group Members 2-5
Recommended Focus Group Participants 2-6
Participant Selection 2-6
USPS Survey 2-8

Characterization of the USPS Universe in WasteWise 2-9
Survey Approach 2-9
Survey Instruments 2-12
Survey Mode 2-12
Overview of Respondents 2-12
USPS Interviews 2-14
Best Practices Review for Data Collection and Quality Control Practices 2-16
Synthesis of Data Collection and Quality Control Efforts 2-17
Quality Assurance Procedures 2-20
Strengths and Weaknesses of the Methodology 2-21

CHAPTER 3 | FINDINGS
Evaluation Question 1 3-1
Evaluation Question 2 3-8
Evaluation Question 3 3-9
Evaluation Question 4 3-17

CHAPTER 4 | RECOMMENDATIONS
A PP E ND I X A : F I N A L E VA L U AT I O N ME T H OD O L OG Y
A PP E ND I X B : L I T ER AT U RE RE V IEW
A PP E NDI X C : F O C U S G RO U P P R OTOC O L
APPENDIX D: FOCUS GROUP SU MMARY
A PP E NDI X E : U S P S WA STEW I S E FACIL I T Y S URVE Y
A PP E NDI X F : U S P S SU RVE Y RE S ULTS
A PP E NDI X G : B E S T P RA CT I C E S RE V IE W F O R D ATA C O L LE CT I O N A N D
DATA QU ALITY CONT ROL
APPENDIX H: QUALITY ASSURANCE PLAN
APPENDIX I: OMB W H ITE PAPER

EXECUTIVE SUMMARY

In January 1994, EPA launched WasteWise—a partnership program designed to help
businesses, government and non-profit organizations find practical methods for reducing
municipal solid waste (MSW). WasteWise currently has over 2,000 partners representing
over 50 sectors, who commit to reduce and recycle MSW and select industrial and
commercial wastes. Partners include large corporations, small and medium-sized
businesses, schools, colleges, universities, hospitals, state and local governments, tribes,
and other institutions. WasteWise uses a broad range of approaches to encourage
prevention, recycling, and reuse of waste. WasteWise program activities include various
forms of technical assistance and recognition.
EPA’s Office of Resource Conservation and Recovery (ORCR) and the Office of Policy’s
Evaluation Support Division (ESD) sponsored this program evaluation to: assess the
value that WasteWise provides to its partners, assess changes in waste management
behavior at partner organizations, and explore how to improve performance measurement
moving forward. Industrial Economics, Incorporated (IEc) conducted the evaluation
under contract to EPA.
The evaluation was guided by four key questions:
1. WasteWise uses a variety of approaches to influence the behavior of partners.
Which approaches—for example technical assistance, information, awards and
recognition—are most effective for which types of partners?
2. In addition to participation in WasteWise, what other factors may influence a
partner organization’s decisions to improve management of MSW (e.g., cost
savings, consumer pressure, other voluntary program opportunities)?
3. What can be determined about how WasteWise participation contributes to
partner behavior regarding MSW management (e.g., by effecting waste
management improvements sooner, better incorporating waste management as a
permanent feature of corporate culture, facilitating non-participant changes by
providing information)?
4. What can EPA do to encourage WasteWise partners to submit sufficient
environmental data for performance measurement and evaluation purposes?
As discussed in Chapter 2 of this report, IEc used several research methods to answer the
evaluation questions, including review of existing program data, and collection of new
data through a focus group, survey, and interviews. We surveyed the United States Postal
Service (USPS) WasteWise partners, and studied differences in facilities that joined
WasteWise early on versus those that joined later, hypothesizing that earlier joiners
would demonstrate greener waste management behaviors given longer exposure to

ES-1

WasteWise services. We also conducted a review of best practices for data collection and
quality control to address evaluation question 4. Exhibit ES-1 provides an overview of
methods used to answer each evaluation question.
E X H I B I T E S - 1 : C R O S S WA L K O F E VA L U AT I O N Q U E S T I O N S A N D P R I M A RY A N D S E C O N D A RY D ATA
COLLECTION METHODS
EVALUATION QUESTION

PRIMARY METHOD(S)

SECONDARY METHOD(S)

1. WasteWise uses a variety of
approaches to influence the
behavior of partners. Which
approaches—for example technical
assistance, information, awards and
recognition—are most effective for
which types of partners?

•
•

Focus Group
Review of existing
program data including
website statistics,
ward program data,
conference attendance
data

•

USPS Survey

2. In addition to participation in
WasteWise, what other factors may
influence a partner organization’s
decisions to improve management
of MSW (e.g., cost savings,
consumer pressure, other voluntary
program opportunities)?
3. What can be determined about
how WasteWise participation
contributes to partner behavior
regarding MSW management (e.g.,
by effecting waste management
improvements sooner, better
incorporating waste management as
a permanent feature of corporate
culture, facilitating non-participant
changes by providing information)?
4. What can EPA do to encourage
WasteWise partners to submit
sufficient environmental data for
performance measurement and
evaluation purposes?

•

Literature Review

•
•

USPS Survey
USPS
Interviews

•
•

USPS Survey
Focus Group

•

USPS
Interviews
Literature
Review

•

Best Practices Review

•

(None)

The report organizes findings by evaluation question in Chapter 3; we provide a short
summary below:
E v a l u a t i o n Q u e s t i o n 1 : Wa s t e W i s e u s e s a v a r i e t y o f a p p r o a c h e s t o i n f l u e n c e t h e
behav ior of partners. Which approaches—for example technical assistance,
information, awards and recognition—are most effective for which types of
partners?

Findings:
• The focus group was the most helpful method to address this question.
• The WasteWise awards program reaches many participants and receives very
positive feedback.
• The WasteWise conference received generally positive feedback from focus group
participants, but conference data and survey data call the value of conferences into
question.
ES-2

• WasteWise receives consistently positive feedback on technical tools offered to
partners, including greenhouse gas calculations, the Re-TRAC waste reporting
system, program website, and helpline.
• WasteWise partners are hungry for more communication from the program.
E v a l u a t i o n Q u e s t i o n 2 : I n a d d i t i o n t o p a r t i c i p a t i o n i n Wa s t e W i s e , w h a t o t h e r
f a c t o r s m a y i n f l u e n c e a p a r t n e r o r g a n i z a t i o n ’s d e c i s i o n s t o i m p r o v e
management of MSW (e.g., cost sav ings, consumer pressure, other voluntary
p r o g ra m o p p o r t u n i t i e s ) ?

The literature review identified several factors that influence environmental decisionmaking. IEc grouped these factors as follows:
• External market forces, including production levels/market trends and firm size.
These factors can obscure the role of WasteWise in driving behavior change.
• Potentially complementary factors to WasteWise, including customer/supply
chain pressure, community pressure/public image, corporate environmental ethic,
and cost savings. These factors can be synergistic with WasteWise influence in
some contexts.
• Pre-existing requirements, which include regulatory and legally-binding
agreements. Where present, these factors take precedence over WasteWise
influence.
• Uncertain impacts, including public disclosure laws, threat of future regulation,
pressure from environmental groups, industry pressure, and internal industry
codes. The impact of these factors is context-specific.
Under Evaluation Questions 3 and 4, we refer to these literature review findings to
interpret data collected about WasteWise impacts and best practices for data collection
and quality control, respectively.
E v a l u a t i o n Q u e s t i o n 3 : W h a t c a n b e d e t e r m i n e d a b o u t h o w Wa s t e W i s e
participation contributes to partner behav ior regarding MSW management (e.g.,
b y e f f e c t i n g w a s t e m a n a g e m e n t i m p r o v e m e n t s s o o n e r, b e t t e r i n c o r p o ra t i n g
w a s t e m a n a g e m e n t a s a p e r m a n e n t f e a t u r e o f c o r p o ra t e c u l t u r e , f a c i l i t a t i n g
non-participant changes by prov iding information)?

Findings:
• The survey results provide clear evidence that WasteWise contributes to better
waste management practices among USPS facilities. Early USPS WasteWise
joiners conduct more recycling activities than later joiners, and have higher
recycling frequencies for every material and a higher recycling frequency across
materials. Also, early USPS joiners have been recycling for a longer time than
later joiners, and are more aware of their recycling rates.
• Survey respondents cite several reasons for initiating recycling that are potential
proxies for WasteWise influence, or complementary to WasteWise factors.

ES-3

• Self-selection bias is unlikely to explain the extent of difference found in the
survey between early and later WasteWise joiners.
• Focus group results and USPS interviews validate survey findings that WasteWise
contributes to changes in waste management.
• The inability to conduct the USPS district survey originally planned hindered
learning about some potential areas of WasteWise influence on USPS.
E v a l u a t i o n Q u e s t i o n 4 . W h a t c a n E PA d o t o e n c o u ra g e Wa s t e W i s e p a r t n e r s t o
submit sufficient env ironmental data for performance measurement and
evaluation purposes?

Findings:
• WasteWise is now collecting data necessary to establish a credible baseline.
• WasteWise has created a powerful incentive for program participation and
reporting by offering free access to Re-TRAC.
• WasteWise has taken steps to encourage participant adherence to the program’s
reporting standards, although EPA could do more to improve the first-time quality
of data submitted by partners.
• EPA takes steps to validate waste data reported to WasteWise, but could adopt
additional measures to bolster confidence in self-reported data.
• While many EPA partnership programs encourage or require partners to submit
normalized data, OMB has precluded WasteWise from doing so.
• WasteWise emulates other data quality best practices identified across partnership
programs.
The report provides recommendations for the WasteWise program moving forward in
Chapter 4; in summary, they include:
• Increase communications from EPA to WasteWise partners.
• Promote communications among WasteWise partners by providing an online
venue for networking.
• In absence of additional program funding, consider recasting the conference as an
awards ceremony.
• Keep a focus on offering high-value technical tool to partners.
• Invest in enhancement to annual reporting to improve the efficiency of the
reporting review process, and collect information the potential benefits of
WasteWise through the annual reporting process.
• As resources allow, conduct research into spillover effects.
• Develop high-level communications around the interplay of factors that encourage

ES-4

CHAPTER 1 | INTRODUCTION AND PURPOSE

The U.S. generates approximately 2.4 million tons of municipal solid waste (MSW)
annually. Preventing and recycling these wastes conserves resources, reduces greenhouse
gas emissions, and improves human and ecological health. In January 1994, EPA
launched WasteWise—a partnership program designed to help businesses, government
and non-profit organizations find practical methods for reducing municipal solid waste.
The WasteWise program has over 2,000 partners representing over 50 sectors, who
commit to reduce and recycle MSW and select industrial and commercial wastes.
Partners include large corporations, small and medium-sized businesses, schools,
colleges, universities, hospitals, state and local governments, tribes, and other institutions.
In addition, WasteWise has approximately 200 endorsers, mainly membership-based
organizations, who recruit other organizations to become WasteWise partners and
provide partners with ongoing information about WasteWise tools and events.
WasteWise uses a broad range of approaches to encourage prevention, recycling, and
reuse of waste materials. WasteWise program activities include various forms of
technical assistance, public recognition and awards, and annual conferences.
EPA’s Office of Resource Conservation and Recovery (ORCR) and the Office of Policy’s
Evaluation Support Division (ESD) sponsored this program evaluation to assess several
areas of WasteWise program outcomes. The evaluation serves the following purposes:
• Identify WasteWise activities that are most useful for improving waste
management activities undertaken, and identify any differences among categories
of program partners. This information will help EPA direct program resources
toward activities with the greatest utility for different industry sectors.
• Better understand the extent to which partner behavior regarding MSW
management can be attributed to WasteWise participation. This involves first
identifying factors outside of WasteWise that influence partner’s waste
management behavior, and then identifying and assessing changes in
organizational behavior that can be linked to utilization of WasteWise approaches.
• Identify potential methods for encouraging WasteWise partners to submit robust
and consistent waste management tracking data. EPA instituted a new data
collection protocol for WasteWise in 2009 that greatly improves the program’s
data collection system. As part of this evaluation, we document these changes and
identify potential additional enhancements.

1-1

• Explore EPA’s ability to meet OMB expectations for program evaluation for the
WasteWise program using the full suite of research methods readily available to
the Agency, barring an additional Information Collection Request submittal; and
assess the feasibility and appropriateness of applying a randomized controlled trial
(RCT) or similar evaluation approach to the WasteWise program and similar
programs.
R E P O RT O R G A N I Z AT I O N

The report is organized as follows:
• The remainder of Chapter 1 presents the WasteWise logic model and the
evaluation questions that guided this project.
• Chapter 2 presents the methodology used in this evaluation. IEc used several
methods to assess WasteWise outcomes, including: analysis of existing program
data; literature review; focus group; survey of USPS members; and a review of
data collection and quality control best practices across EPA partnership
programs. We also discuss the strengths and weaknesses of this combination of
methods to assess program outcomes.
• Chapter 3 presents the evaluation findings, organized by the four evaluation
questions.
• Chapter 4 presents recommendations for improving the WasteWise program,
including broader recommendations for improving EPA’s communications on the
appropriate use, contributions, and limitations of partnership programs.
We include all major program evaluation deliverables, including memos with interim
results from individual methods, in a series of appendices in a separate file. See the
Table of Contents for the list of appendices.
PROGRAM LOGIC MODEL

To illustrate the various components of the WasteWise Program and to inform
development of specific evaluation questions, EPA has developed a logic model (i.e., a
graphical representation of the relationships between program inputs, outputs, and
intended outcomes). As shown in Exhibit 1, the key components of the model include:
• Resources ⎯ the basic inputs of funds, staffing, and knowledge dedicated to the
program.
• Activities ⎯ the specific procedures or processes used to achieve program goals.
For example, WasteWise Program activities include technical assistance,
collaboration with external groups, and publicity efforts.
• Outputs ⎯ the immediate products that result from activities and are often used
to measure short-term progress. For example, EPA outputs include yearly
conferences, fact sheets and reports, and WasteWise website resources.

1-2

EXHIBIT 1:

WA S T E W I S E P R O G R A M L O G I C M O D E L

Inputs

Activities

HQ funds
and staff

Collaborate
with external
groups (e.g.,
ATSWMO,
NRC)

Regional
offices
funds &
staff

WW Hall of
Fame
members &
Endorsers

States with
WasteWise
programs
(i.e., MA,
PA, TN)

Solicit an d
track member
waste and
activities

Publicize
program and
best
management
practices

Peer
mentoring,
educational and
techn ical
assistance

Outputs

Customers

ShortShort-Term
Outcomes

Exhibit and present at
conferences, and sh are
exhibit materials for
regional use

Endorsers

Increased
number of
partners and
endorsers

4-6 HQ & Regions
conference calls annu ally

Hall of Fame
members

Web site resources: online tool kit, publications
directory, welcome packet

Monthly helpline calls
and emails

State WW technical
assistance materials

Intermediate/
Long-Term
Outcomes

Consistent waste
management
data collection

Yearly con ference an d
awards ceremony with
awards distributed and
press releases

Fact sheets and
annual reports

Intermediate
Outcomes

Strategic waste
planning at
partner
facilities (e.g.,
baseline
assessment,
data collection
infrastructure
and procedures
established)

Increased waste
prevention,
recycling, reuse,
and procurement
of recycled
products

Reduction in
waste generation

Potential
partners and
existing
partners

State WW con ferences

Logic Model for Waste Wise Program (November 3, 2008)

Natural
resource
conservation

Cost-savings
from improved
waste
management
Institution alization of waste
reduction
behavior

Reduced lifecycle impacts
of waste
generation (e.g.
reduced GHG
emissions,
water use, etc.)

External Variables

LongLong-Term
Outcomes

Landfilling
substituted
with higher
and better
land uses

Reduction in
climate
change

Improved
human health
an d
ecological
health

•Other drivers of firm waste reduction (e.g., existing
internal environmental initiatives, other
environmental leadership programs, public pressure)
•Program and Agency resource levels
•State disposal bans & mandatory programs
•Tipping fees
•Commodity prices for recyclables
Abbreviations:
WW: Wa ste Wise
GHG: greenhouse gases

1-3

• Customers ⎯ groups and individuals targeted by WasteWise Program activities
and outputs. For example, EPA provides technical assistance and recognition to
WasteWise partners and endorsers.
• Short-Term Outcomes ⎯ changes in awareness, attitudes, understanding,
knowledge, and skills resulting from program outputs that are causally linked to
the WasteWise Program. For example, EPA’s outreach and publicity efforts result
in recruitment of new partners and endorsers for the WasteWise program.
• Intermediate Outcomes ⎯ changes in behavior that are broader in scope than
short-term outcomes. Intermediate outcomes often build upon the progress
achieved in the short-term. For example, increased numbers of WasteWise
partners and endorsers results in increased waste prevention, reuse, recycling, and
procurement of recycled products.
• Long-Term Outcomes ⎯ the overarching goals of the program, which in this
case include natural resource conservation, better uses of land than as landfills,
reduction in climate change, and improvements in human and ecological health.
B A C K G R O U N D I N F O R M AT I O N O N WA S T E W I S E D ATA C O L L E C T I O N E F F O RT S

IEc reviewed historic waste data reported by WasteWise partners to determine if they
were of sufficient completeness and quality to use as a data source for this evaluation.
This section summarizes our findings.
To estimate the proportion of partners reporting waste data to the WasteWise program,
IEc first looked at the historic program partner universe. As of the end of 2008,
WasteWise had 2,197 partners, as communicated on the program website. However, the
WasteWise database had records for 11,835 current and former partners; if accurate, this
would mean that WasteWise has 9,638 former partners. This number seems very high,
and is likely a result of record keeping problems; however, it represents an upper bound
of the number of total WasteWise partners.
IEc then assessed the number of partners that reported waste data to WasteWise. EPA
made significant changes to WasteWise program rules in 2010 to require partners to
submit both baseline and annual waste data as a condition of membership. From 2004
through 2009, waste reporting was requested, but not required. Prior to 2004, EPA did
not collect these data from partners. As shown in Exhibit 2, partners reported limited
waste data to WasteWise from 2004-2008 compared to the program’s membership levels.
IEc reviewed these data in aggregate to determine if they were of sufficient completeness
to analyze as part of the evaluation process. A total of 663 partners, current and past, had
reported data to WasteWise through 2008, generating the 1,219 records noted above. The
number of records exceeds the number of partners because many of the same partners
reported annually in multiple years. Of those 663 partners, 267 partners provided only
baseline data, and 234 provided only annual data. It is not clear if these are partners are
all current partners, or if some of these may have been partners that left the program.

1-4

EXHIBIT 2:

WA S T E W I S E TO TA L R E P O RT I N G B Y Y E A R — B A S E L I N E A N D A N N U A L R E P O RT I N G
RECORDS

YEAR

BASELINE AND ANNUAL DATA RECORDS

2000
2003
2004
2005
2006
2007
2008
Total

1
1
170
249
217
443
138
1,219

Given that we do not have complete information on the number of former WasteWise
members, or the current membership status of those who have reported, IEc estimated a
range of the proportion of partners that reported waste data, based on the number of
current partners and the total number current and past partners. Results are presented in
Exhibit 3. Only 162 partners reported both baseline and annual data necessary for trend
analysis, which we estimate as between one and seven percent of the partner universe.
Even the high end of this range, seven percent, is too low to enable extrapolation of these
data to the entire WasteWise universe.
EXHIBIT 3:

P R O P O RT I O N O F WA S T E W I S E PA RT N E R S R E P O RT I N G

NUMBER OF

% OF 11,835 PAST AND

% OF 2,197 CURRENT

PARTNERS

PRESENT PARTNERS —

PARTNERS – HIGH

REPORTING TYPE

REPORTING

LOW ESTIMATE

ESTIMATE

Baseline Only

267

2%

12%

Annual Only

234

2%

11%

Both

162

1%

7%

Total

663

N/A

N/A

Thus, analysis of historical WasteWise waste data is precluded by a low frequency of
reporting. As such, this program evaluation does not consider changes in quantified
environmental outcomes of WasteWise members. Alternatively, this evaluation
examines changes in behavior among WasteWise partners, and the program’s role in
those changes. This evaluation also explores changes that EPA could undertake to
improve WasteWise data collection and quality control to support future performance
measurement.

1-5

E VA L U AT I O N Q U E S T I O N S

To develop and refine evaluation questions, IEc conducted an initial data and document
review, and engaged in several discussions with EPA regarding the implications of our
findings for scope of this evaluation. Subsequently, IEc and EPA finalized the evaluation
questions that EPA seeks to answer through this project:
1. WasteWise uses a variety of approaches to influence the behavior of partners.
Which approaches—for example technical assistance, information, awards and
recognition—are most effective for which types of partners?
2. In addition to participation in WasteWise, what other factors may influence a
partner organization’s decisions to improve management of MSW (e.g., cost
savings, consumer pressure, other voluntary program opportunities)?
3. What can be determined about how WasteWise participation contributes to
partner behavior regarding MSW management (e.g., by effecting waste
management improvements sooner, better incorporating waste management as a
permanent feature of corporate culture, facilitating non-participant changes by
providing information)?
4. What can EPA do to encourage WasteWise partners to submit sufficient
environmental data for performance measurement and evaluation purposes?

1-6

CHAPTER 2 | METHODS

This chapter summarizes the evaluation methodology employed to assess EPA’s
WasteWise program. First, we discuss methods for collecting and analyzing existing
data. We then review efforts to collect new data, including the literature review, focus
groups, surveys, and interviews. The chapter concludes with a discussion of the strengths
and weaknesses of the evaluation approach and quality assurance procedures. For
complete information on methods, refer to the evaluation methodology document in
Appendix A.
E VA L U AT I O N D E S I G N

IEc employed a mixed-methods approach to collecting information for this evaluation.
Key sources of data include:
Existing information:
• Existing data and documentation on the WasteWise program, including data and
documents related to partners’ use of WasteWise program activities and services,
such as the WasteWise website, helpline, annual conference, and awards program.
• Peer-reviewed literature on impacts and attribution issues associated with
voluntary programs.
• Company websites and publications, including FedEx, UPS, DHL, and USPS.
• Websites of select EPA partnership programs and non-EPA voluntary programs,
and government websites.
Original research:
• Focus group with representatives from a sector participating in WasteWise
• Survey of USPS facility staff
• Post-survey interviews with select USPS HQ and District staff
A N A LY S I S O F E X I S T I N G D ATA

EPA provided IEc with a variety of documents and data related to partners’ use of
WasteWise program activities and services, such as the WasteWise website, helpline,
annual conference, and awards program. IEc reviewed these documents for relevance to
Evaluation Question 1 (i.e., which program activities are most effective for which types
of partners?). IEc evaluated each data source for evidence of utility to WasteWise
partners, as well as information on who (i.e., which sectors) are looking for information
provided by the resource.

2-1

We b s i t e D a t a

EPA tracks a variety of statistics, or “webstats” from the WasteWise website. EPA
provided IEc with webstats from September 2007 through August 2008. One of the key
statistics tracked in webstats is the number of times various files are download from the
website. IEc identified the ten most commonly downloaded files during one year
(September 2007 through August 2008) as indicators of the most relevant content for
WasteWise website users.
Another useful statistic tracked by the WasteWise website is the most commonly used
search phrases. By summing data on the number of times users of the WasteWise
website entered a search phrase between September 2007 and August 2008, IEc identified
popular phrases to serve as an indicator of what users are looking for on the website and
more generally, what topics are of concern/interest to them.
One key limitation of the WasteWise webstats is that they do not provide information on
who is using the various features of the WasteWise website. Thus, we cannot derive the
type of user (e.g., WasteWise partner, non-member, or individual citizen), or, for
professional users, the sector of the user.
Wa s t e W i s e C o n f e r e n c e D a t a

EPA provided IEc with a list of the 2007 WasteWise Annual Conference attendees,
including sector information. The purpose of WasteWise conferences is to provide
networking opportunities, information sharing, and recognition of participants who have
excelled in their waste management efforts. The most recent conference included a
discussion regarding zero waste, the use of climate profiles provided to participants by
WasteWise, and a general program update.
Using the conference attendance data, IEc identified the ten most represented industries at
the 2007 conference. EPA also provided IEc with the 2007 conference evaluations,
submitted by conference participants, as well as the minutes from the conference. IEc
reviewed these documents for information on the types of WasteWise materials and
activities that conference participants find useful.
EPA later provided IEc with data for the 2008 WasteWise Annual Conference attendees.
Thus, IEc expanded our original analysis of 2007 conference attendees and analyzed
attendee breakdown by sector for 2008.
Helpline Data

EPA provided monthly correspondence logs in Excel format for May 2007 through
August 2008. The monthly correspondence logs track the name and affiliation of the
contact, the date of the inquiry, and the nature of the inquiry and response or action taken
(for technical assistance inquiries only). All inquiries are coded based on the following
categories: program implementation question from a WasteWise member, data
verification, program information request, technical assistance, request from WasteWise
regional contacts, or a general waste/recycling inquiry. At EPA’s recommendation, IEc
focused on assessing the technical assistance inquiries, and limited consideration to the
past year (September 2007 to August 2008). The technical inquiry log categorizes each

2-2

inquiry by keyword. IEc grouped these keywords into categories to determine the
subjects of the most frequent inquiries.
Aw a r d s a n d R e c o g n i t i o n

EPA provided IEc with data on all WasteWise award winners from 1997 to 2005. The
data included each participant who has won a WasteWise award, and the specific award
that they won and year of the award.
As an initial analysis, IEc tallied the number of award winners by industry in 2007 to
determine the sectors that most actively participated in the awards program that year. To
discern trends in award recipients, IEc analyzed award recipients by sector and by
company, from 1998 through 2008, using the WasteWise Award Winner spreadsheet. IEc
performed several analyses to determine the presence of trends among award recipients,
including analyzing awards won by each participant and tallying the number of
participants for each category. The 2007 WasteWise Conference evaluations also
provided additional information about the WasteWise awards program.
N E W D ATA C O L L E C T I O N E F F O RT S

In addition to using existing files and data sources, IEc undertook new data collection
efforts to support this evaluation. These efforts included:
• Literature reviews related to Evaluation Questions 2 and 3
• Focus group related to Evaluation Questions 1 and 3
• Survey of USPS facility staff related to Evaluation Question 3
• Interviews with USPS HQ and district staff related to Evaluation Question 3
• Review of data collection and quality control practices related to Evaluation
Question 4
L i t e ra t u r e R e v i e w

IEc used literature review as the primary method for addressing Evaluation Question 2:
In addition to participation in WasteWise, what other factors influence a partner
organization’s decisions to improve management of MSW (e.g., cost savings,
consumer pressure, other voluntary program opportunities)?
Evaluation Question 2 represents an initial step in the attribution of WasteWise benefits,
or identifying beneficial impacts specifically resulting from WasteWise. It is important
to identify and correct for external factors that are unrelated to WasteWise program
design but may drive participation in WasteWise and overall program performance.
These factors include, for example:
• Regulatory requirements in other markets (e.g., European Union directives or
some State regulations);
• Participation in other voluntary programs;
• Changes in technical requirements by significant customers or suppliers; and

2-3

• Market volatility that changes production levels.
A significant body of literature exists on the reasons that companies join partnership
programs and the impact of external factors (e.g., threat of regulation) on program
performance. As part of a previous project addressing attribution methodology, IEc
developed a Draft Literature Review of Approaches to Estimating Attribution of
Voluntary Program Benefits (Memorandum submitted to EPA Office of Solid Waste,
February 25, 2008). To address Evaluation Question 2, IEc updated this literature search
with information published in 2008, and used the body of information to develop an
inventory of the main external factors that influence organizational behavior related to
MSW management. To identify recent publications pertinent to the evaluation, IEc
employed the following search engines: Dialog, EconLit, EPA, Environmental Valuation
Reference Inventory (EVRI), Social Sciences Research Network (SSRN), National
Bureau of Economic Research (NBER), EBSCOhost, and a targeted search for authors.
The complete literature review deliverable is included in Appendix B.
Once the key external factors (e.g., other than WasteWise) that may influence behavior
were identified in Question 2, Question 3 was designed to consider the leverage points
specific to WasteWise, and identify key questions for assessing the impacts specifically
associated with WasteWise participation:
What can be determined about how WasteWise participation contributes to partner
behavior regarding MSW management (e.g., by effecting waste management
improvements sooner, better incorporating waste management as a permanent feature of
corporate culture, facilitating non-participant changes by providing information)?
IEc reviewed literature on materials and waste management in the air delivery and freight
services sector, the private sector of most relevance to USPS, to inform the development
of the USPS survey. We used the following data sources to identify relevant literature for
this review:
• Company websites and publications, including FedEx, UPS, DHL, and USPS;
• Government websites, including EPA (e.g., the Smartway program); NTIS, and
State transportation agencies;
• Trade associations, including Express Delivery and Logistics Association and
Global Trade and Logistics; and
• Research organizations, including the Transportation Research Board, University
Transportation Centers, and the Transit Cooperative Research Program
Fo c u s G r o u p s

On September 29, 2009, IEc conducted a focus group addressing the potential benefits of
WasteWise membership. The purpose of the focus group was to address Evaluation
Questions 1 and 3 as proposed in the evaluation methodology. Question 1 addresses the
relative effectiveness of WasteWise tools for influencing partners’ waste management
practices. Question 3 explores the contributions of WasteWise to partners’ waste
management practices. We explored both questions throughout the focus group and
obtained information regarding members’ opinions and views of the WasteWise Program.
2-4

Below, we first discuss the criteria used to select sectors for the focus group, followed by
the criteria used to recommend specific companies from each sector for participation in
the focus group. Finally, we describe the focus group procedures and analysis of results.
Selection of Focus Group Members

IEc recommended a set of eleven sectors for inclusion in the focus groups, with one
company or organization to represent each sector, for a total of eleven participants. We
made this recommendation because the Paperwork Reduction Act limited us to a total of
nine non-federal participants (2 of the participants recommended are federal, resulting in
a total of eleven participants). Moreover, larger focus groups can be unwieldy and are
less likely to capture perspectives from all members.
We selected sectors using a series of criteria. The primary criterion for selecting the
sectors was a high level of participation in WasteWise. To determine the highest
participating sectors, we queried the online WasteWise Membership Listing to obtain a
count of the number of WasteWise partners by sector. 1 We defined a high-participation
sector as one with a minimum of 40 partners in the program. Of the 18 sectors that had
40 of more partners, we selected the top five for inclusion in the focus groups:
• Local Government
• Colleges & Universities
• Consulting & Employment Services
• Waste Management Services
• U.S. Postal Service 2
We selected the remaining six sectors for inclusion in the focus groups using a blend of
two additional criteria: sector type and average quantity of waste generated by facilities in
each sector. We characterized each sector as belonging to one of the following types:
government, institutional (e.g., schools and NGOs), services, or production/
manufacturing. In addition, we obtained data from the WasteWise database on the
quantity of waste generated by each sector in 2007. 3 We then chose sectors reporting the
highest average waste generation per facility (calculated as the total waste quantity per
sector divided by the number of partners generating that waste). 4 Finally, we aimed to
ensure adequate representation of all sector types. For example, if two sectors had
roughly equal waste generation quantities but different sector types, we selected the
sector type with less representation in the final set. In addition, we tried to achieve some
diversity across sector types (e.g., if two production/manufacturing sectors made products
in the same general category, such as automotive/vehicle parts, we selected only one of
those sectors).
1
2

Accessed at: http://WasteWise.tms.icfi.com/wisesearch/search.asp on January 15, 2009.
One sector, the Federal Government, contains a total of 146 partners; we divided this sector into the USPS (86 partners)
and other Federal Government partners (60 partners).

3
4

We analyzed data only for WasteWise partners that are flagged as currently active in the database.
For each sector, we looked at average waste generation per facility instead of total waste generation by sector to
normalize the reported waste generation data. Not all partners reported waste generation in 2007, so straight sector totals
would not have been easily comparable. By diving sector waste totals by the number of reporting partners, we account for
the variability between sectors in the number of partners reporting waste generation.

2-5

Exhibit 4 presents the eleven sectors selected for inclusion in the focus groups, the
associated data for each sector, and a summary of the rationale for selection.
Re c o m m e n d e d F o c u s G r o u p Pa r t i c i p a n t s

From the eleven sectors selected for the focus groups, IEc used the following criteria for
selecting specific companies/facilities to participate in the focus groups:
1. High-quantity of waste generation.
2. A diversity of recent and long-time WasteWise members.
3. Diversity in awards and recognition (e.g., some companies that have received one
or more awards and others that have not).
4. Diversity in reporting behavior (e.g., some companies/facilities that regularly
report to WasteWise and some that do not).
Pa r t i c i p a n t S e l e c t i o n

EPA selected two organizations within each sector as top choices, and IEc ranked these
choices to ensure diversity. Of the eleven sectors identified, nine sectors participated;
Participants from the Motor Vehicle and Parts and the Federal Government sectors did
not attend the focus group. Details about the organizations representing each sector in the
focus group are presented in Exhibit 5.

2-6

EXHIBIT 4:

S E C TO R S R E C O M M E N D E D F O R I N C L U S I O N I N F O C U S G R O U P S

AVG MSW
QUANTITY
NO. OF
SECTOR

PER PARTNER

PARTNERS

SECTOR TYPE

(LBS)

3,680,658
1,117,365

RATIONALE FOR SELECTION

Local Government
Colleges & Universities

174
131

Government
Institution

Consulting & Employment
Services
Waste Management
Services
US Postal Service
Electronics & Electrical
Equipment

126

Service Sector

36,060

Very high participation in WasteWise
Very high participation in
WasteWise; provides an example of
the institution sector type
Very high participation in WasteWise

94

Service Sector

10,819,119

Very high participation in WasteWise

86
67

Service Sector*
Production/
Manufacturing

41,945,333
1,084,962

Printing & Publishing

64

Production/
Manufacturing

10,559,959

Federal Government
(Other)**
Utilities

60

Government

53

Production/
Manufacturing

16,247,978

Entertainment

45

Service Sector

46,848,591

Very high participation in WasteWise
Provides another example of a
private production/ manufacturing
sector type; provides diversity within
production/ manufacturing sector
type
Large quantity of waste generated;
provides diversity within
production/manufacturing sector
type
Provides another example of
government sector type
Very Large quantity of waste
generated; provides diversity within
production/manufacturing sector
type
Large quantity of waste generated;
provides another example of a
private service sector type

Motor Vehicle & Parts

42

Production/
Manufacturing

69,513,940

4,766,288

Very large quantity waste generator;
provides diversity within production/
manufacturing sector type

Source:
EPA, “WasteWise Membership Listing,” accessed at: http://WasteWise.tms.icfi.com/wisesearch/search.asp
Notes:
*More closely represents a service sector than government sector
**Excludes the U.S. Postal Service

2-7

EXHIBIT 5:

F O C U S G R O U P PA RT I C I PA N T S
LENGTH

REPORTED

OF

MSW

TOTAL

MEMBER

EVER

GENERATED

AWARDS

SECTOR

ORGANIZATION

SHIP

REPORTED?

2007 (TONS)

RECEIVED

Local
Government

King County,
Washington

12

Yes

Not
Reported

5

Colleges and
Universities

University of Colorado
at Boulder

15

No

Not
Reported

0

Consulting and
Employment
Services

CDM

2

Yes

152,418

0

Waste
Management
Services

Inland Empire Regional
Composting Authority

2

Yes

15,041

0

US Postal
Service

USPS Northeast Area

12

Yes

14,932,913

7

Electronics
and Electrical
Equipment

General Dynamics Lincoln Operations

2

Yes

156,850

0

Printing and
Publishing

FedEx Kinkos

12

Yes

52,543,958

1

Utilities

PSEG

15

Yes

17,975,048

9

Entertainment

The Walt Disney
Company

15

Yes

321,619,163

14

EPA and IEc worked together to select the focus group date. IEc prepared the draft focus
group protocol and information sheet for participants (attached here as Appendix C). The
focus group was held at the IEc office in Cambridge, Massachusetts; Andy Schwarz, a
Principal at IEc, moderated the focus group. IEc took notes to assist in the summarizing
findings from the focus group. After the focus group, IEc synthesized responses to each
question and developed a focus group summary that identified the key findings, available
in Appendix D.
USPS Survey

Due to constraints under the Paperwork Reduction Act, EPA could not survey most
program participants without undertaking an ICR process. However, EPA can conduct
surveys within the federal family. Because USPS is a very active partner in the
WasteWise program, with all of its facilities enrolled in WasteWise, IEc conducted a

2-8

survey of USPS facilities. We surveyed USPS processing and distribution centers
(P&DCs) and bulk mail centers (BMCs) to investigate the effects of WasteWise
membership on waste management behavior within USPS, by looking for differences in
facilities that joined WasteWise many years ago and facilities that joined WasteWise
relatively recently.
C h a r a c t e r i z a t i o n o f t h e U S P S U n i v e r s e i n Wa s t e W i s e

USPS entities began joining WasteWise in 1997. Initially, USPS entities joined at many
different levels within the organization. The range of partners initially included entities
as diverse as individual post offices and processing facilities, as well as whole USPS
districts and even larger USPS areas. Now, most partners join WasteWise and report at
the district level, and all USPS districts are enrolled in WasteWise. As of late 2008,
USPS WasteWise membership was organized into 86 USPS individual partners. Of those
86 individual USPS partners, 75 partners reported at the district level, 6 partners reported
at the area level, and 6 partners reported at the individual facility level.
Survey Approach

After discussing the goals and intent of this survey with USPS, IEc determined that
district staff members and managers at P&DCs and BMCs represented the most
appropriate target universe. District staff members play a key role in organizing waste
management activities and therefore are likely to have direct experience implementing
WasteWise-related activities and other waste management strategies, and P&DCs and
BMCs generate and manage large quantities of non-hazardous waste and are therefore
able to identify the effectiveness of USPS efforts at a facility level.
The USPS organization includes nine areas, 80 districts, and 460 P&DCs and BMCs.
The Northeast area (which includes eight individual districts) and four districts (Alabama,
Dallas, Sacramento, and South Florida) joined WasteWise several years before the other
areas and districts. Together, these 12 districts contain a total of 55 P&DCs and BMCs;
we defined this group of early joiners as “Group A” and surveyed the entire Group A
universe. The majority of USPS WasteWise partners, however, joined in 2007 and 2008.
This universe, labeled “Group B,” contains 405 PDCs and BMCs. Instead of surveying
the entire Group B universe, IEc developed a statistically valid sampling strategy to
survey 200 facilities. 5 The sample plan is summarized in Exhibit 6.
Thus, to discern the effects of WasteWise participation, we surveyed all facilities that
were early joiners, as well as a statistically valid sample of facilities that joined later. We
hypothesized that due to their longer tenure participating in WasteWise, USPS facilities
and districts that joined the program earlier than others would report higher utilization of
“greener” waste management approaches.
IEc also developed a survey for district staff and planned to survey one staff member
from each district. However, due to the USPS reorganization and consolidation of
districts, IEc was not able to administer this survey.

5

See WasteWise evaluation methodology in Appendix A for detailed information about the survey approach.

2-9

EXHIBIT 6:

S A M P L E P L A N S U M M A RY

FACILITY LEVEL GROUPS

Group A (early joiners)
Group B (later joiners)

EXPECTED

INITIAL

EFFECTIVE

POPULATION

RESPONSE

SAMPLE

SAMPLE

SIZE

RATE

SIZE

SIZE

55
405

75%
50%

55
200

41
100

To ensure that the sample reflected a variety of geographic locations, we developed a
plan to stratify the sampling of P&DCs based on the area in which the center is located.
The USPS organization contains nine areas. The entire Northeast Area joined early on;
therefore all P&DCs and BMCs in the Northeast were surveyed. We applied
stratification across the remaining eight areas. See Exhibit 7 for the stratification of
Group B.

2-10

EXHIBIT 7:

L AT E R J O I N E R S ( G R O U P B ) S T R AT I F I C AT I O N

SQUARE OF
STRATUM
STRATUM

1
2
3
4
5
6
7
8
9
Total

DESCRIPTION

BMCs
Capital Metro
Eastern
Great Lakes
New York Metro
Pacific
Southeast
Southwest
Western

STRATUM

EXPECTED

ESTIMATED

VARIANCE FOR

POPULATION

RELATIVE

RELATIVE

INITIAL

RESPONSE

EFFECTIVE

STRATUM

ESTIMATED

SIZE (N)

PROPORTION

PROPORTION

SAMPLE SIZE

RATE

SAMPLE SIZE (N)

PROPORTION

PROPORTION

28
26
59
45
18
23
42
53
111
405

0.07
0.06
0.15
0.11
0.04
0.06
0.10
0.13
0.27
1.00

0.005
0.004
0.021
0.012
0.002
0.003
0.011
0.017
0.075

14
13
29
22
9
11
21
26
55
200

50%
50%
50%
50%
50%
50%
50%
50%
50%

7
6
15
11
4
6
10
13
27
100

0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5
0.5

0.032
0.035
0.014
0.019
0.055
0.040
0.020
0.016
0.007

ASSUMPTIONS:

0.50
0.0021
0.0019

Estimated population proportion
Variance of estimated population proportion
Standard deviation of SRS (for comparison)

2-11

Stratified sampling has the added benefit of guaranteeing a geographic spread for the
sample. The sampling plan assumes "proportional allocation." That is, the sample size
for each stratum is proportional to the size of the stratum. IEc chose a sample size of 200
for Group B in consideration of both the need for statistical validity as well the need to
minimize survey burden on USPS staff. See the Evaluation Methodology for more
information on the sampling process.
Survey Instruments

The final survey instruments for facility staff members and is available in Appendix E.
The survey was designed to investigate:
• Waste management activities at the facility-level, including recycling of
specific materials, and source reduction;
• Knowledge of waste management outcomes;
• Influences on waste management attitudes and behaviors (including WasteWise
membership); and
• Use of WasteWise tools, and assessment of those tools.
IEc shared the facility and district surveys with Charlie Vidich of USPS Headquarters to
ensure that the survey questions were clear and understandable. IEc received feedback
for the facility survey and then revised the survey to use language that was more
consistent with USPS operations.
Survey Mode

Based on conversations with USPS, IEc confirmed that USPS staff have ready Internet
access and a familiarity with online surveys. As such, IEc conducted the survey by
Internet using ESurveysPro Basic online survey service.
O v e r v i e w o f Re s p o n d e n t s

Of the 255 facilities contacted about the survey, 132, or 52%, responded. Thirty of 55
long-term partners responded to the survey, for a response rate of 54.5% from Group A.
Similarly, Group B had a response rate of 51 %, with 102 of 200 facilities responding. 6
See Exhibit 8 for a summary of facility surveys and the number of responses by group.
EXHIBIT 8:

RESPONSES BY GROUP A AND GROUP B

GROUP

6

FACILITIES SURVEYED

RESPONDED

DID NOT RESPOND

Group A

55

30

25

Group B
Total

200
255

102
132

98
123

Two respondents left significant sections of the survey blank.

2-12

IEc and USPS undertook several activities to increase the response rate, including:
•

USPS Headquarters verbally notified managers of upcoming survey in August
2009.

•

IEc sent a survey invitation to subjects with an explanation of the purpose of the
survey on August 31, 2009.

•

IEc sent a follow-up invitation and reminder on September 14, 2009.

•

USPS contacts conducted outreach among facilities to increase the response rate.

•

IEc sent a second round of follow-up invitations in November 2009.

Exhibit 9 presents a breakdown of respondents by facility type. IEc worked with USPS
staff to develop the list of current P&DCs and BMCs. It is important to note, however,
that some of the facilities labeled as P&DCs were post offices that that contain or
previously contained some processing equipment, and perform(ed) some P&DC
functions. Fifteen respondents selected “other,” indicating that their facility is not a
BMC, or P&DC. A comparison of these “other” facilities to respondent positions
indicated that six “other” facilities appeared to be post offices with some P&DC
functions, and the remaining nine facilities appeared to be sorting or processing centers
(and thus, may have been miscategorized by respondents).
EXHIBIT 9:

U S P S T Y P E S O F FA C I L I T I E S R E S P O N D E D
WHAT TYPE OF FACILITY DO YOU WORK AT?

Processing and Distribution Center (P&DC)
Bulk Mail Center (BMC)
Other
Total

TOTAL

108
9
15
132

As seen in Exhibit 10, about half of the respondents (73) indicated that they determine
waste management methods at their facilities, either independently or in conjunction with
others. However, many respondents indicated that someone else determined waste
management methods, typically District or Area staff. Twenty-six facilities checked the
“other” box, sometimes indicating specific positions from a list of choices presented on
the survey form; most of the positions listed under “other” are facility-level positions (as
opposed to positions at the district or area level).

2-13

EXHIBIT 10:

D E C I S I O N - M A K I N G O N WA S T E M A N A G E M E N T

WHO DETERMINES WASTE MANAGEMENT METHODS?
(CHECK ALL THAT APPLY)

I do
District Staff
Area Staff
Headquarters Staff
Other

COUNT

73
55
33
17
26

The survey asked respondents about their tenure at USPS. Despite recent changes at
USPS, most of the respondents have been in their positions for at least a year, and nearly
half have been in their positions for over five years. See Exhibit 11.
EXHIBIT 11:

TENURE OF RESPONDENT POSITIONS

HOW LONG HAVE YOU BEEN IN YOUR POSITION?

5+ years
3 – 5 years
1 – 3 years
6 – 12 months
0 – 6 months
Total

TOTAL

63
23
34
9
3
132

Based on the above responses, we are confident that respondents were staff members who
were knowledgeable about waste management practices at their facilities. Survey results
are discussed at length in the Findings chapter; IEc’s survey results memo with complete
results is included in Appendix F.
USPS Interv iews

IEc identified three USPS interview participants at the area and district level. IEc used
the interviews to follow up on survey results, and specifically to clarify and expand upon
key survey findings, including differences among early and later joiners (these survey
findings are discussed at length in Chapter 3). See Exhibit 12 for the interview guide. IEc
provided a synthesis of the interviews to ESD and ORCR staff.

2-14

EXHIBIT 12:

U S P S S U RV E Y F O L L O W U P I N T E RV I E W G U I D E

1.

General Interviewee Information
a) Name?
b) Title?
c) How long have you been in your position?

2.

What is your role in determining waste prevention methods at facilities?

3.

What types of communications occur among HQ, Area, and District staff
when considering or implementing waste prevention methods?
a) How does WasteWise figure into these communications?

4.

Do you believe that WasteWise participation influenced and/or supported
changes in waste management practices at the facility, district, or area
level?
a) If yes, provide some examples (e.g., initializing or retaining
recycling of specific materials?)
b) Has this influence changed over time? If so, how?

5.

The survey indicates that district or area encouragement, and sometimes
requirements, are a key influence on facility waste management
practices. In your opinion, how much influence did WasteWise
membership have on district/area encouragement/requirements for
greener waste management at the facility level?
a) Has this influence changed over time? If so, how?

6.

The survey results indicate that, in general, partners that joined
WasteWise a long time ago (Group A) reported greener waste prevention
activities (e.g., recycling frequency, number of recycling activities) when
compared to newer partners that joined over the last couple of years
(Group B). Why do you think older partners report greener waste
prevention activities?

7.

Older WasteWise partners also indicated that they are more aware of
their recycling rates. Can you think of any reasons why older partners
might be more aware of recycling rates?

8.

Do you think that WasteWise, or Area/District use of WasteWise
materials, has influenced personnel attitudes about waste management at
the facility level? If so, how?

9.

Is there any other information that you could provide that would assist us
in our analysis?

2-15

B e s t P ra c t i c e s R e v i e w f o r D a t a C o l l e c t i o n a n d Q u a l i t y C o n t r o l P ra c t i c e s

Early in the process of developing a program evaluation methodology for the WasteWise
program, IEc determined that partner environmental data previously collected by EPA for
WasteWise are not robust enough to support performance measurement and program
evaluation. As such, EPA developed Evaluation Question 4 to include as part of this
evaluation:
• What can EPA do to encourage WasteWise partners to submit sufficient
environmental data for performance measurement and evaluation purposes?
To address this evaluation question, IEc conducted a review of data collection and
QA/QC best practices across select EPA partnership programs and voluntary programs
outside of EPA. The focus of the best practices review was to identify practices that
encourage program partners to submit robust and consistent environmental data, and
could be utilized for ongoing performance measurement as well as future program
evaluation. We also compared best practices identified to current WasteWise practices,
and determine where WasteWise has implemented these practices, and whether there are
areas where WasteWise goes beyond best practices used by other partnership programs.
IEc conducted a review of data collection best practices across select EPA partnership
programs and non-EPA voluntary programs, focusing on methods to increase data
quality. To identify programs to review, we applied the following criteria:
• Voluntary participation (non-mandatory)
• Facility or firm-based (not product based)
• Program data collection and reporting responsibilities exist at the facility or firm
level
• Some programs should have a follow-up component, for quality control
• Some programs included should have a waste reporting component
• Some programs should use electronic reporting
As discussed in the WasteWise Evaluation Methodology, IEc identified seven EPA
partnership programs to include in the review. Below, we list each program, and describe
the rationale for including them.
• Hospitals for a Healthy Environment (H2E): A previous EPA evaluation of the
H2E program suggested that EPA collect data for normalization purposes and
require baseline and annual reporting for new partners, as well as annual reporting
for existing partners. H2E implemented the suggestions and now partners are
required to submit annual reports. In addition, the H2E toolbox (cms.h2eonline.org/partners/toolbox/) contains useful guidance for current and prospective
partners, including steps for getting started, sample partner goals, data collection
practices, and normalizing guidance to account for changes in activities across
different types of facilities (e.g., # of patients seen, # of patient beds occupied).
• Laboratories for the 21st Century (Labs 21): Labs21 differs from WasteWise and
many other EPA voluntary programs in that the partnership is project-based and
2-16

partners do not submit annual reports until after project is complete. However,
Labs21 is included in this review as the program provides useful materials on
topics relevant to WasteWise such as best practices, case studies, and
benchmarking.
• National Environmental Performance Track: Performance Track required all
members to submit baseline data and annual data, and aggregated and published
performance measurement results. Performance Track had a strong focus on
QA/QC. The program reviewed all data submitted, followed up with members to
ensure accuracy in reporting, and conducted site visits at 5 – 10% of member
facilities each year.
• Natural Gas Star: The Natural Gas Star program provides many sector-specific
resources to partners, such as emission quantification guidance and information on
cost-effectives technologies. Natural Gas Star has also been able to aggregate and
publish results.
• National Partnership for Environmental Priorities (NPEP): NPEP is a projectbased program; partners report their baseline quantities and associated
achievements to EPA. Since 2006, NPEP has inquired about QA/QC for data
associated with partner success stories.
• SmartWay: The SmartWay program has also developed sector-specific resources,
including models and standards for reporting baseline and performance
measurement data.
• Energy Star Buildings and Plants: At the request of EPA, IEc added this program
to the original list. Many of the "Plants & Buildings" partners
match WasteWise partner sectors. In addition, the program maintains a reporting
database for partners that can also be used for benchmarking.
IEc found that the following non-EPA programs and initiatives contained reporting
guidance or tools that could inform WasteWise data quality and increase reporting; as
such, we included them in this review:
• Australia’s Greenhouse Challenge Plus: This program is mandatory for a small
number of companies, but the majority of partners join voluntarily. The program
provides resources to help partners calculate their greenhouse gas emissions, and
reporting is completed through a universal reporting system. To minimize
reporting burden and data duplication, the reporting system shares data with
various agencies and programs.
• Stewardship Ontario’s Blue Box: This mandatory program offers a variety of
calculators and guidance documents for waste/recycling reporting.
Synthesis of Data Collection and Quality Control Efforts

For each of the above programs (EPA and non-EPA), IEc conducted a comprehensive
review the following materials to identify and compare practices across programs:
• Environmental reporting forms
2-17

• Environmental reporting instructions
• Reporting follow up and quality control procedures
• Reporting requirements and/or incentives for reporting
• Program data aggregation
• Program evaluations
IEc asked the following data collection and QA/QC questions of each program reviewed.
We answered these questions by reviewing program documents and, when needed, by
following up with program staff.
• Baseline: How does the program establish a credible baseline?
• Reporting standards: What reporting standards does the program use to ensure
consistent and accurate data collection? (Examples could include: standard
reporting frequency, mandating absolute data, mandating facility-wide reporting,
providing definitions of program indicators, and asking for text descriptions to
provide context on reported data.)
• Reporting materials: How does the program use reporting materials to encourage
adherence to reporting standards? (Examples could include: providing clear
reporting instructions; using standard reporting forms; using advanced forms such
as Excel, PDF, or online forms to minimize reporting confusion or mistakes; using
innovative reporting methods or materials to assist program participants in
providing quality information.)
• Reporting compliance: How does the program encourage or require compliance
with reporting standards? (e.g., by making reporting a condition of program
participation, or by providing incentives for reporting?)
• Reporting quality control: How does the program ensure the quality of reported
data? (Examples could include: using a standard guide to review all submissions,
comparing data to previously submitted data, comparing data to other data sets
like TRI, following up with members on questionable numbers, site visits,
reference checks)
• Data normalization: Does the program encourage or require members to
normalize environmental data to account for external factors, such as economic
conditions?
• Data aggregation: If the program aggregates data, how does the program ensure
that data are suitable for aggregation? Does the program systematically exclude
data that should not be aggregated?
• Double counting: How does the program address potential double counting within
its own reporting, and across programs?
• Transparency: How does the program ensure transparency of data limitations in
its communication of program results? (Examples could be noting existence or

2-18

potential effects of: external conditions, double counting, missing data, excluded
data, or other quality control issues.)
• Benchmarking: Does the program's data collection facilitate benchmarking of
performance among participants and/or between participants and non-participants,
and if so, how?
We answered these questions by reviewing program documents, and when needed, by
following up with program staff. We discuss the outcome of the best practices review in
Chapter 3 under Evaluation Question 4.
In summary, Exhibit 13 maps each evaluation question to methods used to answer it.
EXHIBIT 13:

C R O S S WA L K O F E VA L U AT I O N Q U E S T I O N S A N D P R I M A RY A N D S E C O N D A RY D ATA
COLLECTION METHODS
EVALUATION QUESTION

PRIMARY METHOD(S)

SECONDARY METHOD(S)

1. WasteWise uses a variety of
approaches to influence the
behavior of partners. Which
approaches—for example technical
assistance, information, awards and
recognition—are most effective for
which types of partners?

•
•

Focus Group
Review of existing
program data including
website statistics,
ward program data,
conference attendance
data

•

USPS Survey

2. In addition to participation in
WasteWise, what other factors may
influence a partner organization’s
decisions to improve management
of MSW (e.g., cost savings,
consumer pressure, other voluntary
program opportunities)?
3. What can be determined about
how WasteWise participation
contributes to partner behavior
regarding MSW management (e.g.,
by effecting waste management
improvements sooner, better
incorporating waste management as
a permanent feature of corporate
culture, facilitating non-participant
changes by providing information)?
4. What can EPA do to encourage
WasteWise partners to submit
sufficient environmental data for
performance measurement and
evaluation purposes?

•

Literature Review

•
•

USPS Survey
USPS
Interviews

•
•

USPS Survey
Focus Group

•

USPS
Interviews
Literature
Review

•

Best Practices Review

•

(None)

2-19

Q u a l i t y A s s u ra n c e P r o c e d u r e s

In conducting the evaluation, IEc, ESD, and ORCR agreed on a set of three key quality
assurances:
• IEc and EPA agreed on the key data sources to inform the evaluation, including:
• Existing data and documentation on the WasteWise program,
including data and documents related to partners’ use of WasteWise
program activities and services, such as the WasteWise website,
helpline, annual conference, and awards program
• Previous literature review: Draft Literature Review of Approaches to
Estimating Attribution of Voluntary Program.
• Company websites and publications, including FedEx, UPS, DHL, and
USPS; and government websites, including EPA (e.g., the Smartway
program); NTIS, and state transportation agencies
• Focus group with representatives from sectors participating in
WasteWise
• Survey of USPS facility staff
• Interviews with select USPS HQ and District staff
• Review of data collection best practices across select EPA partnership
programs and non-EPA voluntary programs
• IEc designed its analyses in the context of the project’s overarching evaluation
questions and the program logic model, and used statistical techniques to describe
the significance of analytical findings where possible and appropriate.
• EPA staff from ESD and ORCR reviewed IEc’s outputs, including:
• Program Evaluation Methodology
• Summary of Award Data
• Summary of Conference Data
• Literature Review
• Focus Group Summary
• USPS Survey Results
• Summary of USPS Interviews
• Best Practices Review
Appendix H contains the Quality Assurance Plan that IEc delivered to EPA in July 2009.

2-20

S t r e n g t h s a n d We a k n e s s e s o f t h e M e t h o d o l o g y

There are significant strengths of this project that make it unique. The evaluation
methodology is well designed for understanding how and why a partnership program is
effective, which can provide useful information for program managers of WasteWise and
other partnership programs
The greatest strength is that this evaluation relied on a multitude of data collection and
analytical methods, including a literature review, a focus group, a survey, interviews, a
best practices review, and analyses of existing data. Using multiple sources of
information to address the same question provides the opportunity for findings from one
source to validate or contradict findings from another source. When findings are
validated by more than one information source, it results in increased confidence in the
research findings. As discussed in Chapter 4, several of the evaluation findings are
bolstered by validation from more than one source.
In addition, the USPS survey was designed to discern statistically significant differences
between long-term WasteWise partners and recent joiners regarding waste management
attitudes and behaviors, a good indicator of WasteWise impacts. IEc designed the survey
in conformance with best practices for evaluation research. In particular, IEc:
•

Utilized the expertise of a survey expert and statistician to develop the survey and
review survey questions;

•

Selected a sample large enough to support statistical analysis;

•

Used random stratified sampling to ensure geographic representation.

•

Set a clear boundary between the two groups to be studied; Group A joined from
1997 to 2004 whereas Group B primarily joined later, with the majority of
districts joining in 2007 and 2008.

A limitation of the survey is that it includes only USPS facilities as opposed to a broader
sample of WasteWise members. As discussed above, EPA would have had to file for an
ICR to conduct a broader survey. Filing an ICR with OMB for this kind of survey is a
time consuming process and based on the program’s ICR history, it is unclear if OMB
would have approved such an ICR. Moreover, an ICR process was beyond the resources
of this evaluation. Although there may be limits to the transferability of USPS findings to
the broader WasteWise universe, IEc used the focus group to compensate for this survey
weakness. In addition, it should be noted that if anything, USPS behavior is a
conservative proxy for behavior of the larger WasteWise universe. USPS has extreme
cost pressures, and is unlikely to sustain a long-term involvement with any voluntary
program that does not offer clear and compelling value to the organization.
A general limitation of this methodology is that although it finds significant evidence of
WasteWise impacts on participant behavior (as discussed in Chapter 4), it cannot quantify
the contribution of WasteWise to changes in waste management attitudes and behavior.
As discussed in the OMB White Paper in Appendix I, methodologies for mathematically
attributing impacts to partnership programs are elusive given the complex ways in which
these programs share information and influence behavior among their memberships as

2-21

well as in broader markets, and the variety of factors that influence firm-level decisionmaking around environmental issues.

2-22

CHAPTER 3 | FINDINGS

In this chapter, we present findings from the WasteWise evaluation, organized by the four
evaluation questions. As discussed in the previous chapter, IEc used multiple methods to
address the first three Evaluation Question; as such, we synthesize findings across
methods for these evaluation questions. We include key data and exhibits from interim
deliverables to explain and illustrate findings, but do not replicate the full detail of
interim deliverables here. We have included a number of project deliverables in the
Appendices for reference; the literature review is included in Appendix B; the focus
group summary is included in Appendix D; the survey results deliverable is included in
Appendix F; and the Best Practices Review for Data Collection and Quality Control is
included in Appendix G.
E VA L U AT I O N Q U E S T I O N 1 :

WasteWise uses a variety of approaches to influence the
behavior of partners. Which approaches—for example technical assistance,
information, awards and recognition—are most effective for which types of
partners?
Findings:

The focus group was the most helpful method to address this question. During the
focus group, IEc was able to collect direct feedback on WasteWise technical tools and the
program’s general approach to interfacing with partners. In the case of conferences and
awards, IEc also had good existing data on their use that served to supplement focus
group findings. The survey was not particularly helpful in addressing Evaluation
Question 1 because materials are often not branded as WasteWise materials at the facility
level in USPS. Hence, facility-level staff may not be aware that technical assistance
materials provided by USPS management integrated the WasteWise framework and
WasteWise content.
The WasteWise awards program reaches many participants and receives very
positive feedback. Focus group participants noted that WasteWise awards resonate with
executives, and many participants find them helpful in promoting their participation in
the WasteWise program, and for communicating their environmental programs to the
public. One participant stated that if his organization had not received a WasteWise
award, they would have stopped recycling marginal commodities three to four years ago.
Because the organization received an award for its recycling program, however, recycling
of the material became standard operating procedure and is now perceived as mandatory
throughout the organization.

3-1

The award program reaches many WasteWise partners. As shown in Exhibit 14, a large
number of program participants have been involved in the awards program, with 171
different participants having won one or more awards over the course of the program.
EXHIBIT 14:

S U M M A RY O F AWA R D W I N N E R S B Y N U M B E R O F AWA R D S W O N 1 9 9 8 - 2 0 0 8

NUMBER OF

NUMBER OF

PERCENT OF ALL

TOTAL NUMBER

PERCENT OF ALL

AWARDS

PARTICIPANTS

AWARD WINNERS

OF AWARDS

AWARDS WON

1

78

45.61%

78

15.82%

2-5

65

38.01%

184

37.32%

6-9

21

12.28%

157

31.85%

10-12

7

4.09%

74

15.01%

100%

493

100%

Total

171

To investigate the representation of award winners by sector, IEc identified the top ten
award winners by sector. This is shown in Exhibit 15.
EXHIBIT 15:

TO P T E N AWA R D W I N N E R S B Y S E C TO R 1 9 9 8 - 2 0 0 8

NUMBER OF

PERCENT OF

AWARDS

TOTAL AWARDS

Local Government

47

9.53%

Federal Government

45

9.13%

Utilities

40

8.11%

Furniture Manufacturing

34

6.90%

Colleges and Universities

30

6.09%

State Government

29

5.88%

Electronics & Electrical Equipment

26

5.27%

Motor Vehicles & Part

24

4.87%

Scientific, Photographic, & Control Equipment

22

4.46%

SECTOR

Entertainment

17

3.45%

Total

314

66.69%

As shown in Exhibit 15, local and federal government agencies won a combined total of
almost 20% of all awards given between 1998 and 2008, with state government agencies
winning an additional 6%. Utilities, furniture manufacturing, and colleges and
universities are also well represented among award winners, accounting for slightly more
than 20% of the total.
The WasteWise conference received generally positive feedback from focus group
participants, but conference data and survey data call the value of conferences into
question. The annual conference received generally positive reviews from focus group
participants. One participant who has participated since the mid-90’s and whose
organization is involved in many other voluntary programs, finds the WasteWise
3-2

conference to be the overall best-run conference of its type. However, another participant
thought the conference was too small and is of limited value because it is focused on the
awards ceremony. The networking opportunities provided by the conference were
commended by most focus group participants, and many of them expressed an interest in
expanding the networking opportunities available through WasteWise.
The conference attendance data are not as positive as the focus group feedback. IEc
analyzed 2007 and 2008 conference attendance data by participant and sector. There were
210 conference attendees in 2007 and 170 attendees in 2008, a drop of 40 participants.
The data revealed that both the 2007 and 2008 WasteWise Conferences were attended by
federal government participants more than any other WasteWise participant sector, as
shown in Exhibit 16. Federal agencies represent approximately 25% of all participants by
sector that attended in 2007 and 37% in 2008.
EXHIBIT 16:

WA S T E W I S E C O N F E R E N C E AT T E N D E E S B Y S E C TO R

NUMBER OF

NUMBER OF

ATTENDEES 2008

ATTENDEES 2007

Federal Government

51

63

Electronics and Electrical Equipment

15

6

Utilities

11

14

SECTOR

Waste Management Services

11

5

Local Government

10

27

10

2

Fossil Fuel Production
Consulting and Employment Services

7

7

6

3

Colleges and Universities

5

9

Food Manufacturing

3

1

Furniture Manufacturing

3

2

Medical Services

3

2

Motor Vehicles and Parts

3

9

Non-Profit Organization

3

3

Pharmaceuticals

3

3

Schools – K – 12

3

3

State Government

3

7

Building Materials

2

0

Chemicals

2

0

Communication

2

4

Forest and Paper Products

2

4

Retail and Mail Order

2

1

Rubber and Plastic Products

2

2

Wholesale Distribution

2

0

Airlines

1

1

Banking, Finance and Savings

1

0

WasteWise Contractor ICF (5 attendees 2008, 9 attendees 2007) were excluded from the total.

3-3

NUMBER OF

NUMBER OF

ATTENDEES 2008

ATTENDEES 2007

Beverages

1

1

Computer and Office Equipment

1

1

Construction and Engineering

1

1

Entertainment

1

1

Food, Drug and Convenience Stores

1

0

Hotels, Resorts and Lodging

1

0

Scientific, Photographic and Control Equipment

1

1

Printing and Publishing

0

2

Property Management and Real Estate

0

2

Industrial and Farm Equipment

0

1

Restaurant and Food Services

0

1

Other/Unknown

4

17

170

210

SECTOR

Total

Within the federal government agencies, US EPA represents an overwhelming proportion
of federal government participation, not surprisingly. Exhibit 17 shows the breakdown of
WasteWise conference attendees from federal government agencies in 2007 and 2008.
EXHIBIT 17:

F E D E R A L A G E N C Y 2 0 0 7 A N D 2 0 0 8 WA S T E W I S E C O N F E R E N C E AT T E N D A N C E

2008 CONFERENCE

2007 CONFERENCE

ATTENDANCE

ATTENDANCE

US Environmental Protection Agency
US Postal Service
US Department of Agriculture
US Air Force
National Partnership for Environmental
Priorities
National Institute of Health
Naval Facility Engineering Command –
Atlantic
Naval Institute for Dental and Biomedical
Research
US Army, Fort Hood
Department of Homeland Security
Pentagon
Sandia National Labs

36
9
1
1

41
8
0
0

0
0
0
0

1
1
2
1

Federal Aviation Administration
Oak Ridge Lab

0
0

3
2

FEDERAL GOVERNMENT AGENCY

1
1
1
1

0
4
0
0

3-4

EPA represents approximately 65% of all attendees at the 2007 WasteWise Conference,
and approximately 70% in 2008. The next largest representation of federal agencies for
both years was the US Postal Service (USPS), however, the USPS represented less than
20% of federal attendees in 2007 and 2008. The National Institute of Health is the only
other group that attended both the 2007 and 2008 WasteWise Conference, with all other
federal agencies attending only one of the two Conferences. Sandia National Labs, which
is historically one of the highest WasteWise award winners, having won ten awards
between 1998 and 2008, sent one representative to the 2007 Conference and had no
representatives at the 2008 Conference. This high degree of variability in federal agency
attendance could be interpreted in different ways. On one hand, it is a positive sign that
different agencies are interested in the WasteWise program. On the other hand, it is
curious that many agencies that attended in 2007 did not attend in 2008.
Examining the remaining top five sectors for attendance, we found a similar variability
among participation rates. Exhibit 18 shows the number of partners in each of these
sectors that were represented at both the 2007 and 2008 WasteWise conferences. The
totals reflect the number of partners and not total participation, as several partners sent
more than one representative.
EXHIBIT 18:

C O N F E R E N C E R E P E AT AT T E N D E E S B Y S E C TO R

2007

# OF PARTNERS

2008 CONFERENCE:

CONFERENCE: #

WITH REPEAT

# OF PARTNERS

OF PARTNERS

ATTENDANCE IN

REPRESENTED

REPRESENTED

2007 AND 2008

Electronics and Electrical
Equipment

5

3

2

Utilities

6

8

2

Waste Management Services

8

5

1

Local Government

8

25

4

SECTOR

As seen in Exhibit 18, three of the sectors had similar participation numbers at both
conferences. There is little repeat attendance among any of the four sectors in 2007 and
2008. Local government representation varied greatly between 2007 and 2008, which
may be an effect of the economic downturn. Although repeat attendance for this sector is
the highest, the rate of repeat attendance is still low. While we would have to look at a
longer time period to analyze trends in conference attendance confidence, the low level of
repeat attendance from 2007 to 2008 is not a positive indicator for the WasteWise
conference. However, partners that attended both conferences sent a similar number of
representatives, if not more, to the 2008 conference. For example, Raytheon sent four
representatives to the 2007 conference and 10 in 2008.
Finally, although the USPS survey is generally of limited value in assessing WasteWise
approaches because of the lack of WasteWise branding at the USPS facility level, it is
worth noting that conferences in general (not just WasteWise conferences) were the tools
least cited as influencing waste management in the USPS survey, as shown in Exhibit 19.
3-5

Furthermore, the majority of survey respondents were not familiar with the WasteWise
conference in particular.
EXHIBIT 19:

TO O L S T H AT I N F L U E N C E WA S T E M A N A G E M E N T A C T I V I T I E S AT FA C I L I T I E S

Conferences

Staff/group meetings

Training sessions

Internal emails

Fact sheets

Information from other USPS facilities

Direction from Area and/or District staff members
0

10

20

30

40

50

60

70

80

90 100

Number of Facilities

WasteWise receives consistently positive feedback on technical tools offered to
partners, including Greenhouse Gas Calculations, Re-TRAC, program website, and
helpline: Greenhouse gas calculations were cited by participants in the focus group as
one of the WasteWise tools used extensively. Participants noted that the fact that the
calculations come from EPA gave them credibility within their organizations. One
participant went as far as to say that the use of the calculations were a key component of
their continued involvement in the program. Although survey respondents were not
familiar with WasteWise tools in general, of facilities that were familiar with these tools,
WARM Greenhouse Gas calculations received the most positive reviews.
Similarly, focus group participants indicated that the Re-TRAC system is extremely
helpful, and were very enthusiastic about the system’s ability to assist with waste
management and reporting. The ability to select different commodities was a popular
component of Re-TRAC. (As discussed under Evaluation Question 4, Re-TRAC has also
been critical for facilitating robust data collection and management for performance
measurement.) USPS interviewees noted that Re-TRAC is a key benefit for assisting new
partners in particular with waste tracking.

3-6

Some focus group members raised questions about how up-to-date the greenhouse gas
calculators are. One participant expressed concern that portions of the website have not
been updated in several years and that the WARM model, in particular, was not reflective
of current advancements in GHG calculations. Also, many participants expressed interest
in syncing their internal greenhouse gas calculators with the calculators offered through
the WasteWise program, to streamline GHG monitoring and reporting.
Focus group participants largely agreed about the overall helpfulness of the website, due
in particular to the availability of useful resources, calculators, and methodologies.
WasteWise partners who are aware of the helpline report that they find it extremely
helpful, especially with regard to seeking out information about annual reporting and
award applications. However, some participants were completely unaware of the helpline,
or unaware of the breadth of service that it provides.
Despite overall enthusiasm for WasteWise technical tools, some WasteWise partners
perceive that EPA’s communication on of the availability of these tools is lacking.
Participants who did not utilize specific tools often cited their lack of knowledge about
them. One participant suggested that EPA provide training sessions aimed at new
members that would involve using the website, annual reporting, and the applicability of
WasteWise tools.
WasteWise partners are hungry for more communication from the program.
During the focus group, the discussion of WasteWise tools led to a broader discussion on
the dissemination of information throughout the WasteWise program. There was a clear
divide between perceptions of long-term members and more recent joiners. Several
newer members had not used the helpline and were not aware of the services provided by
it. One new joiner indicated that his involvement was minimal due to a lack of
information and training. A long-term member indicated that, in the past, WasteWise
information was much more prevalent and available, but over the past year, the level of
information he received from WasteWise had dropped drastically. However, not all longterm members agreed with this sentiment. Participants did share general agreement that
the WasteWise contact information is out of date and that information distribution is not
reaching all members.
Partners are looking for several ways to become more informed about WasteWise and
take better advantage of program offerings, including;
•

Training for using the WasteWise tools, annual reporting, and award
applications.

•

More frequent contact from WasteWise about annual reporting, award
applications and other program announcements.

•

An updated, browsable, online directory of WasteWise to replace the current
system, which only allows for searching but not browsing.

•

Opportunities for newer members to network with older members who have won
awards and who are more knowledgeable regarding annual reporting and other
aspects of WasteWise.

3-7

Finally, focus group participants indicated that EPA should work harder to champion the
importance of WasteWise to increase its potential as a means of establishing closer
strategic relationships with other members (who may be suppliers).
Sector-based differences in perceptions of WasteWise approaches are minimal. The
focus group participants did not express divisions in perceptions of WasteWise
approaches among sectors, with the one exception of the waste management sector,
whose representative expressed concern that many WasteWise approaches were not
generally applicable to the sector. From IEc’s review of existing data, it appears that a
diversity of sectors participate in and benefit from the awards program. Although certain
sectors attend the conferences more that others (USPS, local government partners,
electronics/electrical equipment manufacturers, utilities), these attendance numbers
appear to correlate with the sectors’ overall facility participation rates in WasteWise. It is
interesting that USPS is highly represented at conferences, given the low marks that
facility-level staff gave to conferences in general. However, it is quite possible that
USPS District and Area staff attend WasteWise conferences, as opposed to the facilitylevel staff that participated in the survey.
E VA L U AT I O N Q U E S T I O N 2 :

In addition to participation in WasteWise, what other
factors may influence a partner organization’s decisions to improve management of
MSW (e.g., cost savings, consumer pressure, other voluntary program
opportunities)?
Findings:

IEc used the literature review as the main method to address this evaluation question; the
complete literature review is included in Appendix B. The literature review is comprised
of a targeted review and analysis of recent literature related to partnership programs,
focusing on the identification of the key external factors (i.e., factors not part of program
design) that may influence decisions to participate in the WasteWise program and to
change management practices. Below, we summarize findings of the literature review.
Under Evaluation Questions 3 and 4, we refer back to literature review findings in the
discussion of WasteWise impacts and the discussion of best practices for data collection
and quality control.
The literature review identified 12 general factors that influence environmental decisionmaking. IEc grouped these factors into the following categories: external market forces,
potentially complementary factors, pre-existing requirements, and factors with uncertain
impacts vis-à-vis WasteWise.
External market forces includes two factors, production levels/market trends and firm
size. Decreases in waste generation may be the direct result of a decrease in production
levels to respond to broader market forces. Broader market or sector trends can have a
direct effect on the changes in waste generation and waste management reported by
existing partners. Thus, production levels could result in overstatement or understatement
of WasteWise impacts. (In consideration of this dynamic, IEc addresses the issue of
normalizing for economic conditions within the Best Practices Review and Evaluation
Question 4). The firm size factor indicates that different sizes may have different
motivations for joining partnership programs. For example, larger firms that are more
3-8

likely to have sophisticated waste management approaches may focus on recognition,
while smaller companies may find technical assistance more important.
Potentially complementary factors to WasteWise include four factors: customer/supply
chain pressure, community pressure/public image, environmental ethic, and cost savings.
EPA has traditionally addressed these factors as “alternative” motivations to WasteWise,
and has discounted the role of WasteWise and other voluntary programs in partner
outcomes when these factors are clearly present. The literature, however, suggests that
these factors can work in complementary ways with WasteWise and similar programs by
assisting firms in obtaining and sharing information, and in adopting practices that confer
cost savings, demonstrate responsiveness to suppliers/public, and demonstrate adherence
to the firm’s environmental ethic. For example, a firm may join WasteWise to
demonstrate an environmental ethic, and may also, as a result of WasteWise, implement a
waste management plan earlier or on a broader scale, and thus enjoy greater cost savings.
Therefore, WasteWise may provide specific program resource or activities that represent
real program achievement, even in the context of other motivations. To evaluate the
impact of these factors on a particular firm, one would need to understand the role of
WasteWise and the extent to which the program’s tools, resources, and activities
contributed to the waste reduction or management outcomes.
Pre-existing requirements: The literature review found that in situations where
partners have separate, pre-existing requirements associated with other regulatory or
legally-binding agreements, these requirements are likely to drive documented waste
management changes, and WasteWise participation would have little or no impact. The
literature review indicates that changes in waste generation at companies that, for
example, are subject to state-level waste bans for certain wastes, should not be considered
the result of WasteWise activities. As discussed under Evaluation Question 3,
information that IEc collected from interviews with USPS conflicts to a certain degree
with this finding.
Uncertain impacts: The remaining five factors fall under the category of uncertain
impacts because it is not clear in general whether they complement WasteWise’s
structure, or indicate a motive that precludes a significant impact by WasteWise. These
factors include public disclosure laws, threat of future regulation, pressure from
environmental groups, and industry pressure or internal industry codes. The impact of
each of these factors is context-specific, and a complete evaluation of WasteWise impacts
requires firm-specific information to determine how these factors intersect with
WasteWise activities. The fifth uncertain impact, participation in other voluntary
programs, raises uncertainty because of potential double counting (this factor is addressed
in best practices review under Evaluation Question 4).

3-9

E VA L U AT I O N Q U E S T I O N 3 : What can be determined about how WasteWise
participation contributes to partner behavior regarding MSW management (e.g., by
effecting waste management improvements sooner, better incorporating waste
management as a permanent feature of corporate culture, facilitating nonparticipant changes by providing information)?
Findings:

The survey results provide clear evidence that WasteWise contributes to better
waste management practices among USPS facilities. Early WasteWise joiners (Group
A) reported greener approaches to waste management overall compared to later joiners
(Group B), and many results were statistically significant. Specifically:
• Early USPS WasteWise joiners (Group A) conduct more recycling activities
than later joiners (Group B).
The survey asked respondents about recycling activities that are undertaken at their
facility. As shown in Exhibit 20, on average, participants in Group A reported 2.77
recycling activities per facility, versus 2.00 activities for participants in Group B. This
difference is statistically significant at the 1% level (t = 3.13).
EXHIBIT 20:

R E C Y C L I N G A C T I V I T I E S AT U S P S FA C I L I T I E S

GROUP A

GROUP B

RECYCLING ACTIVITY

(N=30)

(N=102)

TOTAL

Reverse hauling of undeliverable mail
Separate collection/contracts with recyclers in
addition to waste haulers
Participate in specific recycling approach identified
by local government
Work with post offices to collect waste materials
from customers (e.g., unwanted mail from customer
PO Boxes)
Reuse of recycled materials in-house
Other
Total number of recycling activities
Average number of recycling activities per facility

19

48

67

19

61

80

6

15

21

21
14
4
83
2.77

31
30
19
204
2.00

52
44
23
287
2.17

• Early USPS WasteWise joiners (Group A) have higher recycling frequencies for
every material, and a higher recycling frequency across materials, compared to
later joiners (Group B).
The survey asked a series of questions about the frequency of recycling for a variety of
materials (undeliverable mail, plastic pallets, wooden pallets, corrugated cardboard,
mixed paper, office supplies, and plastic). For each material, the survey asked facilities
if the material is recycled:
• Always or almost always (90-100% of the time)
• Usually (50-90% of the time)
3-10

• Occasionally (10-50% of the time)
• Rarely or never (0-10% of the time)
For communication purposes, IEc color-coded results of the recycling frequency question
using King County’s Environmental Behavior Index (EBI) 8 , presented in Exhibit 21.
EXHIBIT 21:

E N V I R O N M E N TA L B E H AV I O R I N D E X C L A S S I F I C AT I O N

FINDING ON RECYCLING FREQUENCY

COLOR CODING

Always/Almost Always: 90 – 100% of the time
Usually: 50 – 90% of the time
Occasionally: 10 – 50% of the time
Rarely/Never: 0 – 10% of the time
Other
Not Applicable: this facility does not use/receive the material

Green
Light Green
Yellow
Brown
White
White

IEc analyzed material specific results, and rolled up results across materials. Exhibit 22
presents a rollup of recycling frequency across all materials. As shown in Exhibit 22,
Group A more frequently indicated that materials are always or almost always recycled,
and Group B more frequently indicated that materials are rarely or never recycled.
Material-specific results can be found in the WasteWise Survey Results memo in
Appendix F.
EXHIBIT 22:

R E C Y C L I N G F R E Q U E N C Y A C R O S S A L L M AT E R I A L S ( R O L L U P A N A LY S I S )

RECYCLING FREQUENCY

Always/Almost Always: 90 – 100% of the time
Usually: 50 – 90% of the time
Occasionally: 10 – 50% of the time
Rarely/Never: 0 – 10% of the time
Other
Not Applicable
Total

GROUP A

69.31%
15.84%
1.98%
6.44%
5.45%
0.99%
100.00%

GROUP B

55.46%
11.93%
4.74%
19.97%
6.75%
1.15%
100.00%

DIFFERENCE

13.85%
3.91%
-2.76%
-13.53%
-1.30%
-0.16%

We conducted a statistical analysis of the difference in recycling frequency of Group A
and Group B for always/almost always recycle and rarely/never recycle. On average,
participants in Group A reported always/almost always recycling 4.7 materials, versus 3.8
materials for participants in Group B. This difference is statistically significant at the 5%
level (t = 2.52). Participants in Group A rarely/never recycle an average of 0.4 materials,
while Group B reported rarely/never recycling and average of 1.4 materials, as presented
in Exhibit 23. This difference is statistically significant at the 1% level (t = 3.24). We
8

The EBI approach involves coding responses to communicate the environmental soundness of different actions (e.g., green
indicates most environmentally sound action, brown indicates least environmentally sound). King County, Washington, used
the EBI approach to communicate survey results on the adoption of environmentally preferable behaviors among County
residents.

3-11

did not conduct statistical analyses for the “usually” and “occasionally” frequencies, as
those two categories were very broad, accounting for frequencies ranging from 10 to
90%.
EXHIBIT 23:

S TAT I S T I C A L A N A LY S E S O F R E C Y C L I N G F R E Q U E N C I E S A C R O S S A L L M AT E R I A L S
( R O L L U P A N A LY S I S )

RECYCLING FREQUENCY

Always/Almost Always: 90 – 100% of the time
Rarely/Never: 0 – 10% of the time

AVERAGE #

AVERAGE #

MATERIALS

MATERIALS

TEST

GROUP A

GROUP B

STATISTIC 9

4.8
0.4

3.8
1.4

2.5214**
3.1846***

As shown in Exhibit 24, recycling frequency varied by material. Cardboard was the
material most frequently cited as being recycled always or almost always, with 97 % of
respondents from Group A reporting that cardboard always or almost always recycled.
Cardboard was also the most frequently reported material for Group B, with 81% of
respondents indicating that cardboard is always or almost always recycled. Undeliverable
mail was second most recycled material for both groups.
Group A reported higher recycling rates than Group B for every individual material.
Differences in recycling rates between Group A and Group B ranged from a small
difference, such as 3% for office supplies, to a larger difference of 20 % for recycling
plastic pallets. In general, the difference in responses ranged from 10 – 15 %. The
survey results memo in Appendix F provides details on recycling rates by material for
Group A and Group B.
EXHIBIT 24:

F R E Q U E N C Y O F A LWAY S / A L M O S T A LWAY S R E C Y C L I N G R E S P O N S E S B Y M AT E R I A L

MATERIAL

GROUP A

GROUP B

DIFFERENCE

Corrugated Cardboard
96.55%
81.00%
15.55%
Undeliverable Mail
89.66%
71.43%
18.23%
Mixed Paper
75.00%
58.59%
16.41%
Office Supplies
68.97%
66.00%
2.97%
Wooden Pallets
62.07%
48.51%
13.55%
Plastic Pallets
58.62%
38.38%
20.24%
Plastics
34.48%
24.24%
10.24%
Average
69.33%
55.45%
13.88%
Note: Percentages cannot be aggregated because this table presents only the frequency of
selecting always/almost always recycles; Appendix F contains detailed results for the response
options provided for this question.

• Early USPS WasteWise joiners (Group A) have been recycling for a longer time
than later joiners (Group B).
9

*** denotes 99% significance level, ** denotes 95% significance level, and * denotes 90% significance level.

3-12

The survey asked USPS facilities about the tenure of recycling activities. IEc analyzed
material specific results (presented in the survey results memo in Appendix F) and rolled
up results across materials. As shown in Exhibit 25, respondents most frequently
indicated a recycling tenure of more than five years, across all materials. However, 53 %
of respondents from group A reported first recycling materials more than five years ago,
compared to 40 % in group B. In addition, Group A reported that facilities started
recycling an average of 3.7 materials more than five years ago. Group B reported first
recycling an average of 2.7 materials more than five years ago. This difference is
statistically significant at the 5% level (t = 2.13).
EXHIBIT 25:

T E N U R E O F R E C Y C L I N G A C R O S S A L L M AT E R I A L S ( R O L L U P A N A LY S I S )

TENURE OF RECYCLING

GROUP A

GROUP B

DIFFERENCE

More than 5 years ago
3 – 5 years ago
2 – 3 years ago
1 – 2 years ago
6 – 12 months ago

52.74%
10.95%
3.48%
9.95%
2.49%

40.18%
11.09%
7.50%
6.60%
2.40%

12.56%
-0.14%
-4.02%
3.35%
0.09%

In the past 6 months
I do not know
Question was not asked (material is rarely/never
recycled)
Total

1.00%
12.94%

1.50%
10.49%

-0.50%
2.45%

6.47%
100.00%

20.24%
100.00%

-13.77%

• Early USPS WasteWise joiners (Group A) are more aware of their recycling
rates than later joiners (Group B).
The survey asked about awareness of the facility’s recycling rate across materials. As
shown in Exhibit 26, over 70% of respondents from Group A indicated that they either
know their recycling rate, or could research it for all or some materials it, while just over
50 % from Group B reported the same. This difference is statistically significant at the
10% level (z = 1.92). Very few facilities in either group knew their overall recycling rate
off-hand.
EXHIBIT 26:

AWA R E N E S S O F R E C Y C L I N G R AT E A C R O S S A L L M AT E R I A L S

DO YOU KNOW THE APPROXIMATE RECYCLING
RATE FOR THE MATERIALS YOUR FACILITY
RECYCLED IN 2008?

GROUP A

GROUP B

DIFFERENCE

No, this metric is not tracked.
I know or could research recycling rates for
some of the materials we recycle, but not all.
Yes, but I would need to research it.
Yes, I roughly know the %age of materials
that were recycled.
Total

28.57%

48.98%

-20.41%

32.14%
32.14%

24.49%
18.37%

7.65%
13.77%

7.14%
100.00%

8.16%
100.00%

-1.02%

3-13

For two survey questions, results indicated that Group A has greener practices than Group
B, but results were not statistically significant. These include:
• Frequency of changes to operations or to the facility’s organization resulting from
recycling and waste prevention.
• Number of waste prevention activities.
Differences between Group A and Group B on the following survey questions were
mixed and/or marginal:
• Changes in attitudes of facility personnel about waste prevention.
• Waste prevention/recycling leading to other environmental initiatives.
Additional information on the above survey questions and responses can be found in the
survey results memo in Appendix F.
Survey respondents cite several reasons for initiating recycling that are potentially
proxies for WasteWise influence, or complementary to WasteWise factors.
The survey asked USPS staff why they started to recycle various materials; results are
presented in Exhibit 27. Cost savings opportunity was the most common response, being
cited 527 times reason across all materials. As discussed under Evaluation Question 2,
despite conventional wisdom, the cost savings in motivating behavior is not necessarily a
detractor to the role of WasteWise, as WasteWise is designed to help facilities enjoy cost
savings from waste prevention and recycling. In fact, focus group participants and USPS
interviewers indicate that WasteWise provided cost savings opportunities, as discussed
later in this section.
Encouragement from District/Area representatives was the second most common
response, with 312 responses across the two groups. Requirement of District/Area
representatives was the fourth most frequently cited reason for first recycling materials,
with 188 responses across the two groups. Given that District and Area representatives
use WasteWise as the organizing framework for USPS waste management approaches,
and these representatives are a conduit for WasteWise information to USPS facilities, we
view these responses as potential proxy indicators for WasteWise influence at the facility
level. EPA voluntary program participation, another proxy for WasteWise, was citied 57
times across the two groups.
Survey respondents cited local initiatives 90 times across Groups A and B, and cited state
or local requirements 36 times. These factors are more likely to discount the role of
WasteWise in firm behavior, but they are also far less common than potential proxies for
WasteWise, and potential complementary factors to WasteWise. Also, the effect of state
and local requirements in this context are non-linear and difficult to decipher; see further
discussion of WasteWise’s role in USPS response to waste bans within the discussion of
WasteWise interview findings below.

3-14

EXHIBIT 27:

R E A S O N S C I T I E D F O R W H Y FA C I L I T I E S S TA RT E D R E C Y C L I N G ( A C R O S S M AT E R I A L S )

GROUP A

GROUP B

RESULTS

RESULTS

NORMALIZED

NORMALIZED

REASON FOR

TIMES CITED

TIMES CITED

RECYCLING (CHECK

BY GROUP A

BY GROUP B

BY FACILITY

BY FACILITY

ALL THAT APPLY)

(N=30)

(N=102)

TOTAL

COUNT

COUNT

133

394

527

4.4

3.9

98

214

312

2.5

2.1

76

112

188

0.9

1.1

Local initiatives
EPA voluntary
program
participation

26

64

90

0.6

0.6

19

38

57

0.2

0.4

Other
Required by local or
state law

7

46

53

0.5

0.5

15

21

36

.5

0.2

Cost savings
opportunity
District/Area
representatives
encouraged it
District/Area
representatives
required it

Self-selection bias is unlikely to explain the extent of differences found in the survey
between early and later WasteWise joiners. Early USPS WasteWise joiners may have
benefited from Area and District management that were generally more proactive on
environmental issues than Group B. Thus, one could argue that Group A has a selfselection bias, and may have undertaken improvements to waste management seen in the
survey results in absence of WasteWise. Thus, we looked for evidence, beyond
differences in Group A and Group B, that WasteWise contributed or did not contribute to
waste management practices. We found some indications of WasteWise’s direct
influence from survey responses, which counter the notion that proactive Area and
District management can explain differences in Group A and Group B:
• The timing of when facilities started to recycle is generally consistent with when
facilities joined WasteWise. Group A joined from 1997 to 2004 whereas Group B
primarily joined later, with the majority of districts joining in 2007 and 2008. As
discussed above, the survey found a statistically significant difference in the
number of facilities that started recycling or improved waste management over
five years ago between Group A and Group B, with many more facilities in Group
A starting recycling more than five years ago. Moreover, given that Group B
joined WasteWise mostly in the 2007-2008 timeframe, we would expect to see
more Group B recycling activity starting during this time if the activity was tied to
joining WasteWise. Across all materials, the proportion of respondents in Group
B that started recycling 2-3 years ago is 4% more than Group A respondents.
However, this difference is larger for individual materials, including a 9%
difference for undeliverable mail recycling, and a 6% difference in plastic pallet
3-15

and cardboard recycling. To the extent that WasteWise efforts have been directed
at these materials over the last few years, this would be further evidence of
causality.
• Although we did not expect respondents at the facility level to be familiar with
WasteWise by name, some facilities directly cited WasteWise as a reason for
originating recycling activities, including 27% of respondents in Group A and
14% from Group B. Moreover, the higher proportion of respondents in Group A
citing WasteWise as a factor in originating recycling activities does not support
the notion that self-selection bias accounts for differences seen in Group A and
Group B.
• Many survey respondents from both Group A and Group B indicated that
District and Area representatives either encouraged or (less frequently) required
recycling of various materials, as shown in Exhibit 27. Given that District and
Area representatives are a conduit for WasteWise information to USPS facilities,
these are potential proxy indicators for WasteWise influence, as discussed above.
As shown in Exhibit 27, the number of times that influence of District and Area
representatives was cited is similar in Group A and Group B, which does not
support the notion that District and Area staff are generally more proactive in
Group A. In addition, if being independently environmentally proactive was the
main reason that Group A started recycling earlier, we would not expect to see
cost savings cited more by Group A (on a normalized basis) as a factor motivating
behavior.
Focus group results and USPS interviews validate survey findings that WasteWise
contributes to changes in waste management.
• Several focus group participants identified tangible waste prevention or
recycling achievements that WasteWise contributed to. Specific benefits of
WasteWise cited by focus group participants include:
- Initiating waste management initiatives that led to environmental benefits and
cost savings, and which would not have occurred outside of WasteWise, or
would have occurred later in the absence of WasteWise.
- Continuing greener waste management practices that were environmentally
preferable but not justified on a cost basis, because the firm had communicated
the improved practices to stakeholders.
- Using WasteWise data and framework to support broader sustainability goals
such as carbon neutrality and green building operations.
- Communicating waste management achievements to stakeholders.
Specifically, focus group members stated that communications to the public
about waste management were more credible when mentoring WasteWise, and
that these communications contributed to the practices becoming standard,
permanent procedures.

3-16

• USPS staff interviewed reported that early joiners have tangibly benefited from
cumulative knowledge gained from being a WasteWise partner, which has led to
better results seen in the survey. This assessment was provided by all three of the
USPS interviewees, including an interviewee from a district that joined later.
Interviewees underscored that WasteWise gave USPS a framework and a game
plan to implement the organization’s general waste management goals. One
interviewer stressed that WasteWise assistance in identifying recycling markets
was quite valuable and led to recycling of additional waste streams.
One USPS interviewee indicated that the organization used WasteWise as a
framework to respond to a patchwork of different state waste bans in a coherent,
efficient way. Without the framework and tools offered by WasteWise, the
interviewee stated that USPS would have taken longer to come into compliance
with the waste bans, and compliance costs would have been higher. The literature
review underscores that WasteWise cannot take “credit” for mandated
environmental improvements. However, if data were available, WasteWise could
conceivably take credit in this case for “early adoption” – the mandated
improvements that occurred prior to the compliance deadline. Moreover, the
program could take credit for compliance cost savings associated with the new
regulation.
The inability to conduct the district survey hindered learning about some potential
areas of WasteWise influence on USPS, including personnel attitudes on waste
management; and changes in relationships with regulators, suppliers, competitors,
and the public. The facility survey indicated little difference between Group A and B
on personnel attitudes; interviewees noted that WasteWise influence on attitudes is
mostly seen at the district level. Given IEc’s understanding of USPS structure, we did
not ask questions about changes in relationships at the facility-level questionnaire, as
these questions are likely not applicable to facility-level staff. If we found transmission
of WasteWise principles and activities through partner organizations, and/or to external
organizations (i.e., spillover effect) that would be a significant area of program benefit.
We discuss this “OMB White Paper in Appendix I.
E VA L U AT I O N Q U E S T I O N 4 :

What can EPA do to encourage WasteWise partners to
submit sufficient environmental data for performance measurement and evaluation
purposes?
Findings:

We discuss the findings of the best practices review below; the complete best practices
review is included in Appendix G.
It should be noted that since IEc commenced this evaluation in late 2008, EPA instituted
new data reporting requirements for WasteWise. For example, WasteWise now requires
partners to sign a Partnership Agreement when registering, and the program has
developed a Partnership Assurance Protocol requiring partners to report baseline and
annual data in order to remain in active status. In addition, WasteWise has transitioned to
a fully online reporting system in July 2009. The best practices review took these recent
program changes into account.
3-17

Also, it would be unrealistic to expect WasteWise or any other EPA partnership program
to implement all of the data collection and QA/QC best practices indentified by IEc in the
best practice review. Some of the best practices are resource intensive, and cannot be
implemented in the absence of staff able dedicate much of their time to performance
measurement. In addition, some of the best practices and specific data recommendations
indentified in this report may fall outside of the current ICR and approved data collection
forms.
WasteWise is now collecting data necessary to establish a credible baseline.
WasteWise has requested baseline data from new partners since 2005, although most
partners did not provide baseline data on a voluntary basis. As noted above, WasteWise
recently developed a Partnership Agreement that requires partners to register in the
program and submit baseline and annual data. Upon registering for WasteWise, the
“Welcome to WasteWise” email generated by Re-TRAC communicates that prospective
partners need to report baseline information within 60 days of joining the program.
Partnership is activated by submitting baseline data; for example, EPA lists the entity as a
partner and distributes an electronic logo once the data are submitted. If EPA does not
receive data within 60 days, the Re-TRAC account is deactivated, and partnership is
never established. WasteWise staff may grant extensions to this reporting schedule on a
case-by-case basis depending on partner circumstances.
Now that WasteWise is collecting baseline and annual data (see below) for all partners, it
will develop a data set that could be mined for performance measurement purposes in the
future. Although this evaluation takes an in-depth look at changes in partner behavior
associated with WasteWise membership, the program will be in a better position to
analyze program environmental outcomes in a few years, when it has accrued enough
baseline and annual data to support trend analysis.
WasteWise has created a powerful incentive for program participation and
reporting by offering free access to Re-TRAC. Re-TRAC is popular, proprietary
online software that assists organizations in tracking waste prevention, disposal, and
recycling at the commodity level. In absence of WasteWise, organizations pay a
subscription fee to use Re-TRAC. Offering Re-TRAC for free to WasteWise participants
is a key program benefit. Focus group participants and USPS interviews indicated that
Re-TRAC is very helpful for tracking waste minimization and recycling efforts.
WasteWise also provides a GHG report on waste and recycling data reported, which
focus group members identified as a valuable service. Independent of this evaluation,
EPA has received positive feedback on Re-TRAC from its partnership. 10
WasteWise has taken steps to encourage participant adherence to the program’s
reporting standards, although EPA could do more to improve the first-time quality
of data submitted by partners. The new WasteWise Partnership Assurance Protocol
requires annual reporting by March 31st for the previous calendar year. Re-TRAC
facilitates incremental reporting, so facilities can enter data weekly or monthly, or at
intervals customized by a partner, as an alternative to entering annual quantities. (ReTRAC automatically sums data reported across time at the end of the year to develop
10

WasteWise Re-TRAC factsheet, available at: http://www.epa.gov/osw/partnerships/WasteWise/pubs/retrac.pdf

3-18

annual quantities.) EPA deactivates WasteWise partnership for partners who do not
submit annual data within 60 days of the March 31st deadline, unless EPA staff grants an
extension for extenuating circumstances.
Re-TRAC provides instructions to participants for reporting waste data. Re-TRAC also
provides guidance on data collection practices, and provides links to FAQs and other
resources. In addition, WasteWise advertises on its reporting page that the program
hotline can provide assistance with data reporting; hotlines can be very effective tools for
ensuring partner understanding of program requirements and compliance with them.
Finally, WasteWise maintains a library of on-line documents to assist partners with
reporting-related activities including: setting up recycling programs, conducting waste
audits, monitoring program effectiveness, and communicating program results.
Other programs reviewed have taken additional steps to encourage adherence to reporting
standards, including:
•

Providing model baseline and annual reports.

•

Providing direct links from the online reporting form to a list of common
conversions, to avoid conversion errors.

•

Specifying data fields that participants are required to complete before submitting
an annual report. IEc was able to submit a blank test annual report through the
Re-TRAC system. Although all annual reports submitted via Re-TRAC are
reviewed and approved before being formally entered into the WasteWise,
database, requiring fields would help to clarify reporting rules, improve first time
quality, and reduce resources necessary for reviewing reported data.

•

Collecting supplemental information on how partners measure or estimate
reported quantities. For example, recycling data are typically of high quality
because recyclers have an incentive to calculate the exact quantity of materials
collected from suppliers. However, data on waste disposed can be subject to
some errors from conversions, or from questionable methods of estimating
tonnage disposed (as some waste hauling contracts, especially those that are
based on a waste hauling schedule for emptying a set number of dumpsters, do
not generate invoices that specify tonnage disposed). Also, data on waste
prevention typically needs to be estimated, often involving a series of
assumptions and calculations that can introduce error and/or uncertainty.
Obtaining this supplemental information may require revisions to existing
WasteWise forms currently approved by OMB.

EPA takes steps to validate waste data reported to WasteWise, but could adopt
additional measures to bolster confidence in self-reported data. WasteWise staff
review waste data before it is published, using ad-hoc logic tests to assess the plausibility
of data reported. For example, if a partner reports a waste quantity that appears to
conflict with previous reporting, or be out of step with the type and scale of operations,
WasteWise staff note the issue and follow up with the partner. EPA will not finalize the
data for aggregation until staff are satisfied with the quality of the data.

3-19

While logic tests are helpful, other programs have developed guidance for reviewing data
submitted. Guides typically cover appropriate responses to each data field on the
reporting form. Guides also call out common quality problems, such as pitfalls in
estimating waste disposal and prevention (discussed above), and in assessing changes in
quantities from year to year. Guides also sometimes include information on the range of
impacts (in this case, waste) typical for various industries and scale of operations.
EPA does not conduct site visits or require third-party certification to verify WasteWise
data. It should be noted that conducting site-visits to verify data would be very resource
intensive for EPA, while requiring third-party certification would be similarly resource
intensive for WasteWise partners. As such, most EPA partnership programs do not
utilize site visits or require third- party data certification.
While many EPA partnership programs encourage or require partners to submit
normalized data, OMB has precluded WasteWise from collecting normalized data.
Many programs reviewed encourage or require partners to provide normalized data, to
factor out external factors when reviewing progress. External factors include changes in
economic conditions (or other measure of an organization’s activity). Absolute data do
not indicate if reductions or increases in waste are due to economic conditions as opposed
to partner environmental initiatives, which is important context for understanding
individual facility progress.
However, in the process of EPA coordinating with OMB on ICR approval for the
WasteWise program in 2007, OMB raised background economic conditions as a factor
that WasteWise should account for when they are attributing partner improvements to the
program. Thus, OMB included background economic conditions among the list
including cost savings, state and local laws, and customer expectations that are commonly
cited as other factors that influence firm behavior. 11 Given OMB’s inclusion of economic
conditions as an attribution factor, EPA has not pursued collection of normalized data
from WasteWise partners. EPA is awaiting the results of this evaluation to engage in a
comprehensive discussion of WasteWise attribution issues with OMB. It is not clear why
OMB did not raise similar concerns within the ICRs for the other EPA partnership
programs that collect data normalized for economic conditions.
WasteWise emulates other data quality best practices identified across partnership
programs. WasteWise has taken proactive steps to ensure data quality by adopting best
practices including:

11

•

Ensuring internal consistency by using the WARM model, ORCR’s official
model, to generate greenhouse gas reports for members.

•

Using automated data aggregation tools in Re-TRAC to eliminate the possibility
of human error in aggregating program results.

•

Taking steps to avoid double-counting of waste reductions reported to
WasteWise with waste reductions reported to other programs.

See the previous literature review conducted for this evaluation for a discussion of influences on firms that join

partnership programs, dated December 14, 2009.

3-20

•

Including a disclaimer on reported aggregate results that WasteWise does not
take credit for all improvements reported by partners, given the constellation of
factors that influence partners’ waste management decisions.

3-21

CHAPTER 4 | RECOMMENDATIONS

In this chapter, IEc draws on lessons learned from this program evaluation to provide
recommendations to EPA on future directions for WasteWise. As seen in the previous
chapter, findings from this evaluation are largely positive; they reflect that WasteWise
provides real value to its partners, and drives positive changes in waste management
within partner organizations.
The findings from this evaluation are particularly positive considering the clear resource
strain on the program. From 2003 through 2008, WasteWise lost half of its staff, going
from five FTEs to 2.5 FTEs over the course of five years. The program did not increase
contractor spending to substitute for the loss of FTEs. Furthermore, WasteWise received
significant program funding from OAR through 2006, but since 2006, OAR has not
contributed to WasteWise, resulting in a 60% reduction in WasteWise funding. Since
2006, OSWER has funded the entire program out of its budget. As a result of budget
cutbacks, WasteWise eliminated several partner services over the past five years,
including a campaign to promote state-level WasteWise programs, WasteWise bulletins
and other regular partner communications, exhibits at conferences and trade shows, and
regional recruiting events.
In light of the generally positive assessment of the program, IEc does not recommend
making sweeping changes to the program, especially any that would result in further
reductions of resources. Moreover, given the resource constraints faced by the program,
we focus on recommendations that would: strengthen program design in a low cost
manner, respond to the stated needs of WasteWise partners, and help EPA measure and
demonstrate the benefits of the program moving forward.
Increase communications from EPA to WasteWise partners. WasteWise staff
reduced communications to partners over the past five years as a result of budget
cutbacks. Focus group participants and USPS interviewees noticed this change, and
expressed a desire for more regular communications from the program. EPA should
explore if it is possible to augment communications in a low-cost manner. Some specific
ideas to consider include:
•

Develop an electronic welcome packet to distribute to new WasteWise partners,
and to be stored on the partners-only portion of the WasteWise website. The
welcome packet should provide cursory information on all WasteWise services,
and include links to those services and/or applicable contract information.

•

Develop standard email communications to distribute to members regarding
annual reporting, awards applications, and other regularly occurring program
happenings. Distribute these email communications on a standard cycle.
4-1

•

Use social networking platforms such as Twitter and LinkedIn to communicate to
the WasteWise partnership at a low cost, and through media that members may
be actively using already. Several EPA programs as well as the Administrator’s
office use Twitter and LinkedIn to communicate to various audiences.

•

To address existing confusion among the WasteWise partnership, work with
colleagues in ORCR to develop a one-pager that clarifies the relationship
between WasteWise and ORCR’s Resource Conservation Challenge (RCC),
including differences in benefits. Post the one-pager on the WasteWise and RCC
websites, and distribute it to respective email lists.

•

Update the WasteWise contacts database, and ensure that it is kept up-to-date by
including a line on each email communication from the program that asks
recipients if WasteWise is reaching the right people, and provide clear
instructions for partners to update contact information.

Promote communications among WasteWise partners by providing an online venue
for networking. In addition to desiring more communication from EPA, WasteWise
partners are eager to network more among themselves, both to share information and
lessons learned on environmental strategies, and to form strategic business relationships.
However, it is unlikely that EPA can sponsor additional in-person networking events for
WasteWise partners with current resources available to the program. Alternatively, EPA
should explore virtual networking models available to the Agency. LinkedIn is a possible
solution for fostering networking among WasteWise partners, as well as facilitating
regular communication from EPA staff to the WasteWise membership. In addition, EPA
could explore other existing commercial networking sites that are designed to organize
and promote information sharing among groups, and facilitate ongoing discussion.
Multiple free or low-cost online applications have features that cater to common
organizational needs such as discussion threads, blogs/wikis, the ability to post
documents and links, and event calendars and notifications. Two such sites known to IEc
are www.huddle.net and www.ning.com.
Also, to allow partners to find one another more easily, EPA should use contact
information from Re-TRAC or the program’s existing contacts database to develop a
web-view of contact information that is browsable by sector, and available on the
partners-only area of the WasteWise website. If categorizing partners by sector is
currently cost-prohibitive due to the need to research sectors, EPA could add a question
to the WasteWise application form and annual reporting form that asks partners to select
a sector from a drop-down list. It is unclear if this change would be covered by the
current WasteWise ICR, or if EPA would need to seek approval from OMB to make this
change.
In absence of additional program funding, consider recasting the conference as an
awards ceremony. In the past, WasteWise held conferences over a two day period that
included working sessions, forums, and partner networking. This format was well-liked
and well-attended by partners. However, IEc’s review of WasteWise conference
attendance data from 2007 and 2008 suggests that the current conference format may not
be delivering as much value as participants expect. Overall conference attendance
4-2

declined from 2007 to 2008, repeat attendance is low, and the conference appears
dominated by EPA attendees. During the focus group, IEc heard from one participant
that the conference feels more like an awards ceremony than a broader networking event.
If EPA lacks the resources to expand the focus of the conference to include broader
sessions and networking events, the Agency should consider rebranding it as an awards
ceremony only.
Keep a focus on offering high-value technical tools to partners. Partners clearly value
WasteWise’s technical tools, including Re-TRAC and GHG calculations from waste
reporting. EPA should bolster the value of current tools by ensuring that underlying data
are up-to-date; and by developing frequently asked questions for the WARM model and
GHG reports tailored to the WasteWise audience. If WasteWise is looking for an
incentives area to invest in, developing additional technical tools would be a good area to
explore. Previously, WasteWise had considered developing a series of technical issue
papers in conjunction with Hall of Fame companies; this may beone strategy for
providing additional technical resources to partners. Also, WasteWise could foster
communications about technical issues and desired tools on an online networking
platform (discussed above). For example, during the focus group, participants discussed
their interest in assistance in integrating various GHG reporting tools. Even if
WasteWise cannot address this need with resources available, through online networking,
it could foster dialogues among partners about technical solutions to integration that
partners are experimenting with or have had success with.
Invest in enhancements to annual reporting to improve the efficiency of the
reporting review process, and collect information on potential benefits of
WasteWise. The best practice review identified several potential enhancements to the
WasteWise annual reporting process that are utilized by other partnership programs,
including:
•

Additional training materials for annual reporting, such as model reports, which
would clarify reporting rules and likely increase first-time quality of data
submitted (thereby reducing staff or contractor hours needed to review reports).

•

Adding questions to baseline and annual reporting forms to inquire about how
partners estimate reported data on waste disposal and waste prevention in
particular. Having this information will often provide confidence in data
reported; in some cases, it will highlight potential problems for EPA to follow up
on.

•

Combining new member registration with baseline reporting, to establish a onestep process for new members. This change also has the potential to reduce
transaction costs for both members and WasteWise staff.

•

As part of ongoing Re-TRAC enhancements, EPA should require that partners
complete non-optional fields in the reporting form, to make sure that sufficient
data are included in annual reports. As with other proposed enhancements,
adding required fields will reduce review time.

4-3

•

Develop internal guidance for WasteWise staff and contractors for systematically
reviewing WasteWise partner data, to replace current use of ad-hoc logic tests.

In addition, EPA should add questions to the annual reporting form that inquire about
potential WasteWise contributions to partner operational decision-making on waste
prevention and management initiatives, as well as any other business decisions that relate
to waste prevention and management (i.e., supply chain alterations). Again, some of
these recommendations would increase data collected by WasteWise, and may require
OMB approval and ICR modification.
As resources allow, conduct research into spillover effects. Communication of
information on environmental best practices from the WasteWise program and its
partners to non-partners (i.e., spillover effects) is a potentially important area of program
benefits, and one that we have not been able to assess in this evaluation. If funding is
available for additional research, we recommend that EPA examine WasteWise spillover
effects in sectors with high representation in WasteWise. EPA could add questions to the
annual reporting form to gauge whether (and how) WasteWise has improved
relationships with competitors, suppliers, and/or customers. If EPA could obtain ICR
clearance for a survey, it could also investigate potential spillover effects by surveying
partners and non-partners in selected sectors.
Develop high-level communications around the interplay of factors that encourage
firms to make decisions on waste management and other environmental issues. It is
clear from the literature, focus groups, and interviews that significant changes in
environmental practices are driven by a constellation of motivations and organizational
structures. WasteWise clearly caveats its reporting to communicate that the program
does not take credit for all of the results reported by members. But more broadly,
confusion about the role of partnership programs in motivating change is widespread. As
such, we recommend that based on this evaluation and related work, EPA develop
communication pieces for various audiences (internal management, political, academic,
partnership program members and stakeholders, and the general public) on all of the
factors that encourage firms to make voluntary environmental investments, and on how
partnership programs intersect with some of these factors to spur positive changes in
behavior. We also suggest that EPA develop a companion set of communications around
the new white paper that ESD is developing on how partnership programs can
demonstrate their value.

4-4


File Typeapplication/pdf
File Modified2011-06-03
File Created2011-06-03

© 2024 OMB.report | Privacy Policy