2679ssb01

2679ssb01.docx

Chesapeake Bay Program Citizen Stewardship Index, Diversity Profile, and Local Leadership Surveys (New)

OMB: 2003-0002

Document [docx]
Download: docx | pdf



PART B OF THE SUPPORTING STATEMENT (FOR STATISTICAL SURVEYS)











INTRODUCTION TO PART B

The Environmental Protection Agency (EPA) will conduct the following type of statistical survey to estimate end user satisfaction of its research products. This survey will evaluate EPA’s research products based on the quality, usability, and timeliness of these products and will be collected by collecting survey-based feedback from the key users of EPA’s research products, who include the Agencies state, local, and non-governmental partners.





This page intentionally left blank.



B.1 SURVEY OBJECTIVES, KEY VARIABLES AND OTHER PRELIMINARIES

B.1.a Survey Objectives

This request is for three related surveys that will provide data to track its progress under the Stewardship goal of the 2014 Chesapeake Bay Watershed Agreement (the Agreement).1 The Stewardship goal is stated as:

Increase the number and the diversity of local citizen stewards and local governments that actively support and carry out the conservation and restoration activities that achieve healthy local streams, rivers and a vibrant Chesapeake Bay.2

The three surveys contained in this request provide data to track three outcomes defined under the Stewardship goal:

  • Citizen Stewardship Outcome: Increase the number and diversity of trained and mobilized citizen volunteers with the knowledge and skills needed to enhance the health of their local watersheds.

  • Local Leadership Outcome: Continually increase the knowledge and capacity of local officials on issues related to water resources and in the implementation of economic and policy incentives that will support local conservation actions.

  • Diversity Outcome: Identify stakeholder groups that are not currently represented in the leadership, decision-making and implementation of conservation and restoration activities and create meaningful opportunities and programs to recruit and engage these groups in the Partnership’s efforts.

The three surveys are aligned with the three outcomes stated above.

B.1.b Key Variables

In designing the surveys, EPA and its partners in the Chesapeake Bay Program (the “Program”) have endeavored to include only the necessary information to track the outcomes cited above, as well as information that can assist in interpreting the data (e.g., demographics and other context-related information).

The key aspects for tracking the outcomes for each outcome are describe below.

For the Citizen Stewardship Outcome (Stewardship Survey), the key data elements are:

  • Information on activities the respondent performs related to lawn and/or yard care

  • Stewardship behaviors (e.g., use of rain barrels, use of fertilizer) that the respondent performs currently and an assessment of the likelihood the respondent would perform them in the future

  • The respondent’s volunteer-related activities and attitudes toward volunteer activities

  • The respondent’s civic engagement and attitudes toward civic engagement

  • The respondent’s perceptions and attitudes towards environmental and/or water quality

For the Local Leadership Outcome (Local Leaders survey), the key data elements are:

  • A set of questions on the extent to which the respondent’s community takes into account upstream and downstream communities when making decisions, both overall and in specific policies

  • Priorities for conservation and restoration activities and policies

  • The impact that protecting the Chesapeake Bay has on the various aspects (e.g., economic, water quality) of the community

  • Tools that the community considers in protecting Chesapeake Bay

  • Their level of understanding of rules and regulations at various levels of government

  • Access to knowledgeable people to provide assistance in implementing environmental policy

  • Activities the community has undertaken in the last three years related to water quality

  • An assessment of the respondent’s understanding of watersheds

Finally, for the Diversity Outcome (Diversity Profile survey), the key data elements are:

  • Gender identity

  • Age

  • Racial identity

  • Identification as a member of the LGBTQIA+ community

  • Disability status

  • A list of groups they engage with as part of the Program

B.1.c Statistical Approach

Both the Local Leaders Survey and the Diversity Profile Survey will be sending the survey to a set of individuals who appear on lists that the Program will use for purposes of the survey. Thus, the Local Leaders Survey and the Diversity Profile Survey are not employing statistical methods.

The Stewardship Survey has selected a sample size that is identical to the one used in the 2017 version of the survey that was implemented by a Program partner. The statistical approach for the Stewardship Survey is to use stratified random sampling where the defined strata are the seven states in the region (see Table 11). Potential respondents will be selected from each state randomly to meet the sample size requirements. Section B.2.c.i discusses the precision associated with these sample sizes in each state.

B.1.d Feasibility

EPA has assessed the feasibility of completing the surveys regarding the potential obstacles that respondents may face, the available funding, and timeliness of the data.

  • Obstacles. EPA does not expect the respondents to face any significant obstacles in responding to the three surveys. As noted above, the Stewardship survey and the Diversity Profile survey were both implemented by EPA partners previously and did not encounter obstacles to response in those instances. The Local Leaders survey, new to this effort, asks respondents a set of questions that should be answerable based on their existing knowledge. Additionally, the modes of implementation – web for the Local Leaders and Diversity Profile survey and a combination of phone, web, and mail for the Stewardship Survey – were chosen to minimize any potential obstacles to response.

  • Funds. EPA has ensured that sufficient funds are available to complete the survey.

  • Timeliness. The survey results will be available in time to inform EPA’s tracking of outcomes.

B.2 SURVEY DESIGN

This section contains a detailed description of the statistical survey design and sampling approach including a description of the sampling frame, sample identification, precision requirements and data collection instrument. Given that only the Stewardship Survey is using a statistical design, much of the focus on this section will be on the Stewardship Survey. Where necessary, we address aspect related to the other two surveys as well.

B.2.a Target Population and Coverage

Table 10 below provides a summary of the targeted and covered (sampling) population for each survey.

Table 10. Target and Sampled Populations for Each Survey

Survey

Target Population

Covered (Sampling) Population

Stewardship

Adults who live in the Chesapeake Bay region.

Adults living in the Chesapeake Bay region that are reachable by one of the three survey modes: phone, web, or mail.

Local Leaders

Elected or appointed officials (or their senior staff) who are involved in policymaking in the Chesapeake Bay region.

Elected or appointed officials (or their senior staff) whose email addresses have been compiled by EPA and its partners in the Program.

Diversity Profile

Individuals who are working in roles to preserve Chesapeake Bay

Individuals who are working in roles to preserve Chesapeake Bay whose email addresses have been compiled by EPA and it partners in the Program.



B.2.b Sample Design

This section describes the sample design. It includes a description of the sampling frame, target sample size, stratification variables and sampling method.

B.2.b.i Sampling Frame

The sampling frames for the Local Leaders survey and the Diversity Profile survey consist of lists of in-scope email addresses that were compiled by EPA and its partners on the Program. For the Diversity Profile survey, the Program developed an initial list in implementing the survey in 2016 and has refined that list since then.

The Stewardship Survey will use a list of phone numbers that consist of both landline and cellphones, an address list, and email addresses. EPA will rely on the implementing contractor to acquire these lists. As noted, this survey was implemented before as a phone survey and EPA will ensure a comparable sample frame is used.

B.2.b.ii Sample Size

The Local Leaders and Diversity Profile surveys are not using statistical sampling and the survey is being sent to all individuals on the lists compiled by EPA and its partners. Thus, sample size will be determined by the number who respond to the survey.

The Stewardship Survey will be selecting stratified random samples from each of the seven states within the Chesapeake Bay Region based on the population of each state.

B.2.b.iii Stratification Variables

For the Local Leaders and Diversity Profile surveys, no stratification is being used; all individuals who appear on the sample frame will receive the survey.

The Stewardship Survey will stratify the sample by state to ensure state-level results can be analyzed with known precision and confidence. Table 11 provides a state-level breakdown of the 5,200 people that will be sampled under the survey. The data on Table 11 reflect the breakdown used in the 2017 version of the survey to ensure comparability.

Table 11. Sample by State for Stewardship Survey

State

Sample Size

Sample Percentage

Estimate of Population Living in Chesapeake Bay Region Estimate

Population Percentage

Delaware

400

7.7%

178,000

1.0%

District of Columbia

800

15.4%

658,000

3.6%

Maryland

1,000

19.2%

5,895,000

32.4%

New York

400

7.7%

599,000

3.3%

Pennsylvania

1,000

19.2%

3,634,000

19.9%

Virginia

1,000

19.2%

6,990,000

38.4%

West Virginia

600

11.5%

268,000

1.5%

Total

5,200

100%

18,222,000

100%



B.2.b.iv Sampling Method

For the Local Leaders and Diversity Profile surveys, no sampling method is being employed; all individuals who appear on the sample frame will receive the survey.

For the Stewardship Survey, EPA’s contractor will select random samples within each stratum and will reach out to each potential respondent via mail, email, and phone.

B.2.c Precision Requirements

B.2.c.i Precision Targets

The Local Leaders and Diversity Profile surveys are not using statistical sampling and the survey is being sent to all individuals on the lists compiled by EPA and its partners. Thus, neither survey has a defined precision in a statistical sense.

Table 12 contains the precision values for the Stewardship Survey. EPA has set a minimum of +/- five percentage points for yes/no questions on the survey. EPA has not set a precision for each state but plans to use the same sample size as previously used to ensure comparability. Thus, EPA calculated the precision that can be obtained based on the sample from sample used in each state in the 2017 survey. The precision values reflect the plus/minus values around an assumed worst-case (variance-wise) assumption of 50 percent yes response to a yes/no question. For example, in Delaware, if the value for a yes/no question were 50 percent of respondents, a 95 percent confidence interval around that estimate would be from 45.1 percent to 54.9 percent. As measured values move from the worst-case variance value of 50 percent, the confidence intervals would become more precise as variance declines. EPA expects these precision values, especially the one for the entire region of 1.4 percentage points, is more than sufficient to meets its needs.

Table 12. Statistical Precision Estimates for Stewardship Survey

State

Sample Size

Estimate of Population Living in Chesapeake Bay Region Estimate

Precision (+/- for 95 percent confidence interval)

Delaware

400

178,000

4.9

District of Columbia

800

658,000

3.5

Maryland

1,000

5,895,000

3.1

New York

400

599,000

4.9

Pennsylvania

1,000

3,634,000

3.1

Virginia

1,000

6,990,000

3.1

West Virginia

600

268,000

4.0

Total

5,200

18,222,000

1.4



B.2.c.ii Non-sampling Error

EPA expects the primary form of non-sampling error to affect these three surveys will be nonresponse. To minimize nonresponse, EPA will employ good survey practices, including:

  • For the two email-based surveys, EPA and its partners will send a pre-notification email followed by a send-out email with the link to the survey and then a series of 3-4 reminder emails.

  • For the Stewardship Survey, EPA has decided to use a multi-modal approach to maximize the likelihood of response. Respondents will be allowed to choose a response mode that is best for them.

The occurrence of nonresponse in each survey will potentially bias the results towards specific types of respondents. For the Local Leaders and Stewardship surveys, EPA expect that respondents who are more interested in conserving Chesapeake Bay will also be more likely to respond. For the Diversity Profile survey, EPA expect that individuals who are more concerned about diversity-related issues would be more likely to respond.

EPA is not basing decisions on the data collected through the survey but is using the data to track progress under three outcomes defined under the Stewardship goal of the 2014 Chesapeake Bay Watershed Agreement. Thus, EPA plans to collect these data over time. As long as the effect of nonresponse is constant over time, the data being tracked by EPA in each survey will allow for tracking the Agency’s progress at meeting its objectives under the outcomes and the goal.

B.2.d Data Collection Instrument Design

As noted under Section B.1.b, most of the survey questions on each instrument reflect data needed to track progress under the three outcomes of the Stewardship goal under the 2014 Chesapeake Bay Watershed Agreement. In this section, we provide further justification for each set of questions.

Local Leaders survey

  • The respondent’s role in the community (title, in decision-making, functionally) – Understanding the roles of the respondents will help EPA in interpreting the responses to key questions on the survey regarding the knowledge and capacity of local leaders.

  • The size and location of the community – The size and location of the community will assist EPA in understanding the responses to the key data collected under the survey regarding the knowledge and capacity of local leaders.

  • Several questions on the extent to which their community takes into account upstream and downstream communities when making decisions, both overall and in specific policies – Understanding the extent to which communities take into account upstream and downstream communities is a key aspect of understanding how local leaders can contribute to the stewardship goal.

  • Priorities for conservation and restoration activities and policies – Understanding the priorities of the communities will provide information on the extent to which local leaders are contributing to Stewardship.

  • The impact that protecting the Chesapeake Bay has on the various aspects (e.g., economic, water quality) of the community – Understanding how local leaders perceive the impact of protecting the Bay on their community contributes to EPA understanding the knowledge and capacity of local officials on issues related to water resources.

  • Tools that the community considers in protecting Chesapeake Bay – Knowing the tools that the community use to protect the Bay contributes to EPA’s understanding of local leader capacity.

  • Their level of understanding of rules and regulations at various levels of government – These questions measure the knowledge of local leaders in relation to protecting the Bay.

  • Which federal rules the community is subject to – This question measures the knowledge of local leaders in relation to protection of the Bay.

  • Access to knowledgeable people to provide assistance in implementing environmental policy – This question measures the capacity of local leaders to access people who are knowledgeable about protecting the Bay.

  • Resources (e.g., trainings, individuals, guidance documents) the respondent would use to learn more about environmental policy or water resources – This question provides information on the extent to which local leaders know where to go to increase their capacity and knowledge.

  • Activities the community has undertaken in the last three years related to water quality – This question asks about the actions taken that can improve the Bay.

  • An assessment of the respondent’s understanding of watersheds – This set of questions measures the local leaders’ knowledge about watersheds.

  • A set of demographics (age, gender, years in current position) – The demographics will assist EPA in interpreting the results from the other questions.

The Diversity Profile survey asks about the following data items:

  • The respondent’s role in protecting Chesapeake Bay, including the type of organization they work for, their role at that organization, whether they are in leadership role in the Chesapeake Bay Program, and their tenure in the Chesapeake Bay Program – Understanding these context-based questions will assist EPA in interpreting the response to the key questions related to diversity in the survey.

  • A set of questions that ask about gender identity, age, racial identify, identification as a member of the LGBTQIA+ community, disability status – These are the key questions for understanding the diversity among the respondents.

  • Where they live in terms of geographic location and characteristics (e.g., urban) – This question on location will assist EPA in understanding the geographic scope of diversity.

  • A list of groups they engage with as part of the Program – Understanding the network of groups in relation to diversity will assist EPA in understanding the reach of diversity in the area.

  • Whether they completed the survey in prior years – This question will assist EPA in understanding how the survey results may differ from year to year.

Finally, the Stewardship Survey asks several questions that can be grouped into eight broad categories:

  • Information on the respondent’s location within the Chesapeake Bay watershed – Understanding where the respondent is within the watershed assists with both sample allocation and with interpreting the results.

  • Information on activities the respondent performs related to lawn and/or yard care – These are key data for tracking stewardship.

  • Stewardship behaviors (e.g., use of rain barrels, use of fertilizer) that the respondent performs currently and an assessment of the likelihood the respondent would perform them in the future – These are key data for tracking stewardship.

  • The respondent’s volunteer-related activities and attitudes toward volunteer activities – These are key data for tracking stewardship.

  • The respondent’s civic engagement and attitudes toward civic engagement– These are key data for tracking stewardship.

  • The respondent’s perceptions and attitudes towards environmental and/or water quality– These are key data for tracking stewardship.

  • A set of demographics – Understanding the respondents’ demographics will assist EPA in better understanding the key stewardship data.



B.3 PRE-TESTS AND PILOT TEST

B.3.a Pre-tests

The Stewardship and Diversity profile surveys have been implemented by EPA partners prior to the collection being requested under this ICR. Thus, EPA does not expect the need to conduct pre-test for the Stewardship and Diversity Profile surveys. EPA does not expect to perform a pre-test for the Local Leaders survey.


B.3.b Pilot Test

EPA does not expect to conduct pilot testing for these surveys.

B.4 COLLECTION METHODS AND FOLLOW-UP

B.4.a Collection Method

The Diversity Profile and the Local Leadership Survey will be collected using a web-based data collection approach. Emails will be sent to potential respondents with a link to take the survey. The Stewardship Survey will be collected as a multi-mode survey that uses a combination of phone, web, and mail to collect responses.

B.4.b Survey Response and Follow-up

Expected Response Rates

EPA expects the following response rates for the three surveys:

  • 10 percent for the Stewardship Survey. This is based on prior implementation of the survey in the region by EPA’s partners. EPA also notes, however, that the prior version of the survey was collected using just a phone survey; the use of three modes (phone, web, and mail) is meant to increase response rate for this implementation.

  • 5 percent for the Local Leaders Survey. Given the large number of potential respondents (15,000) and the broad implementation of this survey across the region, EPA expects a that 5 percent is a reasonable expectation for response.

  • 53 percent for the Diversity Profile Survey. This is based on prior implementation of the survey in the region by EPA’s partners.

Implementation and Follow-up

The two surveys being collected using email distribution will involve the following sequence of emails to maximize response:

  1. A prenotification email to alert the potential respondents to the upcoming survey. This will be sent from a recognizable organization or individual to ensure the respondents see the survey as a valid data collection effort.

  2. An initial distribution email that provides the survey link to the respondents. This will be sent 1-2 days after the prenotification email.

  3. A set of 3-4 reminders that are sent at 3-4 day intervals to those who have not responded yet. Each reminder will include the link and will alter the way in which information about the survey is presented (e.g., new subject lines) in an attempt to create new stimuli for the respondents.

For Stewardship Survey, the contractor implementing the survey will obtain a list of valid phone numbers (land and cell), email addresses, and physical addresses from a reputable vendor. The contractor will remove duplicate across the different lists. The respondents will be sorted by state for each mode and then randomized within each state. The contractor will select a sample for each state that total 5,200 respondents across states from across the three modes.

Calculating Actual Response Rates

For each survey, EPA will calculate response rates by dividing the number of completed surveys by the number of attempted surveys minus out-of-scope respondents. For the Local Leaders and Diversity Profile Surveys (email), out-of-scope respondents will be defined as any respondent who (1) indicates in an emailed reply they are not in scope (e.g., no longer in the position) and (2) any undeliverable email addresses. For the Stewardship Survey (phone), EPA will define out-of-scope respondents as any respondent who provides information that indicate they are not in scope (e.g., lives outside it the area).

B.5 ANALYZING AND REPORTING SURVEY RESULTS

B.5.a Data Preparation

Each survey effort under this request will result in an electronic data file. The data from the two web-based instruments and the data from the phone CATI system for the third survey will be transferred to an MS Excel file for analysis.

EPA does not expect to use any method to impute missing data items. Those items will be recorded as unit nonresponses.

B.5.b Analysis

The data from each survey question in each survey will be tabulated by response option, including a full accounting of missing responses (unit nonresponse) for each question. EPA will cross-tabulate responses as needed. For the Stewardship Survey, EPA expects to tabulate the response for each state separately as well.

B.5.c Reporting Results

Results from the surveys will be reported by EPA on the Chesapeake Bay Program website and included in the annual “Bay Barometer” report.

















This page intentionally left blank.



Appendix A – Data Collection Instruments




2 See footnote 1.

ICR for Chesapeake Bay Program Surveys 1 July 20, 2021

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPurdy, Mark
File Modified0000-00-00
File Created2021-09-15

© 2024 OMB.report | Privacy Policy