Revised Supporting Statement Citizens Science ICR clean

Revised Supporting Statement Citizens Science ICR clean.docx

Generic Clearance for Citizen Science and Crowdsourcing Projects (Renewal)

OMB: 2080-0083

Document [docx]
Download: docx | pdf

ICR number 2521.14

Generic Clearance for Citizen Science and Crowdsourcing Projects


1. Circumstances Making the Collection of Information Necessary


Citizen science and crowdsourcing: Innovative research methods that engage the public


Citizen science and crowdsourcing are tools that engage, educate and empower the public to apply their curiosity and contribute their talents to a wide range of scientific and societal issues. Citizen Science is a form of open collaboration where the public can participate actively in the scientific process through methods that include asking research questions, collecting and analyzing data, interpreting results, or engaging in problem solving. Crowdsourcing is a process where individuals or organizations submit an open call for contributions of information from a large group of individuals (“the crowd”).


EPA’s mission to protect human health and the environment


The mission of EPA is to protect human health and the environment. Citizen science and crowdsourcing can support EPA’s mission and purpose, including ensuring that national efforts to reduce environmental and public health risks are based on the best available scientific information and that all parts of society – communities, individuals, businesses, and state, local and tribal governments – have access to accurate information sufficient to effectively participate in managing human health and environmental risks. To meet these goals, EPA fosters the sound use of science and technology and conducts leading-edge research; the Agency funds community-driven projects. Likewise, the Agency conducts educational activities to increase the public’s knowledge and understanding of environmental issues so that they can make appropriate decisions. See section C. Appendix for EPA’s statutory authority and examples of policy support for incorporating citizen science and crowdsourcing methods into Agency scientific endeavors.

Benefits of research using citizen science and crowdsourcing approaches in EPA research


Citizen science and crowdsourcing can create engaging opportunities for the public to experience their environment, contribute environmental data at a more local level, and provide opportunities to analyze large environmental datasets. These methods give people the ability to easily share data they encounter in their communities and environments. Whether it is an individual photographing an endangered species they come across on their walk, someone at home adding in descriptions to online aerial photographs taken at a disaster site, or owners of personal monitors documenting their daily activities, crowdsourcing and citizen science provide people a fun and accessible way to contribute to science or foster a greater appreciation of their natural environment and community. In addition, citizen science and crowdsourcing projects promote greater openness in the scientific process by actively encouraging participation in various aspects of research. Researchers using citizen science and crowdsourcing are committed to the dissemination of data and results back to the public.


Many federal and non-federal organizations are already using innovative citizen science and crowdsourcing tools to advance their missions. These tools are especially valuable where data are distributed across space and time or when projects rely on large datasets. Successful citizen science and crowdsourcing projects usually result from iteration of the design based on feedback from the participants. Also, there could be uncertainty about whether the time and effort to create a project will capture the interest of the public and yield meaningful public participation. Speed and flexibility are beneficial to develop, test, and implement good citizen science and crowdsourcing projects to allow, for example, internet-based activities to evolve with technology and variable participation over time. An expedited approval process could facilitate incorporation of citizen science and crowdsourcing methods into EPA’s research and scientific initiatives, which will provide large datasets with diverse information that can provide a more thorough understanding of environmental issues.


The growth and success of citizen science and crowdsourcing is tied closely with advances in technology. Enhanced tools and methods are constantly making citizen science and crowdsourcing more feasible and effective. EPA researchers want to respond to and interact with industries through technology development. For example, the ability to quickly involve new technologies could allow EPA researchers to contribute to low-cost sensor testing and use. In addition, an expedited approval process is consistent with OMB Memo M-10-061, which promotes the use of new technologies and greater openness in government.


Federal support for citizen science and crowdsourcing


In the 2017 American Innovation and Competitiveness Act2, Congress authorized agencies to harness the ingenuity of the public by using open innovation methods such as citizen science and crowdsourcing. Citizen science and crowdsourcing are in line with the Paperwork Reduction Act’s intent to “ensure the greatest possible public benefit from and maximize the utility of information created, collected, maintained, used, shared, and disseminated by or for the Federal Government.3


Design principles for citizen science and crowdsourcing projects


Citizen science and crowdsourcing projects under this generic ICR will include the following design principles:


  1. Participants have a meaningful role in the research project, and can act as contributors or collaborators.

  2. Projects have a genuine scientific question or goal.

  3. Projects are low-burden for participants.

  4. Projects include active management of data and data quality, including a data quality assurance plan and ongoing evaluation of data quality and data management.

  5. Projects are opt-in and participants have full control over the extent that they participate.

  6. The data gathered and/or analyzed are shared with participants and generally made publicly available, unless there are security or privacy concerns that prevent this.

  7. Participants receive feedback on how their contribution adds to the project, e.g. how their data will be used and what the research findings are.

  8. Project leads will evaluate scientific output, data quality, and the impact on participants.

  9. Projects are designed to contribute to research and science, not to inform Agency regulations or policies.



EPA citizen science research will have clear linkages to community problems and concerns, and this participatory, “translational science approach” differentiates citizen science research from EPA’s traditional research activities. The objective of citizen science project under this ICR is to use a rigorous research approach – and then apply findings to real-world problem solving with benefits to the participants.

2. Purpose and Use of the Information Collection


EPA relies on scientific information. Citizen science and crowdsourcing techniques will allow the Agency to collect qualitative and quantitative data that might help inform scientific research, assessments, or environmental screening; validate environmental models or tools; or enhance the quantity and quality of data collected across the country’s diverse communities and ecosystems to support the Agency’s mission. Information gathered under this generic clearance will be used by the Agency to support the activities listed above and might provide unprecedented avenues for conducting breakthrough research.


Collections will be from participants who actively seek to participate on their own initiative through an open and transparent process (the Agency does not select participants or require participation); the collections will be low-burden for participants; collections will be low-cost for both the participants and the Federal Government; and data will be available to support the scientific research (including assessments, environmental screening, tools, models, etc.) of the Agency, states, tribal or local entities where data collection occurs. EPA may, by virtue of collaborating with non-federal entities, sponsor the collection of this type of information in connection with citizen science projects.


Collection of health symptoms, illnesses, and human biological samples (e.g., blood, urine, hair) is outside the scope of this generic.



All such collections will accord with Agency policies and regulations related to human subjects research as described in EPA Order 1000.17a (Policy and Procedures on Protection of Human Subjects in EPA Conducted or Supported Research. Each project under this generic ICR will be classified as either not human subjects research (NHSR) or human subjects research HSR). All HSR must be reviewed and approved by the EPA Human Subjects Research Review Official (HSRRO). To receive approval researchers must submit the IRB-approved research package including evidence of IRB approval of no more than minimal/moderate risk along with evidence of a Federal-wide Assurance (FWA) on file with the U.S. Department of Health and Human Services (HHS). Agency Human Subjects Officers (HSOs) will review and approve all NHSR projects in consultation with the HSRRO. Finally, personally identifiable information (PII) will only be collected when necessary and in accordance with applicable federal procedures and policies. If a new collection is not within the parameters of this generic ICR, the Agency will submit a separate information collection request to OMB for approval.


The methods used for collecting information can vary greatly across projects, including the use of apps, questionnaires or monitoring devices. All citizen science and crowdsourcing activities under this ICR will be consistent with the citizen science definition in Section 3 of the Citizen Science and Crowdsourcing Act of 2017. All projects conducted by EPA will be reviewed and approved by the EPA citizen science coordinator to ensure compliance with the parameters in this ICR as well as compliance with all other legal and administrative requirements. In addition, all projects submitted under this ICR will manage PII according to the Federal regulations. As with any scientific endeavor at EPA, citizen science and crowdsourcing projects will have approved data quality and data management plans as part of their project design before implementation. EPA provides employees resources for developing data quality and data management plans4.




The popularity and application of citizen science and crowdsourcing methods continues to grow with new and low-cost portable technologies. Therefore, the modes of data collection under this generic clearance may include: paper or digital questionnaires, data forms, surveys, focus groups or interviews; new and existing online collaboration tools; fields in a cell or smart phone applications (apps); online web-based forms or interactive computer interfaces that elicit information; social media platforms; text or SMS messages; readings from sensors (personal, mobile, stationary or portable) or other mobile, portable or stationary instruments– readings either sent back to the Agency in real-time, through an online data collection site, or through another acceptable mode listed here; analog or digital audio or video recordings; digital or analog photographs; and information collected automatically through an app, computer, the metadata accompanying a digital photograph, or a mobile sensor.

Information may be actively collected and actively submitted information (such as descriptions, measurements, photographs, etc.) as well as passively submitted information (such as the metadata accompanying actively submitted information, e.g. date, time, and location stamps automatically included with apps and digital photographs, etc.).

Citizen science and crowdsourcing collections submitted under this generic clearance can be stand-alone projects or the methods may be incorporated into an existing or new project, including, but not limited to, projects in the following typology5:

  • Data gathering projects. These projects may include 1) observation, characterization and documentation of natural phenomena or general environmental observations, opinions, or preferences or 2) surveying participants or screening environmental conditions, including using specialized equipment provided by project leaders to record and submit data, or submitting samples plus descriptors (e.g. of air or water) for testing. Data may be collected using technologies mentioned above, through structured data forms, surveys, focus groups or interviews, submitting photographs or other media, surveys or questionnaires, or providing written observations.

  • Classification/problem solving projects. Participants’ tasks may include: 1) observation of recorded materials provided by project organizers (images, video, etc.) through structured data submission forms, surveys or questionnaires in an online or computer program, clicking boxes, highlighting parts of text or image, and providing comments and/or annotations; 2) Classification of images or sounds using structured data submission forms or clicking boxes in an online or computer program; 3) Transcribing information, by typing handwritten logs or notes; 4) Performing a function meant to generate human behavior data; or 5) Problem-solving or manipulation of data. Tasks 1-5 may be conducted via structured actions or instructions or through the use of “human-based computational game” or “game with a purpose”, a human-based computational technique in which a computational process performs its function by presenting certain steps to humans in an entertaining way.

Data gathering and classification/problem solving projects may include participants providing information, opinions or observations about a research subject’s environmental surroundings. To the extent applicable, these projects will accord with all Agency policies and regulations related to human subjects research and will follow the established approval paths through the Human Subjects Research Review Official6.

Citizen science and crowdsourcing collections under this generic clearance may include the following types of questions or requests of participants:



  • Profile/Preference information. Projects may request a username and/or password as well as user preference information to facilitate or customize the user experience. Participants may be asked to submit an email address, name, and zip code, as well as acknowledge a privacy policy or terms agreement. Participants may also be presented with an opportunity to be placed on a mailing list for the project. This includes projects administered through a web form or mobile application.

  • Personal and Contact Information. Citizen science and crowdsourcing projects may solicit contact information. This information may be necessary to organize and analyze data (i.e., it may be necessary to know which data points are from the same observer). Projects may request contact information (name and email address, zip code, address and phone number) to provide participants with project updates and share data. Participants would be made aware that the publically available data on contact information will be anonymized and aggregated, for example, by census tract, zip code, city, or some other higher level than individual addresses.

  • Experience and Expertise. For data quality purposes, citizen science and crowdsourcing projects may request information to evaluate the skill level of the participant by asking about their experience with the project topic. Questions may be about a person’s age range, level or topic of education, participation in organizations, or professional experience.

  • Information about Observations. Projects may request accompanying information, such as the date and time of the activity, the location (e.g., GPS coordinates, address, zip code, etc.), the weather (e.g., temperature, precipitation, wind, humidity, visibility, etc.), and a description or characterization of the location (e.g., vegetation type, type of water body, environmental condition, etc.) or personal senses (e.g. smell, visual cues, sound, etc.).

  • Project Evaluation. Citizen science and crowdsourcing projects may collect information on the participant’s experience for project evaluation and development. This may include questions on how the participant found out about the project, the amount of time spent, distance traveled, how difficult the task was for the participant, whether the participant enjoyed the experience, and if they will participate again. Projects may also request information to evaluate participant outcomes, such as changes in the participant’s understanding of the scientific process or project topic, through survey questions before and after participation.

  • Training. Citizen science and crowdsourcing projects may need to train participants for the purpose of soliciting quality data and increasing participant benefits including education and engagement. Participants may be asked to read materials, watch training videos, or attend training sessions in-person or virtually via a webinar. To ensure that participants understand the training, they may be assessed through testing instruments like a questionnaire or survey, which may be administered online or through a computer program, on paper, in cell a phone app, or in-person.

3. Consideration Given to Information Technology


In order to encourage participation and reduce burden on participants, citizen science and crowdsourcing efforts often utilize information technology that is available to a number of potential participants (cell phones, personal computers, tablets, etc.). The projects submitted under this generic clearance may collect information electronically through new and existing online collaboration tools, cell phone applications (apps) or SMS, web-based forms, online computer programs or forms, social media platforms, or sensors (personal, mobile, stationary or portable).

4. Duplication of Information

No similar data are gathered or maintained by the Agency or are available from other sources known to the Agency. Citizen science projects will collect new information that is not already available (e.g. local water and air quality). In these citizen science projects, volunteers contribute valuable data that can fill data gaps.

5. Reducing the Burden on Small Entities

Participants in the crowdsourcing and citizen science projects under this generic clearance will be individuals, not small businesses or other small entities.

6. Consequences of not Conducting Collection

If unable to collect information through citizen science or crowdsourcing methods under a generic ICR, the Agency would be unable to adapt and utilize these innovative tools in a timely manner to engage the public in Agency science. With these methods, EPA benefits from the public’s knowledge, expertise, and willingness to contribute to scientific endeavors that rely on large and geographically comprehensive datasets. The public and other organizations are beginning to capture and organize data with smartphones and portable sensors; the Agency’s involvement will allow for publicly-generated data to effectively support EPA research, including initiating data collection, developing innovative methods for data processing, and managing data quality. EPA research innovation significantly benefits by EPA researchers having access to the newest technologies affording the opportunity to contribute meaningfully to low-cost sensor testing and use. Moreover, members of the public enjoy participating in citizen science and crowdsourcing projects, which are fun, educational, engaging, and will allow for more open communication between EPA and the public; citizen science projects in other agencies have gathered millions of data points contributed by hundreds of thousands of interested individuals. These projects are always voluntary, low-burden, and rely on the interest and self-motivation of the participants. Finally, projects under this generic clearance will allow Agency researchers to test ideas more quickly, respond to the project’s needs as they evolve, and incorporate feedback from participants for flexible, innovative research methods that involves the public in a variety of aspects of scientific research.

7. Special Circumstances

There are no special circumstances.

8. Consultations with Persons outside the Agency

In accordance with 5 CFR 1320.8(d), a 60-day and 30-day notice for public comment will be published in the Federal Register (FR notice number). The Agency consulted with representatives at federal agencies with citizen science and crowdsourcing efforts: USGS, HHS and USDA/USFS. The federal representatives are associated with the Federal Community of Practice on Crowdsourcing and Citizen Science (FCPCCS). Representatives reviewed a draft of this generic clearance. This document reflects the feedback and comments from this community.

A 60-day notice for public comment period was published on December 27, 2018 and closed on February 25, 2019. One public comment was received expressing concern about the quality of citizen science data collected, especially if used for regulatory decision making. The commenter also expressed concern about whether data collected under this ICR would be made publicly available per the proposed rule Strengthening Transparency in Regulatory Science, and that supporting documentation for the ICR was unavailable. All these concerns were addressed in the original supporting documents that outline the requirements of projects approved under this ICR, available in the public docket (EPA-HQ-ORD-2015-0659¬). An updated supporting statement for this renewal is also now available in the docket as part of the additional 30-day comment period. As stated in the design principles of the Generic Clearance for Citizen Science and Crowdsourcing Projects …[9] Projects are designed to contribute to research and science, not to inform Agency regulations or policies. Any data collected by EPA is required to be collected in a manner consistent with finalized agency policies, regulations, and data quality guidelines. Only those projects fitting the guidelines of this generic information collection rule can be approved under this ICR. Those that do not meet the requirements must undergo separate approval for data collection.

9. Payment of Gift

The Agency will not provide payment or other forms of remuneration to participants.

10. Confidentiality

If a confidentiality pledge is deemed useful and feasible, the Agency will only include a pledge of confidentiality that is supported by authority established in statute or regulation, that is supported by disclosure and data security policies that are consistent with the pledge, and that does not unnecessarily impede sharing of data with other agencies for compatible confidential use. If the agency includes a pledge of confidentiality, it will include a citation for the statute or regulation supporting the pledge.



11. Sensitive Nature

No questions will be asked that are of a personal or sensitive nature as defined by OMB.

12. Burden of Information Collection

A variety of platforms and media will be used to collect information from respondents. We expect that there will be a range of burden hours depending on the details of the citizen science and crowdsourcing method employed. The total range of annual burden hours requested is 351,150 to 402,750 hours based on the number of collections we expect to conduct over the requested period for this clearance.


The total dollar value of the annual burden hours is based on the National Compensation Survey: Occupational Wages in the United States May 2018 published by the Bureau of Labor Standards Occupation and Wages, May 2018 (http://www.bls.gov/oes/current/oes_nat.htm#00-0000). We use the value for All Occupations, average hourly wage of $24.34 multiplied by 1.4 to account for benefits, $34.08.


  1. Data gathering projects: We estimate approximately 1,425 participants per year per data gathering collection project under this generic clearance. This number is based on the maximum annual number of registrants over four years of data from USGS’s citizen science program, the National Phenology Network (NPN) (OMB Control Number 1028-NEW). NPN estimated 13 minutes for registration, login and reading guidelines. Under this generic clearance, we estimate 7 data gathering projects per year (10,000 registrants). We estimate the number of participants completing training will be 80% of registrants (8,000 participants). Training modules will vary by data collection; we estimate four hours. For this estimate we assume that each trained participant will collect the same number of observations in the same amount of time. USGS’s NPN estimated 500,000 observations per year at 2 minutes per observation for plant phenology, which is a relatively quick observation. For this generic clearance, we estimate that the same number of “trained” participants (8,000) will collect a total of 500,000 observations at 5 minutes per observation = 5.2 hours/participant on an annual basis. The estimated annual burden for 7 data gathering projects is 75,833 hours.


  1. Classification/problem-solving project: We estimate 2,500 participants per year per classification/problem solving data collection under this generic clearance based on estimates from an example of a classification/problem-solving project at USGS, the citizen science program iCoast (OMB Control Number 1028-NEW). iCoast estimated 10 minutes for registration, login and reading guidelines. We estimate 3 data gathering projects annually under this generic clearance. We estimate the number of participants completing training will be 80% of registrants. Training modules will vary by data collection; we estimate four hours. The estimated number of participants that will spend time on the website, app, or computer program engaged in the activities will vary, and it is difficult to predict. Participants will continue to engage with the site based on their interest and submit data until the task is complete. For this estimate, we assume data collection tasks (classification/problem solving) will be completed with 50% of the trained participants engaged by the sites for 8 hours per month or 96 hours per year. The estimated annual burden for 3 classification/problem solving projects is 313,250 hours.


Burden of information collection request table

Estimated Annual Reporting Burden

Type of Collection

Number of Participants

Estimated Time per Participant (hours unless otherwise noted)

Total Annual Burden Hours

A. Data gathering projects

Participant registration, initial login & reading guidelines

10,000

13 minutes

2,167 hours

Participant training (estimate 80% of those who register will undergo training)

8,000

4 hours

32,000 hours

Participants contributing observations (estimate all "trained")

8,000

5.2 hours (500,000 observations at 5 minutes each)

41,667

Total burden hours

 

 

75,833 hours

Total annual labor costs

 

hourly rate including. benefits, $34.08

$2,584,389

B. Classification/problem-solving projects

Type of Collection

Number of Participants

Estimated Time per Participant

Total Annual Burden Hours

Participant registration, initial login & reading guidelines

7,500

10 minutes

1,250 hours

Participant training

6,000

4 hours

24,000 hours

Participants completing data collection tasks

3,000

96 hours (8 hours/month)

288,000 hours

Total burden hours

 

 

313,250 hours

Total annual labor costs

 

hourly rate including benefits, $34.08

$10,675,560

Grand total, annual burden hours

 

 

389,083 hours

Grand total, annual labor costs

 

 

$13,259,949

Grand total, burden hours over 3 years



1,167,249

Grand total, 3 year labor costs



$39,779,847



13. Costs to Respondents

There will not be fees associated with participation in the data collections under this generic clearance. Participants will not be required to purchase any equipment to collect data, but some low-cost sensors or other technical or low-tech supplies may be necessary to complete all of the data collection tasks should the participants decide to complete all tasks. The costs to participants for materials will vary based on the data collection type (data gathering, classification/problem solving, or research subject participation) and medium (i.e. sensors, apps, or paper forms). The Agency does not expect participants to make purchases specifically for citizen science and crowdsourcing projects under this generic clearance. However, the table below reflects an annual 3-year estimate for Operations and Maintenance Costs (O&M) that participants might incur should they decide to purchase equipment to fully participate in a citizen science or crowdsourcing collection under this generic clearance. The estimate is based on the following assumptions: one eighth of the expected participants in the “data gathering projects” purchases low-tech equipment with a maximum cost of $25 per person (i.e. use of already owned internet or data plans; or purchase of low-technology equipment like water monitoring kits) and one eighth of the expected participants in the “data gathering projects” purchases high-tech equipment (i.e. personal monitors, portable/personal air sensors, or other higher-technology equipment) with a maximum cost of $500 per person.

Operations & Maintenance (O&M) Costs

 

Estimated maximum cost per participant

Estimated number of participants expected to use

O&M costs

Low-tech equipment

$25.00

1000

$25,000

Low-cost personal or portable technology

$500.00

1000

$500,000

Total annual non-hour burden cost

$525,000

O&M costs over 3 years

$1,575,000



14. Costs to Federal Government

The anticipated cost to the Federal Government is approximately $151,678 annually. These costs are comprised of: project administration and estimated contractor payments. EPA person-costs are estimated using an hourly rate for a GS-14 (step 1) including an additional 60% for benefits based in Washington, DC. Time spent on each step may vary, as well as the GS-level of the employees involved. The estimate for project administration is based on 10 projects per year at 5% time given 2,087-hour divisor7 for an employee’s annual rate of pay.





Task

Costs (and Person-hours) Per Project

Total Hours and Cost

EPA

($50.41/

Hour)

Estimated contractor costs

Total Cost/Year (estimated for 10 projects per year)

Project Administration

$9,168


$91,678

Contractor costs


$60,000


Total



$151,678



15. Reason for Change

This three year renewal will allow EPA to continue its citizen science activities consistent with recent Congressional legislation (American Innovation and Competitiveness Act), a 2016 GAO report (“Practices to Engage Citizens and Effectively Implement Federal Initiatives”), a 2017 GAO Report (“Executive Branch Developed Resources to Support Implementation, but Guidance Could Better Reflect Leading Practices”), and a September 2018 EPA OIG report on improving management controls for EPA citizen science (“EPA Needs a Comprehensive Vision and Strategy for Citizen Science that Aligns with its Strategic Objectives on Public Participation”). The dollar figures have been updated to reflect current wages. Anticipating an increase in the number of EPA citizen science projects, we have increased the total burden hours by 19%.

16. Tabulation of Results, Schedule, Analysis Plans


The tabulation, timeline, analysis, and publication of information collected under this generic clearance will vary by submission.


In accordance with the Presidential Memorandum on Transparency and Open Government8, information collected under this generic clearance will disclose information rapidly in forms that the public can readily find and use and in compliance with the data policies outlined on Data.Gov9.


Each project submitted under this generic clearance will specify the tabulation, timeline and analysis of the information collection. The information collected is for Agency scientific purposes, thus a number of projects are likely to publish the results of analyzed data, in peer-reviewed scientific journals, white-papers, Agency reports, or Agency strategic research plans, which will be available for public consumption.


17. Display of OMB Approval Date

Not applicable for this request.

18. Exceptions to Certification for Paperwork Reduction Act Submissions

These activities comply with the requirements in 5 CFR 1320.9.



B. STATISTICAL METHODS

Data collection methods and procedures will vary; however, the primary purpose of these collections will be for qualitative and quantitative data collections that might help inform scientific research, assessments, or environmental screening; validate environmental models or tools, or; enhance the quantity and quality of data collected across the country’s diverse communities and ecosystems to support the Agency’s mission.


  1. Universe and Respondent Selection


Statistical methods will not be used in the selection of respondents. Participants in citizen science and crowdsourcing projects are self-selected. The method for soliciting participation will be described fully in each collection request, but participation may be advertised through targeted outreach and engagement methods like standard and social media outlets, collaborations with on-the-ground partners, public talks, and word-of-mouth.


The number of participants will vary by project submitted under this generic clearance. The variation in participation is likely due to multiple factors like personal interests, accessibility, perceived burden, outreach by the Agency, and success over time. For example, the citizen science program Nature’s Notebook (USGS, OMB Control Number 1028-NEW) reported that participation increased from 40 registered observers in 2008 to 530 registered observers in 2011.


Results will not be used to directly inform Agency regulations or policies. Data also will not be generalized beyond the scope of the sample.


  1. Procedures for Collecting Information


Data collection methods and procedures will vary and the specifics of these will be provided with each collection request. Each request under this generic collection will include details on the statistical methodology for stratification and sample selection (if applicable to the collection – this is not applicable to the selection of participants), estimation procedure, degree of accuracy needed for the research purpose described in the justification, unusual problems requiring specialized sampling procedures, and any use of periodic data collection cycles (less frequent than annual) to reduce burden.



  1. Methods to Maximize Response


Participants will have complete control over their participation in citizen science or crowdsourcing projects. Participants will need to proactively seek out opportunities, respond to an email, or actively sign up for a project in order to participate. Outreach and advertising materials will provide information on how to participate but will not assume participation from anyone. Several existing citizen science and crowdsourcing projects employ engagement tactics to support continued participation, and reduce non-response including newsletters with appreciation, motivation and results delivered to participants, and optional bi-weekly reminders to observe. The collection requests under this generic clearance may utilize some of these techniques while acknowledging that participants have full control over whether to participate or not.


Each collection request under this generic clearance will specify methods to track and increase response rates. Some collection requests will provide opportunities for participants to submit negative data, for example, information on the time and effort to attempt to obtain an observation in the event of no observation.


  1. Testing of Procedures


Pretesting may be done with internal staff or a limited number of external colleagues (less than 10). If the number of pretest respondents exceeds nine members of the public, the Agency will submit the pretest instruments for review under this generic clearance.




  1. Contacts for Statistical Aspects and Data Collection


Projects submitted under this generic clearance can consult with statisticians in the development, design, conduct, and analysis of the data collection. Statistical expertise is available from agency statisticians or contractors and the Agency will include the names and contact information of persons consulted in the specific information collection requests submitted under this generic clearance as needed.



C. AppendiX


EPA Statutory Authorities


  • American Innovation and Competitiveness Act § 402 42 USC § 1861 authorizes Federal science agencies to “conduct projects designed to advance the mission of” the agency. It also authorizes an agency to work with outside organizations in these projects.

  • Clean Air Act § 103, 42 U.S.C. § 7403, authorizes research into techniques for monitoring and controlling air pollution.

  • Clean Water Act § 104, 33 U.S.C. § 1254, authorizes EPA to encourage, cooperate with and render technical services to individuals, including the general public, to promote the coordination and acceleration of demonstrations, studies and training relating to the causes, effects, prevention and elimination of water pollution.

  • Solid Waste Disposal Act § 8001, 42 U.S.C. § 6981, authorizes EPA to encourage, cooperate with and render technical services to individuals as well as public and private sector entities to promote the coordination and acceleration of demonstrations, studies, training and public education programs relating to, among other things: adverse and welfare effects of the release of solid waste into the environment; operation and financing of solid waste management programs; planning and operation of resource recovery and conservation systems and hazardous waste management systems; production and marketing of recovered resources; reductions in the amount of solid and hazardous waste and unsalvageable waste materials; and, the development and application of improved methods of collecting and disposing of solid wastes to recover and market materials and energy from these wastes.

  • Marine Protection, Research and Sanctuaries Act § 203, 33 U.S.C. § 1443, authorizes EPA to encourage, cooperate with, and render technical assistance to public and private sector entities, including individuals, to promote the coordination of demonstrations, studies and training to minimize dumping of materials into the ocean that may unreasonably degrade or endanger human health, welfare, or the marine environment and economic potential.

  • Safe Drinking Water Act § 1442, 42 U.S.C. § 300j-1, authorizes the Administrator to conduct research, studies, and demonstrations relating to the causes, diagnosis, treatment, control, and prevention of risks to human health related to drinking water supply, and to share information and make recommendations based on this research and investigation.

  • The National Environmental Education Act, § 4, 20 U.S.C. § 5503authorizes EPA to develop and support programs to increase environmental literacy.

  • 107-118 Comprehensive Environmental Response, Compensation and Liability Act § 311, 42 U.S.C. § 9660, authorizes EPA to conduct research, and provide training and technical assistance to individuals and organizations, to facilitate the inventory, assessment, preparation and remediation of brownfields sites, including associated community involvement.



Policy support

  • GAO Report GAO-17-507. Open Innovation: Executive Branch Developed Resources to Support Implementation, but Guidance Could Better Reflect Leading Practices. June 2017. This report identified key actions agencies and executive offices could do to encourage and expand the use of open innovation in government.

  • EPA Office of Inspector General Report No. 18-P-0240. EPA Needs a Comprehensive Vision and Strategy for Citizen Science that Aligns with Its Strategic Objectives on Public Participation. September 5, 2018. This report evaluated whether EPA has developed controls to manage the use of citizen science results to meet the agency’s mission.

  • 2013 Second Open Government National Action Plan - encourages Federal Agencies to harness the ingenuity of the public by accelerating and scaling the use of open innovation methods such as citizen science and crowdsourcing: https://www.whitehouse.gov/sites/default/files/docs/us_national_action_plan_6p.pdf

  • OMB Memo M-11-07. Facilitating Scientific Research by Streamlining the Paperwork Reduction Act Process. December 9, 2010. Citizen science and crowdsourcing are in line with the Paperwork Reduction Act’s intent to “ensure the greatest possible public benefit from and maximize the utility of information created, collected, maintained, used, shared, and disseminated by or for the Federal Government.”

  • OMB Memo M-10-06. Open Government Directive. December 8, 2009. Promotes open government and the use of new technologies.

  • OMB Memo M-15-16. Multi-Agency Science and Technology Priorities for the FY 2017 Budget. July 9th, 2015. “Agencies are encouraged to use approaches to foster innovation such as Grand Challenges, incentive prizes, citizen science, and collaboration with members of the Maker Movement.”



Lesson Learned from EPA’s First Generic ICR for citizen science (2016-2019) Generic ICR #2080-0083


EPA used the generic ICR for citizen science to conduct ten new projects.  Some “lessons learned” are that well designed citizen science projects can 1) fill data gaps and provide another means of identifying potential environmental problems, 2) improve public understanding of environmental issues and actions that address them, 3) create a stronger, more inclusive and collaborative network of individuals and organizations dedicated to environmental problem solving, and 4) yield cost savings and efficiency in environmental monitoring and protection programs.  The future improvements anticipated for the next few years will focus on improvements to the infrastructure and guidance for EPA citizen science projects.  This will help staff identify valuable applications for citizen science and expedite the internal review and approval processes for new projects. 

 

EPA benefitted from an Office of the Inspector General (OIG) self-initiated audit of EPA’s citizen science conducted in 2017 and 2018.  The final report was issued on September 5, 2018 titled “EPA Needs a Comprehensive Vision and Strategy for Citizen Science that Aligns with Its Strategic Objectives on Public Participation” (https://www.epa.gov/sites/production/files/2018-09/documents/_epaoig_20180905-18-p-0240.pdf).  The report highlights as a noteworthy achievement the generic ICR to expedite the approval process for new citizen science projects. 

 

The OIG audit of EPA citizen science was that EPA is seen as a leader in the US and abroad on engaging in citizen science, and that EPA has made noteworthy progress in providing internal support.  Some EPA programs and regions are engaged as illustrated by promising efforts across diverse range of topics.  Some projects are community oriented; others address research questions; others involve monitoring.  However, the OIG concluded that the set of ad hoc projects across the agency lack a coherent EPA-wide vision and thus don’t yield full value. 

 

EPA agreed with the four OIG recommendations for improved management of citizen science.  Below is a brief summary of the four recommendations and EPA actions that are underway.  Competition of all the actions to respond to OIG recommendations is anticipated by December 2020. 

 

1. Establish a strategic vision and objectives for managing the use of citizen science

-       Link to the agency’s strategic goals

-       Define roles and responsibilities for implementation

-       Identify resources to maintain and build upon existing agency expertise

 

Status:  An EPA workgroup have prepared a draft strategic vision and principles for EPA citizen science. EPA has conducted outreach to states and tribes through the Environmental Council of the States (ECOS), the E-Enterprise Leadership Council (EELC) and other mechanisms. 

       

2. Issue an EPA Quality Assurance Handbook for Citizen Science

 

Status:  QA handbook for citizen science was issued March 2019 ( https://www.epa.gov/citizen-science/handbook-quality-assurance  ).  We are now preparing on on-line training for EPA staff, state/tribal environmental programs, and citizen science groups

 

3. Build capacity for using citizen science

-       Policy guidance and checklist for EPA staff on administrative and legal factors

-       Training/outreach on how to develop projects for program and regional staff

-       Communication to highlight successes (i.e., where citizen science data is used)

-        

Status:  An EPA staff workgroup prepared a draft checklist and policy guidance to help EPA staff conduct citizen science projects.  The document includes information on how to comply with legal issues (e.g., Paperwork Reduction Act, human subjects, Privacy Act, etc.), administrative processes (e.g., EPA process for approval of new new citizen science apps); and ethical issues that arise in citizen science projects. EPA is also upgrading citizen science website and other communication products.

 

4. Prepare an assessment of data management requirements for using citizen science data

-       Action plan on sharing and using data, data format/standards, and data testing/validation

 

Status: An EPA workgroup prepared an assessment and a framework for an action plan on managing citizen science data.  This was shared in draft with the E-Enterprise Leadership Council (EELC). The EELC is comprised of senior leaders from EPA, states and tribes. 



Background Information About the Projects Included Under the Generic ICR #2080-0083 (from 2016-2019)


CONTINUED PROJECTS

1. CyanoScope

  1. Date: 2016 to present

  2. Location: Primary geographic target is the northeastern U.S. (EPA Regions 1 & 2), but other geographic areas are possible.

  3. Number of Participants: Over 300 organizations participated.

  4. Types of Data Collected: Image based documentation of harmful algal blooms, microscopic images of individual organisms, and fluorometric data

  5. How Results Were Disseminated or Used: Data are available to the public using an existing cyanobacteria collaborative webpage. Further data visualization and data input is under development. These data will be incorporated with other data and collected using the consistent

  6. Project Lead: EPA Region 1 (Boston)

  7. Lessons Learned: There is a great need for educating the public on the perceived and real risks surrounding harmful cyanobacteria blooms. This program has trained hundreds of individuals over the past few years and has been the catalyst for many local startup monitoring and education programs. It continues to be well received and participation continues to expand. There is also a great need for aggregating data across state lines in order to gain regional perspectives. EPA is developing data visualization and exploration tools.

  8. Types of Future Information Collection: Types of information that may be collected in the future include water quality data, biological information on the types of cyanobacteria present in specific waterbodies, cyanobacteria toxin data, and toxin accumulation in biota. These data will be relevant to the principal goals of the Clean Water Act.

2. STEM Education

  1. Date: 2018 to present

  2. Location: Arkansas

  3. Number of Participants: 30

  4. Types of Data Collected: Water quality data

  5. How Results Were Disseminated or Used: Data gathered advances understanding of the quality of water in private domestic wells, to help ensure the safety of the water supplies. Data are made available to the public as confirmed by the owners of the wells.

  6. Project Lead: EPA Office of Research and Development

  7. Lessons Learned: This pilot demonstrated that monitoring of private drinking water wells is suited to crowdsourcing snag citizen science.

  8. Types of Future Information Collection: Same information will be collected in the future. Several additional states have expressed interest.

3.Smoke Sense

  1. Date: 2017

  2. Location: Nationwide

  3. Number of Participants: 30,000+

  4. Types of Data Collected: Individual reports of smoke observations., health symptoms, and behavioral actions taken to reduce exposure.

  5. How Results Were Disseminated or Used: To date, the project resulted in five Manuscripts which provide an unprecedented advance in knowledge of individual level engagement with the issue of air quality as a health risk. In these manuscripts we explored how people respond to air.

  6. Project Lead: [email protected]

  7. Lessons Learned: The Smoke Sense project brings together a range of scientific expertise with local, state, federal, and private partners to collaboratively build knowledge about wildfire smoke, health, and protective actions to improve public health outcomes.

  8. Types of Future Information Collection: Future collections will likely be same a current.

4. Great Lakes Underwater Video

  1. Date: 2017 to present

  2. Location: Great Lakes

  3. Number of Participants: 514

  4. Types of Data Collected: Analysis of underwater video: presence of invasive species.

  5. How Results Were Disseminated or Used: Researchers used the results to evaluate abundance of invasive species in Lake Ontario, Lake Huron and the Niagara River, to evaluate method of crowdsourcing underwater video analysis.

  6. Project Lead: EPA Office of Research and Development

  7. Lessons Learned: Preliminary results demonstrated that citizen scientists were able to identify substrate type, species, and vegetation. The use of crowdsourcing shows promise as a cost effective means of interpreting underwater video. Data are useful in National Coastal Condition Assessment that is used by Resource Managers in the Great Lakes.

  8. Types of Future Information Collection: N/A

5. Puerto Rico Drinking Water Study

  1. Date: 2017 to present (project delayed to hurricane).

  2. Location: Puerto Rico

  3. Number of Participants: 198 (planned)

  4. Types of Data Collected: Data on the incidence and type of gastrointestinal illness using fecal and saliva tests.

  5. How Results Were Disseminated or Used: Data obtained during the study will be publicly available upon publication of peer reviewed journal article without personally identifiable information or sensitive information. Data will also be available in EPA Science Hub publication data sets.

  6. Project Lead: EPA Office of Research and Development

  7. Lessons Learned: Project was delayed due to hurricane but preliminary indications that simple citizen science and crowdsourcing can contribute to epidemiological studies. Information from this project is critical to identify the etiological agents and to evaluate the effectiveness of water treatment.

  8. Types of Future Information Collection: Project will continue to collect the same information in the future.

6. Stove replacement in NM, Navaho Tribe

  1. Date: February 2018 - Present

  2. Location: Shiprock, NM Navajo Nation Tribal Reservation

  3. Number of Participants: 14 to date

  4. Types of Data Collected: Household characteristics and household activities related to generation of particulate matter and carbon monoxide.

  5. How Results Were Disseminated or Used: Early results informed design of next phase of study (for example redesigning the activity log to be more user-friendly). Results were disseminated to the Navajo Nation Human Research Review Board (NNHRRB) and at the NNHRRB annual research conference.

  6. Project Lead: EPA Region 9

  7. Lessons Learned: Quick low-cost approval process for ICRs allowed us to be able to start an intervention study to see if there are air quality improvements from a custom stove-replacement project taking place on the Navajo Nation – a delayed approval process would have kept us from taking advantage of this opportunity to assess the impacts of the replacement.

  8. Types of Future Information Collection: Future collections to be the same.

7. Coastal Acidification

  1. Date: 2018 to present

  2. Location: Coastal Massachusetts

  3. Number of Participants: 3 to 5 organizations will sign up volunteers.

  4. Types of Data Collected: Water quality, including total alkalinity, pH and other related measures, such as salinity and temperature.

  5. How Results Were Disseminated or Used: Disseminated in reports data will be made public but data management plan has not been established.

  6. Project Lead: [email protected]; [email protected]

  7. Lessons Learned: Project still in early stages, but preliminary indications that volunteer monitoring of coastal acidification is a cost-effective way to collect data.

  8. Types of Future Information Collection: Future collections to be the same.

8. LA Library Sensor Loan Program

  1. Date: Information Collection has not yet begun.

  2. Location: Los Angeles, CA

  3. Number of Participants: 0 to date

  4. Types of Data Collected: Experience using air quality sensors and administering sensor loan program.

  5. How Results Were Disseminated or Used: None to date

  6. Project Lead: EPA Region 9

  7. Lessons Learned: Project still underway but preliminary lessons are that loan programs build better relationships with the public, address community concerns, and increase knowledge about how to appropriately use sensor technologies. A “lessons learned” report is planned that will capture experiences of community participants as well as librarians and summarize best practices.

  8. Types of Future Information Collection: Potential for additional loan programs in other communities.



DISCONTINUED PROJECTS

1. Pet Health

  1. Date: Oct 2017 – June 2019

  2. Location: Web survey

  3. Number of Participants: 4000 total; 1000 partial and 3000 complete

  4. Types of Data Collected: Pet health, in- home and near- home Environment

  5. How Results Were Disseminated or Used: Fact sheet posted on website; presentations

  6. Project Lead: EPA Office of Research and Development

  7. Lessons Learned: While the survey was not able to continue through a full three-year, initial indications are that people are very interested in being engaged when there is a positive feedback loop of information. People were excited and willing to fill out a survey for their pets. In the 24 months of the survey we collected over 3000 complete survey responses. From the data we are seeing that many of the disease issues faced by humans are also present in our pets (e.g. cancer, flu, Lyme, asthma). Future studies should narrow the focus and partner with epidemiologists to better tailor the survey questions.

  8. Types of Future Information Collection: No future collection planned.


2. Honeybee Survey

  1. Date: Feb-17

  2. Location: National

  3. Number of Participants: 100

  4. Types of Data Collected: Hive health survey, honey sample

  5. How Results Were Disseminated or Used: Website and authored a manuscript using some of the honey samples collected during this project (DOI 10.1099/acmi.0.000065)

  6. Project Lead: EPA Office of Research and Development

  7. Lessons Learned: The bee keeping community is acutely aware of issues affecting honey bee colony health and is motivated to find solutions.

  8. Types of Future Information Collection: This project was phased out in FY19.

3. Honeybee Survey Addendum – Same information as the Honeybee Survey

4. Smith River MT Survey 

  1. Date: April 2018-September 2018

  2. Location: Smith River, Montana

  3. Number of Participants: 94

  4. Types of Data Collected: Photos and survey

  5. How Results Were Disseminated or Used: Results were processed by Montana DEQ to make informed monitoring and assessment decisions for nutrient impairment in the Smith River.

  6. Project Lead: EPA Region 8

  7. Lessons Learned: The project was planned and implemented for one year (2018). Lessons learned: The funding was used to create a phone App to allow users to take georeferenced pictures in an unconnected environment (i.e., no cell phone/wi-fi coverage). While the App development was successful and successfully implemented in 2018, maintaining the App, technical support, and raw data processing took larger than anticipated contractor costs, and neither EPA nor the state had funding in 2019 to maintain App support. In the future, continuous funding would be needed to provide continuous App and processing support. Also, while the App was successful, it did not take advantage of newer technologies and was built on an older ESRI Survey 1-2-3 platform. In the future, App development funding should be directed towards IT companies that specialize in App development, and that future Apps take advantage of latest (but more expensive) technologies.

  8. Types of Future Information Collection: No additional information collection is anticipated for the future.

5. Citizen Science Motivation and Experiences (sub project of Smoke Sense) - This project was not implemented

Project Lead: EPA Office of Research and Development







1 OMB Memo M-10-0. Open Government Directive. December 8, 2009. https://www.whitehouse.gov/sites/default/files/omb/assets/memoranda_2010/m10-06.pdf

2 American Innovation and Competitiveness Act, S.3084, 114th Cong. (2015-2016)

3 OMB Memo M-11-07. Facilitating Scientific Research by Streamlining the Paperwork Reduction Act Process. December 9, 2010.

4 http://www.epa.gov/quality/

5 Typology adapted from: Teresa Scassa and Haewon Chung. 2015. Typology of citizen science projects from an intellectual property perspective: Invention and Authorship Between Researchers and Participants. Wilson Center, Commons Lab, Case Study Series, Vol. 5.

6 http://www2.epa.gov/osa/basic-information-about-human-subjects-research

7 http://www.opm.gov/policy-data-oversight/pay-leave/pay-administration/fact-sheets/computing-hourly-rates-of-pay-using-the-2087-hour-divisor/

8 FR Doc. E9-1777, Presidential Memorandum for the Heads of Executive Departments and Agencies 01/26/2009.

9 https://www.data.gov/data-policy

20


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy