2521ss01

2521ss01.docx

Generic Clearance for Citizen Science and Crowdsourcing Projects (New)

OMB: 2080-0083

Document [docx]
Download: docx | pdf

ICR number 2521.01

Generic Clearance for Citizen Science and Crowdsourcing Projects (New)


1. Circumstances Making the Collection of Information Necessary


Citizen science and crowdsourcing: Innovative research methods that engage the public


Citizen science and crowdsourcing are tools that engage, educate and empower the public to apply their curiosity and contribute their talents to a wide range of scientific and societal issues. Citizen Science is a form of open collaboration where the public can participate actively in the scientific process through methods that include asking research questions, collecting and analyzing data, interpreting results, or engaging in problem solving. Crowdsourcing is a process where individuals or organizations submit an open call for contributions of information from a large group of individuals (“the crowd”).


EPA’s mission to protect human health and the environment


The mission of EPA is to protect human health and the environment. Citizen science and crowdsourcing can support EPA’s mission and purpose, including ensuring that national efforts to reduce environmental and public health risks are based on the best available scientific information and that all parts of society – communities, individuals, businesses, and state, local and tribal governments – have access to accurate information sufficient to effectively participate in managing human health and environmental risks. To meet these goals, EPA fosters the sound use of science and technology and conducts leading-edge research; the Agency funds environmental justice and community-driven projects. Likewise, the Agency conducts educational activities to increase the public’s knowledge and understanding of environmental issues so that they can make appropriate decisions. See section C. Appendix for EPA’s statutory authority and examples of policy support for incorporating citizen science and crowdsourcing methods into Agency scientific endeavors.

Benefits of research using citizen science and crowdsourcing approaches in EPA research


Citizen science and crowdsourcing can create engaging opportunities for the public to experience their environment, contribute environmental data at a more local level, and provide opportunities to analyze large environmental datasets. These methods give people the ability to easily share data they encounter in their communities and environments. Whether it is an individual photographing an endangered species they come across on their walk, someone at home adding in descriptions to online aerial photographs taken at a disaster site, or owners of personal health monitors documenting their daily activities, crowdsourcing and citizen science provide people a fun and accessible way to contribute to science or foster a greater appreciation of their natural environment and community. In addition, citizen science and crowdsourcing projects promote greater openness in the scientific process by actively encouraging participation in various aspects of research. Researchers using citizen science and crowdsourcing are committed to the dissemination of data and results back to the public.


Many federal and non-federal organizations are already using innovative citizen science and crowdsourcing tools to advance their missions. These tools are especially valuable where data are distributed across space and time or when projects rely on large datasets. Successful citizen science and crowdsourcing projects usually result from iteration of the design based on feedback from the participants. Also, there could be uncertainty about whether the time and effort to create a project will capture the interest of the public and yield meaningful public participation. Speed and flexibility are beneficial to develop, test, and implement good citizen science and crowdsourcing projects to allow, for example, internet-based activities to evolve with technology and variable participation over time. An expedited approval process could facilitate incorporation of citizen science and crowdsourcing methods into EPA’s research and scientific initiatives, which will provide large datasets with diverse information that can provide a more thorough understanding of environmental issues.


The growth and success of citizen science and crowdsourcing is tied closely with advances in technology. Enhanced tools and methods are constantly making citizen science and crowdsourcing more feasible and effective. EPA researchers want to respond to and interact with industries through technology development. For example, the ability to quickly involve new technologies could allow EPA researchers to contribute to low-cost sensor testing and use. In addition, an expedited approval process is consistent with OMB Memo M-10-061, which promotes the use of new technologies and greater openness in government.


Federal support for citizen science and crowdsourcing


In the 2013 Second Open Government National Action Plan2, President Obama called on agencies to harness the ingenuity of the public by accelerating and scaling the use of open innovation methods such as citizen science and crowdsourcing. Citizen science and crowdsourcing are in line with the Paperwork Reduction Act’s intent to “ensure the greatest possible public benefit from and maximize the utility of information created, collected, maintained, used, shared, and disseminated by or for the Federal Government.3


Design principles for citizen science and crowdsourcing projects


Citizen science and crowdsourcing projects under this generic ICR will include the following design principles:


  1. Participants have a meaningful role in the research project, and can act as contributors or collaborators.

  2. Projects have a genuine scientific question or goal.

  3. Projects are low-burden for participants.

  4. Projects include active management of data and data quality, including a data quality assurance plan and ongoing evaluation of data quality and data management.

  5. Projects are opt-in and participants have full control over the extent that they participate.

  6. The data gathered and/or analyzed are shared with participants and generally made publicly available, unless there are security or privacy concerns that prevent this.

  7. Participants receive feedback on how their contribution adds to the project, e.g. how their data will be used and what the research findings are.

  8. Project leads will evaluate scientific output, data quality, and the impact on participants.

  9. Projects are designed to contribute to research and science, not to inform Agency regulations or policies.



2. Purpose and Use of the Information Collection


EPA relies on scientific information. Citizen science and crowdsourcing techniques will allow the Agency to collect qualitative and quantitative data that might help inform scientific research, assessments, or environmental screening; validate environmental models or tools; or enhance the quantity and quality of data collected across the country’s diverse communities and ecosystems to support the Agency’s mission. Information gathered under this generic clearance will be used by the Agency to support the activities listed above and might provide unprecedented avenues for conducting breakthrough research.


Collections will be from participants who actively seek to participate on their own initiative through an open and transparent process (the Agency does not select participants or require participation); the collections will be low-burden for participants; collections will be low-cost for both the participants and the Federal Government; and data will be available to support the scientific research (including assessments, environmental screening, tools, models, etc.) of the Agency, states, tribal or local entities where data collection occurs. EPA may, by virtue of collaborating with non-federal entities, sponsor the collection of this type of information in connection with citizen science projects. To the extent applicable, all such collections will accord with Agency policies and regulations related to human subjects research and will follow the established approval paths through the Human Subjects Research Review Official. Finally, personally identifiable information (PII) will only be collected when necessary and in accordance with applicable federal procedures and policies. If a new collection is not within the parameters of this generic ICR, the Agency will submit a separate information collection request to OMB for approval.


As with any scientific endeavor at EPA, citizen science and crowdsourcing projects will have approved data quality and data management plans as part of their project design before implementation. EPA provides employees resources for developing data quality and data management plans4.


The popularity and application of citizen science and crowdsourcing methods continues to grow with new and low-cost portable technologies. Therefore, the modes of data collection under this generic clearance may include: paper or digital questionnaires, data forms, surveys, focus groups or interviews; new and existing online collaboration tools; fields in a cell or smart phone applications (apps); online web-based forms or interactive computer interfaces that elicit information; social media platforms; text or SMS messages; readings from sensors (personal, mobile, stationary or portable) or other mobile, portable or stationary instruments– readings either sent back to the Agency in real-time, through an online data collection site, or through another acceptable mode listed here; analog or digital audio or video recordings; digital or analog photographs; and information collected automatically through an app, computer, the metadata accompanying a digital photograph, or a mobile sensor.

Information may be actively collected and actively submitted information (such as descriptions, measurements, photographs, etc.) as well as passively submitted information (such as the metadata accompanying actively submitted information, e.g. date, time, and location stamps automatically included with apps and digital photographs, etc.).

Citizen science and crowdsourcing collections submitted under this generic clearance can be stand-alone projects or the methods may be incorporated into an existing or new project, including, but not limited to, projects in the following typology5:

  • Data gathering projects. These projects may include 1) observation, characterization and documentation of natural phenomena or general environmental health observations, opinions, or preferences or 2) surveying participants or screening environmental conditions, including using specialized equipment provided by project leaders to record and submit data, or submitting samples plus descriptors (e.g. of air or water) for testing. Data may be collected using technologies mentioned above, through structured data forms, surveys, focus groups or interviews, submitting photographs or other media, surveys or questionnaires, or providing written observations.

  • Classification/problem solving projects. Participants’ tasks may include: 1) observation of recorded materials provided by project organizers (images, video, etc.) through structured data submission forms, surveys or questionnaires in an online or computer program, clicking boxes, highlighting parts of text or image, and providing comments and/or annotations; 2) Classification of images or sounds using structured data submission forms or clicking boxes in an online or computer program; 3) Transcribing information, by typing handwritten logs or notes; 4) Performing a function meant to generate human behavior data; or 5) Problem-solving or manipulation of data. Tasks 1-5 may be conducted via structured actions or instructions or through the use of “human-based computational game” or “game with a purpose”, a human-based computational technique in which a computational process performs its function by presenting certain steps to humans in an entertaining way.

Data gathering and classification/problem solving projects may include participants providing health information, opinions or observations about a research subject’s environmental surroundings. To the extent applicable, these projects will accord with all Agency policies and regulations related to human subjects research and will follow the established approval paths through the Human Subjects Research Review Official6.

Citizen science and crowdsourcing collections under this generic clearance may include the following types of questions or requests of participants:



  • Profile/Preference information. Projects may request a username and/or password as well as user preference information to facilitate or customize the user experience. Participants may be asked to submit an email address, name, and zip code, as well as acknowledge a privacy policy or terms agreement. Participants may also be presented with an opportunity to be placed on a mailing list for the project. This includes projects administered through a web form or mobile application.

  • Personal and Contact Information. Citizen science and crowdsourcing projects may solicit contact information. This information may be necessary to organize and analyze data (i.e., it may be necessary to know which data points are from the same observer). Projects may request contact information (name and email address, zip code, address and phone number) to provide participants with project updates and share data. Participants would be made aware that the publically available data on contact information will be anonymized and aggregated, for example, by census tract, zip code, city, or some other higher level than individual addresses.

  • Experience and Expertise. For data quality purposes, citizen science and crowdsourcing projects may request information to evaluate the skill level of the participant by asking about their experience with the project topic. Questions may be about a person’s age range, level or topic of education, participation in organizations, or professional experience.

  • Information about Observations. Projects may request accompanying information, such as the date and time of the activity, the location (e.g., GPS coordinates, address, zip code, etc.), the weather (e.g., temperature, precipitation, wind, humidity, visibility, etc.), and a description or characterization of the location (e.g., vegetation type, type of water body, environmental condition, etc.) or personal senses (e.g. smell, visual cues, sound, etc.).

  • Project Evaluation. Citizen science and crowdsourcing projects may collect information on the participant’s experience for project evaluation and development. This may include questions on how the participant found out about the project, the amount of time spent, distance traveled, how difficult the task was for the participant, whether the participant enjoyed the experience, and if they will participate again. Projects may also request information to evaluate participant outcomes, such as changes in the participant’s understanding of the scientific process or project topic, through survey questions before and after participation.

  • Training. Citizen science and crowdsourcing projects may need to train participants for the purpose of soliciting quality data and increasing participant benefits including education and engagement. Participants may be asked to read materials, watch training videos, or attend training sessions in-person or virtually via a webinar. To ensure that participants understand the training, they may be assessed through testing instruments like a questionnaire or survey, which may be administered online or through a computer program, on paper, in cell a phone app, or in-person.

3. Consideration Given to Information Technology


In order to encourage participation and reduce burden on participants, citizen science and crowdsourcing efforts often utilize information technology that is available to a number of potential participants (cell phones, personal computers, tablets, etc.). The projects submitted under this generic clearance may collect information electronically through new and existing online collaboration tools, cell phone applications (apps) or SMS, web-based forms, online computer programs or forms, social media platforms, or sensors (personal, mobile, stationary or portable).

4. Duplication of Information

No similar data are gathered or maintained by the Agency or are available from other sources known to the Agency.

5. Reducing the Burden on Small Entities

The collection tools for citizen science activities will be presented in a way that is quick and simple for users to enter in data. All activities will be voluntary and thus respondents will not face any burden if the activity does not interest them.

6. Consequences of not Conducting Collection

If unable to collect information through citizen science or crowdsourcing methods under a generic ICR, the Agency would be unable to adapt and utilize these innovative tools in a timely manner to engage the public in Agency science. With these methods, EPA benefits from the public’s knowledge, expertise, and willingness to contribute to scientific endeavors that rely on large and geographically comprehensive datasets. The public and other organizations are beginning to capture and organize data with smartphones and portable sensors; the Agency’s involvement will allow for publically-generated data to effectively support EPA research, including initiating data collection, developing innovative methods for data processing, and managing data quality. EPA research innovation significantly benefits by EPA researchers having access to the newest technologies affording the opportunity to contribute meaningfully to low-cost sensor testing and use. Moreover, members of the public enjoy participating in citizen science and crowdsourcing projects, which are fun, educational, engaging, and will allow for more open communication between EPA and the public; citizen science projects in other agencies have gathered millions of data points contributed by hundreds of thousands of interested individuals. These projects are always voluntary, low-burden, and rely on the interest and self-motivation of the participants. Finally, projects under this generic clearance will allow Agency researchers to test ideas more quickly, respond to the project’s needs as they evolve, and incorporate feedback from participants for flexible, innovative research methods that involves the public in a variety of aspects of scientific research.

7. Special Circumstances

There are no special circumstances.

8. Consultations with Persons outside the Agency

In accordance with 5 CFR 1320.8(d), a notice for public comment was published in the Federal Register (80 FR 59148) on October 1, 2015. No comments were received. The Agency consulted with representatives at federal agencies with citizen science and crowdsourcing efforts: USGS, HHS and USDA/USFS. The federal representatives are associated with the Federal Community of Practice on Crowdsourcing and Citizen Science (FCPCCS). Representatives reviewed a draft of this generic clearance. This document reflects the feedback and comments from this community.

9. Payment of Gift

The Agency will not provide payment or other forms of remuneration to participants.

10. Confidentiality

If a confidentiality pledge is deemed useful and feasible, the Agency will only include a pledge of confidentiality that is supported by authority established in statute or regulation, that is supported by disclosure and data security policies that are consistent with the pledge, and that does not unnecessarily impede sharing of data with other agencies for compatible confidential use. If the agency includes a pledge of confidentiality, it will include a citation for the statute or regulation supporting the pledge.



11. Sensitive Nature

No questions will be asked that are of a personal or sensitive nature as defined by OMB.

12. Burden of Information Collection

A variety of platforms and media will be used to collect information from respondents. We expect that there will be a range of burden hours depending on the details of the citizen science and crowdsourcing method employed. The total range of annual burden hours requested is 351,150 to 402,750 hours based on the number of collections we expect to conduct over the requested period for this clearance.


The total dollar value of the annual burden hours is based on the National Compensation Survey: Occupational Wages in the United States May 2014 published by the Bureau of Labor Standards Occupation and Wages, May 2014 (http://www.bls.gov/oes/current/oes_nat.htm). We use the value for All Occupations, average hourly wage of $22.71 multiplied by 1.4 to account for benefits, $31.79.


  1. Data gathering projects: We estimate 2,000 participants per year per data gathering collection under this generic clearance. This number is based on the maximum annual number of registrants (2,153) over four years of data from USGS’s citizen science program, the National Phenology Network (NPN) (OMB Control Number 1028-NEW). NPN estimated 13 minutes for registration, login and reading guidelines. Under this generic clearance, we estimate 7 data gathering projects per year (14,000 registrants). We estimate the number of participants completing training will be 80% of registrants (11,200 participants). Training modules will vary by data collection; we estimate a range of one hour to four hours. For this estimate we assume that each trained participant will collect the same number of observations in the same amount of time. USGS’s NPN estimated 500,000 observations per year at 2 minutes per observation for plant phenology, which is a relatively quick observation. For this generic clearance, we estimate that the same number of “trained” participants (11,200) will collect a total of 500,000 observations at 5 minutes per observation = 3.7 hours/participant on an annual basis. The estimated range of annual burden for 7 data gathering projects is 55,900 to 89,500 hours.


  1. Classification/problem-solving project: We estimate 2,500 participants per year per classification/problem solving data collection under this generic clearance based on estimates from an example of a classification/problem-solving project at USGS, the citizen science program iCoast (OMB Control Number 1028-NEW). iCoast estimated 10 minutes for registration, login and reading guidelines. We estimate 3 data gathering projects annually under this generic clearance. We estimate the number of participants completing training will be 80% of registrants. Training modules will vary by data collection; the range is one hour to four hours. The estimated number of participants that will spend time on the website, app, or computer program engaged in the activities will vary, and it is difficult to predict. Participants will continue to engage with the site based on their interest and submit data until the task is complete. For this estimate, we assume data collection tasks (classification/problem solving) will be completed with 50% of the trained participants engaged by the sites for 8 hours per month or 96 hours per year. The estimated range of annual burden for 3 classification/problem solving projects is 295,250 to 313,250 hours.





Burden of information collection request table

Estimated Annual Reporting Burden

Type of Collection

Number of Participants

Estimated Time per Participant (hours unless otherwise noted)

Total Annual Burden Hours

A. Data gathering projects

Participant registration, initial login & reading guidelines

10,000

13 minutes

2,167 hours

Participant training (estimate 80% of those who register will undergo training)

8,000

1 to 4 hours

8,000 to 32,000 hours

Participants contributing observations (estimate all "trained")

8,000

5.2 hours (500,000 observations at 5 minutes each)

41,667

Total burden hours

 

 

51,833 to 75,833 hours

Total annual labor costs

 

hourly rate including. benefits,$31.79

$1,647,782 to $2,410,742

B. Classification/problem-solving projects

Type of Collection

Number of Participants

Estimated Time per Participant

Total Annual Burden Hours

Participant registration, initial login & reading guidelines

7,500

10 minutes

1,250 hours

Participant training

6,000

1 to 4 hours

6,000 - 24,000 hours

Participants completing data collection tasks

3,000

96 hours (8 hours/month)

288,000 hours

Total burden hours

 

 

295,250 to 313,250 hours

Total annual labor costs

 

hourly rate including benefits,$31.79

$9,385,998 to $9,958,218

Grand total, annual burden hours

 

 

347,083 to 389,083 hours

Grand total, annual labor costs

 

 

$11,033,779 to $12,368,959

Grand total, burden hours over 3 years



1,041,250 to 1,167,250

Grand total, 3 year labor costs



$33,101,338 to $37,106,878



13. Costs to Respondents

There will not be fees associated with participation in the data collections under this generic clearance. Participants will not be required to purchase any equipment to collect data, but some low-cost sensors or other technical or low-tech supplies may be necessary to complete all of the data collection tasks should the participants decide to complete all tasks. The costs to participants for materials will vary based on the data collection type (data gathering, classification/problem solving, or research subject participation) and medium (i.e. sensors, apps, or paper forms). The Agency does not expect participants to make purchases specifically for citizen science and crowdsourcing projects under this generic clearance. However, the table below reflects an annual 3-year estimate for Operations and Maintenance Costs (O&M) that participants might incur should they decide to purchase equipment to fully participate in a citizen science or crowdsourcing collection under this generic clearance. The estimate is based on the following assumptions: one eighth of the expected participants in the “data gathering projects” purchases low-tech equipment with a maximum cost of $25 per person (i.e. use of already owned internet or data plans; or purchase of low-technology equipment like water monitoring kits) and one eighth of the expected participants in the “data gathering projects” purchases high-tech equipment (i.e. personal health monitors, portable/personal air sensors, or other higher-technology equipment) with a maximum cost of $500 per person.

Operations & Maintenance (O&M) Costs

 

Estimated maximum cost per participant

Estimated number of participants expected to use

O&M costs

Low-tech equipment

$25.00

1000

$25,000

Low-cost personal or portable technology

$500.00

1000

$500,000

Total annual non-labor burden cost

$525,000

O&M costs over 3 years

$1,575,000



14. Costs to Federal Government

The anticipated cost to the Federal Government is approximately $144,164 annually. These costs are comprised of: project administration and estimated contractor payments. EPA person-costs are estimated using an hourly rate for a GS-14 (step 1) including an additional 60% for benefits based in Washington, DC. Time spent on each step may vary, as well as the GS-level of the employees involved. The estimate for project administration is based on 10 projects per year at 5% time given 2,087-hour divisor7 for an employee’s annual rate of pay.





Task

Costs (and Person-hours) Per Project

Total Hours and Cost

EPA

($50.41/

Hour)

Estimated contractor costs

Total Cost/Year (estimated for 10 projects per year)

Project Administration

$8,416


$84,164

Contractor costs


$60,000




Total




$144,164






15. Reason for Change

This is a new collection.

16. Tabulation of Results, Schedule, Analysis Plans


The tabulation, timeline, analysis, and publication of information collected under this generic clearance will vary by submission.


In accordance with the Presidential Memorandum on Transparency and Open Government8, information collected under this generic clearance will disclose information rapidly in forms that the public can readily find and use and in compliance with the data policies outlined on Data.Gov9.


Each project submitted under this generic clearance will specify the tabulation, timeline and analysis of the information collection. The information collected is for Agency scientific purposes, thus a number of projects are likely to publish the results of analyzed data, in peer-reviewed scientific journals, white-papers, Agency reports, or Agency strategic research plans, which will be available for public consumption.


17. Display of OMB Approval Date

Not applicable for this request.

18. Exceptions to Certification for Paperwork Reduction Act Submissions

These activities comply with the requirements in 5 CFR 1320.9.



B. STATISTICAL METHODS

Data collection methods and procedures will vary; however, the primary purpose of these collections will be for qualitative and quantitative data collections that might help inform scientific research, assessments, or environmental screening; validate environmental models or tools, or; enhance the quantity and quality of data collected across the country’s diverse communities and ecosystems to support the Agency’s mission.


  1. Universe and Respondent Selection


Statistical methods will not be used in the selection of respondents. Participants in citizen science and crowdsourcing projects are self-selected. The method for soliciting participation will be described fully in each collection request, but participation may be advertised through targeted outreach and engagement methods like standard and social media outlets, collaborations with on-the-ground partners, public talks, and word-of-mouth.


The number of participants will vary by project submitted under this generic clearance. The variation in participation is likely due to multiple factors like personal interests, accessibility, perceived burden, outreach by the Agency, and success over time. For example, the citizen science program Nature’s Notebook (USGS, OMB Control Number 1028-NEW) reported that participation increased from 40 registered observers in 2008 to 530 registered observers in 2011.


Results will not be used to directly inform Agency regulations or policies. Data also will not be generalized beyond the scope of the sample.


  1. Procedures for Collecting Information


Data collection methods and procedures will vary and the specifics of these will be provided with each collection request. Each request under this generic collection will include details on the statistical methodology for stratification and sample selection (if applicable to the collection – this is not applicable to the selection of participants), estimation procedure, degree of accuracy needed for the research purpose described in the justification, unusual problems requiring specialized sampling procedures, and any use of periodic data collection cycles (less frequent than annual) to reduce burden.



  1. Methods to Maximize Response


Participants will have complete control over their participation in citizen science or crowdsourcing projects. Participants will need to proactively seek out opportunities, respond to an email, or actively sign up for a project in order to participate. Outreach and advertising materials will provide information on how to participate but will not assume participation from anyone. Several existing citizen science and crowdsourcing projects employ engagement tactics to support continued participation, and reduce non-response including newsletters with appreciation, motivation and results delivered to participants, and optional bi-weekly reminders to observe. The collection requests under this generic clearance may utilize some of these techniques while acknowledging that participants have full control over whether to participate or not.


Each collection request under this generic clearance will specify methods to track and increase response rates. Some collection requests will provide opportunities for participants to submit negative data, for example, information on the time and effort to attempt to obtain an observation in the event of no observation.


  1. Testing of Procedures


Pretesting may be done with internal staff or a limited number of external colleagues (less than 10). If the number of pretest respondents exceeds nine members of the public, the Agency will submit the pretest instruments for review under this generic clearance.


  1. Contacts for Statistical Aspects and Data Collection



Projects submitted under this generic clearance can consult with statisticians in the development, design, conduct, and analysis of the data collection. Statistical expertise is available from agency statisticians or contractors and the Agency will include the names and contact information of persons consulted in the specific information collection requests submitted under this generic clearance as needed.



C. Appendix


EPA Statutory Authorities


  • Clean Air Act § 103, 42 U.S.C. § 7403, authorizes research into techniques for monitoring and controlling air pollution.

  • Clean Water Act § 104, 33 U.S.C. § 1254, authorizes EPA to encourage, cooperate with and render technical services to individuals, including the general public, to promote the coordination and acceleration of demonstrations, studies and training relating to the causes, effects, prevention and elimination of water pollution.

  • Solid Waste Disposal Act § 8001, 42 U.S.C. § 6981, authorizes EPA to encourage, cooperate with and render technical services to individuals as well as public and private sector entities to promote the coordination and acceleration of demonstrations, studies, training and public education programs relating to, among other things: adverse health and welfare effects of the release of solid waste into the environment; operation and financing of solid waste management programs; planning and operation of resource recovery and conservation systems and hazardous waste management systems; production and marketing of recovered resources; reductions in the amount of solid and hazardous waste and unsalvageable waste materials; and, the development and application of improved methods of collecting and disposing of solid wastes to recover and market materials and energy from these wastes.

  • Marine Protection, Research and Sanctuaries Act § 203, 33 U.S.C. § 1443, authorizes EPA to encourage, cooperate with, and render technical assistance to public and private sector entities, including individuals, to promote the coordination of demonstrations, studies and training to minimize dumping of materials into the ocean that may unreasonably degrade or endanger human health, welfare, or the marine environment and economic potential.

  • Safe Drinking Water Act § 1442, 42 U.S.C. § 300j-1, authorizes the Administrator to conduct research, studies, and demonstrations relating to the causes, diagnosis, treatment, control, and prevention of risks to human health related to drinking water supply, and to share information and make recommendations based on this research and investigation.

  • The National Environmental Education Act, § 4, 20 U.S.C. § 5503 authorizes EPA to develop and support programs to increase environmental literacy.

  • 107-118 Comprehensive Environmental Response, Compensation and Liability Act § 311, 42 U.S.C. § 9660, authorizes EPA to conduct research, and provide training and technical assistance to individuals and organizations, to facilitate the inventory, assessment, preparation and remediation of brownfields sites, including associated community involvement.



Policy support

  • 2013 Second Open Government National Action Plan - encourages Federal Agencies to harness the ingenuity of the public by accelerating and scaling the use of open innovation methods such as citizen science and crowdsourcing: https://www.whitehouse.gov/sites/default/files/docs/us_national_action_plan_6p.pdf

  • OMB Memo M-11-07. Facilitating Scientific Research by Streamlining the Paperwork Reduction Act Process. December 9, 2010. Citizen science and crowdsourcing are in line with the Paperwork Reduction Act’s intent to “ensure the greatest possible public benefit from and maximize the utility of information created, collected, maintained, used, shared, and disseminated by or for the Federal Government.”

  • OMB Memo M-10-06. Open Government Directive. December 8, 2009. Promotes open government and the use of new technologies.

  • OMB Memo M-15-16. Multi-Agency Science and Technology Priorities for the FY 2017 Budget. July 9th, 2015. “Agencies are encouraged to use approaches to foster innovation such as Grand Challenges, incentive prizes, citizen science, and collaboration with members of the Maker Movement.”





1 OMB Memo M-10-0. Open Government Directive. December 8, 2009. https://www.whitehouse.gov/sites/default/files/omb/assets/memoranda_2010/m10-06.pdf

2 https://www.whitehouse.gov/sites/default/files/docs/us_national_action_plan_6p.pdf

3 OMB Memo M-11-07. Facilitating Scientific Research by Streamlining the Paperwork Reduction Act Process. December 9, 2010.

4 http://www.epa.gov/quality/

5 Typology adapted from: Teresa Scassa and Haewon Chung. 2015. Typology of citizen science projects from an intellectual property perspective: Invention and Authorship Between Researchers and Participants. Wilson Center, Commons Lab, Case Study Series, Vol. 5.

6 http://www2.epa.gov/osa/basic-information-about-human-subjects-research

7 http://www.opm.gov/policy-data-oversight/pay-leave/pay-administration/fact-sheets/computing-hourly-rates-of-pay-using-the-2087-hour-divisor/

8 FR Doc. E9-1777, Presidential Memorandum for the Heads of Executive Departments and Agencies 01/26/2009.

9 https://www.data.gov/data-policy

13


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEdward Kulschinsky
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy