Misinformation and disinformation are major threats to climate action.
Climate change misinformation and disinformation create a distorted perception of climate science and solutions; meanwhile they weaken the public mandate for effective domestic and international policies aligned with the goals of the Paris Agreement.
Universal definition
Climate disinformation and misinformation refers to deceptive or misleading content that:
Undermines the existence or impacts of climate change, the unequivocal human influence on climate change, and the need for corresponding urgent action according to the IPCC scientific consensus and in line with the goals of the Paris Climate Agreement;
Misrepresents scientific data, including by omission or cherry-picking, in order to erode trust in climate science, climate-focused institutions, experts, and solutions; or
Falsely publicises efforts as supportive of climate goals that in fact contribute to climate warming or contravene the scientific consensus on mitigation or adaptation.
Our Policy Asks
Governments should encourage social media, advertising technology, and broadcast and publishing companies to take these steps:
Adopt the below universal definition of climate disinformation as deceptive or misleading online behaviour that:
- Undermines public understanding of the existence or impacts of climate change, the unequivocal human influence on climate change, and the need for corresponding urgent action to reduce global warming emissions (mitigation) and prepare for the current impacts and those we must expect (adaptation), according to the IPCC scientific consensus and in line with the goals of the Paris Climate Agreement;
- Misrepresents scientific data, including by omission or cherry-picking, to erode trust in climate science, climate-focused institutions, experts, and solutions; or
- Falsely publicizes efforts as supportive of climate goals that in fact contribute to climate warming or contravene the scientific consensus on mitigation or adaptation, including greenwashing.
Produce, publicize and resource a transparent company plan to stop the spread of climate disinformation including these elements:
- Community content standards with accompanying monitoring and evaluation indicators.
- An enforcement mechanism for violation of these standards, including downranking.
- A plan to monitor and respond in all languages in which the company hosts content.
- An up-to-date, publicly accessible, and functional ad library that captures all paid ads, including political and issue-based ads.
- An explanation of any fact-checking processes.
- A user-support system for flagging content that violates community content standards, and a commitment to respond to users
- Strong labor standards for content moderation work.
Report on the prevalence of climate disinformation on their products and services, including (but not limited to): coordinated information operations, including those orchestrated by or affiliated with the fossil fuel lobby; repeat offender activity from commercial disinformers, media outlets and other high-traction or ‘verified’ accounts, state-sponsored influence efforts and interference; and the enforcement of content moderation policies on their services.
Allow researchers access to all data needed to conduct research that contributes to the detection, identification and understanding of systemic climate disinformation risks, as well as to the assessment of adequacy, efficiency and impacts of risk mitigation measures taken.
Prevent the monetisation of climate disinformation, ensuring that tech companies do not profit from hosting or amplifying such content, and weakening the financial incentive for disinformer networks. Mitigation efforts should detail: advertisements, ad tech tools and placements; revenue-sharing schemes with content creators;; merchandising; and any other monetised activity on their products and services.
Implement platform-wide inoculation efforts to expose mis- and disinformation to increase the resilience of users to false or misleading content, and prioritize scientifically credible content Companies should provide regular and transparent reporting on the effectiveness of these measures to regulators.
Implement strong labor policies for content moderation staff, including fair pay, clear contracts, accessible mental health/counseling services, sustainable working conditions, and union representation.
Produce and enforce transparency, safety, equity, and accountability measures related to company use of artificial intelligence and other emerging technologies, especially regarding the potential to accelerate the spread of climate-specific and other disinformation, and increased energy use.