Data Monitor December 2025

Well CAAD, we’ve made it (almost) to the end of 2025, a year in which the disinformation landscape was comprehensively mapped not only by our friends in the NGO world, but also by impressive, interdisciplinary academic efforts, like the Social Media Lab’s investigation toolkit.

The International Panel on the Information Environment released a foundational 127 page report this year on who is creating disinformation, what they’re saying, where it’s spreading, who it’s targeting, what effects it’s having and crucially what solutions might actually work.

Even more comprehensive, the Climate Social Science Network published its (free 400+ page e-) book assessment of Climate Obstruction around the world, documenting how Big Carbon has sabotaged climate action at every level, but also what groups (like CAAD!) are doing to stop those who are stopping the global public from getting the climate action that 89% of the planet wants from their political leaders. So if you’re looking for a cozy read (or podcast!) to curl up with this holiday season, we’ve got you covered!

If you’d rather keep the reading to a minimum, then you can catch up with CAAD’s annual pre-COP report, Deny, Deceive, Delay: Demystified, a quick read of just 15 pages. Because as we close out 2025 and look toward 2026, it’s time to move from assessing just how unfair the information playing field is, and start leveling it out.

2026 is the year to move beyond describing an information landscape, with problematic platforms polluted with disinformation and hate by design, to really start fixing it.

Sound too hard? Don’t worry, we’ve already gotten started!

For example, CAAD and Roots sponsored a fellowship for countering disinformationone of whom found a ‘pink slime’ fake news ring in Argentina promoting fossil fuels. Apparently those findings made their way into the hands of Argentinian Congresswoman Vilma Ripoll, who incorporated the findings into a draft bill regulating disinformation that will be submitted to the National Congress next year.

That’s the latest – but certainly not the only –  example of policymakers finally acting in 2025 – and a warning shot for what’s to come in 2026.

Promises Expired, Fines Applied: Info Integrity Hits Different

This year made one thing we’ve long known absolutely unmistakably obvious: Big Tech is not playing nice, and undeniably is playing politics. With the most problematic platforms showing little interest in self-regulation, to put it mildly, 2025 marked a shift from norm-setting to enforcement of regulations.

We no longer need to debate whether to regulate greenwashing; the question now is what to do with the settlement money once regulations bite.  In New York State, for example, academics Jennifer Jacquet and Sonali Shukla McDermid argued that a $1.1 million JBS USA greenwashing settlement should fund a public advertising campaign about greenwashing meat – using industry money to expose industry tactics.

After years of mapping a polluted information ecosystem, 2025 also delivered early clean up efforts. CAAD’s Climate Information Integrity Summit in Brazil put the issue onto the COP30 agenda and into UN leadership discussionsincluding during the General Assembly.

While the summit ultimately did succumb to the obstructive forces of petrostates and failed to include “fossil fuels,” in the final text, for the first time, the text acknowledged the role of disinformation in sabotaging climate policy. That crack matters. Walls don’t collapse all at once.

Consider The Hague. Not the biggest city in the world, but its ban on fossil fuel ads may turn out to be the hole that bursts the dam of advertising holding up the fossil fuel industry’s social license to operate. A CAAD survey and peer reviewed research both found the policy is broadly popular across Europe, and when it was upheld in court over industry objections, it served as a warning to the industry that their plan of using the tobacco industry’s disinformation playbook may ultimately result in the same outcome as the tobacco industry: an ad ban. (…What’s the opposite of irony?)

2025 was the year that information integrity accountability infrastructure grew teeth, and started using them. CAAD’s policy guidelines of safety, transparency, and accountability, stopped being aspirational and started producing consequences.

In Europe, regulators  opened an investigation against Google and AI use, while a Berlin court handed Xitter a loss, confirming that researchers should have access to data under the Digital Services Act, so we can at least all study how platforms are being weaponized against, in this case, elections.

On the other side of the world, in Australia, CAAD (and others, of course) informed a Senate inquiry on information integrity, which put an executive of Rupert Murdoch’s News Corp in the hotseat, where he denied being part of the denial machine.

We’ll revisit that claim in 2026, when Australia finds itself co-hosting COP31.

Turns Out, ‘Trust Us’ Doesn’t Work for the Planet

When Meta rang in 2025 by throwing out fact-checkers, it prompted reminders about the political beast running the company, and over the course of the year Big Tech only further confirmed its hostility to information integrity.

March’s data monitor showed how climate disinformation was used as a deliberate electoral tactic in Germany, driven largely by the far-right AfD and amplified by social media algorithms, legacy media gaps, and transatlantic far-right actors – a story we are seeing over and over with increasing fervour.

As if we needed any more confirmation that bots impact elections, our analysis found a bot network propagating a climate disinformation – conspiracy claim about Canadian PM Mark Carney, who, under consistent pressure from campaigns like this, has since made a number of environmental rollbacks.

April’s data monitor uncovered a raft of greenwashing disinformation in Brazil. Cosy and warm sustainability marketing from some of the country’s biggest polluters was compared against the reality – poor environmental records and lobbying to weaken environmental protections

Then, May’s data monitor showed how a crisis like the Iberian blackout can be easily exploited by disinformation actors. In this case, media and online networks artificially boosted claims that the blackout was caused by renewables. Later in 2025, we found that 7 in 10 Spanish people believed at least one false narrative about the cause of the blackout several months later, and that vast majorities of the UK and Spain want governments to hold social media companies accountable to stop the spread of harmful false content during emergencies.

In June, far-right networks known for ‘issue stacking’ – talking about whatever controversial issue of the day will garner more engagement – attacked the Madleen aid voyage to Palestine, specifically smearing and abusing  Greta Thunberg as a way to discredit climate science and climate action.

Following a crucial Media Matters study into the right-wing dominance of new media in the USA, in July we found how major right-wing podcasts and YouTube channels have become a powerful vector for clean energy disinformation, repeatedly attacking renewables while presenting themselves as apolitical entertainment.

In September, we investigated how in Europe, coordinated, top-down campaigns backed by powerful financial, often overseas, are artificially inflating fringe climate opinions rooted in misinformation. Big Tech platforms are allowing this ‘digital shoving’ of the debate to take place in the pursuit of ever larger profits. So moving into 2026, we should take to heart a finding from a new study: “If social media platforms are to remain a part of modern society, people should recognize that the opinions they see are not representative of public opinion.”

What Now?

Climate disinformation isn’t just background noise – it’s a policy assassin. Sometimes it drains political will. Other times, it helps elect governments that embrace anti‑science, gutting research institutions and civil society along the way.

In late 2025, the EU finally put its money where its mouth is, handing Elon Musk’s X a €120 million fine – a big first enforcement under the Digital Services Act for deceptive design, opaque ad data, and blocking researchers’ access to public data. 

Looking ahead to 2026, DSA enforcement is no longer theoretical. Whether platforms comply (or dodge) these rules will shape whether disinformation keeps scaling or finally meets accountability.

At the same time, we know the fossil fueled billionaires are also backing anti-trans groups, so we’ll need to be even more intersectional as they exploit the “Carbon Bros” to stoke the so-called “culture wars” around the world.

We’ll need to remain flexible but resolute in the new year, as the calls get louder for popular policies like banning fossil fuel ads like we did tobacco ads, and to protect the public from the “wild conspiracy theories” that are making disasters more dangerous.

In Other News