As Meta prepares to phase out its third-party fact-checking program, the company’s dwindling team of fact-checkers faces an unprecedented surge of misinformation surrounding recent wildfires in the United States. From claims that the fires were intentionally set by government operatives to conspiracy theories involving space lasers, the misinformation is spreading rapidly across Facebook and Instagram, creating significant challenges for the few remaining workers tasked with combating it.

A Final Push Amid Uncertainty
For many fact-checkers, this final stretch of their work is bittersweet. While they’ve spent years debunking false claims and striving to improve digital literacy, their roles are set to be eliminated as part of Meta’s ongoing cost-cutting measures. The company announced earlier this year that it would sunset its contracts with third-party fact-checking organizations, citing a shift in priorities and a focus on automated tools.
Despite the looming end of their jobs, fact-checkers are determined to make an impact in their final days. “The wildfire misinformation is some of the most dangerous we’ve seen,” said one anonymous fact-checker. “People are scared, and these false narratives spread like wildfire themselves — pun intended.”
A Wave of Misinformation
The wildfires, which have devastated large parts of California and other states, have become a hotbed for conspiracy theories. Viral posts have accused government agencies of using the fires as a pretext to seize land or force mass evacuations, while others allege that the fires were caused by secret military technology.
Some posts have gone further, claiming that the fires were intentionally set to clear the way for high-speed rail projects or smart cities, ideas that have no basis in reality. These theories often gain traction in communities already distrustful of the government or traditional media.
“One video claiming ‘directed energy weapons’ were used has been shared thousands of times in just a few hours,” said another fact-checker. “These kinds of claims not only mislead people but can also erode trust in emergency services and public officials.”
The Consequences of Conspiracy Theories
Experts warn that wildfire-related misinformation can have serious consequences. False claims can lead to panic, distract from evacuation efforts, and undermine trust in the scientists and officials working to address the crisis. In some cases, misinformation has even resulted in harassment of firefighters and utility workers, who have been wrongly accused of starting fires.
Meta’s Response
Meta has stated that it is committed to combating misinformation, even as it transitions away from human fact-checking. The company is reportedly investing in AI tools to identify and reduce the spread of false information, but critics argue that automated systems lack the nuance and context provided by human oversight.
“AI can flag content, but it can’t explain why something is false in a way that resonates with people,” said a former fact-checker. “That’s the human element we’re losing.”
Looking Forward
As Meta winds down its fact-checking program, the wildfire misinformation surge serves as a stark reminder of the ongoing challenges posed by false information online. Advocates are calling for stronger regulations and more robust partnerships between tech companies and independent organizations to address the problem.
For the fact-checkers whose roles are coming to an end, the fight against misinformation remains a deeply personal mission. “We’re doing everything we can with the time we have left,” said one worker. “If we can stop even one harmful narrative from spreading, it’s worth it.”
With wildfires and conspiracy theories both spreading rapidly, the battle over truth online remains as critical as ever.