For years, experts have warned about the potential for generative AI to be used in disinformation campaigns, but one man is showing just how dangerous this can be. John Mark Dougan, a former US Marine and police officer, has reportedly spent the last eight years creating fake news to sow discord and disrupt elections from his hideout in Moscow. His stories include explosive fabrications about the First Lady of Ukraine, illegal FBI operations, and Russian military campaigns.
Dougan’s activities were revealed following a BBC investigation. Dougan’s malicious online activities began after he left the Palm Beach County Sheriff’s Office in Florida. He started a website to collect leaked information about the law enforcement agency, eventually sharing officers’ private information online and fabricating rumors about their activities. This led to an FBI raid on Dougan’s home in 2016, prompting his flight to Moscow.
Since then, Dougan’s disinformation campaign has expanded, involving other threat actors. He appears on Russian think tank panels, at military events, and on a TV station owned by Russia’s Ministry of Defense. Dougan is behind several fake news outlets and stories aimed at turning internet users against certain political figures. The BBC’s investigation revealed Dougan’s campaign creates dozens of sites with names meant to sound quintessentially American, such as Houston Post and Chicago Crier, using AI to generate thousands of news stories. To appear legitimate, some stories merely regurgitate content from other sites or retell real stories from a conservative stance.
Some stories spin absurd tales: the First Lady of Ukraine buying a $4.8 million Bugatti with American aid money or the FBI illegally wiretapping Trump’s Mar-a-Lago resort. Visual aids, like YouTube videos and deepfakes, are sometimes used to lend credibility. Some of Dougan’s fake news has even landed on real news outlets, accidentally lending them momentary credibility. These stories often gain traction among far-right social media users, particularly those explicitly pro-Russia. Many people share headlines without reading the associated articles, making it easier for misinformation to spread.
Dougan has denied responsibility for the campaign at times but bragged to the BBC about it at others, showing no concern about the publicity. When asked if he’d slow the spread of his made-up stories, he simply responded, “Don’t worry—the game is being upped.” He also denied being paid by the Russian government to spread lies.
Media experts, including those at Clemson University’s Media Forensics Hub, believe Dougan is just one part of a larger puzzle. Darren Linvill, co-director of the Hub, told the BBC, “He may be just a bit player and a useful dupe, because he’s an American.”
Regardless of Dougan’s exact role, his cryptic comment comes at a crucial time. With the 2024 US presidential election approaching, the campaign’s fake stories are shifting toward American politics. This serves as a stark reminder to voters and social media users to double-check the legitimacy of a headline or “screenshot” before sharing it.