AI/ML, Threat Intelligence

Officials: Fake Kamala Harris videos part of Russian influence operations

Share

Russia was confirmed by U.S. intelligence officials to have perpetrated fraudulent videos on Vice President and Democratic presidential nominee Kamala Harris as part of its influence operations ahead of the upcoming election, reports The Record, a news site by cybersecurity firm Recorded Future.

Aside from releasing videos implicating Harris in a hit-and-run accident, Russia also disseminated phony videos of her speeches, said an Office of the Director of National Intelligence official, who noted Russian influence actors' delayed response to Harris' nomination contrary to a Microsoft report last week that described attackers' immediate pivot upon the withdrawal of President Joe Biden. While Russia has created the most extensive artificial intelligence content for text-, image-, audio-, and video-based disinformation efforts, no state-backed threat actors have harnessed AI as a "revolutionary influence tool" due to persistent challenges in ensuring concealment, overcoming tool restrictions, and developing advanced models, the ODNI official added. However, intelligence officials warned of persistent influence campaigns beyond the polls.

An In-Depth Guide to AI

Get essential knowledge and practical strategies to use AI to better your security program.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms of Use and Privacy Policy.