Several high-profile political influencers, each with millions of followers, were collecting a paycheck from the Kremlin but apparently were none the wiser. According to an indictment filed earlier this week, the personalities—including Tim Pool, Dave Rubin, and Benny Johnson—were secretly funded by Russian state media.
The United States Department of Justice has charged two Russian citizens with directing a $10 million effort to influence the 2024 election by spreading disinformation to U.S. audiences. RT, formerly Russia Today, paid the money to Tennessee-based Tenet Media, which managed a network of commentators who focused on Western political and cultural issues.
The indictment identified Kostiantyn Kalashnikov and Elena Afanasyeva as the RT staffers, the BBC reported. Tenet Media was not directly named in the indictment, and it did not respond to a request for comment.
The U.S. influencers claim they were victims of the alleged plot and have insisted that they maintained full editorial control of their respective content.
The Old Russian Playbook
In all likelihood, the influencers may have produced the same content even without a paycheck coming from Moscow, but the money allowed their voices to be amplified on social media.
“We should absolutely be concerned about Moscow supporting influencers on social media. Even now that we know about it, our knowledge is not going to cure the problem, and these pieces of information are already out there. The damage is likely larger and more persistent than people realize, as these biased views have already spread widely,” warned Dr. Dan Ariely, professor of psychology and behavioral economics at Duke University.
Spreading misinformation has long been a tool of the Kremlin, and from the same Russian playbook that has been used for centuries.
“The concern with Moscow is its goals. For the last one hundred years, Russia has used the same set of tactics to discredit, sow division and chaos, introduce mistrust, and seek to undermine democratic institutions. Only the tools have changed—social media,” explained Morgan Wright, chief security advisor at cybersecurity firm SentinelOne.
Loss Of Influence
A question that also needs to be asked is whether Moscow was supporting influencers to use them to spread misinformation or if it was seeking to discredit them once a link was established to Russian money.
“History is filled with similar attempts by many nations, including the U.S.A’s ‘spoiling operation,’ approved by the 40 Committee, which sought to alter elections in Chile and prevent the spread of communism,” said Wright. “Tactics and tools are more accessible to discern than goals. It’s still not proven what the actual Russian goals were, and that could be one of the goals.”
For now, the influencers are claiming they were victims—and it is possible that their supporters may see them as such. It isn’t clear if any disclosure that Russia was funding will cost them followers.
“Even when we’re aware that someone is biased or has been paid for their opinion, we still tend to trust them more than we should,” said Ariely. “While we might take their views with a grain of salt, we don’t discount them enough. This means that even if we know certain influencers are being paid to promote specific viewpoints, we’ll likely still give their opinions more weight than they deserve. We end up entertaining ideas from sources with agendas very different from our own more seriously than we should, even when we know we shouldn’t fully trust them.”
Can It Be Countered?
There is the old standby that knowing is half the battle, but in the era of social media where people believe what they want it could take much more to successfully counter such misinformation campaigns.
“We need to demand influencers to be transparent about the source of their influence in its entirety—from their funding sources to their paid sponsorships,” suggested Ariely.
Transparency may not solve the problem entirely, but influencers should still prominently disclose sponsorships.
“It would make influencers more cautious about accepting certain deals and allow their audiences to better evaluate the content,” added Ariely. “Ultimately, we need to make our information systems more worthy of the trust we’re evolutionarily inclined to give them. Simply teaching distrust isn’t viable; instead, we need to foster environments that deserve our trust.”
The greatest challenge may be in handling trust in digital spaces, especially with influencers in the political space, where election results could be impacted.
“Simply teaching people not to trust isn’t viable or desirable, as trust is a fundamental and positive aspect of human nature,” Ariely continued. “Instead, we need to build information systems that are worthy of our trust. Rather than changing human nature, we should adapt our online environments to deserve the trust we’re predisposed to give.”