Or, watch on YouTube
In this webinar, DarkOwl analysts explore the disinformation landscape on the dark web in the context of the upcoming U.S. presidential election. What emerges is a complex, multifaceted online space characterized by a variety of actors, ranging from nation states to American citizens and U.S.-based conspiratorial political movements. All of the above play key roles in both creating and amplifying mis- and disinformation which has seeped from the deep and dark web onto the surface web, and vice versa. As a number of prominent social media platforms maintain policies of limited disinformation regulation, false narratives previously concentrated on the dark web and alternative social media platforms have become mainstream, thereby gaining traction and reaching greater audiences. Combined, these factors reflect a complex environment in the lead up to the election and highlight the importance of identifying and combatting mis- and disinformation.
Make sure to check out our full report on this topic.
NOTE: Some content has been edited for length and clarity.
Erin: We’re excited to kind of talk about this topic. I’m Erin, I’m the Director of Collections and Intelligence at DarkOwl, and I’m joined by my colleague Bianca who works on all of our investigations and services and has been digging into this topic quite a bit. So obviously, it’s November next week, which I find insane. And we’re just about two weeks out from the election. And there’s a lot of things going on out there on mainstream media, obviously. But we wanted to take a deep dive and see what we’re seeing from our side of things on the dark web. So, with that being said, I think we can dive right in and Bianca, I guess the first question would be:
Bianca: Well, during this election period, as with previous elections and recent years, particularly since 2016, we’re seeing disinformation narratives gaining pretty significant traction. And disinformation, as we know, can play quite a significant role in influencing voters. And much of these false narratives that we’re seeing are originating on the dark web and dark web adjacent spaces, especially Telegram. And so, because of that, in order to get a comprehensive picture of the online disinformation landscape and the role it can play influencing voters, it really is vital to examine the role that the dark web plays in spreading that disinformation.
I think you can basically broadly divide the main groups into two categories. And I’d say that the first one is nation states and then you also have domestic actors. So, starting off with the nation states, two of the main actors we’re seeing are Russia and Iran. Russia of course has a history of leading influence operations against the US as we’ve seen since 2016. Russia’s strategy this year though, it’s worth noting, does seem quite different compared to previous years. Most notably, they really seem to be taking advantage of domestically produced conspiracy theories more and more really this year, as opposed to, as we’ve seen previously from them – creating their own false narratives and then sharing and disseminating those narratives. And I think that shift in tactics is a reflection of the domestic disinformation landscape that we’re seeing right now, where you have these absurd conspiracy theories entering the mainstream and then being viewed by millions of people online. So really, nation states like Russia that are leading these foreign influence operations are recognizing that that’s unfortunately something they can take advantage of these domestically produced conspiracy theories.
Other than Russia moving on with these nation -state actors, we are of course seeing Iran emerging as a key player right now in election influence operations. In the lead-up to November 5th, Iran has already carried out cyber-attacks against election campaigns with the DOJ – just recently announcing the indictment of, I believe, three Iranian hackers for targeting former President Donald Trump’s campaign. Importantly though, Iran is also actively sharing content that like Russia’s, is aimed at sowing discord in the US. And that’s something we’ve seen from Russia, of course, since 2016, increasingly. And for Iran, Microsoft researchers in particular identified these websites associated with Iran that are basically posing as American sources and spreading in disinformation.
So we’ve got Russia, Iran, and continuing on with nation states, we really shouldn’t forget China was also leading its own election -focused influence operations. One of its influence operation campaigns has been active since 2017. And we’ve recently been seeing increased activity from that campaign. But I do want to highlight that researchers do seem to believe that China’s efforts likely will be more restrained compared to Russia and Iran. And they don’t really seem to be aiming to undermine one campaign over another. So whereas you see Russia attempting to undermine Vice President Kamala Harris’s campaign and Iran attempting to undermine former President Donald Trump’s campaign, we’re not really seeing that lean or favoring from China to the same extent. So those are the main nation-state actors.
Erin: It’s interesting as well, sorry to interrupt you, but how the landscape has changed since 2016, right? So I saw some reporting with Russia as well that they didn’t necessarily get what they wanted maybe out of the Trump presidency and is that impacting what their goals are and how they’re reacting now. So it seems like as you were just saying, that they’re more trying to focus on just creating that conflict internally in the US, as well as still, promoting Trump, but it’s interesting how they’ve changed their tactic.
Bianca: Yeah, that’s a great point. And they’re just continuing to so discord, like that seems to be the number one priority, really, and undermining faith in the election process and undermining faith in democracy. So that’s something we’re still seeing from them. Those are the main nation-state actors to answer your question that are kind of the main players right now in the disinformation landscape.
But I do also want to highlight that second bucket I mentioned that’s domestic actors. And there are US-based individuals and political movements that are generating disinformation related to the election and candidates that we’re seeing right now. For instance, the far-right conspiratorial movement, QAnon in particular, which first appeared in 2017, they seem to have effectively entered the mainstream at this point, and their conspiracy theories are seen across the surface web. And that’s a lot of the disinformation that we’re seeing in the current landscape is coming from these far-right conspiratorial
movements. To answer your question, I’d say those are the two main buckets, the nation-states, but then also domestic actors.
I’d say broadly you can group the main narratives into two groups, two categories. So those that are questioning election integrity and then you have those that are targeting presidential candidates. So, for the first category, you have essentially all of the disinformation that’s questioning election integrity. So unfounded claims of voter fraud, which of course was also a very dominant narrative in 2020, and we’ve seen that narrative persist and enter the mainstream increasingly. And some of those narratives are being amplified by foreign actors, but American citizens themselves are also responsible, I think, for a lot of that amplification. That’s the first category and then the second category broadly is disinformation aimed at undermining either Vice President Kamala Harris’ campaign or former President Donald Trump’s campaign. To give an example, you have Russia spreading disinformation that’s again meant to support Trump and undermine Harris and then at the same time Iran spreading disinformation meant to support Harris and undermine Trump. To give a more specific example, one of the most recent examples of disinformation aimed at undermining a candidacy was this staged video that was created by Russia that falsely accused Governor Tim Walz of sexual misconduct. And that was a story in the news this week. The video has already been debunked, but it nonetheless gained hundreds of thousands of views on Twitter and has been shared on the dark web and on groups in Telegram. So, I’d say those are really the two main categories that we’re seeing right now.
Erin: I think with AI and things, it really highlights how videos can be made relatively easily these days that can be shared. And by the time that they’re debunked or shown to be false, the damage is almost done, the genie’s out of the bottle. So definitely concerning, but you just touched on the dark web and Telegram.
Well, to address Telegram, right now we are seeing lots of groups on Telegram, especially far-right ones, that are basically spreading disinformation meant to sway voters. And again, some of that disinformation is coming from nation states. There are Russian news bots in a lot of these channels that are sharing headlines and articles that, again, are false and have no basis in fact. So, like you’ll see RT news, Russian bots, RT news, of course, being Russian funded propaganda. And then you’ll also have some of these same Telegram groups and channels sharing disinformation that’s originating from U.S. based individuals and again, conspiratorial movements like QAnon. So going back to this, the role that domestic actors are playing in addition to nation-states. It’s really interesting that a lot of the conspiratorial content that we’re seeing on spaces like Telegram, a lot of that content is leaking into the surface web. And vice versa, there is a lot of content overlap. And that’s concerning given that there used to be a much clearer distinction between the surface web and platforms, dark web adjacent platforms like Telegram. So, you’re seeing a lot of interaction in terms of the content we’re seeing on both spaces.
Erin: I think that’s an interesting point, right? Because we tend to think of the dark web, some dark web adjacent platforms like Telegram where there’s limited oversight, although obviously that seems to be changing at the moment, where people want to hide their intentions and stay anonymous. And with this, we’re really seeing people like move over and have less concern about hiding their identity. Like, how do you see that happening and why do you think that’s happening?
Bianca: I think it’s not surprising that we’re seeing, you know, anonymity being weaponized to spread this information, right? It’s more difficult to attribute this disinformation to a specific group, even a nation state or an individual, if they’re remaining anonymous, and that’s not just on the dark web, you know, we’re also seeing the anonymity on the surface web with users on Twitter, now X, spreading disinformation, but kind of hiding their true identity. And that’s become a lot easier on Twitter, especially where the verified checkmarks don’t signify reputability anymore that you just buy the checkmark. And it’s easier to kind of stay anonymous and sell yourself as this reputable source.
I did want to touch back about Telegram, though. I think it’s not surprising that we’re seeing a lot of disinformation there, of course, wanting to flag that just a few months ago in August, the app’s founder was arrested and charged in France in relation to an investigation into criminal activity on Telegram. So, it’s really not just disinformation being shared on the platform. The main concern right now also is violent extremist content and child sexual abuse material that we’re seeing on Telegram. But in terms of disinformation, I think it’s worth highlighting that one of the main concerns about Telegram is the sheer size of the groups and channels there. So, channels don’t have a limit on the number of subscribers and groups can have, I think as many as 200,000 members, which is massive, right? And that scale means that disinformation can very quickly reach large audiences and then gets shared and amplified by these massive groups in over and over and over again. So overall, Telegram is absolutely hosting a lot of the disinformation we’re seeing regarding the election, whether that’s false claims of voter fraud or also disinformation targeting presidential candidates. And that’s definitely something to be concerned
about.
Erin: Yeah, and I think we’ve definitely seen Telegram being used in other arenas in that way as well. Israel Hamas is an excellent example of disinformation being shared and even actual news information being shared quicker on Telegram than it is on mainstream media. And someone was asking me earlier this week, actually, if I think what’s next after Telegram now that the CEO’s been arrested and moved on
and I was like, honestly, I don’t think people are going to move or not quickly because there’s too many people in too many groups and they’re too well established that I think it will be difficult for them to move and create that with any of the other apps that are out there, but it’s definitely having an impact I think on
a lot of the things that are going on. So that’s a really interesting insight.
Bianca: Conspiracy theories are effectively significantly distorting the information landscape
right now, in the lead up to the election. And as you noted, a lot of them are gaining a lot of traction. And I think, you know, to give an example, a good example of the prominence of conspiracy theories right now is the information landscape we saw during Hurricane Helene and Milton. So you had far-right groups and individuals who were spreading disinformation claiming that the US government was using weather control technology so that the hurricane would be steered towards Republican voters. And you had, as you noted, of course, prominent figures reiterating these theories. There were politicians and public figures amplifying that conspiracy theory. Former President Donald Trump claimed that hurricane relief funds were being spent on illegal migrants, so having public figures reiterate those conspiracy theories lend them more credence, right, and makes it easier for them to gain traction, even though they are completely false. A lot of these conspiracy theories gained millions of views on Twitter and were reshared by more prominent figures in the Republican Party and also by Twitter’s own CEO, Elon Musk. And a lot of the most viral posts were from far-right individuals sharing often xenophobic and racist conspiracy theories. And so, I think the fact that there are millions of people engaging with this content, on Twitter especially, and amplifying and agreeing with the conspiracy theories is very concerning. And it’s ultimately a reflection of the divisiveness that we’re seeing ahead of the election. What we saw with Hurricane Helene and Milton was effectively the weaponization of tragic events, right? To influence voters ahead of the election. And that weaponization unfortunately worked and reached a massive audience. And it of course also had unfortunately real world implications with meteorologists receiving death threats. So absolutely conspiracy theories are playing a key part in this disinformation landscape right now.
Well, that’s a really interesting question because, of course, no political party is immune to conspiracy theories. But based on the research we’re doing right now, far-right individuals, including public figures or Republican members of Congress are dominating the disinformation landscape right now on the dark web and also on the surface web, importantly, and like I said, there is a lot of overlap in terms of content in both of those places. A lot of the dominant conspiracy theories we are seeing right now are rooted in far-right ideas. So again, for the Hurricane Helene and Hurricane Milton response and information landscape, we saw a lot of conspiracy theories and disinformation aimed at undermining the Biden-Harris administration and the Harris Walz presidential campaign. And on dark web adjacent platforms like Telegram, far-right groups are also dominant in terms of election disinformation. The group spreading significant disinformation and with the largest numbers of subscribers are our right groups as we’ve seen up until now. And that’s consistent with findings as well that that type of disinformation does tend to be particularly prevalent and toxic in that far-right online space.
Turning to left-wing conspiracies, the most prominent one I’d say that we’ve seen up until now was the baseless claim that the July 13th assassination attempt against former President Donald Trump in Pennsylvania was staged by the Trump campaign. And a lot of that chatter surrounding that unfounded conspiracy theory, interestingly enough, was on Twitter, X, rather than on the dark web. Ultimately, no political movement is free of conspiracy theories. But the ones gaining the most traction right now do appear to be far right conspiracy theories.
Erin: Yeah, I feel like it seems like the far right are just a lot better at organizing and weaponizing things like social media and telegram and etc. because we did a lot of work to try and balance and see what we could find left-wing group that’s thought of out there talking and you know maybe they’re just better at hiding what they’re saying or maybe they’re not you know doing it in the same way but it’s interesting how it does always seem to lean to that far-right side.
Bianca: Yes, absolutely. For more context, earlier this month, the DOJ announced that they had arrested this Afghan national who was based in Oklahoma City. Like you said, for plotting an attack on election day on behalf of ISIS. And then he was arrested by the FBI for purchasing two AK 37s with his brother-in-law, who was an accomplice, and the suspect admitted that he was going to carry out the attack on election day and expected to die in that attack and go down as a martyr. In terms of his connections to Telegram, the suspect interestingly was very active in pro-ISIS telegram groups and allegedly saved ISIS propaganda, as was noted in the indictment document, to his iCloud account and I believe also to his Google account. So, ISIS propaganda from Telegram. He had also been in contact with an ISIS associate via Telegram who was giving him guidance regarding the upcoming attack that he was plotting. So definitely Telegram connections there and it’s ultimately not that surprising given that Telegram is notorious for being a hotbed or extremist activity, particularly for ISIS. There are lots of pro-ISIS groups there. And not just, of course, pro-ISIS groups, unfortunately, a lot of domestic extremist groups, as I noted, that being one of the main issues leading to the CEO’s arrest recently in France. But absolutely,
the individual had ties to individuals in ISIS,and those connections were through Telegram.
Erin: Yeah, it’s interesting how we see this group for really being used in Telegram and how the arrest of the CEO may impact that. I mean, we definitely saw after the announcements that Telegram are going to cooperate with law enforcement and individuals talking about moving to other messaging platforms. As I said, I’m not sure, that they’re all going to move, but I think it’s interesting that they’re having those conversations because Telegram really has been that hotbed and obviously, we’re talking about elections now, but I think you can go to any big event that’s happened or any kind of extremist group and find some kind of telegram footprint for them at the moment.
Well, in 2016, we, of course, had Russia leading extensive disinformation operations against the U.S., also in an effort to interfere with the presidential election, and, as you mentioned, the aim of those campaigns was to sow discord and undermine American democracy, and they used bots and intelligence officers that were masquerading as American citizens to spread this information and again exacerbate divisions. And these operations have not stopped, right? We’re still seeing that activity today. But what’s different now, in 2024 compared to 2016, is that other nation-states have significantly ramped up their influence operations as well, you know, as I mentioned, particularly Iran, and they’re engaging in similar large-scale campaigns, you know, Iran in this election has really emerged as a prominent actor in the current disinformation landscape in the lead-up to November 5th. They’ve already carried out cyber-attacks against presidential candidates, campaigns, they’ve actively disseminated disinformation meant to sow discord among American voters like Russia did in 2016. And you know, as I mentioned, we’ve also seen China similarly amplifying divisive rhetoric and there are Chinese linked influence operations
and campaigns that are spreading disinformation and conspiracy theories.
So, to answer your question, ultimately, this year is quite different from 2016, just in terms of the variety of actors that we’re seeing engaging in large scale influence operations. But also importantly, I think that what’s particularly concerning right now, and especially different from 2016 is the way that, as I’ve noted, conspiracy theories have effectively become mainstream. And that’s really not to say that 2016 was devoid of conspiracy theories. There were, of course, conspiracy theories in 2016 and there will always be conspiracy theories. But the scale of their reach today is on a completely different level. As I mentioned, there are mainstream platforms, particularly X, so not just the dark web, where false claims about presidential candidates and regarding the validity of the election, these conspiracy theories are gaining millions of views. And part of the reason that their It is so significant is that you have US prominent US based individuals that are amplifying those conspiracy theories and allowing it to gain even more traction. And because of that, these conspiracy theories have entered the mainstream and are
not just in the dark corners of the internet anymore. So, I think that’s really the the main difference between 2016 and 2024.
Erin: Yeah, I feel like domestically, people are just more emboldened to share their views regardless of if they’re conspiracy theories or even if they’re not, they’re just, I think people are less concerned about the impact that that’s going to have as you say, because on both sides, so many politicians are backing that kind of rhetoric. And as you say, it’s interesting, obviously, we focus on the dark web and
dark web adjacent, that it’s kind of impossible to look at this topic these days without looking at social media, because there’s such an overlap and they interact so much, like the things that are shared on Twitter, and then immediately put onto Telegram and vice versa. And there’s no one policing that or checking that. And the likes of Facebook and Instagram will try and say, this isn’t true or this isn’t verified or read this at your own cost, but Twitter seems to have moved away from doing that a little bit in recent years. And yeah, I think it’s very difficult with the amount of information that individuals are receiving to make sense of everything that’s going around and just the pure, as you say, the sheer size of data and conspiracy theories and things that are being shared now compared to previously. I can see why it’s difficult for people to make a judgment. And as I said earlier, like once these things are out there, it’s really hard to walk them back. There’s a lot of people that however many times you tell them something isn’t true and it’s been debunked, aren’t going to believe you.
Yes, absolutely. It’s very likely that we’ll see a pretty significant increase in disinformation targeting American voters as we get closer to November 5th. Russia, Iran and China are well aware of the fact that their influence operations can have a greater impact closer to the date of the election when they can influence voters. And as individuals have already begun to vote. And US intelligence officials are actually already warning of this increase. There were reports stating that influence operations targeting specific political campaigns have already increased. I think it’s really important to note, though, that foreign influence operations aren’t going to stop after November 5th. And the ODNI actually just released a report, I think yesterday, warning that Russia, China, and Iran are all expected to continue their influence operations well through inauguration day. And it’s very likely that they’ll continue spreading disinformation again meant to sow discord among Americans and to undermine trust in the election process. And that’s something we already saw with the presidential election in 2020. Election officials and intelligence officials have particularly warned that there’s a possibility that Russia, Iran and China could actually try to stoke post-election violence. So that’s something that definitely needs to be closely monitored. But yes, we’re expecting to see an increase in that kind of activity leading up to November
5th, but also well after November 5th up until inauguration day.
I think the most important step and the quickest one, at least for individuals, to combat disinformation and this it seems very simple but it’s to verify sources. So before sharing or reposting anything online, just taking a few minutes to check the credibility of the source and also take the time to cross reference and see if you can find another source that’s also a reputable or sharing the same information. So if you can cross-reference, there’s a greater likelihood that that information is valid. For organizations, I’d say carrying out fact-checking initiatives already is vital. Social media platforms, it’s worth noting, have the ability to give users the opportunity to report disinformation. And that’s huge. But Twitter, again, coming back to Twitter unfortunately removed a feature that allowed users to report misinformation and disinformation. So, bringing that back that feature, I think, and for other organizations and social media platforms implementing that features is a pretty vital first step to combat election related disinformation.
But yeah, fact checking in general and verifying your sources is the way to go.
Erin: I think knowing where something came from and make sure that it’s not just circular reporting. Everything is coming from one place. Usually, you know, a place that may not be that legitimate is such an important thing to do. And I think having discussions about that. So just going back to the dark web briefly, I think we’ve talked about how there’s a lot of crossover that’s going to mainstream social media sites. Would you say that there’s anything specific on the dark web relating to elections? I know like in the past, we’ve seen things related to like voting machines and hacking. And you know, DEF CON is famous for having their hacking village. Have we seen an increase in that kind of discussion or not really? Absolutely, seeing a lot of narratives about kind of questioning election integrity, like you said, voting systems.
Bianca: Absolutely, a lot of that on the dark web and on telegram channels, especially in a lot of these channels that have as many as, you know, and groups that have as many as 200,000 subscribers. Again, a lot of them are aimed at undermining confidence in the election process in the U.S. and sowing discord. So definitely seeing those conspiracy theories dominant on Telegram, but as you noted as well, you really can’t look at it in the vacuum, right, because a lot of those disinformation narratives are also being seen on mainstream platforms. So, it’s interesting that we’re seeing this kind of dialogue between the two spaces and that theories that previously would have probably been limited to the corners of the internet as it were are now very much so in the mainstream. And it’s sometimes even hard to identify where they first originated? Just because of the fact that we’re seeing them all over the place, all these
conspiracy theories.
Erin: Yeah, absolutely. And I think that’s the thing I think on the dark web, the more things that we see are the traditional dark web things that you see people doing, like talking about hacking, or talking about, you know, leaking voter information or information that could be used relating to voters. That’s the dark web bread and butter whereas you know outside of things like Telegram I’m not sure that people are using the dark web for those kinds of conversations because they don’t need to they can do it on mainstream platforms without fear of you know reprisal so it’s a really interesting shift I think that you’re highlighting.
Well, just highlighting again I’m glad that you asked the question about things people can do to combat disinformation and just flagging again the importance of verifying sources. There are lots of great sources online as well from CISA on step selection officials can take to ensure to ensure that we’re combating disinformation right now. Organizations and individuals can do a lot to combat this rise in misinformation and disinformation that we’re seeing right now. Thank you all for joining this webinar.
Erin: That just made me think as well – I was at some sessions recently where I feel like you can’t have a dark web or an OSINT or a chat these days about mentioning AI. And I just feel like these days with the way AI is improving and deep fakes in terms of generating stories and generating videos and generating images is just something people that need to be so aware of and goes back to your point about really validating those sources because things can look so believable these days in a way that they couldn’t several years ago. So I think that’s an interesting point as well.
Products
Services
Use Cases