Subscribe To Our Newsletter!

Stay up to date on the top trending threats as well as the top stories in Security, Networks, Cloud, IT Ops & AIOps.

Impact of Disinformation Campaigns to Disrupt US Elections

Post by Brandon Hoffman In Security on Oct 27, 2020

I recently sat down with fellow cybersecurity experts Sean Cordero and John Bambenek to discuss recent cyber activities targeting US elections. Hands down, we unanimously agreed that “information warfare and disinformation campaigns” is peaking at epic levels, far surpassing in sophistication as compared to those encountered in 2018 elections.

With only a week to go before final votes are cast, adversaries have increased their cyber efforts targeting both elections process and infrastructure (ransomware), and propaganda efforts to ignite heated emotions and disillusionment with voters. Major online players (Google, Facebook, Twitter, Microsoft) have stepped up measures to strengthen security across the board including dissemination of “real vs. not real” information.

Nevertheless, our internal research (Knowledge Now) substantiates recent findings of “information warfare” activities. Facebook recently discovered two networks leveraging social media to spread government propaganda. Related accounts, pages, and groups were removed due to violation of Facebook’s policies regarding CIB (coordinated inauthentic behavior).

The networks originated in the Philippines and in China. Facebook removed over 200 accounts, 42 pages, 9 groups, and 27 Instagram accounts related to these campaigns. The Chinese account network used GANs (Generative Adversarial Networks), an AI technique capable of fabricating faces in an attempt to elude detection. The campaign mounted pages and groups in support for each presidential candidate. The malicious intent was to spread propaganda about a particular party or candidate and gather information on users who support a particular candidate or ideology.

We also noticed the use of mobile apps and ads to spread “fake news.” In response, Android App Store and Google Play, took measures to prohibit apps that could be used for political influence campaigns. Google defined “misrepresentation” as apps or developers that “impersonate any person or organization, or that misrepresent or conceal their ownership or primary purpose.” This includes apps that misrepresent their location or affiliation and contain content relating to politics or social issues. Google has applied similar parameters to its Ads policy.

Examples of “deepfake” video, audio and photo manipulation, designed to deliver misinformation, also hit airwaves or at least tried to. Recent news cited Fox, CNN, MSNBC affiliates refusal to air two “deepfake” election ads featuring manipulated videos of North Korean dictator Kim Jong Un and Russian President Vladimir Putin warning Americans that their democracy is in danger.

“Deep fakes” and other synthetic media are used to fabricate “evidence” to convince a target audience that a candidate or other individual did or said something that is scandalous or potentially damaging to their reputation, thereby affecting their candidacy for an elected or appointed position. It’s smear campaigns at its finest, leveraging social media to accomplish its malicious intent.

 

 

To understand threat actors motives around “disinformation campaigns,” here are insights to provide perspective from John and Sean.

Sean: As much as US voters are aware of fake news, people are still numb to information and unable to discerns what is “real” or “not real.” Nation states realized this and are using it to their advantage.

John: Russia exploited our core vulnerability in 2016, and have amped up their activities. They’re leveraging the emotional, psychological impacts of COVID-19, lockdowns and other related conditions.

As people are more highly emotionally charged, they tend to revert back to their pre-existing notions or views regarding presidential candidates and party lines. Russia is infiltrating both camps with organized threats and disinformation campaigns. Their primary foreign policy isn’t aimed necessarily on one candidate or the other. Their policy objective is de-stabilization.

The same holds true by other nation states. China and Taiwan support US trade efforts because it benefits them. Ultimately, they’re all going after their long term geopolitical objectives to advance their interests in the World, without a countervailing United States or Western world to push back on.

We’re seeing US citizens angered and emotionally charged by the elections political stage as with COVID-19 pandemic. They’re not paying attention as closely to “fake news” or to malicious activities, I.e. phishing emails, SMS texts or criminal actions like donations to fake fundraising entities. Blinded by emotions, this scenario enables information warfare to be effective. The nation states’ end goal is to expand their country’s sphere of influence no matter what.

Sean: Social media and our use of sharing and communicating information is driving the prevalence of disinformation campaigns. The disinformation spreads as campaigns begin in a drip-like process. It slowly and continuously makes their way into social media which then gets exponentially expanded and amplified.

The ways disinformation is communicated and spread flows through your followers or friends. The “amplification aspect” is a powerful capability especially viewed from a global context. Social media and the speed of disinformation is growing and accelerating. There’s no controls nor “reality checks” in what’s “true or untrue.” It’s very troubling.

Brandon: In closing, US voters and organizations must be on full alert over the next week for disruptions to elections process, voting and propaganda activities. Take extra security measures to protect your right to vote. And protect yourself and your information in order to minimize your exposure to attacks and compromises.

Visit our elections resources to become more aware and informed:

 

 

 

 

CONNECT WITH US

About Author

Technology enthusiast, avid traveller and culinary expert (in my own mind). Trying to contribute meaningfully to the cyber security community and solve problems while maintaining my time ninja status.