SMA hosted a speaker session, presented by Dr. Todd Helmus (RAND), Dr. William Marcellino (RAND), and Dr. Marek Posard (RAND) as a part of its SMA EUCOM Speaker Series.
Dr. Posard began the speaker session by stating that he and his colleagues examined Russia’s past interference in the 2016 US presidential election in an effort to mitigate the impact that Russian interference could have on the 2020 election. He emphasized that Russia’s ultimate goal in its election interference efforts is not to aid one candidate in winning but rather to paralyze the US election system. Russia aims to achieve this goal by halting political cooperation between legislators, further polarizing the US civilian population, and ultimately leading citizens to not participate in future democratic elections. Dr. Posard stated that while the US has been targeted by a foreign power’s influence campaigns before, Russia’s influence campaign has been the most effective due to its efficient use of technology.
Dr. William Marcellino then described the research team’s methodology. According to Dr. Marcellino, the team documented more than two million Twitter conversations from 630 unique accounts to create an overlay of the communication network that makes up the information environment (IE). The team created its model of the IE by first using a community detection algorithm and then giving the data to human analysts. Dr. Marcellino stated that the research team was able to identify which organizations communicate with each other and which groups act as communication hubs.
Dr. Marcellino went on to point out that Russian bots use parallel tactics for targeting both politically conservative and progressive groups. Russia targets these groups by talking about the world through cultural and societal issues, including racism, and divisive issues, such as immigration. He added that as analysts continue to help the algorithm learn to detect sarcasm and understand the use of persuasion, the team will better understand how and why Russian trolls target specific groups. This algorithm improvement could also help the RAND team evaluate why Russian bots that targeted politically conservative groups used sarcasm more frequently than bots that targeted progressive groups. Dr. Marcellino ended his portion of the presentation by stating that the research team will eventually need to expand its language capabilities for further research, as Russian trolls are active in many different languages.
Next, Dr. Helmus claimed that Russian bots use politicized memes to target US citizens more than any other tool. He stated that Russian trolls are not concerned with using facts. Instead, they will use both false and true information if it supports an individual’s preconceived worldview. This, in turn, will likely deepen societal fissures. Dr. Helmus then identified an encouraging sign, which is that regardless of political affiliation, US citizens reacted negatively after learning that information came form a Russian source, even if the information supported their political beliefs. He concluded by emphasizing that if information can be attributed to a Russian source, audiences will become less excited about and drawn to the information.
According to Dr. Posard, he and his colleagues convened a focus group, which was divided into three sub-groups: 1) Individuals who like Republicans and dislike Democrats, 2) individuals who like Democrats and dislike Republicans, and 3) individuals who are non-partisan. He commented that most individuals in the study did not want to claim a party affiliation. Instead, individuals wished to describe themselves by using other identifiers, such as their professions. Nonetheless, Russia considers the US’s current culture of activism as an opportunity to sow discord. Dr. Posard stated that participants in the focus group were shown a conservative-leaning meme, a progressive-leaning meme, and a public service announcement (PSA) about misinformation. As Dr. Helmus explained prior, most individuals had a negative emotional reaction when the memes were attributed to a Russian source. Dr. Posard believes this is evidence that a lot of social media in the US suffers from mistaken attribution, which is an issue that can be corrected. He added that audience members also appreciated the PSA because it was from a government organization that had authority.
Dr. Posard concluded the presentation by providing three recommendations to US government officials: 1) Collect intelligence on misinformation before an election cycle, 2) release a PSA about misinformation but do not talk directly about societal issues, and 3) flag posts from Russia or other countries that are meant to manipulate US politics.
Note: We are aware that many government IT providers have blocked access to YouTube from government machines during the pandemic in response to bandwidth limitations. We recommend viewing the recording on YouTube from a non-government computer or listening to the audio file (below), if you are in this position.
Please email Ms. Nicole Omundson (nomundson@nsiteam.com) for access to our speakers’ slides.
To access our speakers’ relevant reports, please visit https://www.rand.org/nsrd/projects/cal-oes.html.
Comments