Trump’s time is up, but his Twitter legacy lives on in the global spread of QAnon conspiracy theories



http://www.shutterstock.com

Verica Rupar, Auckland University of Technology and Tom De Smedt, University of Antwerp

“The lie outlasts the liar,” writes historian Timothy Snyder, referring to outgoing president Donald Trump and his contribution to the “post-truth” era in the US.

Indeed, the mass rejection of reason that erupted in a political mob storming Capitol Hill mere weeks before the inauguration of Joe Biden tests our ability to comprehend contemporary American politics and its emerging forms of extremism.

Much has been written about Trump’s role in spreading misinformation and the media failures that enabled him. His contribution to fuelling extremism, flirting with the political fringe, supporting conspiracy theories and, most of all, Twitter demagogy created an environment in which he has been seen as an “accelerant” in his own right.

If the scale of international damage is yet to be calculated, there is something we can measure right now.

In September last year, the London-based Media Diversity Institute (MDI) asked us to design a research project that would systematically track the extent to which US-originated conspiracy theory group QAnon had spread to Europe.

Titled QAnon 2: spreading conspiracy theories on Twitter, the research is part of the international Get the Trolls Out! (GTTO) project, focusing on religious discrimination and intolerance.

Twitter and the rise of QAnon

GTTO media monitors had earlier noted the rise of QAnon support among Twitter users in Europe and were expecting a further surge of derogatory talk ahead of the 2020 US presidential election.

We examined the role religion played in spreading conspiracy theories, the most common topics of tweets, and what social groups were most active in spreading QAnon ideas.

We focused on Twitter because its increasing use — some sources estimate 330 million people used Twitter monthly in 2020 — has made it a powerful political communication tool. It has given politicians such as Trump the opportunity to promote, facilitate and mobilise social groups on an unprecedented scale.




Read more:
QAnon and the storm of the U.S. Capitol: The offline effect of online conspiracy theories


Using AI tools developed by data company Textgain, we analysed about half-a-million Twitter messages related to QAnon to identify major trends.

By observing how hashtags were combined in messages, we examined the network structure of QAnon users posting in English, German, French, Dutch, Italian and Spanish. Researchers identified about 3,000 different hashtags related to QAnon used by 1,250 Twitter profiles.

Protestors with flag showing US flag and QAnon logo
Making the connection: demonstrators in Berlin in 2020 display QAnon and US imagery.
http://www.shutterstock.com

An American export

Every fourth QAnon tweet originated in the US (300). Far behind were tweets from other countries: Canada (30), Germany (25), Australia (20), the United Kingdom (20), the Netherlands (15), France (15), Italy (10), Spain (10) and others.

We examined QAnon profiles that share each other’s content, Trump tweets and YouTube videos, and found over 90% of these profiles shared the content of at least one other identified profile.

Seven main topics were identified: support for Trump, support for EU-based nationalism, support for QAnon, deep state conspiracies, coronavirus conspiracies, religious conspiracies and political extremism.




Read more:
Far-right activists on social media telegraphed violence weeks in advance of the attack on the US Capitol


Hashtags rooted in US evangelicalism sometimes portrayed Trump as Jesus, as a superhero, or clad in medieval armour, with underlying Biblical references to a coming apocalypse in which he will defeat the forces of evil.

Overall, the coronavirus pandemic appears to function as an important conduit for all such messaging, with QAnon acting as a rallying flag for discontent among far-right European movements.

Measuring the toxicity of tweets

We used Textgain’s hate-speech detection tools to assess toxicity. Tweets written in English had a high level of antisemitism. In particular, they targeted public figures such as Jewish-American billionaire investor and philanthropist George Soros, or revived old conspiracies about secret Jewish plots for world domination. Soros was also a popular target in other languages.

We also found a highly polarised debate around the coronavirus public health measures employed in Germany, often using Third Reich rhetoric.

New language to express negative sentiments was coined and then adopted by others — in particular, pejorative terms for face masks and slurs directed at political leaders and others who wore masks.

Accompanying memes ridiculed political leaders, displaying them as alien reptilian overlords or antagonists from popular movies, such as Star Wars Sith Lords and the cyborg from The Terminator.

Most of the QAnon profiles tap into the same sources of information: Trump tweets, YouTube disinformation videos and each other’s tweets. It forms a mutually reinforcing confirmation bias — the tendency to search for, interpret, favour, and recall information that confirms prior beliefs or values.




Read more:
Despite being permanently banned, Trump’s prolific Twitter record lives on


Where does it end?

Harvesting discontent has always been a powerful political tool. In a digital world this is more true than ever.

By mid 2020, Donald Trump had six times more followers on Twitter than when he was elected. Until he was suspended from the platform, his daily barrage of tweets found a ready audience in ultra-right groups in the US who helped his misinformation and inflammatory rhetoric jump the Atlantic to Europe.

Social media platforms have since attempted to reduce the spread of QAnon. In July 2020, Twitter suspended 7,000 QAnon-related accounts. In August, Facebook deleted over 790 groups and restricted the accounts of hundreds of others, along with thousands of Instagram accounts.




Read more:
Trump’s Twitter feed shows ‘arc of the hero,’ from savior to showdown


In January this year, all Trump’s social media accounts were either banned or restricted. Twitter suspended 70,000 accounts that share QAnon content at scale.

But further Textgain analysis of 50,000 QAnon tweets posted in December and January showed toxicity had almost doubled, including 750 tweets inciting political violence and 500 inciting violence against Jewish people.

Those tweets were being systematically removed by Twitter. But calls for violence ahead of the January 20 inauguration continued to proliferate, Trump’s QAnon supporters appearing as committed and vocal as ever.

The challenge for both the Biden administration and the social media platforms themselves is clear. But our analysis suggests any solution will require a coordinated international effort.The Conversation

Verica Rupar, Professor, Auckland University of Technology and Tom De Smedt, Postdoctoral research associate, University of Antwerp

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Why QAnon is attracting so many followers in Australia — and how it can be countered



SCOTT BARBOUR/AAP

Kaz Ross, University of Tasmania

On September 5, a coalition of online groups are planning an Australia-wide action called the “Day of Freedom”. The organisers claim hundreds of thousands will join them on the streets in defiance of restrictions on group gatherings and mask-wearing mandates.

Some online supporters believe Stage 5 lockdown will be introduced in Melbourne the following week and the “Day of Freedom” is the last chance for Australians to stand up to an increasingly tyrannical government.

The action is the latest in a series of protests in Australia against the government’s COVID-19 restrictions. The main issues brought up during these protests centre around 5G, government surveillance, freedom of movement and, of course, vaccinations.

And one general conspiracy theory now unites these disparate groups — QAnon.




Read more:
QAnon believers will likely outlast and outsmart Twitter’s bans


Why QAnon has exploded in popularity globally

Since its inception in the US in late 2017, QAnon has morphed beyond a specific, unfounded claim about President Donald Trump working with special counsel Robert Mueller to expose a paedophile ring supposedly run by Bill and Hillary Clinton and the “deep state”. Now, it is an all-encompassing world of conspiracies.

QAnon conspiracy theories now include such wild claims as Microsoft founder Bill Gates using coronavirus as a cover to implant microchips in people, to governments erecting 5G towers during lockdown to surveil the population.

Donald Trump has tacitly endorsed QAnon, saying its followers
Leah Millis/Reuters

Last week, Facebook deleted over 790 groups, 100 pages and 1,500 ads tied to QAnon and restricted the accounts of hundreds of other Facebook groups and thousands of Instagram accounts. QAnon-related newsfeed rankings and search results were also downgraded.

Facebook is aiming to reduce the organising ability of the QAnon community, but so far such crackdowns seem to have had little effect on the spread of misinformation.

In July, Twitter removed 7,000 accounts, but the QAnon conspiracy has become even more widespread since then. A series of global “save the children” protests in the last few weeks is proof of how resilient and adaptable the community is.

Why Australians are turning to QAnon in large numbers

QAnon encourages people to look for evidence of conspiracies in the media and in government actions. Looking back over the last several years, we can see a range of events or conspiracy theories that have helped QAnon appeal to increasing numbers of followers in Australia.

1) Conspiracies about global governance

In 2015, Senator Malcolm Roberts claimed the UN’s 1992 “Agenda 21” plan for sustainable development as a foreign global plan aimed at depriving nations of their sovereignty and citizens of their property rights.

The belief that “Agenda 21” is a blueprint for corrupt global governance has become a core tenet of QAnon in Australia.

Any talk of “global bankers and cabals” directly taps into longstanding anti-Semitic conspiracies about supposed Jewish world domination often centred on the figure of billionaire George Soros. The pandemic and QAnon have also proven to be fertile ground for neo-Nazis in Australia.

2) Impact of the far-right social media

QAnon has its roots on the far-right bulletin boards of the websites 4Chan and 8Chan. Other campaigns from the same sources, such as the “It’s OK to be White” motion led by One Nation leader Pauline Hanson in the Senate, have been remarkably successful in Australia, showing our susceptibility to viral trolling efforts.

3) Perceived paedophiles in power

During the Royal Commission into Institutional Responses to Child Abuse, Senator Bill Heffernan tried unsuccessfully to submit the names of 28 prominent Australians which he alleged were paedophiles.

His failure is widely shared in QAnon circles as proof of a cover-up of child abuse at all levels of Australian government. The belief the country is run by a corrupt paedophile cabal is the most fundamental plank of the QAnon platform.

Among the QAnon conspiracy theories in the US is that Hollywood actors have engaged in crimes against children.
CHRISTIAN MONTERROSA/EPA

4) Increasingly ‘unaccountable and incompetent’ governments

A number of recent events have eroded public trust in government — from the “sports rorts affair” to the Witness K case — and all serve to further fuel the QAnon suspicion of authority figures.

5) Longstanding alternative health lobbies

Australia’s sizeable anti-vax movement has found great support in the QAnon community. Fear about mandatory vaccinations is widespread, as is a distrust of “big pharma”.

Also, the continuing roll-out of 5G technology throughout the pandemic has confirmed the belief among QAnon followers that there are ulterior motives for the lockdown. Wellness influencers such as celebrity chef Pete Evans have amplified these messages to their millions of followers.

6) The ‘plandemic’ and weaponising of COVID-19

In the QAnon world, debates about the origin of the coronavirus, death rates, definition of cases, testing protocols and possible treatments are underpinned by a belief that governments are covering up the truth. Many believe the virus isn’t real or deadly, or it was deliberately introduced to hasten government control of populations.

Understanding QAnon followers

Understanding why people become part of these movements is the key to stopping the spread of the QAnon virus. Research into extremist groups shows four elements are important:

1) Real or perceived personal and collective grievances

This year, some of these grievances have been linked directly to the pandemic: government lockdown restrictions, a loss of income, fear about the future and disruption of plans such as travel.

2) Networks and personal ties

Social media has given people the ability to find others with similar grievances or beliefs, to share doubts and concerns and to learn about connecting theories and explanations for what may be troubling them.

3) Political and religious ideologies

QAnon is very hierarchically structured, similar to evangelical Christianity. QAnon followers join a select group of truth seekers who are following the “light” and have a duty to wake up the “sheeple”. Like some religions, the QAnon world is welcoming to all and provides a strong sense of community united by a noble purpose and hope for a better future.

4) Enabling environments and support structures

In the QAnon world, spending many hours on social media is valued as doing “research” and seen as an antidote to the so-called fake news of the mainstream media.

Social isolation, a barrage of changing and confusing pandemic news and obliging social media platforms have been a boon for QAnon groups. However, simply banning or deleting groups runs the danger of confirming the beliefs of QAnon followers.




Read more:
How misinformation about 5G is spreading within our government institutions – and who’s responsible


So what can be done?

Governments need to be more sensitive in their messaging and avoid triggering panic around sensitive issues such as mandatory or forced vaccinations. Transparency about government actions, policies and mistakes all help to build trust.

Governments also need to ensure they are providing enough resources to support people during this challenging time, particularly when it comes to mental and emotional well-being. Resourcing community-building to counter isolation is vital.

For families and friends, losing a loved one “down the Q rabbit hole” is distressing. Research shows that arguing over facts and myths doesn’t work.

Like many conspiracy theories, there are elements of truth in QAnon. Empathy and compassion, rather than ridicule and ostracism, are the keys to remaining connected to the Q follower in your life. Hopefully, with time, they’ll come back.The Conversation

Kaz Ross, Lecturer in Humanities (Asian Studies), University of Tasmania

This article is republished from The Conversation under a Creative Commons license. Read the original article.

How misinformation about 5G is spreading within our government institutions – and who’s responsible



Aris Oikonomou/EPA

Michael Jensen, University of Canberra

“Fake news” is not just a problem of misleading or false claims on fringe websites, it is increasingly filtering into the mainstream and has the potential to be deeply destructive.

My recent analysis of more than 500 public submissions to a parliamentary committee on the launch of 5G in Australia shows just how pervasive misinformation campaigns have become at the highest levels of government. A significant number of the submissions peddled inaccurate claims about the health effects of 5G.

These falsehoods were prominent enough the committee felt compelled to address the issue in its final report. The report noted:

community confidence in 5G has been shaken by extensive misinformation
preying on the fears of the public spread via the internet, and presented as facts, particularly through social media.

This is a remarkable situation for Australian public policy – it is not common for a parliamentary inquiry to have to rebut the dodgy scientific claims it receives in the form of public submissions.

While many Australians might dismiss these claims as fringe conspiracy theories, the reach of this misinformation matters. If enough people act on the basis of these claims, it can cause harm to the wider public.

In late May, for example, protests against 5G, vaccines and COVID-19 restrictions were held in Sydney, Melbourne and Brisbane. Some protesters claimed 5G was causing COVID-19 and the pandemic was a hoax – a “plandemic” – perpetuated to enslave and subjugate the people to the state.




Read more:
Coronavirus, ‘Plandemic’ and the seven traits of conspiratorial thinking


Misinformation can also lead to violence. Last year, the FBI for the first time identified conspiracy theory-driven extremists as a terrorism threat.

Conspiracy theories that 5G causes autism, cancer and COVID-19 have also led to widespread arson attacks in the UK and Canada, along with verbal and physical attacks on employees of telecommunication companies.

The source of conspiracy messaging

To better understand the nature and origins of the misinformation campaigns against 5G in Australia, I examined the 530 submissions posted online to the parliament’s standing committee on communications and the arts.

The majority of submissions were from private citizens. A sizeable number, however, made claims about the health effects of 5G, parroting language from well-known conspiracy theory websites.

A perceived lack of “consent” (for example, here, here and here) about the planned 5G roll-out featured prominently in these submissions. One person argued she did not agree to allow 5G to be “delivered directly into” the home and “radiate” her family.




Read more:
No, 5G radiation doesn’t cause or spread the coronavirus. Saying it does is destructive


To connect sentiments like this to conspiracy groups, I looked at two well-known conspiracy sites that have been identified as promoting narratives consistent with Russian misinformation operations – the Centre for Research on Globalization (CRG) and Zero Hedge.

CRG is an organisation founded and directed by Michel Chossudovsky, a former professor at the University of Ottawa and opinion writer for Russia Today.

CRG has been flagged by NATO intelligence as part of wider efforts to undermine trust in “government and public institutions” in North America and Europe.

Zero Hedge, which is registered in Bulgaria, attracts millions of readers every month and ranks among the top 500 sites visited in the US. Most stories are geared toward an American audience.

Researchers at Rand have connected Zero Hedge with online influencers and other media sites known for advancing pro-Kremlin narratives, such as the claim that Ukraine, and not Russia, is to blame for the downing of Malaysia Airlines flight MH17.

Protesters targeting the coronavirus lockdown and 5G in Melbourne in May.
Scott Barbour/AAP

How it was used in parliamentary submissions

For my research, I scoured the top posts circulated by these groups on Facebook for false claims about the health threats posed by 5G. Some stories I found had headlines like “13 Reasons 5G Wireless Technology will be a Catastrophe for Humanity” and “Hundreds of Respected Scientists Sound Alarm about Health Effects as 5G Networks go Global”.

I then tracked the diffusion of these stories on Facebook and identified 10 public groups where they were posted. Two of the groups specifically targeted Australians – Australians for Safe Technology, a group with 48,000 members, and Australia Uncensored. Many others, such as the popular right-wing conspiracy group QAnon, also contained posts about the 5G debate in Australia.




Read more:
Conspiracy theories about 5G networks have skyrocketed since COVID-19


To determine the similarities in phrasing between the articles posted on these Facebook groups and submissions to the Australian parliamentary committee, I used the same technique to detect similarities in texts that is commonly used to detect plagiarism in student papers.

The analysis rates similarities in documents on a scale of 0 (entirely dissimilar) to 1 (exactly alike). There were 38 submissions with at least a 0.5 similarity to posts in the Facebook group 5G Network, Microwave Radiation Dangers and other Health Problems and 35 with a 0.5 similarity to the Australians for Safe Technology group.

This is significant because it means that for these 73 submissions, 50% of the language was, word for word, exactly the same as the posts from extreme conspiracy groups on Facebook.

The first 5G Optus tower in the suburb of Dickson in Canberra.
Mick Tsikas/AAP

The impact of misinformation on policy-making

The process for soliciting submissions to a parliamentary inquiry is an important part of our democracy. In theory, it provides ordinary citizens and organisations with a voice in forming policy.

My findings suggest Facebook conspiracy groups and potentially other conspiracy sites are attempting to co-opt this process to directly influence the way Australians think about 5G.

In the pre-internet age, misinformation campaigns often had limited reach and took a significant amount of time to spread. They typically required the production of falsified documents and a sympathetic media outlet. Mainstream news would usually ignore such stories and few people would ever read them.

Today, however, one only needs to create a false social media account and a meme. Misinformation can spread quickly if it is amplified through online trolls and bots.

It can also spread quickly on Facebook, with its algorithm designed to drive ordinary users to extremist groups and pages by exploiting their attraction to divisive content.

And once this manipulative content has been widely disseminated, countering it is like trying to put toothpaste back in the tube.

Misinformation has the potential to undermine faith in governments and institutions and make it more challenging for authorities to make demonstrable improvements in public life. This is why governments need to be more proactive in effectively communicating technical and scientific information, like details about 5G, to the public.

Just as nature abhors a vacuum, a public sphere without trusted voices quickly becomes filled with misinformation.The Conversation

Michael Jensen, Senior Research Fellow, Institute for Governance and Policy Analysis, University of Canberra

This article is republished from The Conversation under a Creative Commons license. Read the original article.