Robyn Torok, Edith Cowan University
Prime Minister Malcolm Turnbull has joined Britain’s Prime Minister Theresa May in calling on social media companies to crack down on extremist material being published by users.
It comes in the wake of the recent terror attacks in Australia and Britain.
Facebook is considered a hotbed for terrorist recruitment, incitement, propaganda and the spreading of radical thinking. Twitter, YouTube and encrypted services such WhatsApp and Telegram are also implicated.
Addressing the extent of such content on social media requires international cooperation from large social media platforms themselves and encrypted services.
Some of that work is already underway by many social media operators, with Facebook’s rules on this leaked only last month. Twitter says that in one six-month period it has suspended 376,890 accounts related to the promotion of terrorism.
While these measures are a good start, more can be done. A focus on disruption, encryption, recruitment and creating counter-narratives is recommended.
Disruption: remove content, break flow-on
Disruption of terrorists on social media involves reporting and taking down of radical elements and acts of violence, whether that be radical accounts or posted content that breaches community safety and standards.
This is critical both in timing and eradication.
Disruption is vital for removing extreme content and breaking the flow-on effect while someone is in the process of being recruited by extremists.
Taking down accounts and content is difficult as there is often a large volume of content to remove. Sometimes it is not removed as quickly as needed. In addition, extremists typically have multiple accounts and can operate under various aliases at the same time.
Encryption: security authorities need access
When Islamic extremists use encrypted channels, it makes the fight against terrorism much harder. Extremists readily shift from public forums to encrypted areas, and often work in both simultaneously.
Encrypted networks are fast becoming a problem because of the “burn time” (destruction of messages) and the fact that extremists can communicate mostly undetected.
Operations to attack and kill members of the public in the West have been propagated on these encrypted networks.
The extremists set up a unique way of communicating within encrypted channels to offer advice. That way a terrorist can directly communicate with the Islamic State group and receive directives to undertake an attack in a specific country, including operational methods and procedures.
This is extremely concerning, and authorities – including intelligence agencies and federal police – require access to encrypted networks to do their work more effectively. They need the ability to access servers to obtain vital information to help thwart possible attacks on home soil.
This access will need to be granted in consultation with the companies that offer these services. But such access could be challenging and there could also be a backlash from privacy groups.
Recruitment: find and follow key words
It was once thought that the process of recruitment occurred over extended periods of time. This is true in some instances, and it depends on a multitude of individual experiences, personality types, one’s perception of identity, and the types of strategies and techniques used in the recruitment process.
There is no one path toward violent extremism, but what makes the process of recruitment quicker is the neurolinguistic programming (NLP) method used by terrorists.
Extremists use NLP across multiple platforms and are quick to usher their recruits into encrypted chats.
Key terms are always used alongside NLP, such as “in the heart of green birds” (which is used in reference to martyrdom), “Istishhad” (operational heroism of loving death more than the West love life), “martyrdom” and “Shaheed” (becoming a martyr).
If social media companies know and understand these key terms, they can help by removing any reference to them on their platforms. This is being done by some platforms to a degree, but in many cases social media operaters still rely heavily on users reporting inappropriate material.
Create counter-narratives: banning alone won’t work
Since there are so many social media applications, each with a high volume of material that is both very dynamic and fluid, any attempts to deal with extremism must accept the limitations and challenges involved.
Attempts to shut down sites, channels, and web pages are just one approach. It is imperative that efforts are not limited to such strategies.
Counter-narratives are essential, as these deconstruct radical ideologies and expose their flaws in reasoning.
But these counter-narratives need to be more sophisticated given the ability of extremists to manipulate arguments and appeal to emotions, especially by using horrific images.
This is particularly important for those on the social fringe, who may feel a sense of alienation.
It is important for these individuals to realise that such feelings can be addressed within the context of mainstream Islam without resorting to radical ideologies that leave them open to exploitation by experienced recruiters. Such recruiters are well practised and know how to identify individuals who are struggling, and how to usher them along radical pathways.
Ultimately, there are ways around all procedures that attempt to tackle the problem of terrorist extremism on social media. But steps are slowly being taken to reduce the risk and spread of radical ideologies.
This must include counter-narratives as well as the timely eradication of extremist material based on keywords as well as any material from key radical preachers.
Robyn Torok, PhD, PhD – researcher and analyst, Edith Cowan University
This article was originally published on The Conversation. Read the original article.
You must be logged in to post a comment.