A new online safety bill could allow censorship of anyone who engages with sexual content on the internet



shutterstock.

Zahra Zsuzsanna Stardust, UNSW

Under new draft laws, the eSafety Commissioner could order your nude selfies, sex education or slash fiction to be taken down from the internet with just 24 hours notice.

Officially, the Morrison government’s new bill aims to improve online safety.

But in doing so, it gives broad, discretionary powers to the commissioner, with serious ramifications for anyone who engages with sexual content online.

Broad new powers

After initial consultation in 2019, the federal government released the draft online safety bill last December. Public submissions closed on the weekend.

The bill contains several new initiatives, from cyberbullying protections for children to new ways to remove non-consensual intimate imagery.

eSafety Commissioner Julie Inman Grant
Julie Inman Grant was appointed as the government’s eSafety Commissioner in 2016.
Lukas Coch/AAP

Crucially, it gives the eSafety Commissioner — a federal government appointee — a range of new powers.

It contains rapid website-blocking provisions to prevent the circulation of “abhorrent violent material” (such as live-streaming terror attacks). It reduces the timeframe for “takedown notices” (where a hosting provider is directed to remove content) from 48 to 24 hours. It can also require search engines to delete links and app stores to prevent downloads, with civil penalties of up to $111,000 for non-compliance.

But one concerning element of the bill that has not received wide public attention is its takedown notices for so-called “harmful online content”.

A move towards age verification

Due to the impracticality of classifying the entire internet, regulators are now moving towards systems that require access restrictions for certain content and make use of user complaints to identify harmful material.

In this vein, the proposed bill will require online service providers to use technologies to prevent children gaining access to sexual material.




Read more:
Coalition plans to improve online safety don’t address the root cause of harms: the big tech business model


Controversially, the bill gives the commissioner power to impose their own specific “restricted access system”.

This means the commissioner could decide that, to access sexual content, users must upload their identity documents, scan their fingerprints, undergo facial recognition technology or have their age estimated by artificial intelligence based on behavioural signals.

But there are serious issues with online verification systems. This has already been considered and abandoned by similar countries. The United Kingdom dropped its plans in 2019, following implementation difficulties and privacy concerns.

The worst-case scenario here is governments collect databases of people’s sexual preferences and browsing histories that can be leaked, hacked, sold or misused.

eSafety Commissioner as ‘chief censor’

The bill also creates an “online content scheme”, which identifies content that users can complain about.

The bill permits any Australian internet user to make complaints about “class 1” and “class 2” content that is not subject to a restricted access system. These categories are extremely broad, ranging from actual, to simulated, to implied sexual activity, as well as explicit nudity.

In practice, people can potentially complain about any material depicting sex that they find on the internet, even on specific adult sites, if there is no mechanism to verify the user’s age.

Screen shot of YouPorn website
The potential for complaints about sexual material online is very broad under the proposed laws.
http://www.shutterstock.com

The draft laws then allow the commissioner to conduct investigations and order removal notices as they “think fit”. There are no criteria for what warrants removal, no requirement to give reasons, and no process for users to be notified or have opportunity to respond to complaints.

Without the requirement to publish transparent enforcement data, the commissioner can simply remove content that is neither harmful nor unlawful and is specifically exempt from liability for damages or civil proceedings.

This means users will have little clarity on how to actually comply with the scheme.

Malicious complaints and self-censorship

The potential ramifications of the bill are broad. They are likely to affect sex workers, sex educators, LGBTIQ health organisations, kink communities, online daters, artists and anyone who shares or accesses sexual content online.

While previous legislation was primarily concerned with films, print publications, computer games and broadcast media, this bill applies to social media, instant messaging, online games, websites, apps and a range of electronic and internet service providers.

Open palms holding a heart shape and a condom.
Sex education material may be subject to complaints.
http://www.shutterstock.com

It means links to sex education and harm reduction material for young people could be deleted by search engines. Hook up apps such as Grindr or Tinder could be made unavailable for download. Escort advertising platforms could be removed. Online kink communities like Fetlife could be taken down.

The legislation could embolden users – including anti-pornography advocates, disgruntled customers or ex-partners – to make vexatious complaints about sexual content, even where there is nothing harmful about it.

The complaints system is also likely to have a disproportionate impact on sex workers, especially those who turned to online work during the pandemic, and who already face a high level of malicious complaints.

Sex workers consistently report restrictive terms of service as well as shadowbanning and deplatforming, where their content is stealthily or selectively removed from social media.




Read more:
How the ‘National Cabinet of Whores’ is leading Australia’s coronavirus response for sex workers


The requirement for service providers to restrict children’s access to sexual content also provides a financial incentive to take an over-zealous approach. Providers may employ artificial intelligence at scale to screen and detect nudity (which can confuse sex education with pornography), apply inappropriate age verification mechanisms that compromise user privacy, or, where this is too onerous or expensive, take the simpler route of prohibiting sexual content altogether.

In this sense, the bill may operate in a similar way to United States “FOSTA-SESTA” anti-trafficking legislation, which prohibits websites from promoting or facilitating prostitution. This resulted in the pre-emptive closure of essential sites for sex worker safety, education and community building.

New frameworks for sexual content moderation

Platforms have been notoriously poor when it comes to dealing with sexual content. But governments have not been any better.

We need new ways to think about moderating sexual content.

Historically, obscenity legislation has treated all sexual content as if it was lacking in value unless it was redeemed by literary, artistic or scientific merit. Our current classification framework of “offensiveness” is also based on outdated notions of “morality, decency and propriety”.




Read more:
The Chatterley Trial 60 years on: a court case that secured free expression in 1960s Britain


Research into sex and social media suggests we should not simply conflate sex with risk.

Instead, some have proposed human rights approaches. These draw on a growing body of literature that sees sexual health, pleasure and satisfying sexual experiences as compatible with bodily autonomy, safety and freedom from violence.

Others have pointed to the need for improved sex education, consent skills and media literacy to equip users to navigate online space.

What’s obvious is we need a more nuanced approach to decision-making that imagines sex beyond “harm”, thinks more comprehensively about safer spaces, and recognises the cultural value in sexual content.The Conversation

Zahra Zsuzsanna Stardust, Adjunct Lecturer, Centre for Social Research in Health, Research Assistant, Faculty of Law and Justice, UNSW

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The picture of who is affected by ‘revenge porn’ is more complex than we first thought



File 20170504 5995 1b6jndg
Tackling the harms of image-based abuse will require a combination of efforts.
shutterstock

Anastasia Powell, RMIT University; Asher Flynn, Monash University, and Nicola Henry, RMIT University

“Revenge porn” – the sharing of nude or sexual images without consent – has been widely understood as the spiteful actions of a jilted ex-lover. As the term has gained popularity, however, so too have understandings grown about the use of nude or sexual images as a tool of abuse and control by perpetrators of domestic violence. The Conversation

But according to our new research, image-based abuse affects many Australians from across diverse communities and in different types of relationships. The picture is more complex than has previously been identified.

Key findings

Our recent survey of 4,274 Australians aged 16 to 45 found that 23% reported having been a victim of image-based abuse.

Most common were sexual or nude images being taken of them without their consent. 20% of those surveyed reported these experiences.

Also common was sexual or nude images being sent onto others or distributed without consent. 11% of those surveyed reported these experiences.

Finally, 9% of survey respondents had experienced threats that a sexual or nude image would be sent onto others or distributed without their consent.

Some groups in Australia were more likely than others to report having been a victim. One in two Indigenous Australians, one in two Australians with a disability, and one in three lesbian, gay and bisexual Australians reported having suffered image-based abuse victimisation.

Also, 30.9% of those aged 16 to 19, and 27% of those aged 20 to 29, reported having been a victim.

Impacts of image-based abuse

Our survey found victims were almost twice as likely as non-victims to report experiencing high levels of psychological distress.

These impacts were highest for those who had experienced threats to distribute an image. 80% of these people reported high levels of psychological distress, consistent with a diagnosis of moderate to severe depression and/or anxiety disorder. This is a very important finding: it demonstrates the severity of the harm associated with image-based abuse victimisation.

Many victims also reported they were “very” or “extremely” fearful for their safety as a result.

Feeling afraid for your safety is an important indicator of potential stalking and/or domestic violence perpetration. Many legal definitions of stalking and abuse, such as for the purposes of an intervention or protection order, require victims to fear for their safety.

Yet there were also important differences in fear experienced by women compared to men.

Gendered nature

Overall, our survey found both men and women were equally likely to report being a victim of image-based abuse. This shows such abuse is not exclusively a form of gender-based violence.

However, there do appear to be some very important differences in the nature and impacts of such abuse according to gender.

For example, the majority (54%) of victims reported the perpetrator was male. 33% of perpetrators were female. 13% were either unknown or a mixed group of both male and female perpetrators.

Both men and women experienced the majority of abuse from known persons such as an acquaintance, friend, or family member. Women (39%) were more likely than men (30%) to be victimised by an intimate partner or ex-partner.

These gendered patterns are similar to other forms of violence and abuse, where both men and women are most likely to experience abuse from male perpetrators, and where women are more likely than men to experience abuse from an intimate partner or ex-partner.

Women victims were also more likely than men to report feeling afraid for their safety.

For example, for images taken without consent, 32% of women victims reported fear for their safety, as compared to 23% of men. For images distributed without consent, 40% of women and 36% of men said they felt afraid. For images threatened, 50% of women and 42% of men reported they felt fearful for their safety.

Our survey has a key limitation: victims can only self-report their victimisation if they have become aware that a sexual or nude image of them was either taken or distributed without their consent. One only has to scratch the surface of content shared online to see there are many more sites and platforms dedicated to sharing women’s nude or sexual images without their consent than men’s.

Identifying these sites and the ways in which they operate is an important avenue for future research. It may shed further light on the gendered nature of image-based abuse.

Where to from here?

Tackling the harms of image-based abuse will require a combination of efforts.

Working alongside social media and website providers to better detect and remove material is vital to improving responses. Improving legal protections and providing information and support services for victims are also key priorities for reform. Information and support will need to cater to the different experiences of the diverse Australian community.

But whether nude or sexual images are being taken or shared by an intimate partner or ex-partner, a friend, family member or stranger, consent is crucial. That is what lies at the heart of this problem. It will take a long-term prevention plan to promote a culture of consent and respect in the digital age.


If you or someone you know is impacted by sexual assault or family violence, call 1800RESPECT on 1800 737 732 or visit www.1800RESPECT.org.au. In an emergency, call 000.

Anastasia Powell, Senior Research and ARC DECRA Fellow, Justice and Legal Studies, RMIT University; Asher Flynn, Senior Lecturer in Criminology, Monash University, and Nicola Henry, Associate Professor & Vice-Chancellor’s Principal Research Fellow, RMIT University

This article was originally published on The Conversation. Read the original article.

NEW PARTNERSHIP HELPS THOSE TRAPPED IN PORNOGRAPHY TO GET FREE


SurfRecon, Inc., Shelley Lubben, and the Pink Cross Foundation have partnered to bring the latest Internet-safety software to families and communities struggling with Internet pornography and to raise awareness about the Pink Cross Foundation, which helps individuals trapped in the adult-entertainment industry start a new life, reports SurfRecon, Inc..

“We realize that parents are struggling with trying to protect their families from Internet pornography, and filters cannot do the job by themselves—especially when someone in the home has a pornography problem,” said Shelley Lubben, Director of the Pink Cross Foundation, “Filters are great, when they work. But I have heard too many scary stories about smart, tech-savvy kids bypassing an Internet filter to access Internet porn.

“We all need to do a better job watching our kids, and SurfRecon is the tool that parents to do just that.”

The new internet-safety software the partnership promotes is the SurfRecon pornography-detection tool, which works hand in hand with a filter to offer “protection + detection” in a home or business.

Besides raising awareness about SurfRecon pornography-detection tools, the partnership also provides much-needed funding for the Pink Cross Foundation by contributing a portion of all purchases of SurfRecon products through the Pink Cross Foundation’s website back to the foundation.

“I thought teaming-up with the Shelley Lubben and the Pink Cross Foundation was a great idea, because not only are we working together to help parents protect their families from pornography,” said Matthew Yarro, Executive VP for SurfRecon, Inc, “But we are also solving another problem. We are helping individuals, performers and sex workers, leave the adult entertainment industry and start a new life.

“We are proud to be contributing to the Pink Cross Foundation.”

 

What Is a SurfRecon Pornography Detection Tool?

The latest wave in Internet-safety tools is a pornography-detection tool, and SurfRecon is the leader. A pornography-detection tool leverages digital signatures, similar to fingerprints, that uniquely identify a pornographic image or video. SurfRecon currently maintains the largest collection of digital signatures with over 200 million in its database.

The SurfRecon software comes pre-installed on a standard USB thumb drive, which can be used on almost any Windows, Macintosh or Linux computer system. The software is easy to use and allows an individual to quickly and accurately scan a computer for pornographic content. The tool also offers a number of safety tools for individuals reviewing any content found.

 

About SurfRecon, Inc.

SurfRecon, Inc. is an Orem, Utah-based company that develops cutting-edge digital detection technologies. It’s flagship product, SurfRecon, is a pornography-detection tool that is in use by families, businesses and law-enforcement agencies around the world.

 

About Shelley Lubben

Shelley Lubben is a mother, a missionary to the sex industry, fighter for truth and advocate for sex workers and porn performers who are abused by the adult industry.

Shelley is also a former porn actress fighting tirelessly against the pornography industry, which affects most of the world in a destructive way. Unrelenting in the cause of human rights, Shelley is passionate to educate people all around the world about the abusive and illegally operating porn industry as well as inspire the world to stop viewing pornography and stop contributing to the destruction of men and women who are being abused daily in the pornography industry.

 

About The Pink Cross Foundation

The Pink Cross Foundation is a compassionate humanitarian outreach dedicated to helping improve the lives of persons struggling with pornography addiction, sex industry abuse, sexual abuse and more. Shelley Lubben, former porn actress and prostitute in the 90’s, was diagnosed with Bipolar disorder, Post Traumatic Stress Disorder, Depressive Disorder, Impulse Control Disorder and substance abuse due to years of trauma from the sex industry. She was prescribed anti-depressants, Lithium, and sleeping pills and recommended counseling for the next twenty years!

After eight years of recovery at the Champion’s Center, Shelley conquered the horrible effects of her past and became a Champion in life through the power of Jesus Christ. Ten years later Shelley is on a mission to go back to the sex industry to reach out to porn stars and sex workers with the power and love of Jesus Christ. Shelley is also on a mission to smash the illusion of porn and help people overcome pornography addiction.

Report from the Christian Telegraph