Screenshot 2024-05-22 123203

Online Safety Act 2023: what protections are there for women and girls? (Part 2)

Online violence against women and girls is a striking issue in the UK, with many people largely unaware of the abuse perpetrated online.

In the second of three blogs looking at the Act and what it means for women and girls, SARSAS comms volunteer, Toby, gives an overview of the positive impacts the Act may hopefully have, as well as some of its potential problems.

What does the Act aim to do?

The Online Safety Act 2023 (the Act) introduces regulation of online spaces for people and businesses in the UK by imposing regulations on providers of internet services. As well as applying a duty of care to providers of user-to-user services and search services to provide stronger safeguarding and mitigation measures for users of the online platform or service.

This means that content generated by other users, providers of search engines and providers of internet services which publish or display pornography, will now need to be regulated and/or removed.
An estimated 25,000 companies will be impacted by the Act, with the general public seeing the most change to social media platforms.

Social media companies and online platforms now have a responsibility to prevent and remove ‘priority’ illegal content from their sites, this includes online abuse related offences such as stalking, harassment, coercive control and intimate image abuse.1

Content that will now need to be removed from all services impacted by the Act includes:2

child sexual abuse
controlling or coercive behaviour
cyber bullying
extreme sexual violence
extreme violence against animals or people
fraudulent adverts involving scams
hate crime and speech
inciting violence
violence against women and girls
illegal immigration and people smuggling
promoting or facilitating suicide
promoting self-harm
intimate image abuse
selling illegal drugs or weapons
sexual exploitation

What does this mean for women and girls?

The Act should support safer, regulated online spaces in the UK with service users having more control over what they are interacting with online. Muting, filtering, and blocking other users, content, or phrases should support safer spaces where women and girls have more control over what they see online in their day-to-day lives. If a user or a piece of online content is reported and flagged as harmful or illegal, social media companies now have a legal duty to investigate the report and act quickly to prevent the spread of online abuse.

Instead of relying on watchdogs or a user base to monitor and flag content, as was the case prior to the Act, service providers are now required to ensure that harmful or illegal material is removed from their website. The emphasis is on organisations to demonstrate that they have effective processes and safeguards in place to protect their users and remove any content that has been flagged as inappropriate or illegal.

Pornography and harmful content will be much harder for children to access, with services required to verify users ages before granting them access. Harmful or distressing pornographic content will now have to be removed from social media sites and any content that users find concerning or potentially harmful can be easily reported and flagged under the new regulations.

Positive impacts of the Act

With tech platforms now having a greater responsibility to provide safer spaces for people to interact with each other online, service users should have easier access to block and mute other users while social media companies can ban online perpetrators of abuse.

Reporting and complaint procedures within large social media companies will have to be strengthened with easier access for users so online services react quickly to prevent the spread of harmful content online.

Users of social media and search engines will have the ability to filter and mute content they do not want to see, tailoring online spaces for each user.

The new offences specifically aimed at tackling online abuse will hopefully have a deterrent effect on potential perpetrators. Significant prison sentences, fines and subjection to the sex offenders register are options in criminal cases if an individual is found guilty of one of the new offences.

Find out more about the new offences introduced by the Act for online perpetrators in part one of our three-part blog series here.

Thoughts from industry leaders

“Along with survivors, other experts and over 100,000 members of the public, we called for the Online Safety Bill to tackle and prevent violence against women and girls. We welcome this landmark new guidance for tech companies to reduce harm to women and girls online, which is a step in the right direction for tackling this abuse. But we also know that implementation and enforcement is key if we are to address the rapid spread of misogyny and online abuse, and we will work with government and Ofcom to ensure it is as robust as possible and well enforced.”2

Andrea Simon, Director of the End Violence Against Women Coalition

“Through our frontline work across the UK, we support children whose mental health and understanding of healthy relationships are damaged by what they see online. We welcome the new duty placed on pornography sites to verify that users are over 18 which will help to stop children from viewing this type of harmful content.”3

Lynn Perry MBE, Chief Executive of Barnardo’s

“We are pleased to see coercive and controlling behaviour recognised as a priority offence in the Act. This means social media platforms are required to respond to these abusive behaviours and take steps towards preventing them from being able to happen in the first place.  It is our hope that the protections included in the Act, will allow women and girls to exist online safely, without abuse.”4

Ellen Miller, Interim CEO of Refuge

Potential problems with the Act

There is widespread concern around multiple issues with the Act, many groups are uneasy with the Act’s impact on free speech, freedom of expression, right to privacy and the undue power of OFCOM to regulate online spaces.

In addition, there are worries that the Act does not fully cover the ever-changing landscape of online interaction. With the rapid acceleration of AI generated content, deepfake technology, growing online anonymity and complexity in how we communicate online, the Online Safety Act may already be obsolete only months after it came into effect.

A major critique of the Act is that it fails to address the business model of big online media platforms, which mostly rely on driving engagement from a user base. Harmful, distressing and even illegal content garners engagement and is rewarded with the same ‘eyeball’5 time as any other content. By targeting the removal of illegal and harmful content the Act will hopefully reduce the spread of online harms, but it may fail to address the systemic models that allow harmful content to flourish online.

While many industry leaders have praised the Online Safety Act and the potential protection it offers to women and girls in online spaces, there are concerns that the Act doesn’t go far enough.

Laura Petrone, an analyst at GlobalData, said the issue is not with the bill, which classifies sexism as harmful, but rather with the definition of a hate crime in the UK. “According to the bill, online platforms are obliged to take down illegal content, which includes things like child sexual abuse, hate crime, promotion of self-harm, and revenge porn. Misogynistic content falls under the realm of harmful but legal content, as this is not classified as a hate crime.” Petrone goes on to argue, “Making misogyny a hate crime would make a big difference, as it would make harassment of women online automatically illegal.”6

“The Online Safety Act 2023 is an ambitious piece of legislation with a huge scope, we were very pleased to see women and girls included in the protections offered by the Act. Online abuse is widespread across user-to-user online services, online harassment, stalking, image-based abuse and harmful and illegal content impact women and girls on a huge scale. However, the online landscape is changing at a rapid rate. In the last two years AI technology has developed to the point where it is almost indistinguishable from real images. We welcome the Online Safety Act’s protections for users of online services and spaces, but it may already be obsolete in many areas of the online world.”
Lisa Durston, SARSAS Communications Manager

Toby Howells is a volunteer communications officer with SARSAS. Toby graduated from the University of York’s Law School in 2020, studying public and criminal law and now works for Bristol City Council.

Legal disclaimer: This article contains general legal information; the legal information is not advice or guidance and should not be treated as such. The information on this website is provided without any representations or warranties and is published exclusively to provide opinion and awareness.

The SARSAS newsletter

Sign up to our monthly newsletter for all the latest news, jobs, blogs, events and campaign updates.

  1. ↩︎
  2.–-a-canter-through/#:~:text=The%20Aim%20of%20the%20Online%20Safety%20Act&text=It%20aims%20to%20prohibit%20providers,like%20child%20sexual%20abuse%20material ↩︎
  3. ↩︎
  4. ↩︎
  5.,over%20the%20content%20they%20see ↩︎
  6. ↩︎
  7. ↩︎

The latest from our news and blogs

TNBI GROUP blog cover (1200 x 628 px) (1)

Trans, Non-Binary, and Intersex (TNBI) Support Group

Coming together in a confidential space to support each other and identify ways to move forward after rape & sexual abuse.

sleep blog

Sleep and trauma

Sleep is crucial for everyone. However, for victim-survivors of sexual violence, it can play a vital role in allowing the brain and body to start healing, processing emotions, and restoring energy levels, which are often depleted after experiencing trauma.


Celebrating Pride: Standing with LGBTQIA+ Victim-Survivors

At SARSAS, we are committed to supporting LGBTQIA+ people throughout the year.