Advanced Television

Ofcom: “Heavy fines for defying Online Safety Act”

October 17, 2024

Ofcom has provided an update on its progress in implementing the Online Safety Act, and set out what to expect over the next 12 months.

The Online Safety Act was passed in October 2023. When fully in force, it will place new legal duties on platforms available in the UK. Before Ofcom can enforce these duties, its is required to consult publicly on codes of practice and guidance.

“In the space of six months, we consulted on our codes and guidance for illegal harms, pornography age verification and children’s safety, and submitted our advice to Government on the thresholds that would determine which services will be ‘categorised’ and subject to additional duties. We have also been speaking to many tech firms – including some of the largest platforms as well as smaller ones – about what they do now and what they will need to do next year,” Ofcom said in its update.

Ofcom added that it has already secured better protections from UK-based video-sharing platforms, including OnlyFans and other adult sites introducing age verification; BitChute improving its content moderation and user reporting; and Twitch introducing measures to stop children seeing harmful videos. Additionally, Meta and Snapchat have made changes that Ofcom proposed in its illegal harms consultation to protect children from grooming. These include Instagram, Facebook and Snapchat introducing changes to help prevent children being contacted by strangers; and Instagram’s ‘Teen Accounts’ to limit who can contact teens and what they can see.

“These are positive steps, but many platforms will have to do far more when the Online Safety Act comes into force,” Ofcom advised.

Parliament has set a deadline of April 2025 for Ofcom to finalise its codes and guidance on illegal harms and children’s safety, but the media regulator expects to finalise its illegal harms codes before then. Ofcom laid out its tentative key milestones over the next year:

  • December 2024: Ofcom will publish first edition illegal harms codes and guidance. Platforms will have three months to complete illegal harms risk assessment.
  • January 2025: Ofcom will finalise children’s access assessment guidance and guidance for pornography providers on age assurance. Platforms will have three months to assess whether their service is likely to be accessed by children.
  • February 2025: Ofcom will consult on best practice guidance on protecting women and girls online, earlier than previously planned.
  • March 2025: Platforms must complete their illegal harms risk assessments, and implement appropriate safety measures.
  • April 2025: Platforms must complete children’s access assessments. Ofcom to finalise children’s safety codes and guidance. Companies will have three months to complete children’s risk assessment.
  • Spring 2025: Ofcom will consult on additional measures for second edition codes and guidance.
  • July 2025: Platforms must complete children’s risk assessments, and make sure they implement appropriate safety measures.

Ofcom advised that it has the power to take enforcement action against platforms that fail to comply with their new duties, including imposing significant fines where appropriate. In the most serious cases, Ofcom will be able to seek a court order to block access to a service in the UK, or limit its access to payment providers or advertisers.

Dame Melanie Dawes, Ofcom’s Chief Executive, commented: “The time for talk is over. From December, tech firms will be legally required to start taking action, meaning 2025 will be a pivotal year in creating a safer life online. We’ve already engaged constructively with some platforms and seen positive changes ahead of time, but our expectations are going to be high, and we’ll be coming down hard on those who fall short.”

Categories: Articles, Policy, Regulation, Social Media

Tags: ,