Advanced Television

Ofcom publishes draft guidance on transparency reporting

July 26, 2024

The user-safety practices of tech firms will be put under the spotlight under draft transparency reporting plans announced by Ofcom.

Under the Online Safety Act, ‘categorised services’, which include some of the most widely used social media and search services, are required to produce transparency reports at least annually. This means that, for the first time in the UK, big tech must publish detailed information about the safety of their platforms for all to see.

Ofcom is now consulting on draft industry guidance setting out its proposed approach to issuing ‘transparency notices’ to applicable online services. These notices will be issued by the media regulator, starting in 2025, subject to secondary legislation being passed. They’ll set out the detailed safety information that providers must reveal in their transparency reports, the format it should take, and the deadline by which they must make it public.

The information that Ofcom requires companies to publish will differ from platform to platform, taking account of the type of service, its number of users, the proportion who are children, along with certain other factors. Data Ofcom could compel companies to disclose include: how prevalent illegal content is on their service, how many users have come across such content, and the effectiveness of features used by a platform to protect children.

Ofcom will also shine a light on the best and worst practices across the industry through its own summary reports. These will help the regulator to drive better safety outcomes for users in two ways. First, people will be able to judge whether firms are doing enough to make their platforms safe and how different services compare. This will help them to make informed decisions about what apps and sites they’re comfortable for themselves and their children to use.

Second, by revealing what goes on within popular sites and apps, Ofcom ultimately hopes to encourage firms to go even further to improve their safety standards.

Information is power

Complementing its transparency reporting powers, the Online Safety Act also gives Ofcom wide-ranging powers to access information held by regulated tech firms, as well as a wide range of third parties. These powers will help Ofcom to understand the effectiveness of the safety measures tech firms have in place and to gather evidence if they have specific compliance concerns.

Ofcom has now publushed a second consultation, which sets out its draft guidance for industry on its general approach to its online safety information gathering powers and their duties to comply. It covers the range of circumstances in which Ofcom might use these powers, including for example to:

  • carry out an audit of a tech firm’s safety measures or features;
  • remotely inspect the workings of their algorithms in real time;
  • obtain information to allow Ofcom to respond to a Coroner’s request in the event of the death of a child; and
  • in exceptional cases, enter UK premises of tech companies to access information and examine equipment.

Firms can face enforcement action or, in the most serious cases, criminal liability for failure to respond to information notices in an accurate, complete and timely way.

Next steps

Failure to comply with either a transparency or information notice from Ofcom could result in tech companies facing fines of up to £18 million or 10 per cent of a company’s worldwide revenue – whichever is higher.

Responses to both the transparency and information gathering guidance consultations must be submitted to Ofcom by October 4th 2024. Subject to responses, Ofcom expects to publish its final guidance by early 2025.

Responding to the news, Andy Lulham, COO of Verifymy, commented: “Ofcom’s proposed transparency reporting guidance is an important step towards holding firms to account on online safety and driving much needed change. Without this kind of information, it has been all too easy for platforms to just claim that their platforms are safe without the hard evidence to back this up. When enacted next year, these audits will put social media and other internet platforms in the spotlight, essentially creating a watchlist of the worst offenders. The exact requirements are to be defined through this consultation phase, but firms’ approaches to tackling harmful content will undoubtedly be very high on this list.”

“In Verfiymy’s recent research three quarters of parents told us their kids had experienced harm online, including coming across inappropriate, violent, and even illegal content. This is clearly an unacceptable situation, so it’s encouraging to see Ofcom starting to get serious in tackling the issue. Making the internet a truly safe place for kids will require significant effort and will from everyone involved. For Big Tech this means enforcing clear policies and enhancing processes, teams and technology to stop harmful content ever making it to their platforms. But, you can’t change what you can’t see, so Ofcom’s transparency reporting will be a vital first step in holding platforms to account and driving the change parents so desperately want to see,” he concluded.

Categories: Articles, Policy, Regulation

Tags: ,