After years of negotiations, the UK’s Online Safety Act has finally passed, and now, the regulatory body Ofcom has outlined its proposed guidelines for how tech companies should comply with the legislation. This is the first set of guidelines in a multi-phase publication process that will culminate in the UK Parliament’s approval next year. Officially, these guidelines are proposals, and Ofcom is seeking feedback from stakeholders before finalizing them. However, even when approved, adhering to the specifics of the guidelines will be voluntary. While this may seem lenient, tech companies will still be held accountable to the act’s overarching rules and may face the consequences if they fail to demonstrate compliance.
Ofcom’s online safety lead, Gill Whitehead, explained that the guidelines aim to hold tech firms accountable and to put a duty of care on them to ensure the safety of their users. For example, platforms will be required to take immediate action to remove illegal content, such as child sexual abuse material, terrorism content, and fraud, once they become aware of its presence. Moreover, they will have to conduct risk assessments to understand and mitigate the specific risks that their services might carry. The intention is to prompt tech firms to adopt a proactive approach in addressing illegal content rather than simply reacting to it after the fact, thus tackling the issue at its roots.
Ofcom estimates that approximately 100,000 services may fall under the broad rules of the Online Safety Act, although only the largest and highest-risk platforms will be subject to the strictest requirements. For these platforms, the guidelines outline measures such as not allowing strangers to send direct messages to children, using hash matching to detect and remove child sexual abuse material, maintaining content and search moderation teams, and offering users ways to report harmful content. While many large tech platforms already follow similar practices, Ofcom hopes that the guidelines will encourage more consistent implementation across the industry.
However, the guidelines may be challenging for certain types of platforms. For instance, following Elon Musk’s acquisition of Twitter (now referred to as X), there have been concerns regarding the platform’s adherence to the guidelines. Musk has made public statements about removing Twitter’s block feature, which goes against Ofcom’s specific guidelines. This highlights the potential tension between the regulator and certain tech companies, particularly those that have taken controversial measures in their content moderation approaches.
In addition to guidelines surrounding illegal content, the Online Safety Act also addresses other illegal online activities, such as content promoting suicide, harassment, exploitation, and more. Ofcom’s regulations also cover how sites should deal with these issues, providing a comprehensive framework for handling a wide range of harmful content online, with potential severe penalties for those who breach the rules.
As Ofcom continues to release future updates, it will address more controversial subjects, including content that is legal but harmful for children, underage access to pornography, and protection measures for women and girls. Additionally, Ofcom will have to navigate delicate matters, like interpreting a section that critics argue could undermine end-to-end encryption in messaging apps. Specifically, the act may require breaking encryption to detect illegal content, a stance that has drawn pushback from encrypted messaging services and digital rights groups due to its potential implications for user privacy.
Furthermore, the act takes a technology-neutral approach, meaning it is relevant to any content, regardless of how it was created, including content generated using artificial intelligence. However, this may pose challenges for less massive platforms due to the practical implications and the extra work required to ensure compliance with the legislation.
The Wikimedia Foundation, the non-profit organization behind Wikipedia, has raised concerns over the increased difficulty of complying with various regulatory regimes across the world. While it supports the concept of regulation, it also highlights the challenges associated with limited resources and the complex legal landscape. The aid, while acknowledging the necessity of platforms having responsibilities, underscores that for non-profit organizations with limited resources, compliance can be problematic.
As Ofcom forges ahead with implementing the Online Safety Act, the regulator aims to streamline the regulatory landscape and facilitate cooperation across different countries. While the UK has made significant progress in addressing online safety through regulation, the practical implications and the global nature of the internet create ongoing challenges. With the complexities of the legislation and differing industry players, the true test of the Online Safety Act may lie in its enforcement, practical implications, and effectiveness in creating a safer online environment.