The UK’s communication services regulator, Ofcom, has set a deadline for social media platforms and other online services to bolster user safety or face substantial penalties. With the introduction of over 40 stringent safety measures under the newly enacted Online Safety Act, companies have until mid-March 2025 to comply. This move marks a significant shift towards ensuring safer digital environments for both children and adults in the UK.
The new measures
Ofcom’s new directives aimed at combating various online threats, including fraud, child sexual abuse material (CSAM), and the spread of illegal content. Here’s what companies are expected to do:
- Senior accountability: Companies must designate a senior manager responsible for ensuring compliance with safety duties, including handling complaints and reporting mechanisms.
- Content moderation: There will be enhanced expectations for moderation teams to be adequately trained and resourced. This is to ensure the swift removal of illegal content, thereby reducing the risk of harm or exposure to inappropriate material.
- Algorithmic adjustments: Social media algorithms must be modified to minimize the promotion and visibility of harmful content. This includes content that could encourage self-harm, eating disorders, or extremism.
- Child protection: Specific measures are designed to protect minors online. For instance, children’s profiles should not display location data; direct messaging from strangers to children should be restricted, and technologies like hash-matching and URL detection should be employed to identify and remove CSAM rapidly.
Ofcom’s approach
In developing these codes, Ofcom has taken a collaborative approach, consulting with tech industries, charities, parents, and even children themselves. The feedback has been instrumental in shaping these regulations. “As an evidence-based regulator, every response has been carefully considered, alongside cutting-edge research and analysis, and we have strengthened some areas of the codes since our initial consultation,” Ofcom noted in its official statement. The focus has been on creating measures that, although not currently in widespread use, could significantly enhance online safety.
Wide scope of the act
The Online Safety Act isn’t limited to tech giants; it extends to all sizes of businesses, from large corporations to small “micro-businesses,” and even to individuals operating online services. The applicability depends on having a “significant number” of UK users or if the UK is a targeted market for the service. This includes user-to-user services like social media, online gaming, and dating platforms, as well as search services and sites hosting pornographic content.
Enforcement and penalties
Non-compliance with these regulations could lead to severe repercussions. Ofcom is empowered to impose fines up to £18 million or 10% of a company’s global annual turnover, whichever is greater. In cases deemed extremely serious, Ofcom can even seek judicial intervention to block a company’s access to the UK market.
Looking forward
Ofcom plans to release further guidance in the first half of 2025, continuing to refine and possibly expand upon the current framework to adapt to emerging online threats and technologies. This ongoing evolution suggests a dynamic approach to regulation, aiming to keep pace with the ever-changing digital landscape.
This initiative by Ofcom not only underscores the UK’s commitment to digital safety but also sets a precedent for other nations grappling with how to regulate the vast and complex world of online interactions. With these new regulations, the digital age in the UK aims to become safer, more accountable, and more protective of its youngest and most vulnerable users.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
