Spotify has quietly rolled out one of the most stringent age-verification processes to date in the UK, forcing users to prove they’re over 18 before they can view certain music videos labelled as adult. This move, which went live shortly after the Online Safety Act took effect on July 25, has left many long-time Spotify subscribers scratching their heads – and, in some cases, scrambling for passports.
When a UK user attempts to play a music video flagged “18+ by rights holders,” Spotify will first ask for camera access. Users must snap a quick selfie, which is then analysed by Yoti’s facial-recognition software to estimate whether the person is over 18. If that scan can’t conclusively confirm your age, the app then prompts you to upload a government-issued ID – think passport or driver’s licence.
Fail both checks, and you’re effectively locked out. Spotify warns that anyone who fails to pass an ID check within 90 days will see their account deactivated and, ultimately, deleted. “You cannot use Spotify if you don’t meet the minimum age requirements for the market you’re in. If you cannot confirm you’re old enough to use Spotify, your account will be deactivated and eventually deleted,” the company states.
The timing isn’t a coincidence. On July 25, the UK’s landmark Online Safety Act 2023 came into force, imposing a new duty of care on platforms to protect children from harmful content – including anything deemed adult, from pornography to extreme violence. Under the Act, Ofcom can levy fines of up to 10% of a company’s global turnover (or £18 million, whichever is higher) for non-compliance.
Although the spotlight has mainly shone on pornographic websites – which Ofcom began probing last week to ensure “highly effective” age-verification systems are in place – social platforms and streaming services like Spotify quietly fall under the same umbrella. The goal: to prevent under-18s from inadvertently accessing content that rights holders consider unsuitable.
Spotify isn’t alone. Over the past fortnight, Reddit introduced age-check prompts for NSFW communities, X (formerly Twitter) added extra verification steps before showing adult media, and Discord began flagging mature servers behind a gate. Even smaller niche forums have bolted on similar pop-ups. All told, more than 6,000 adult sites have signalled compliance, but non-compliant services face block orders or hefty fines.
Despite the shared regulatory impetus, Spotify’s approach stands out for its biometric twist. While many platforms accept a simple “enter your birth date” box, Spotify leaps straight to Yoti’s facial-scanning tech – a measure that has drawn eyebrow-raising reactions from privacy advocates.
Digital-rights groups warn that any sort of biometric scan poses inherent risks. Though Spotify insists that all facial data is encrypted and purged after verification, critics point out that once photos are in the system, the potential for mission-creep or data breaches remains. The UK’s Information Commissioner’s Office (ICO) has previously cautioned against over-collecting sensitive personal data, urging firms to adopt the least intrusive methods possible.
A petition demanding repeal of the age-verification provision has already amassed over 400,000 signatures, with signatories arguing the measures are disproportionate for a music-streaming service. Some users have taken to social media, vowing to “quit Spotify forever” rather than hand over a selfie.
In its public communications, Spotify underscores that the new checks only target explicit music videos – not audio tracks – and that the minimum age to hold a Spotify account in the UK remains 13. The company also reminds users that declining verification simply means losing access to that tiny slice of 18+ content; your curated playlists, podcasts, and vast library of tracks remain untouched.
Behind the scenes, Spotify views this as a compliance exercise, not a bid to roll out biometrics across every corner of its service. A spokesperson told 404 Media that “biometric data is encrypted, deleted immediately after verification, and only used to ensure our platform remains safe and age-appropriate” – dovetailing with Yoti’s own privacy-first policies.
The UK’s Online Safety Act is just the beginning. Ofcom is gearing up to enforce age checks on categories beyond adult videos – potentially including content related to self-harm, eating disorders, and violent extremism. Many expect further rule-making later this year to flesh out those requirements. For Spotify, that could mean deeper scrutiny or expanded verification steps down the line.
For now, the message is clear: if you live in the UK and care about uninterrupted access to every remix, live concert clip, or uncensored music video, you’d better have your ID – and your selfie game – ready. Yet the broader debate raises thorny questions about the price of digital safeguards. As UK platforms navigate these new legal shoals, both businesses and users will be watching closely to see whether these age gates truly keep minors safe – or simply drive everyone to VPNs and rival services.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
