OfCom announced today that it has secured voluntary commitments from X, formerly known as Twitter, to start abiding by its legal duty to tackle the widespread hate crimes being committed on the platform.
Our experiences of being targeted with sustained racist abuse on X is that the perpetrators of unlawful hate crimes openly boast that they can get away with it – because they believe Musk and his platform are on their side. The sad thing is just how often the racist trolls have been proved right over the last year.
It made sense for Ofcom to investigate X first on hate speech, given the deteriorating experience so many of its users have had under its current ownership. When this process began, last December, the Chair wrote to MPs that “if we uncover significant compliance concerns, we would not hesitate to move to formal enforcement action”. Yet Ofcom has chosen not to do so, despite this investigation receiving a tsunami of compelling evidence from civic society that should put beyond reasonable doubt that non-compliance is as much the platform norm as an exception.
Our own evidence of reporting hate crimes using the racist slur “Paki” showed that the X platform protects the perpetrator, not the victim, over 95% of the time. If that does not yet count as a significant compliance concern, it seems almost impossible to imagine what would.
So what is missing today as this investigation concludes is a simple statement of the undeniable fact: X is not yet upholding its legal duties in this country. Big changes are needed to its systems, practices and culture to bring it back within Britain’s laws. Ministers, MPs and others should ensure the regulator is open about what it has found.
Ofcom may have a sound practical case that securing voluntary commitments to act could bring about some changes more quickly than formal sanctions. But that is no barrier to recognising the scale of the problem, including its impact on those targeted with hatred.
The core message should be that the legal duties of platforms under the Online Safety Act are the law of this land. They are not an optional extra. So the regulator should not weaken that message by continuing to hesitate – apparently in fear that it could somehow lose impact or influence unless it asks nicely that the platform operates within the law.
We do welcome the assurances from Ofcom to scrutinise, rigorously, the promises made, rather than trusting X to mark its own homework. Ofcom also needs to use its powers more effectively to demand information – especially to get the data on just how often the platforms impede police requests to investigate hate crimes by refusing to provide user information.
X has made a commitment to remove unlawful content within 48 hours. This is useful too, making it possible to start assessing whether anything significant is changing within three to six days, rather than three to six months. That will reveal whether X’s promises do lead to the platform taking compliance with its legal obligations seriously. These pledges need to deliver significant and rapid change if X is to start protecting users who are the victims of hate crimes, rather than defending and protecting those who use it to target people with unlawful hatred and abuse.
There must be no let-up in the urgency to ensure we make social media lawful again. X needs to stop actively protecting the racist trolls that make the site a hostile environment for ethnic minorities, women and so many other people. This regulated platform has a strong claim to have become probably the most effective catalyst for the rising fear of hate crime which is being viscerally felt across just about every minority group in our society today. It is vital that those with the legal powers to bring the platform back within the law do not hesitate to act on that responsibility.
Sunder Katwala is Director of British Future; Avaes Mohammad is Programme Manager for the British South Asian Bridgers Project at British Future.
Image: Alicia Christin Gerald on Unsplash



