Ofcom tested as Elon Musk's X accused of law breaches on 'racism and antisemitism'

The social media platform, formerly called Twitter, is the subject of a super-complaint lodged with Ofcom by the Good Law Project, which alleges that posts which are “illegal because they incite racial or religious hatred” have been left untouched by moderators. A 100-page annexe published by the legal outfit includes screenshots of posts featuring extreme racism and antisemitism, their URL, the date they were highlighted to X moderators, and the action taken. WARNING: EXPLICIT RACISM Two of the posts featured in the 100-page annexe submitted to Ofcom of examples of racism and antisemitism on X (Image: Good Law Project) Posts which have been ignored by moderators or ruled acceptable include claims that antisemitism “is the solution to societal collapse” and the world should “finish what [Adolf Hitler] started”, that African people are "worthless primates” who are “not human” and should “go off back to the dark continent”, and calls to “get rid of all the n****** and sh**skins”. In a legal letter to Ofcom, leading media lawyer Brett Wilson, representing Good Law Project, alleges that “X is failing to comply with its duties under the Online Safety Act 2023 [OSA]”. Good Law Project said: “The UK’s Online Safety Act requires X to remove illegal content ‘swiftly’. But of the posts we reported to X – posts we believe were unlawful – only 8% were removed.” Graph showing the rate at which X moderators ignored or did not act to remove racist posts on the platform (Image: Good Law Project) On Friday, the European Commission fined X €120 million after ruling that its use of blue check marks for paid subscribers – and not verified users as had previously been the case – was “deceptive”, and that the platform’s advertising library did not provide public data, as required by law. Other parts of the EU’s probe into Musk’s X, looking at the platform’s efforts to counter illegal content and disinformation, are ongoing. However, getting X to pay up may prove a bigger challenge. US vice-president JD Vance defended the platform, saying the EU “should be supporting free speech, not attacking American companies over garbage”. “Much appreciated,” Musk wrote in response. In the UK, Ofcom has tried to fine a porn company more than £1m for failing to comply with the Online Safety Act, but the firm has simply ignored it. READ MORE: Elon Musk's AI forced into apology after calling SNP MP 'rape enabler' At the same time, the media regulator announced that an unnamed "major social media company" was going through compliance remediation, and there may be formal action if there is not sufficient improvement. Digital policy specialist Heather Burns told the Sunday National there is a “good chance of some sort of action from Ofcom on account of the Good Law Project's complaint being a test of the OSA's super-complaints regime, where organisations are able to issue a form of class action complaint on behalf of many users of a service, as opposed to individuals being left on their own”. She went on: “That being said, the Good Law Project knows that this super-complaint is performative, because they know that X is not going to respond as a good faith actor to any questioning from any area of the UK Government.” Burns pointed to an “astonishing bit of testimony” from X’s head of global government affairs, Deanna Romina Khananisho, at the Southport Public Inquiry in November, which is investigating the circumstances around murderer Axel Rudakubana killing three young girls at a dance class in 2024. Elon Musk's X defended leaving a video of a stabbing on the platform on religious grounds (Image: Apu Gomes/Getty images) Asked about a video of a stabbing on X, which Rudakubana watched just minutes before going on the killing spree, Khananisho said it would be “tyrannical overreach” to remove it because, in the video, she “saw an angel protecting Mar Mari”, the bishop who was attacked. Burns said: “For better or for worse, X is going to be a test case. “From the earliest days of the draft Online Safety Act, my concern has been that regulators end up litigating the entire open internet around the bad actions of a small handful of US big tech platforms, with X now being at the top of that list. “The real risk for all of us here, even those of us who are no longer using X, is that everyone gets punished, and all of our rights to freedom of expression and privacy are curtailed because of this one bad faith actor.” An Ofcom spokesperson said: “X, like other major platforms, is currently subject to a period of intense oversight regarding compliance with its online safety duties. Should this process result in any enforcement action, we would announce this publicly.” X was approached for comment.

Comments (0)

AI Article