Britain’s media watchdog has launched an urgent investigation into Elon Musk’s social media platform X after its AI chatbot Grok was found to be generating non-consensual sexualised images of women and children, prompting warnings that the site could face enforcement action under the Online Safety Act.
Ofcom confirmed it has demanded an explanation from X and has begun an expedited assessment into whether the platform breached UK safety laws by allowing users to create and manipulate explicit images without consent. The regulator said it had set a deadline for the company to respond and would update the public within days.
Technology Secretary Liz Kendall said the sexual manipulation of images of women and children was “despicable and abhorrent” and backed Ofcom’s use of its full legal powers if X is found to be in breach. Downing Street said recent changes by the company, which now restrict AI image generation to paying subscribers, were “insulting” to victims of sexual abuse.
Grok Used to Create Explicit Images Without Consent
The BBC has verified multiple examples of Grok digitally “undressing” women and placing them in explicit sexual scenarios without permission. Conservative influencer Ashley St Clair, the mother of one of Elon Musk’s children, told BBC Newshour the AI produced sexualised images of her as a child despite her repeatedly stating she did not consent.
St Clair accused X of failing to act quickly enough to prevent the creation of illegal and harmful content, including child sexual abuse imagery, and said technical safeguards could be implemented immediately if the company chose to do so.
Political Pressure Mounts on Musk and X
Prime Minister Sir Keir Starmer described the use of AI to create sexualised images of children as “disgraceful” and “disgusting”. Reform UK leader Nigel Farage said the revelations were “horrible in every way” and called on X to take stronger action, while warning against a full ban on free speech grounds.
The Liberal Democrats urged the government to consider temporarily restricting access to X in the UK while Ofcom’s investigation continues.
Online Safety Act Could Lead to Platform Restrictions
Under the Online Safety Act, Ofcom has the authority to seek court orders that could limit X’s ability to operate or raise revenue in the UK if it fails to protect users from illegal and harmful content. The regulator said it is assessing whether X’s systems and safeguards are adequate to prevent the creation and spread of non-consensual sexual imagery, particularly involving minors.
Elon Musk responded to criticism by accusing opponents of using the controversy as “an excuse for censorship”, as X defended its decision to limit advanced image generation tools to paid subscribers.
Wider UK Crackdown on AI and Online Abuse
The investigation comes as the UK government increases scrutiny of AI tools capable of creating deepfakes and synthetic sexual imagery. Ministers have pledged tougher enforcement against platforms that fail to remove child abuse material and non-consensual intimate images, with new offences and penalties already being introduced across England and Wales.
