
When did “keeping children safe” become shorthand for dismantling the last vestiges of private communication?
The question demands an answer this week as the Starmer government threatens to ban Elon Musk’s X platform over its Grok AI tool creating sexualised images of minors. The Prime Minister’s language was emphatic: “This is disgraceful. It’s disgusting. And it’s not to be tolerated.” Few would disagree. But beneath the moral clarity lies something more troubling: a government exploiting genuine horror to construct surveillance infrastructure that would make the Stasi proud.
The immediate crisis involves Grok’s image manipulation features, which users exploited to create non-consensual sexual images, including those depicting children aged 11 to 13. The Internet Watch Foundation confirmed these images appeared on dark web forums. X’s response, restricting the feature to paid subscribers, was rightly called “insulting to victims” by Downing Street. No argument there.
But the government’s response reveals an agenda that predates this scandal by years. Starmer instructed Ofcom, Britain’s communications regulator, to consider “all options,” including what would effectively constitute a platform ban under the Online Safety Act.
That legislation, passed in October 2023 and enforced since March 2025, grants Ofcom extraordinary powers under Section 121: the authority to compel any messaging service to install “accredited technology” scanning for terrorism or child sexual abuse material.
The mechanism is client-side scanning. Every message gets examined on your device before encryption occurs. Not suspicious messages. Not flagged accounts. Every single message, from every single user, scanned before the encryption that supposedly protects your privacy even engages.
This is not a bug. This is the design.
The Infrastructure of Surveillance

The Online Safety Act is not operating in isolation. Whilst the public fixates on each manufactured crisis, a comprehensive surveillance architecture is being assembled, piece by piece, contract by contract. Consider what has emerged in the last eighteen months alone.
In November 2023, the NHS awarded Palantir Technologies a seven-year, Β£330 million contract to run the Federated Data Platform, gathering sensitive patient data from up to 240 NHS trusts and integrated care systems. Palantir is not a neutral tech firm. It is a defence-intelligence contractor built in the orbit of the CIA, embedded in modern warfare, policing, border control, and population surveillance. The company was co-founded in 2003 by Peter Thiel, who received seed funding from the CIA’s venture capital fund In-Q-Tel. Its tools are designed to map, predict, target, and control. They were never designed to care.
Yet by May 2025, only 72 NHS trusts (less than a third of the total) were actually using Palantir’s platform, with many reporting it represented “a step backwards” on existing systems. Leeds Teaching Hospitals stated: “From the descriptions we have of these FDP products we believe we would lose functionality rather than gain it by adopting them.” Corporate Watch Greater Manchester’s Chief Intelligence and Analytics Officer warned adoption “may represent a time-consuming and possibly retrograde step,” whilst noting the decision “has caused some public consternation in relation to data security.”
The government’s response to this resistance? Award consultancy giant KPMG an Β£8 million contract to “promote the adoption” of Palantir’s software across the NHS. When your product is so poor that trusts refuse to use it, you don’t improve the product. You hire consultants to overcome their objections. This is not healthcare technology. This is corporate colonisation of public infrastructure.
Whilst NHS workers were raising alarms, journalists who dared investigate found themselves running into walls of silence. Not debate. Not rebuttal. Censorship. Exactly as documented in Labour Heartlands’ coverage of the Bilderberg Meeting 2025, where power brokers met behind closed doors whilst public discussion of Palantir, data sovereignty, and democratic oversight was quietly sidelined.
The Bilderberg gathering, held June 12-15 in Stockholm, assembled 121 of the world’s most powerful individuals under strict secrecy. The published agenda included “AI, Deterrence and National Security” and, chillingly, “Depopulation and Migration.” Among the attendees was Albert Bourla, CEO of Pfizer, a man whose company profited handsomely from a global pandemic, now seemingly intent on mining health data for “AI-driven drug personalisation.” Labour Heartlands Starmer himself travelled to Washington in February to meet with both Donald Trump and Alex Karp, Palantir’s CEO, at the company’s headquarters on Thomas Jefferson Street New Statesman, where he enthused about Britain’s embrace of AI without “over-regulation.”
The pattern is clear. Behind closed doors, at gatherings like Bilderberg, where no minutes are taken and no accountability exists, our futures are being mapped by unelected technocrats and corporate interests. What emerges are contracts like Palantir’s: secretive, expensive, resisted by experts, yet pushed through regardless. The NHS published Palantir’s heavily redacted contract on the last working day before Christmas 2023 The Register, with “most of the section describing ‘protection of personal data'” blacked out. This is governance by obfuscation.
And whilst Palantir embeds itself in our healthcare system and the Online Safety Act constructs scanning infrastructure for our messages, the government announced in September 2025 a mandatory Digital ID scheme. The Prime Minister called it “an enormous opportunity for the UK,” claiming it would “cut the faff” whilst combating illegal migration. The scheme will be mandatory for right-to-work checks, stored on people’s phones via a GOV.UK Wallet, and include name, date of birth, nationality, and photo. House of Commons Library
The estimated cost? Β£1.8 billion over three years, with the Office for Budget Responsibility noting “there has been no specific funding identified for the scheme.” Opposition is nearly universal. A parliamentary petition against mandatory digital ID has gathered over 2.9 million signatures, making it one of the largest in parliamentary history. Every major UK party, from Conservatives to Liberal Democrats to Reform UK, opposes it. Scottish First Minister John Swinney and Northern Irish First Minister Michelle O’Neill have denounced it as incompatible with their nations’ identities and the Good Friday Agreement.
The government’s response to this democratic outcry? Press ahead regardless. The consultation keeps getting delayed. Details remain vague. But the infrastructure marches forward.
Do you see it yet? The Online Safety Act to scan your messages. Palantir’s system to analyse your medical data. Digital IDs to track your movements, employment, and entitlements. Separately, each might be defended as pragmatic modernisation. Together, they constitute the architecture of a surveillance state.
The Historical Parallel: When Security Required Total Visibility
Britain has form here. In 1844, Giuseppe Mazzini, the Italian revolutionary exiled in London, discovered that British authorities had been opening his mail. The scandal rocked Victorian society. Parliament held inquiries. The Times thundered about state overreach. The practice continued, but the controversy established a principle: in a democracy, surveillance requires justification, oversight, and public scrutiny.
Fast forward 182 years. The Online Safety Act, the Palantir contract, and the Digital ID scheme collectively reverse that principle entirely. Now the assumption is total visibility. Privacy becomes the exception requiring justification, not the rule requiring violation. One might marvel at how thoroughly we have inverted the democratic presumption without even noticing.
The Act’s defenders argue necessity. Ofcom’s draft codes, enforceable since 17 March 2025, require platforms accessible to children to implement “highly effective age assurance,” filter harmful content from children’s feeds, and establish swift content moderation processes. Failure carries fines up to Β£18 million or 10% of global revenue, whichever proves greater.
By July 2025, age verification went live across pornography sites, social media networks, dating apps, and gaming platforms. Methods vary: facial age estimation, open banking, photo ID matching, credit card checks. VPN downloads surged. A petition demanding the Act’s repeal gathered over 500,000 signatures. Parliamentary debate followed in December 2025, changing precisely nothing. Democracy performed its theatre whilst the machinery ground forward regardless.
Throughout, the government maintained its target was harmful content, not encryption. Yet the Act contains no protection for end-to-end encryption. Section 121 exists precisely to circumvent it. Ofcom’s final guidance on “accredited technologies” is scheduled for spring 2026. No such technology currently exists that can scan encrypted content without breaking the encryption. The government admits this. It plans to proceed anyway.
Think about that for a moment. They know it cannot work as advertised. They are implementing it regardless.
The Counterargument: Surely Child Protection Justifies Extraordinary Measures?
Let us engage this argument seriously, because dismissing it would be both morally wrong and intellectually lazy. Children deserve protection. Online child sexual abuse material is genuinely horrific. Technology companies have historically done too little to combat it. These are not controversial claims.
But three problems demolish the case for client-side scanning.
First, the infrastructure problem. Once you build a system scanning for one category of content, that system exists. Future governments expand its remit. Today: child abuse. Tomorrow: terrorism. Next week: “misinformation.” Within a year: political dissent. The Chinese government uses identical logic (and identical technology) to monitor WeChat messages for threats to “social stability.” Britain is implementing the same architecture whilst condemning Beijing’s authoritarianism. The irony appears lost on Whitehall.
Palantir’s NHS contract and Digital IDs fit this pattern. Palantir’s AI model was used to sift through submissions for the UK’s Strategic Defence Review in June 2025 Corporate Watch, after which the chancellor announced 10% of the Ministry of Defence budget would go to new technologies. The company already holds a Β£75 million data processing contract with the MoD, awarded in December 2021. Corporate Watch A firm built for military intelligence, now embedded in our healthcare, defence policy, and increasingly our public services. Digital IDs, meanwhile, create the perfect tracking mechanism. Today it’s employment checks. Tomorrow it’s benefit eligibility. Next year it’s movement restrictions during the next “emergency.”
“Those who would give up Essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.”Β
Β Benjamin Franklin
Second, the effectiveness problem. Criminals will migrate to unregulated platforms, peer-to-peer encryption, or simply meet offline. Law-abiding citizens lose privacy whilst predators adapt. The net effect: weakened security for everyone, marginal impact on the targeted harm. Multiple cryptography experts, including from GCHQ itself, have detailed why client-side scanning cannot work as advertised without introducing catastrophic security vulnerabilities. The government heard this evidence. It was unpersuaded.
Third, the precedent problem. Britain’s approach is being studied by authoritarian regimes worldwide. The European Commission’s “Chat Control” proposal goes even further, though it faces growing opposition. When democratic nations normalise mass surveillance infrastructure, they hand despots a blueprint and strip away moral authority to criticise similar measures elsewhere. We become the model for the very regimes we claim to oppose.
The Current Crisis: X, Grok, and Opportunistic Expansion
The Grok scandal provides perfect cover for this expansion. Starmer’s threat to ban X unless it complies with the Online Safety Act sounds reasonable in isolation. Few would defend AI-generated child abuse imagery. But examine the mechanism.
Under the Act, Ofcom can order payment providers, advertisers, and internet service providers to stop working with non-compliant platforms, effectively banning them without formally banning them. This requires court approval but sets no meaningful standard beyond Ofcom’s determination of “necessity and proportionality.” The regulator becomes judge and jury, prosecutor and legislator, all wrapped in the comforting language of “child safety.”
X responded by restricting Grok’s image manipulation to paid subscribers and limiting access to the feature. Reports suggest these restrictions are easily circumvented. The government called this inadequate. Perhaps it is. But the alternative on offer is not better platform moderation. It is comprehensive infrastructure for monitoring every private communication in Britain, analysing every medical record through Palantir’s algorithms, and tracking every citizen’s movements and status through Digital IDs.
There’s an international response. Republican Congresswoman Anna Paulina Luna threatened to sanction both Starmer personally and Britain as a whole if the platform is banned, explicitly comparing the action to previous US responses to Brazilian and European restrictions on X. Elon Musk shared her post and called the UK government “fascist,” asking why Britain arrests more people for online posts than any comparable democracy.
This is not to defend Musk’s erratic leadership or X’s content moderation failures. It is to observe that the government’s response to a genuine problem is to demand powers that would fundamentally alter the relationship between citizen and state. And they wonder why people notice.
The Counterargument: Surely Child Protection Justifies Extraordinary Measures?
Let us engage this argument seriously. Children deserve protection. Online child sexual abuse material is genuinely horrific. Technology companies have historically done too little to combat it. These are not controversial claims.
But three problems demolish the case for client-side scanning:
First, the infrastructure problem. Once you build a system scanning for one category of content, that system exists. Future governments expand its remit.
Today: child abuse. Tomorrow: terrorism. Next week: “misinformation.” Within a year: political dissent.

The Chinese government uses identical logic and identical technology to monitor WeChat messages for threats to “social stability.” Britain is implementing the same architecture whilst condemning Beijing’s authoritarianism.
Second, the effectiveness problem. Criminals will migrate to unregulated platforms, peer-to-peer encryption, or simply meet offline. Law-abiding citizens lose privacy whilst predators adapt. The net effect: weakened security for everyone, marginal impact on the targeted harm. Multiple cryptography experts, including from GCHQ itself, have detailed why client-side scanning cannot work as advertised without introducing catastrophic security vulnerabilities.
Third, the precedent problem. Britain’s approach is being studied by authoritarian regimes worldwide. The European Commission’s “Chat Control” proposal goes even further, though it faces growing opposition. When democratic nations normalise mass surveillance infrastructure, they hand despots a blueprint and strip away moral authority to criticise similar measures elsewhere.
The Current Crisis: X, Grok, and Opportunistic Expansion

The Grok scandal provides perfect cover for this expansion. Starmer’s threat to ban X unless it complies with the Online Safety Act sounds reasonable in isolation. Few would defend AI-generated child abuse imagery. But examine the mechanism.
Under the Act, Ofcom can order payment providers, advertisers, and internet service providers to stop working with non-compliant platforms, effectively banning them without formally banning them. This requires court approval but sets no meaningful standard beyond Ofcom’s determination of “necessity and proportionality.” The regulator becomes judge and jury.
X responded by restricting Grok’s image manipulation to paid subscribers and limiting access to the feature. Reports suggest these restrictions are easily circumvented. The government called this inadequate. Perhaps it is. But the alternative on offer is not better platform moderation. It is comprehensive infrastructure for monitoring every private communication in Britain.
Consider the international response. Republican Congresswoman Anna Paulina Luna threatened to sanction both Starmer personally and Britain as a whole if the platform is banned, explicitly comparing the action to previous US responses to Brazilian and European restrictions on X. Elon Musk shared her post and called the UK government “fascist,” asking why Britain arrests more people for online posts than any comparable democracy.
This is not to defend Musk’s erratic leadership or X’s content moderation failures. It is to observe that the government’s response to a genuine problem is to demand powers that would fundamentally alter the relationship between citizen and state.
Accountability Without Surveillance
Real child protection requires resources: police investigators, social workers, mental health services, education. These cost money. They require political will. They demand sustained attention beyond immediate scandal. In short, they require governance rather than gesture.
Client-side scanning, Palantir contracts, and Digital IDs offer something more attractive to any government: the illusion of technological solution to social problems. They promise to “do something” whilst avoiding the hard work of actually addressing harm’s root causes. It is security theatre elevated to national infrastructure.
The Starmer government inherited the Online Safety Act from the Tories. Its enthusiasm for using it, its embrace of Palantir, its push for Digital IDs, all reveal continuity where there should be rupture. Labour once understood that state power required democratic constraint. It once opposed mass surveillance on principle. Those principles have evaporated, replaced by the same authoritarian reflex that characterises neoliberal governance: control the information environment, own the data infrastructure, track the population, and you control everything.
What should concern us is not any single government’s possession of these powers. It is that the powers exist at all. Section 121 of the Online Safety Act, Palantir’s access to NHS data, and Digital ID tracking grant future governments (governments we cannot predict and may actively oppose) the technical infrastructure for monitoring and controlling every aspect of British life. That infrastructure, once built, will be used. The only question is for what purposes. History suggests the answer will not comfort civil libertarians.

The path forward requires three commitments.
First, repeal Section 121. If client-side scanning is necessary, the government should make that case explicitly, in primary legislation, with full parliamentary scrutiny. Hiding it in regulatory frameworks under Ofcom’s discretion is cowardice dressed as pragmatism.
Second, terminate the Palantir contract and invest in actual child protection: trained investigators, proper funding for support services, international cooperation on criminal networks. Build NHS data systems in-house with democratic oversight, not through CIA-linked defence contractors. This requires resources, not just surveillance. It requires governance, not gadgetry.
Third, abandon the Digital ID scheme and establish meaningful oversight for any future surveillance expansion. If the state insists on such powers, they must come with robust parliamentary scrutiny, independent oversight, and automatic sunset clauses requiring periodic renewal. No blank cheques. No permanent powers. No trust without verification.
Labour MPs should be leading this fight, not enabling it. Instead, they sit silent whilst their government constructs infrastructure that any future Conservative administration (or Reform UK government, god forbid) could weaponise against working-class movements, trade unions, and political dissent. They imagine they are building safety. They are building chains.
The Online Safety Act passed with bipartisan support, minimal opposition, and inadequate scrutiny. Its enforcement reveals what was always intended: not child protection, but control. The Palantir contract was signed behind closed doors whilst elites gathered at Bilderberg to discuss our futures in secret. The Digital ID scheme pushes forward despite nearly three million citizens demanding it stop. The Grok scandal simply provides the justification for using powers that were always going to be used regardless.
When the government tells you mass surveillance is necessary to protect children, remember this: every authoritarian regime makes identical claims. And when you cannot communicate privately, when your health data flows to intelligence contractors, when your movements and status are perpetually tracked, you cannot organise collectively. That is not coincidence. That is the point. But I wouldn’t worry too much about that, protest is also banned.
Your phone should not be a government informant.
Β The future is not set.Β There is no fate but what we make for ourselves.
Sarah Connor – Terminator
The choice to resist that future begins now.
Support Independent Journalism Today
Our unwavering dedication is to provide you with unbiased news, diverse perspectives, and insightful opinions. We're on a mission to ensure that those in positions of power are held accountable for their actions, but we can't do it alone. Labour Heartlands is primarily funded by me, Paul Knaggs, and by the generous contributions of readers like you. Your donations keep us going and help us uphold the principles of independent journalism. Join us in our quest for truth, transparency, and accountability β donate today and be a part of our mission!
Like everyone else, we're facing challenges, and we need your help to stay online and continue providing crucial journalism. Every contribution, no matter how small, goes a long way in helping us thrive. By becoming one of our donors, you become a vital part of our mission to uncover the truth and uphold the values of democracy.
While we maintain our independence from political affiliations, we stand united against corruption, injustice, and the erosion of free speech, truth, and democracy. We believe in the power of accurate information in a democracy, and we consider facts non-negotiable.
Your support, no matter the amount, can make a significant impact. Together, we can make a difference and continue our journey toward a more informed and just society.
Thank you for supporting Labour Heartlands






