Blog
Stay Current, Stay Secure: Why IBM i 7.4 End of Standard Support Means it’s Time to Upgrade
Cloud & Datacentre
Ed Yeates
22 September 2025
Secure data optimisation & proactive backup
Proactive Licensing, Compliance & Asset Management
Agile, Modular, & Secure Cyber Security & Managed Siem
Manage & Transform Multi-Cloud, Hybrid & On-Premise
Public sector organisations are under pressure to move fast on AI. At the same time, the quantum threat is building quietly in the background, encrypted data being harvested today, ready for decryption when Q-Day arrives. Both are live concerns. Both require urgent attention.
But the thing that makes both manageable (or both catastrophic, if it’s missing) is the same: data security maturity. Knowing what data you hold, where it lives, who has access to it, how it’s protected, and whether you could recover it if the worst happened.
This is the foundation. And in most public sector organisations, it hasn’t kept up with the pace of change.
The assumption that data security is a solved problem (or at least someone else’s problem) is the most dangerous position a public sector IT or digital leader can hold right now.
AI systems need data to function. The data feeding those systems needs to be correctly classified, appropriately governed, and protected from misuse. Without visibility of your data estate, you can’t govern AI responsibly. Without governance, you can’t deploy AI safely. The two are inseparable.
Quantum adds a different dimension. The encryption protecting your most sensitive data today was designed for a world where classical computers would take millions of years to break it. Quantum computers will do it in minutes. Any data that hasn’t been migrated to quantum-safe encryption before Q-Day arrives is already compromised in all but name, and the harvesting of that data is already underway.
Neither threat is theoretical and both require the same starting point: a clear, honest picture of where you are.
Wherever your organisation is on the journey, progress follows the same path. Most public sector organisations are somewhere in the middle: further along than they think in some areas, more exposed than they realise in others.
You can’t secure what you don’t know exists. The first stage is building comprehensive visibility over your data. Knowing where it lives, how it’s structured, what it contains, and who has access to it.
In most public sector environments, data is spread across on-premise systems, cloud platforms, SaaS applications, and legacy infrastructure that predates modern data management. Structured and unstructured data sits in silos, often without consistent classification or labelling. AI tools are accessing and processing data whose provenance isn’t fully understood.
This stage is about closing that visibility gap. It’s not glamorous work, but nothing else is possible without it.
Visibility creates the foundation for compliance. For public sector organisations this means mapping data assets against regulatory and governance requirements — whether that’s DSPT for NHS organisations, NCSC guidance for central government, or the broader information governance frameworks that apply across the public sector.
Compliance isn’t a checkbox exercise. It’s a continuous process of monitoring, auditing, and demonstrating due diligence. Organisations that treat it as a point-in-time activity find themselves exposed when requirements change or when incidents occur and they need to demonstrate what controls were in place.
With visibility and compliance established, the focus shifts to active protection. This means implementing security controls that are proportionate to the risk and sensitivity of the data involved — and making sure those controls extend across the full data lifecycle, not just at the perimeter.
For AI, that means securing the data used in training and inference, monitoring model behaviour, and ensuring that access to AI systems is governed with the same rigour as access to the underlying data. For quantum, it means beginning the assessment of cryptographic exposure and prioritising migration to quantum-safe encryption for the highest-risk data first.
The organisations that are furthest ahead have stopped treating AI security and quantum readiness as separate workstreams. They’re the same programme.
The threat landscape doesn’t stay still. Future-proofing means building the flexibility to adapt as it changes — crypto-agility for quantum, governance frameworks that can scale as AI deployment grows, and backup and recovery infrastructure that’s tested continuously rather than annually.
This last point is one that consistently gets underestimated. Prevention is necessary but not sufficient. If a breach occurs, a ransomware attack hits, or data is exfiltrated, the question is: how quickly can you recover, and how confident are you that the backup you’re relying on is actually working? For many organisations, the honest answer is not confident enough.
In our experience, the picture is consistent, most organisations have made progress at Stage 1 and 2. They have some visibility of their data estate and they’re meeting basic compliance requirements. The gaps tend to open at Stage 3 and 4.
AI governance is the clearest example. AI tools are in use, often more widely than senior leaders realise, but the accountability structures, access controls, and monitoring processes that should sit around them are lagging behind. The data feeding those tools is often insufficiently classified or governed.
Quantum readiness is at an earlier stage for most organisations, but the gap between where they are and where they need to be by 2030 is larger than it looks. Particularly for organisations with complex legacy infrastructure and limited cryptographic visibility.
Cyber recovery is the area where we most often find a false sense of confidence. Annual DR tests, backup infrastructure that hasn’t kept pace with hybrid and cloud environments, no clear definition of what recovered means in practice.
The organisations making the most progress share one thing: they’ve been honest about their current state before trying to fix it. That means a structured assessment of current security controls, data visibility, and recovery posture to create a clear baseline from which to build.
That baseline is what makes everything else actionable. It tells you where to prioritise, what the dependencies are, and what good looks like for your organisation in practice.
Celerity works with public sector organisations on data security, AI governance, and cyber resilience. From initial assessment through to managed services across AI and Cyber that keep pace with how the threat landscape is changing. If you’d like to talk through where your organisation stands, we’re happy to have that conversation.
Blog
22 September 2025
Blog
27 June 2024
Blog
21 May 2024