Overcoming public sector patch paralysis and risk culture

The global ransomware attack has been followed by an outcry about the NHS’ reliance on legacy systems. Harry Metcalfe and Lee Maguire of dxw say you can’t properly diagnose the problem – or understand an organisation’s culture – from the outside, but call for technicians to be empowered to act quickly when security risks are identified.

Organisations should respond to security risks the way they respond to cyber attacks – Photo credit: Flickr, FabianOrtizCC BY 2.0

The recent ransomware attack that impacted NHS computer systems has brought into focus how much the public sector relies on outdated and poorly maintained systems.

At our company, dxw, we focus primarily on delivering web technologies. We follow the Service Manual for guidance on the minimum supported browser – but exceptions often still need to be made to accommodate older software in use by public sector organisations.

This will have a knock-on effect on wider security elsewhere.

For example, services can’t use up-to-date cryptography while they still need to support Windows XP users (regardless of the web browser they use), which in practice means that upgrading to anything that doesn’t support the obsolete RC4 encryption cipher is put on hold.


Related content

NHS cyber attack a ‘wake-up call’ for government
Large-scale cyber-attack hits hospitals across England


It’s easy to imagine an entire chain of upgrades and updates delayed until the old PCs get taken out of service.

And while much has been said about the use of unsupported operating systems, it’s the ageing-yet-supported systems that have been left unpatched that are more worrying.

You can’t really diagnose the causes of IT failures, such as a ransomware infection, from outside an organisation.

It may appear to have been the lack of a simple fix, perhaps with an identifiable culprit. But, from the outside, you can’t see the processes, the priorities, the roadblocks, the contracts, the approvals, or the available time.

You can see the job titles, but you can’t see the culture in which they work.

Our approach for high-profile security updates is to apply them quickly, test them in-place, and have the resources available to revert if necessary. Our contracts are structured so that when high-impact security issues emerge, we can apply patches immediately, without additional sign-offs.

Treating a patch with the same immediacy and urgency as an attack brings different risks. But these risks are much easier to manage and plan for than a chaotic set of failures with an unknown cause. Ironically, the very change control processes designed to reduce risks often worsen them.

The impression we sometimes get from public sector organisations is that change control and a culture of risk aversion prevents people from proactively addressing risks at all.

Not many public sector computers are attached to life-support machines or controlling nuclear submarines, but many organisations have culture and processes obliging people to treat them like they are.

The reasons I hear for inaction are understandable. Security updates haven’t been applied, they explain, because they have many competing demands and they can’t risk introducing disruption every time some theoretical new risk is inconveniently announced.

The tragedy is that there’s a kernel of truth there.

Not every demand that “security” makes can be practically accommodated. But succumbing to this philosophy is dangerous: everything will work fine, sometimes for years, right up to the point that at which lots of things break at once. Recent events are a case in point.

There are, I’m told, public sector organisations that manage cost and disruption by only applying updates on a quarterly basis. An approach that would cause most information security people to tear out their (remaining) hair.

This paralysis can be overcome by treating discovery of new risks as the unexpected event, rather than the attack. 

Agree in advance on the approach to security updates, especially those that occur outside of a predictable cycle, so that whoever has the technical means to act is empowered to do so.

Rebecca.Hill

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to our newsletter
ErrorHere