GDPR: Five things we will only discover after 25 May

However well versed the public sector is in the text of the incoming regulation, some questions will not be answered until the law is a living, breathing – and enforcing – entity. With two months until GDPR becomes law, PublicTechnology examines the biggest known unknowns

With the long-awaited General Data Protection Regulation due to come into force in just two months, public-sector organisations should have already found answers to all their most pressing questions about what the incoming law will mean for them.

Data-protection officers will have been appointed, data protection impact assessments (DPIA) should have been carried out, and staff ought to be up to speed on any new procedures and policies.

But it is impossible to know how exactly a law will work in practice until it is taken from the statute books and let loose in the real world.

Some questions about GDPR will, necessarily, remain unanswered until 25 May, and the days, weeks, months, and years thereafter.

Here are five of the biggest issues that we do not – and cannot – know more about until the talking ceases, the dust settles, and GDPR becomes law.

 

  • How will the right to erasure relate to archives?

GDPR introduces a number of new rights for individuals, most notably the right to have their data erased, if they so wish, providing the data controller has no grounds on which to keep them. This part of the law has become known as the ‘right to be forgotten’.

Forgetting someone from a backup environment should not prove too onerous a task. This is live, operational data that will be regularly accessed and updated.

But archived data is a rather different matter. 

Archives – which may, for many public-sector organisations, take the form of tapes stored by an external provider somewhere off-site – are designed to provide a home to data that may not need to be accessed for years at a time, but has typically been kept for legal or regulatory purposes. Cleansing this environment of discrete items of data will likely prove a much trickier task, one that might not be feasible at the speed the individuals concerned – and the law – may demand.

How the ideals of the law will square with the realities of organisations’ technological restrictions will only become apparent when the authorities begin to enforce GDPR in two months’ time.

 

  • How severe will the fines be?

The risks of failing to comply with GDPR have been most often illustrated by reminding organisations of the potential size of the financial penalties. The headline figures – fines of up €20m or 4% of global turnover, whichever is the greater amount – represent a massive increase on the £400,000 maximum penalty for breaching the existing Data Protection Act.

But just because the Information Commissioner’s Office can mete out such severe punishment does not mean that it will. And it is not yet clear how the seriousness of breaches will be calculated, and how the resultant penalties will be extrapolated from that.

All eyes will be on the first high-profile casualty, and whether that organisation is made an example of and fined heavily, or shown comparative lenience.

 

  • How will the public sector deal with compliance issues around special-category data?

The ICO advice for public-sector organisations is that they ought not rely on consent as a legal basis for processing people’s data, but rather that they should evidence that the processing needs to take place to perform tasks in the public interest, or to exercise a legally granted authority. 

This means of demonstrating legality is likely to cover the vast majority of the public sector’s processing of most data.

But the rules are a little different for information that is classed as special category data. This includes information pertaining to factors such as an individual’s ethnicity, sexual orientation, religion, political beliefs and, perhaps most pertinently, their biometric details.

In addition to establishing one of the six options for the legal basis of the processing of personal data, organisations dealing special category data must also meet one of nine additional conditions. One of these appears to cover healthcare providers, while another offers legal protection to those performing duties of “substantial public interest” – so long as data-protection measures and other safeguards are in place.

Central government already has two major programmes for biometric data – one for law enforcement-related information, and another for immigration. In the coming years, more and more organisations will use biometric data-collection technologies. Anyone collecting, sharing, or using this data will need to work out how to ensure they are doing so legally.
 
 

  • Will suppliers attempt to raise prices?

Although data controllers retain absolute responsibility, GDPR puts more onus on data processors – typically external suppliers of some form of storage, or perhaps analysis tools – to ensure they are compliant. 

This additional culpability needs to be codified in the contracts that exist between public-sector bodies and their external providers of backup, archiving, and other services. The need for suppliers to implement additional data-protection measures called for in GDPR must also be committed to paper.

Consequently, many existing contracts will need to be adapted to become GDPR-compliant. Crown Commercial Service has clearly advised public-sector organisations that they should not stand for suppliers attempting to use this as a means to raise prices. Companies, the CCS argument runs, are legally compelled to comply with GDPR in any case, and should not charge more simply for obeying the law.

But suppliers could easily argue that accepting additional liability and providing new services comes at a cost. 

Even if, as seems likely, most existing contracts are updated at no extra charge, the impact of GDPR could eat into suppliers’ margins and, ultimately, drive up prices in the longer term.

 

  • What does a good DPIA look like?

Organisations must perform a data-protection impact assessment for any data-processing activity that “likely to result in a high-risk to individuals’ interests”. 

According to the ICO, this assessment must do four things: “describe the nature, scope, context, and purposes of the processing; assess necessity, proportionality, and compliance measures; identify and assess risks to individuals; and identify any additional measures to mitigate those risks”.

But there is no absolute standard for risk, nor proportionality – such things are in the eye of the beholder. We will only really know what the terms above mean once that beholder is the ICO, which, in all likelihood, will be acting in the aftermath of a serious compliance breach.

The results of its assessments of risk and culpability will, hopefully, help instruct organisations in how their DPIA procedures should work. And, perhaps more importantly, how they should not.
 

Stay tuned to PublicTechnology for more GDPR news and analysis over the coming months. To find out more about what the regulation means for your organisation, visit the ICO website.
 

Sam Trendall

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to our newsletter
ErrorHere