Government ‘must be wary’ of purely consent-based approach to policies on data use

The Government Digital Service’s director of data has said that in keeping an up-to-date framework for data use, government must look beyond privacy and consent rules from the retail sector.

Governments must make decisions based on collective good, says Paul Maltby – Photo credit: GDS

Paul Maltby, writing in a blogpost following his talk at the Government ICT 2.0 conference last week, said that the “evolved GDS” was focusing on broad service transformation, “not just fixing websites”.

As well as outlining ways in which the service was working with departments to improve their data infrastructure and encouraging more use of data analytics, Maltby addressed questions around the policy frameworks for using that data.


Related links

Labour’s Chi Onwurah urges government not to lose sight of data ethics in post-Brexit turmoil
Unlocking the power of big data


He said it was “essential” that the government had a vision of what such frameworks should look like from a citizen’s point of view.

“At the heart of our vision is a future with radically more visibility and control over transactional services,” Maltby said – noting that people can already see and amend the data behind their driving licenses.

This is likely to mean a move away from bulk data transfer between departments towards a “more efficient, consent-based, and privacy-aware way of managing personal data”, he said.

However, he stressed that government should “be careful of simply transplanting perfectly valid arguments about consent and ownership of personal data from retail sectors”.

Government is not a “vending machine”, and it makes decisions on behalf of collective interests – “it’s not always about you as an individual”, he said. Although he acknowledged that the idea of collective good and collective decisions was not always popular with “voices towards the libertarian end of the political spectrum…this is a principle well established in parliament”.

“The government’s view is that we should be wary of a purely consent-based approach that would see individuals able to withdraw data that refers to them for research, for fraud and for crime.

“Or an approach that would step back and allow vulnerable older people to miss out on support for heating bills because they hadn’t proactively ticked the right box,” he said.

Finding the right balance will be part of the discussion about the Digital Economy Bill, which is currently passing through parliament and is due to be considered by a House of Commons Public Bill Committee this month.

Maltby added that, where consent isn’t a suitable protection, the government will need to rely on other safeguards to protect the public interest.

“The Data Protection Act will remain a critical part of that environment, but while in some areas we are removing barriers for data access, elsewhere we will need to consider new protections for how we store, access and use data,” he said.

The data science code of ethics – launched in May this year to help civil servants carry out data science projects properly – will form part of this, he said, but added that this was “the first, not the last, word” and would need to be regularly updated.

“We are seeking to build this into our emerging data science function in government,” he said.

‘Reforms have been surface level’

Maltby also set out the ways in which government was helping departments make better use of data.

He said that, to date, most reforms around open data had too often been “surface level” and that government needed to “roll up our sleeves and get stuck in deeper”.

This includes work to reform open data registers, many of which are out of date and rely on free text input in web forms – leading to multiple entries for single regions, for instance, which prevent departments from analysing them properly.

Maltby added that every team across government currently holds duplicated data that isn’t part of their core business – for instance, they will all hold their own lists of local authorities, countries or businesses.

“They know they’re out of data and it makes analysis hard,” he said. “We want them to concentrate on the things that they do best.”

As an example of the work being done to reduce this duplication, Maltby cited work between GDS and the Department for Communities and Local Government to create a definitive list of local authorities in England.

The local authorities register, which Matlby described as being “a giant leap forward” is in alpha, but when it moves into beta it will be available for all organisations to build straight into databases or software, such as the Food Standards Agency’s 1-5 star ratings for restaurants.

Another issue that GDS has to deal with is getting the data right at the source, Maltby said.

“Perhaps counter-intuitively in an age of big data in our team we often talk about creating minimum viable datasets,” he said. “This is important if other services are to trust this core reference data, and end the practice of duplicating data within our silos.”

To address this, there is a named custodian responsible for each register’s upkeep within the relevant department, and this person can only be held accountable for the data they “mint”, Maltby said.

The director of data also emphasised the work that GDS is doing to encourage the use of data in government, including its accelerator scheme to boost staff’s skills and offering meet-ups for the 350-plus data scientists in Whitehall.

“Yes, we build things with data in GDS, but our real aim is to create an understanding and demand for data scientists, and data science techniques, across government,” Maltby said.

Rebecca.Hill

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to our newsletter
ErrorHere