We all have something to hide – and the government must let us
The public sector needs to be careful it does not fall foul of citizens’ growing disquiet about how their data is used and by whom, according to PublicTechnology editor Sam Trendall
I’ll admit it: I have something to hide.
I’m sure, at some point, we’ve all been party to a discussion – perhaps about surveillance, or ID cards, or fingerprint databases, for example – wherein someone has invoked an argument along the lines of: ‘if you don’t have anything to hide, I don’t see why you would have a problem with this’.
Well, OK. You’ve got me there.
I do something to hide. In fact, I have everything to hide.
My entire life.
In an age when many of us cannot get through breakfast without sharing the experience with the world, the concept of maintaining a ‘private life’ can often seem a little quaint. But, in the letter of the law, at least, it remains a fundamental right.
Article 8 of the Human Rights Act 1998 reads: “Everyone has the right to respect for their private and family life, their home, and their correspondence. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.”
- Interview: Surveillance Camera Commissioner discusses his mission to protect privacy and human rights
- ICO to make ‘clear policy recommendations’ in light of Facebook data probe
- New investigatory powers commissioner to oversee all forms of government surveillance
As a social-media user whose status is very happily – and probably permanently – lapsed, there is now no real way for people who have no direct contact with me to get any form of Sam Trendall news.
I doubt it is much missed.
Even before the Cambridge Analytica revelations, it is years since I posted anything remotely personal on Facebook or Twitter. But the recent headlines have certainly done nothing to make me want to rekindle my use of social media.
But, of course, even my private communications are conducted digitally, and so are necessarily stored not just on the devices of my friends and family, but somewhere on the servers of technology companies.
Even for those of us who actively seek it, a truly private life can feel hard to achieve in this day and age.
While it is easy enough to stop using, or even delete our social media accounts, none of us can log out of the laws of the land.
Two recent stories have shone a light on the tightrope that lawmakers and law-enforcers must walk when using technology to – they would contend – better perform their duties.
Last month, a high-court ruling gave the government six months to amend the Investigatory Powers Act – known to many as the snoopers’ charter – after judges found it to be incompatible with EU law and the European Convention on Human Rights.
And, last week, the facial recognition-enabled cameras that have been used by police in London and Wales were also the subject of unwelcome headlines.
The privacy campaign group Big Brother Watch reiterated its belief that the way technology has been used to date “is likely to be unlawful”, as it published a report claiming that FOI requests showed the software has been 95% inaccurate so far. Perhaps even more significantly, the information commissioner Elizbeth Denham said that facial recognition would be a key focus area for her office in the coming months.
Denham has already written to the Home Office and the National Police Chiefs Council to express her concerns about the technology’s use.
“Should my concerns not be addressed, I will consider what legal action is needed to ensure the right protections are in place for the public,” she said.
One counterargument that has often put forward in response to criticisms and concerns such as those seen recently is that surveillance technologies are used to prevent far, far worse crimes, such as terrorism or child abuse. And that, as such, its use can be justified legally, as well as ethically.
But, for the government and the wider public sector, the legality of how it monitors citizens is really only half the problem.
Public sentiment, which is increasingly turning against anyone that profits from processing our data, is equally important.
Making better use of data in designing policy and delivering services is a major ambition for the government, as well as for many individual organisations such as NHS trusts, police forces, and local authorities. Longer-term goals will include improved data sharing and, in time, effective use of analytics and artificial intelligence.
For this to happen, the public sector needs to ensure that this data is given freely, by a citizenry that is becoming increasingly aware of – and suspicious of – how its information is being used. Even by public-service entities.
The response to the recent headlines, from the Home Office and the police forces of London and South Wales, respectively, was rather bullish.
It may be worth remembering that consent has always been watchword for both the legal and the political system in this country. The police force was built on the Peelian principles of policing by consent and, like all democracies, ours is founded upon the idea of the consent of the governed.
The deluge of GDPR emails that have swamped all our inboxes recently are a timely reminder: consent can always be revoked.
If it wants to ensure that citizens do not try and opt out of its vision for the data-enabled future, the government needs to maintain the trust of a public that may, increasingly, share my wish to protect its private life.
Although big-ticket technology announcements were largely absent from the chancellor’s speech, the Budget contained a number of initiatives and investments in digital and data
Department for International Trade to work with internet giant to help small businesses cash in on overseas opportunities
Trade association techUK has backed the proposed deal – but campaign group Tech For UK is fervently calling for a public vote
Yoshitaka Sakurada has responsibility for protecting the technology to be used at 2020 Olympics