DCMS committee claims it has ‘pulled back the curtain on the secretive world of the tech giants’
The Digital, Culture, Media and Sport Committee has recommended a range of legislative and technical measures to tackle the “crisis in our democracy” being caused by fake news.
The committee’s inquiry into online disinformation is ongoing, but an interim report published yesterday contains an array of conclusions and recommendations. The committee has assessed the impact of fake news to date, as well as the roles government, businesses, and citizens can play in combatting the harm it causes.
The key findings and suggestions of the report include:
- The government should avoid using the term ‘fake news’, as it is poorly defined. It should, instead, agree on a definition for the words ‘disinformation’ and ‘misinformation’ and use these in its place.
- The government should work with experts to create a “verification” system and set of annotations that allows people to instantly ascertain a site’s credibility
- Legislation used by Ofcom to regulate the content put out by radio and television broadcasters should be taken as the basis for creating similar rules for online content.
- Facebook and other online platforms need to take greater responsibility for how their platforms are used, and by whom. During MPs’ evidence-gathering, the committee claimed that “Facebook did not accept their responsibilities to identify or prevent illegal election campaign activity from overseas jurisdictions”.
- MPs on the committee support the recommendation of the Electoral Commission that all online campaigning should have an embedded digital imprint providing provenance and sponsorship details.
- The Electoral Commission should also be given the power to levy fines greater than the current maximum penalty of £20,000.
- The law should reflect that technology companies do not always fall neatly into one of two definitions – ‘platform’ or ‘publisher’ – and a new category should be created.
- There ought to be a “clear legal liability” that compels technology firms to combat illegal or harmful content posted on their sites.
- The data security measures and algorithms of internet firms should be subject to audit in much the same way as company finances.
- The UK government could better collaborate with its counterparts in the US by establishing a “digital Atlantic Charter” covering citizens’ digital rights.
- Technology companies should work with governments and global industry organisations to create and implement a worldwide Code of Ethics. This code should “set down in writing what is and what is not acceptable by users on social media, with possible liabilities for companies and for individuals working for those companies, including those technical engineers involved in creating the software for the companies”.
- A levy placed by the government on social-media companies should be used to help finance digital literacy programmes. The committee said that “digital literacy should be the fourth pillar of education, alongside reading, writing and maths”, and that a national framework for digital skills should be developed by charities and non-governmental organisations.
- Central government departments and regulators should work together to deliver an ongoing “public-awareness initiative” to help inform citizens of the rights and laws applicable to their data.
Yesterday’s publication represents an initial interim report, with a final report – featuring “further conclusions based on the interrogation of data and other evidence” – due to be released later this year.
Committee chair Damian Collins MP said: “We are facing nothing less than a crisis in our democracy – based on the systematic manipulation of data to support the relentless targeting of citizens, without their consent, by campaigns of disinformation and messages of hate. In this inquiry we have pulled back the curtain on the secretive world of the tech giants, which have acted irresponsibly with the vast quantities of data they collect from their users. Despite concerns being raised, companies like Facebook made it easy for developers to scrape user data and to deploy it in other campaigns without their knowledge or consent.
Collins added: “Throughout our inquiry these companies have tried to frustrate scrutiny and obfuscated in their answers. The light of transparency must be allowed to shine on their operations and they must be made responsible, and liable, for the way in which harmful and misleading content is shared on their sites.”