Rachel Neaman of Corsham Institute believes that facing down the challenge of online misinformation needs a long-term and wide-ranging strategy
The fake news phenomenon underlines the pressures posed by an increasingly digital world, with endless volumes of unregulated information now available to all at the touch of a button.
Self-publishing platforms and the social media revolution have brought great openness and opportunity, which is to be celebrated. But with it they have also brought risk and uncertainty.
The government’s recent announcement of an anti-fake news unit symbolises the seriousness of the threat this trend poses to individuals, companies and wider democracy.
It is no surprise that such an initiative is now a key element of the UK’s national security strategy. The full consequences of fake news are yet to be determined, but at worst they can affect election results, trigger social unrest, and support terrorist activity.
The ability to rapidly create believable articles published on credible channels is now widespread. Even when moderators and regulators move swiftly to delete misinformation, they are often already too late. And, in many cases, they themselves struggle to judge what is genuinely true or false.
Instant, viral online sharing, through text messages and apps, means that inflammatory and misleading content can reach thousands of people globally before it is even reported. Even then, screengrabs, copies and duplicates can be easily uploaded elsewhere.
- Social networks could face same regulation as news sites, government says
- Attorney general to examine risks of ‘trial by social media’
- Bedfordshire Fire and Rescue to spend £30,000 to combat ‘misuse of social media’
The public are both confused and concerned by this phenomenon. Recent research from the BBC World Service across 18 countries revealed that 79% of respondents said they worried about what was fake and what was real on the internet. According to Demos, 67% of the public are concerned about fake news, and the 2018 Edelman Trust Barometer shows that less than a quarter of the UK population now trust social media.
Light-hearted fake news stories in the mainstream press, particularly on April Fool’s Day – who can forget the spaghetti tree Panorama programme? – are not new. And, inevitably, it can be increasingly difficult to distinguish between legitimate and fake news stories. Tackling this then also raises complexities around the limits of regulation and freedom of information online.
What is clear is that the government cannot be expected to fight this battle alone. A challenge of this magnitude requires a long-term strategy, encompassing technology, public-awareness campaigns, and specialist skills.
That’s why Corsham Institute recently gave evidence to the DCMS Select Committee on Fake News and why we are establishing a project to take forward further work in this area. It’s also why we run a programme on Countering Violent Extremism, which has at its heart a hugely successful podcast bringing together experts, academics and community practitioners to debate the tactics used to spread extremist views online and the approaches that can limit its impact on susceptible individuals.
The prime minister argued in her speech to the World Economic Forum in Davos that we need a much more concerted effort from tech companies, investors and international partners to deal with extremist content on their platforms. We also need to see a much more focused effort from publishing platforms for screening content and an appropriate system for members of the public to report fake news.
There also need to be more initiatives to support the identification of fake news and the development of more sophisticated critical thinking and digital literacy skills amongst the general population. The subject should be covered in schools, alongside training for adults about the risks posed by this growing phenomenon.
With a comprehensive and cohesive strategy combining prevention and skills development, the pervasive and negative influence of fake news stories can be reduced.
Doing this requires a collaborative effort from the government, the public, publishing and social media platforms. It will also require constant vigilance, backed by ongoing programmes of education and skills that keep up with the pace of tech change, to preserve the interests and ideals of civil society and democracy.