‘We cannot accept this as the new normal’ – European lawmakers urge action on disinformation

Written by Liam Kirkaldy on 5 November 2019 in News
News

European Commission report finds that ‘large-scale automated propaganda’ is a persistent threat

Credit: Adobe Stock

Social media giants have improved transparency on their platforms over the last year, but the scope of action varies considerably between different companies, a European Commission report has found.

The EC found there is a closer dialogue with platforms regarding their policies against disinformation, but “differences in implementation of platform policy, cooperation with stakeholders and sensitivity to electoral contexts persist across member states”.

The findings come as Facebook, Google, Microsoft, Mozilla, Twitter and seven European trade associations published self-assessment reports on the progress made over the past year in the fight against online disinformation.

The reports provide information on policies implementing the code, with the EC saying consistency and level of detail varies.


Related content


In a June report, the EC found that while the European elections in May were “clearly not free from disinformation” the actions taken by the EU, together with journalists, fact-checkers, platforms, national authorities, researchers and civil society, helped “narrow down the space for foreign interference as well as coordinated campaigns to manipulate public opinion”.

In a joint statement, commissioner for justice, consumers and gender equality Věra Jourová, commissioner for the security union Julian King, and commissioner for the digital economy and society Mariya Gabriel warned “large-scale automated propaganda and disinformation persist and there is more work to be done under all areas of the Code”, adding, “we cannot accept this as a new normal”.

They said: "In particular, we commend the commitment of the online platforms to become more transparent about their policies and to establish closer cooperation with researchers, fact-checkers and Member States. However, progress varies a lot between signatories and the reports provide little insight on the actual impact of the self-regulatory measures taken over the past year as well as mechanisms for independent scrutiny.”

They added: “While the efforts of online platforms and fact-checkers can reduce harmful virality through platforms' services, there is still an urgent need for online platforms to establish a meaningful cooperation with a wider range of trusted and independent organisations. Access to data provided so far still does not correspond to the needs of independent researchers. 

“Finally, despite the important commitments made by all signatories, we regret that no additional platforms or corporate actors from the advertising sector have subscribed to the code.”

The EC will publish its comprehensive assessment of action against disinformarrtion in early 2020.

It warned that if the results under the code prove unsatisfactory, the EC may propose further measures, including the possibility of greater regulation.

Catherine Stihler, chief executive of the Open Knowledge Foundation, said: “Facebook, Google and Twitter must act on the growing demands for greater transparency.

“The social media giants have been at the centre of a series of rows about disinformation, particularly in connection with the Brexit referendum, and that simply cannot be allowed to happen once again in the run-up to December’s UK General Election.

“Urgent action is required, and if the platforms don’t act, then they need to be forced to.

“The institutions of the EU must use their influence to require online platforms to provide more detailed information allowing the identification of malign actors, put pressure on Facebook, Google and Twitter to increase transparency, and encourage closer working with fact-checkers to prevent the spread of disinformation.

“The best way to tackle disinformation is to make information open, allowing journalists, developers and the research community to carry out analysis of disinformation operations.”

 

About the author

Liam Kirkaldy is online editor at PublicTechnology sister publication Holyrood, where this story first appeared. He tweets as @HolyroodLiam.

Share this page

Tags

Categories

CONTRIBUTIONS FROM READERS

Please login to post a comment or register for a free account.

Related Articles

How secure is government and should we have a right to know?
8 July 2020

In a lengthy attempt to find out about the security of government’s software systems, PublicTechnology finds a very uneven approach to transparency and what constitutes sensitive...

How Brexit Britain could become a surveillance state
8 July 2020

The UK has tended to only introduce data-protection laws in conjunction with EU legislation and, according to Ray Walsh from ProPrivacy, the post-Brexit world may see the country prioritise...

Related Sponsored Articles

Interview: CyberArk EMEA chief on how government has become a security leader
29 May 2020

PublicTechnology talks to Rich Turner about why organisations need to adopt a ‘risk-based approach’ to security – but first make sure they get the basics right