‘We cannot accept this as the new normal’ – European lawmakers urge action on disinformation
European Commission report finds that ‘large-scale automated propaganda’ is a persistent threat
Credit: Adobe Stock
Social media giants have improved transparency on their platforms over the last year, but the scope of action varies considerably between different companies, a European Commission report has found.
The EC found there is a closer dialogue with platforms regarding their policies against disinformation, but “differences in implementation of platform policy, cooperation with stakeholders and sensitivity to electoral contexts persist across member states”.
The findings come as Facebook, Google, Microsoft, Mozilla, Twitter and seven European trade associations published self-assessment reports on the progress made over the past year in the fight against online disinformation.
The reports provide information on policies implementing the code, with the EC saying consistency and level of detail varies.
- What next for the government’s anti-fake news unit?
- Government, the public, and tech firms must work together to beat the real problem of fake news
- Government anti-fake news unit to continue
In a June report, the EC found that while the European elections in May were “clearly not free from disinformation” the actions taken by the EU, together with journalists, fact-checkers, platforms, national authorities, researchers and civil society, helped “narrow down the space for foreign interference as well as coordinated campaigns to manipulate public opinion”.
In a joint statement, commissioner for justice, consumers and gender equality Věra Jourová, commissioner for the security union Julian King, and commissioner for the digital economy and society Mariya Gabriel warned “large-scale automated propaganda and disinformation persist and there is more work to be done under all areas of the Code”, adding, “we cannot accept this as a new normal”.
They said: "In particular, we commend the commitment of the online platforms to become more transparent about their policies and to establish closer cooperation with researchers, fact-checkers and Member States. However, progress varies a lot between signatories and the reports provide little insight on the actual impact of the self-regulatory measures taken over the past year as well as mechanisms for independent scrutiny.”
They added: “While the efforts of online platforms and fact-checkers can reduce harmful virality through platforms' services, there is still an urgent need for online platforms to establish a meaningful cooperation with a wider range of trusted and independent organisations. Access to data provided so far still does not correspond to the needs of independent researchers.
“Finally, despite the important commitments made by all signatories, we regret that no additional platforms or corporate actors from the advertising sector have subscribed to the code.”
The EC will publish its comprehensive assessment of action against disinformarrtion in early 2020.
It warned that if the results under the code prove unsatisfactory, the EC may propose further measures, including the possibility of greater regulation.
Catherine Stihler, chief executive of the Open Knowledge Foundation, said: “Facebook, Google and Twitter must act on the growing demands for greater transparency.
“The social media giants have been at the centre of a series of rows about disinformation, particularly in connection with the Brexit referendum, and that simply cannot be allowed to happen once again in the run-up to December’s UK General Election.
“Urgent action is required, and if the platforms don’t act, then they need to be forced to.
“The institutions of the EU must use their influence to require online platforms to provide more detailed information allowing the identification of malign actors, put pressure on Facebook, Google and Twitter to increase transparency, and encourage closer working with fact-checkers to prevent the spread of disinformation.
“The best way to tackle disinformation is to make information open, allowing journalists, developers and the research community to carry out analysis of disinformation operations.”
In a lengthy attempt to find out about the security of government’s software systems, PublicTechnology finds a very uneven approach to transparency and what constitutes sensitive...
The UK has tended to only introduce data-protection laws in conjunction with EU legislation and, according to Ray Walsh from ProPrivacy, the post-Brexit world may see the country prioritise...
Joint Council for the Welfare of Immigrants files judicial review
EHRC also flags up need for more data on impact of remote hearings
PublicTechnology talks to Rich Turner about why organisations need to adopt a ‘risk-based approach’ to security – but first make sure they get the basics right
CyberArk's David Higgins explores the cyber risks of hiring independent contractors
CyberArk's John Hurst looks at the true cost of GDPR breaches