Government urged to bring in emergency legislation to combat vaccine disinformation

Shadow minister and campaign group call for new laws

Credit: Ulrike Leone/Pixabay

An emergency law to tackle disinformation about Covid-19 vaccines on social media needs to be introduced by Boris Johnson this week if he wants to make a success of the rollout, Labour has said.

As the country prepares for a mass Covid-19 vaccination programme, shadow culture and media secretary, Jo Stevens, is calling on the government to introduce a bill in the Commons to force social media companies to remove false information within a fixed time period by law and, if material remains online, to apply sanctions to bosses at platforms including Facebook and Twitter.

The emergency bill is needed because the government’s own Online Harms Bill – which covers some of the same ground – might not pass through Parliament until spring 2021 as it is still at the white paper stage.

“This is about preventing social media platforms from facilitating the spread of anti-vaxx information. This is being done at an industrial level scale by groups and bad actors,” she said.

The government should put a statutory duty on media firms to remove material in a fixed period of time, according to the shadow minister.


Related content


“If there are repeated and aggravated breaches of that then there should be personal liability on senior executives on those platforms,” Stevens said. “[The government] could bring a bill forward. It needs to be very simple. Although government has talked about online harms for a long time even if we get their response to the white paper shortly….we know it’s unlikely that legislation will have Royal Ascent until after the vaccine rollout has happened.”

An advertising campaign on billboards, notice boards and bus stops should also have been rolled out in anticipation of the vaccinations, she suggested, to try and encourage the public to take get the jab if they are eligible.

Johnson said at Prime Minister’s Questions last week that he will be publishing a paper “very shortly” on online harms designed to tackle the disinformation like anti-vaccination material.

It is understood members of the government’s Counter Disinformation Unit, set up at the beginning of the Covid crisis, met online last week.

Imran Ahmed, founder of the Center for Countering Digital Hate, said legislation was needed as a matter of urgency. The Online Harms Bill is necessary, he said, but the government must devise an immediate solution to the spread of anti-vaxx material.

“The failure of social media companies to tackle malignant behaviour on their platforms, first with Covid misinformation and now anti-vaccine extremism, demonstrates why the Government must bring forward its Online Harms Bill as soon as possible,” Ahmed added. “However, the most urgent problem needs a far more immediate solution. Lies designed to persuade people not to protect themselves and their families are being broadcast to millions of people online every day.”

Social media companies have already put in place some measures to direct users towards reputable information concerning vaccines. 

Facebook has made changes so that anyone searching for ‘anti-vaxx’ will be shown a significant list of reputable global medical organisations that promote vaccination. They can also remove videos with false information. Searching for ‘anti vaccine’ on Twitter brings up a “know the facts” section and a link to the NHS Covid-19 vaccinations page.

The Pfizer/BioNTech vaccine is due to be introduced this week and the government has orders in with other companies like Moderna and Astra Zeneca, who have also developed vaccines.  

It emerged recently that health secretary Matt Hancock and digital secretary Oliver Dowden last month met with representatives from social media companies “to reduce the spread of harmful and misleading narratives, particularly around the potential Covid-19 vaccine”.

Discussing the meeting in answering a written parliamentary question, minister for digital and culture Caroline Dinenage said: “Social media platforms agreed to continue to work with public health bodies to ensure that authoritative messages about vaccine safety reach as many people as possible; to commit to swifter responses to flagged content and to commit to the principle that no user or company should directly profit from Covid-19 vaccine misinformation or disinformation.”

 

Sam Trendall

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to our newsletter
ErrorHere