'Inoculate' social media users against fake news, MPs told

The best way to fight so-called fake news is by "inoculating" social media users against online propaganda, MPs have been told.

Regularly informing social media users about how to spot hoaxes and alerting people to the sources of online lies is much more effective than addressing the problems after they have been read, according to academic experts who study the issues.

"The most important problem we're facing is that once people take in information, it's extremely difficult to undo that," said Professor Stephan Lewandowsky, an expert in cognitive psychology at the University of Bristol.

But giving people an "inoculation, like a vaccine" by showing them how lies spread online can be effective, said Professor Lewandowsky, giving evidence to the parliamentary inquiry into fake news.

"If we can get to people before the misinformation does then there's evidence to show they will be able to filter it out better," he added.

In October, Facebook released figures showing that Russian operatives published 80,000 posts over a two-year period in an attempt to influence the 2016 US election, reaching 126 million Americans in the process.

And last week Twitter said nearly 700,000 users had been "exposed" to Russian propaganda around the US election sent by more than 50,000 accounts.

Executives from Facebook, Twitter and Google are all due to give evidence to the parliamentary fake news inquiry in February about Russian meddling on their platforms during the Brexit vote.

Last year's French election was also targeted by online propaganda attempting to sway the vote away from the centrist Emmanuel Macron and towards the far-right Marine Le Pen.

But France was more prepared, said Professor Lewandowsky, because they "knew what was coming" after lessons learned from the US election and EU Referendum in 2016.

"But we also need to change the whole ecosystem of online information to make it harder for misinformation to spread," he said.

Facebook promotes news and information which support people's existing beliefs, said Professor Vian Bakir, from Bangor University, but people are less likely to question those ideas if they already believe them, whether they are true or not.

And Dr Caroline Tagg, from the Open University, said that, on Facebook in particular, people tend to ignore or block out information which they do not agree with, rather than correct it.

"This increases the filter bubble effect which increases the polarisation of views which enables fake news to be circulated more easily," said Dr Tagg.

While there is "no silver bullet", Professor Bakir said online information and misinformation should be addressed "at all levels" of education and that political leaders should take responsibility for any lies they spread.

MPs will hear from executives at Facebook, Twitter and Google in Washington on February 8.