Social media ‘failure’ on Trump must cause rethink on disinformation, MP says

Social media giants must rethink how they define harmful disinformation, with the violence at the US Capitol proving their efforts so far have been a “failure” and the move to ban Donald Trump was “too little, too late”.

That is according to MP Damian Collins, the former chair of the Digital, Culture, Media and Sport (DCMS) select committee which led an inquiry into social media and disinformation.

Mr Collins said platforms such as Facebook and Twitter had allowed the US president to “whip up” supporters and frequently share false claims with them because “for too long believed this is simply a matter of free speech and personal choice”.

Electoral College Photo Gallery
Electoral College Photo Gallery

Supporters of Mr Trump stormed the US Capitol building on Wednesday in protest at his election loss to Joe Biden, leading to violent clashes and the deaths of five people.

The president was subsequently locked out of his social media accounts temporarily after being accused of inciting the violence, although Facebook has since extended the ban until the end of his presidency on January 20.

Conservative MP Mr Collins, who as chair of the DCMS Committee produced a report on social networks and the rise of disinformation in 2018, said at the time his committee had labelled disinformation “a threat to democracy”.

“Over the last few days we’ve seen that play out, we’ve seen what it could do,” he told the PA news agency.

“We do not yet, I think, know the damage that’s been done with the poison that’s been pumped into the system, the impact that will have on politics in America in the coming years. It’s something that we still don’t fully understand.”

Mr Collins said Mr Trump had been allowed to “incite” interference with the US election process since losing to Joe Biden because much of the content shared by the president and his supporters was not classed by the platforms as an imminent physical threat to others and therefore a breach of their rules.

“It’s not just about content moderation, it’s about the amplification of that content and that really is the most important thing,” Mr Collins said.

“It’s the fact that people that can use the social media platforms to reach large numbers of people, and the platforms themselves recommend content to people based on what they’re interested in – the platforms are designed to succeed in an attention economy and they’re largely designed to be blind as to what it is people are interested in.

“Now that may be fine if you’re selling music or a pair of shows, but if you’re selling a political ideology and the platforms are directing people to that content because they think they’re interested in it – that’s where the platforms have a clear responsibility to try and make sure their systems aren’t used to amplify harmful messages.”

He said sites should now recognise that their systems are being used to spread disinformation, and that represented “both an immediate harm and a long term attack on democracy, on mainstream media” as well as “driving divisions in society”.

To force this change, Mr Collins said the Government’s planned Online Harms regulation, designed to curb social media and tech giants, should take a closer look at its definition of harmful misinformation, making platforms enforce rules which focus on content beyond just that which causes imminent physical danger to an individual.

“I think we need to look wider than that and say OK, are there circumstances where what we’re talking about is not necessarily an immediate physical danger to an individual, but actually something which is potentially more widely damaging to society as a whole, and I would say that someone using social media to incite an insurrection is an example of how that is dangerous,” he said.

However, the Tory MP also warned that the incident highlighted the power social media wielded, and regulation must ensure that decisions such as de-platforming a president or prime minister “aren’t left to people like Mark Zuckerberg”.

“I think there needs to be some sort of legal or regulatory framework around the way those decisions are made,” he said.

“Like other media, there should be some sort of guidelines around which these companies have to operate and they should be held accountable or liable for the decisions that they make”.