Facebook reveals elections ‘war room’ at California headquarters
Facebook has created a physical “war room” at its California headquarters as part of its efforts to tackle election interference tactics.
Earlier this week the social network introduced new rules around political advertising in the UK, requiring anyone who wants to post political ads to verify their identity and location, and added the UK to its library of political adverts which also includes the US and Brazil.
Now the social network has revealed that in September, in the run-up to elections in both Brazil and the US, it created a physical room at its Menlo Park campus from where it says its elections interference prevention experts can work together.
Director of product management for civic engagement Samidh Chakrabarti said: “The war room has over two dozen experts from across the company – including from our threat intelligence, data science, software engineering, research, community operations and legal teams.
“These employees represent and are supported by the more than 20,000 people working on safety and security across Facebook. When everyone is in the same place, the teams can make decisions more quickly, reacting immediately to any threats identified by our systems, which can reduce the spread of potentially harmful content.”
Facebook said the space could be used to monitor activity around election issues in real time, including efforts to prevent people from voting and increases in spam content.
Mr Chakrabarti said the teams involved had also prepared responses to different techniques that could be used by those seeking to use the platform to influence voters or mislead them.
“These preparations helped a lot during the first round of Brazil’s presidential elections,” he said.
“For example, our technology detected a false post claiming that Brazil’s Election Day had been moved from October 7 to October 8 due to national protests. While untrue, that message began to go viral.
“We quickly detected the problem, determined that the post violated our policies, and removed it in under an hour. And within two hours, we’d removed other versions of the same fake news post.”
Facebook has been repeatedly accused of failing to adequately prevent previous efforts to meddle in elections, something the company itself has also admitted.
Mr Chakrabarti said the company would continue to invest in security in an attempt to stay ahead of “adversaries”.
“The work we are doing in the war room builds on almost two years of hard work and significant investments, in both people and technology, to improve security on Facebook, including during elections,” he said.
“Our machine learning and artificial intelligence technology is now able to block or disable fake accounts more effectively – the root cause of so many issues.
“We’ve increased transparency and accountability in our advertising. And we continue to make progress in fighting false news and misinformation.
“That said, security remains an arms race and staying ahead of these adversaries will take continued improvement over time. We’re committed to the challenge.”