Thursday, 28th March 2024
To guardian.ng
Search

Facebook’s launches ‘war room’ to combat manipulation

By AFP
18 October 2018   |   10:20 am
In Facebook's "War Room," a nondescript space adorned with American and Brazilian flags, a team of 20 people monitors computer screens for signs of suspicious activity. The freshly launched unit at Facebook's Menlo Park headquarters in California is the nerve center for the fight against misinformation and manipulation of the largest social network by foreign…

FILE PHOTO: Silhouettes of mobile users are seen next to a screen projection of Facebook logo in this picture illustration taken March 28, 2018. REUTERS/Dado Ruvic/File Photo

In Facebook’s “War Room,” a nondescript space adorned with American and Brazilian flags, a team of 20 people monitors computer screens for signs of suspicious activity.

The freshly launched unit at Facebook’s Menlo Park headquarters in California is the nerve center for the fight against misinformation and manipulation of the largest social network by foreign actors trying to influence elections in the United States and elsewhere.

Inside, the walls have clocks showing the time in various regions of the US and Brazil, maps and TV screens showing CNN, Fox News and Twitter, and other monitors showing graphs of Facebook activity in real time.

Facebook, which has been blamed for doing too little to prevent misinformation efforts by Russia and others in the 2016 US election, now wants the world to know it is taking aggressive steps with initiatives like the war room.

“Our job is to detect … anyone trying to manipulate the public debate,” said Nathaniel Gleicher, a former White House cybersecurity policy director for the National Security Council who is now heading Facebook’s cybersecurity policy.

“We work to find and remove these actors.”

Facebook has been racing to get measures in place and began operating this nerve center — with a hastily taped “WAR ROOM” sign on the glass door — for the first round of the presidential vote in Brazil on October 7.

It didn’t take long to find false information and rumors being spread which could have had an impact on voters in Brazil.

“On election day, we saw a spike in voter suppression (messages) saying the election was delayed due to protests. That was not a true story,” said Samidh Chakrabarti, Facebook’s head of civic engagement.

Chakrabarti said Facebook was able to remove these posts in a couple of hours before they went viral.

“It could have taken days.”

– Humans and machines –
At the unveiling of the war room for a small group of journalists including AFP this week, a man in a gray pork pie hat kept his eyes glued to his screen where a Brazilian flag was attached.

He said nothing but his mission was obvious — watching for any hints of interference with the second round of voting in Brazil on October 28.

The war room, which will ramp up activity for the November 6 midterm US elections, is the most concrete sign of Facebook’s efforts to weed out misinformation.

With experts in computer science, cybersecurity and legal specialists, the center is operating during peak times for the US and Brazil at present, with plans to eventually work 24/7.

The war room adds a human dimension to the artificial intelligence tools Facebook has already deployed to detect inauthentic or manipulative activity.

“Humans can adapt quickly to new threats,” Gleicher said of the latest effort.

Chakrabarti said the new center is an important part of coordinating activity — even for a company that has been built on remote communications among people in various parts of the world.

“There’s no substitute to face to face interactions,” he said.

The war room was activated just weeks ahead of the US vote, amid persistent fears of manipulation by Russia and other state entities, or efforts to polarize or inflame tensions.

The war room is part of stepped up security announced by Facebook that will be adding some 20,000 employees.

“With elections we need people to detect and remove (false information) as quickly as possible,” Chakrabarti said.

The human and computerized efforts to weed out bad information complement each other, according to Chakrabarti.

“If an anomaly is detected in an automated way, then a data scientist will investigate, will see if there is really a problem,” he said.

The efforts are also coordinated with Facebook’s fact-checking partners around the world including media organizations such as AFP and university experts.

Gleicher said the team will remain on high alert for any effort that could lead to false information going viral and potentially impacting the result of an election.

“We need to stay ahead of bad actors,” he said. “We keep shrinking the doorway. They keep trying to get in.”

In this article

0 Comments