Published: 2:27 p.m., Jan. 23, 2018 | Updated: 1:26 a.m., Jan. 24, 2018
WASHINGTON — Facebook acknowledged on Monday that the explosion of social media posed a potential threat to democracy, pledging to tackle the problem head-on and turn its powerful platform into a force for “good.”
The comments from the world’s biggest social network were its latest response to intense criticism for failing to stop the spread of fake news among its 2 billion users most strikingly leading up to the 2016 US presidential election.
In a blog post, Facebook civic engagement chief Samidh Chakrabarti said he was “not blind to the damage that the internet can do to even a well-functioning democracy.”
“In 2016, we at Facebook were far too slow to recognize how bad actors were abusing our platform,”
Chakrabarti said. “We’re working diligently to neutralize these risks now,” he added.
The blog post, one in a series dubbed “hard questions,” was part of a high-profile push by Facebook to reboot its image, including with the announcement last week that it would let users “rank” the trustworthiness of news sources to help stem the flow of false news.
For democratic good
“We’re as determined as ever to fight the negative influences and ensure that our platform is unquestionably a source for democratic good,” Katie Harbath, Facebook’s head of global politics and government outreach, said in an accompanying statement.
Facebook, along with Google and Twitter, faces global scrutiny for facilitating the spread of fake news, some of which had been directed by Russia ahead of the US presidential election, Brexit vote and other electoral battles.
The social network has concluded that Russian actors created 80,000 posts that reached around 126 million people in the United States over a two-year period.
“It’s abhorrent to us that a nation-state used our platform to wage a cyberwar intended to divide society,” Chakrabarti said.
“This was a new kind of threat that we couldn’t easily predict, but we should have done better. Now we’re making up for lost time,” he added.
Need to tread carefully
Chakrabarti pointed at Facebook’s pledge last year to identify the backers of political advertisements while also stressing the need to tread carefully, citing the example of rights activists who could be endangered if they are publicly identified on social media.
He also elaborated on the decision to let Facebook users rank the “trustworthiness” of news sources, saying: “We don’t want to be the arbiters of truth, nor do we imagine this is a role the world would want for us.”
While acknowledging concerns over the rise of “echo chambers,” Chakrabarti argued that “the best deterrent will ultimately be a discerning public.”
Mixed response
Facebook’s plan to rank news organizations based on user “trust” surveys has drawn a mixed response.
Renee DiResta of the nonprofit Data for Democracy was optimistic.
“This is great news and a long time coming. Google has been ranking for quality for a long time, it’s a bit baffling how long it took for social networks to get there,” DiResta wrote on Twitter.
‘Wikiality’
But technology columnist Shelly Palmer warned that Facebook appeared to be equating trust and truth with what the public believed what some called “wikiality.”
“Wikiality is Facebook’s answer to fake news, alternative facts, and truthiness,” Palmer wrote.
“Facebook, the social media giant, is going to let you rank the news you think is most valuable. What could possibly go wrong?” he added.
‘More interesting than truth’
For media writer Matthew Ingram, the changes “not only won’t fix the problem of fake news, but could actually make it worse instead of better.”
“Why? Because misinformation is almost always more interesting than the truth,” Ingram wrote in the Columbia Journalism Review. MKH/ATM