Facebook post deleted? Appeals may soon be easier

20181122 Facebook execs explain community standards during a  round-table discussion with media held in Singapore on November 13,2018. Photo by Maila Ager

Facebook officials explain community standards during a round-table discussion with media held in Singapore on November 13, 2018. INQUIRER.NET / MAILA AGER

Facebook has almost completed the rollout of a mechanism that will give its users bigger leeway to appeal contents that have been deleted by mistake.

Last year, Facebook users could only write an appeal for their deleted profiles and pages.

But the company later extended this right to people whose posts were taken down on the basis of hate speech, nudity, sexual activity and graphic violence violations.

“Since then, we have expanded this further to cover content that was removed from bullying and harassment,” Facebook said in a statement Wednesday.

“By the end of this year, we will have completed our appeals roll-out, and will be offering appeals for content removed for all policy violations.”

“In the first quarter of 2019, we will start rolling out the ability for people to appeal a decision when they reported content and was told it did not violate our policies,” it added.

Monika Bickert, vice president of Facebook’s global policy management, said the company had been working on further expanding the scope of the appeals process to include contents deleted for other policy violations like terror propaganda, hate speech and act of violence.

“We’re still in the process of rolling that out because as you might imagine this is a huge undertaking,” Bickert said during a recent round-table discussion with the media held in Singapore.

“We will continue to build out our appeals, offerings, and plan to extend appeals to people who report content and are told it does not violate our community standards,” she added.

But how do Facebook users appeal when their posts have been deleted?

Bickert explained that if a post, photo or video had been removed for violating community standards, a Facebook user would be given option to “Request Review.”

The appeals, she said, would then be reviewed by Community Operations team within 24 hours.

“If we made a mistake, the content will be restored and we will notify the person who requested the appeal,” Bickert said.

For certain types of content violations like terror propaganda and child sexual abuse imagery, she said, technology has been effective in detecting “known bad images” and stops these even before they are posted on the platform.

The same could not be said, however, on other violations like bullying and harassments.

“That’s so contextual,” Bickert said, noting that friends may just be joking at each other on Facebook and thus may not be considered as bullying.

“It’s very hard for us to know when something is truly bullying,” she said.

Another “very contextual” violation, which is also harder to detect on Facebook, is hate speech.

“It’s a lot harder than removing child sexual abuse imagery in terms of using technology to do it. But the engineering team is making a lot of progress,” Bickert said.

Despite the difficulties, she said, Facebook is “getting better” in removing contents deemed to have violated its policies.

In its latest Community Standards Enforcement Report, Facebook noted the significant increase in the number of hate speech and violence and graphic content that it detected even before it was reported to them.

For hate speech, the social network said the number of proactive detection rate has more than doubled from 24% in the last quarter of 2017 to 52% in the third quarter of this year.

The proactive detection rate for violence and graphic content, on the other hand, increased by 25 percentage — from 72% in the last quarter of 2017 to 97% in the third quarter of this year.

The report also took note of the more than 1.5 billion fake accounts that had been deleted from April to September this year. /cbb

READ: Facebook takes down over 1.5B fake accounts in six months

Read more...