Facebook new policy on fake content and misinformation will see the largest social media removing content that can instigate violence.
According to Facebook, content that could incite physical violence will be brought down, this is coming after months of criticism from groups who say the platform has been used to target ethnic minorities in countries like Myanmar and Sri Lanka.
Facebook product manager Tessa Lyons said:
“We have identified that there is a type of misinformation that is shared in certain countries that can incite underlying tensions and lead to physical harm offline.”
“We have a broader responsibility to not just reduce that type of content but remove it,” Tessa was quoted by New York Times as saying.
Facebook has been in the spotlight for what some social media analysts negligence over its inability to monitor contents and seemingly hate speech on the platform.
Most recently, Facebook CEO Mark Zuckerberg caused controversy after he suggested in an interview that the company wouldn’t remove posts by Holocaust deniers because they weren’t “intentionally getting it wrong.”
But knowing the implications of his statement, Zukerberg swiftly clarified his statements saying that he personally found “Holocaust denial deeply offensive” and “didn’t intent to defend the intent of people who deny that.”
“Our goal with fake news is not to prevent anyone from saying something untrue — but to stop fake news and misinformation spreading across our services,” Zuckerberg said.
He further reiterated the stance of the company on fake content and misinformation, “And of course if a post crossed line into advocating for violence or hate against a particular group, it would be removed.”
According to report, Facebook new policy on fake content and misinformation will see it partner with local organizations to identify offending content.
Civil society groups in Myanmar had blamed Facebook for allowing its platform to be used for a spread in misinformation that put thousands of Rohingya people in danger.
Similar issues have been seen in Sri Lanka, where riots broke out over a misinformation campaign, and the Times says there have been attacks in India and Mexico related to social media posts.
In Africa, particularly Nigeria, Facebook users use the platform to spread fake content and misinformation, thereby triggering hate and sometimes attack.
During election periods, Facebook has played a significant role as party loyalists had used the platform to spread rumour and propaganda.
Any fake content and misinformation on Facebook in Nigeria generally get larger shares and have in the past led to hatred of a certain ethnic group in a country that is largely divided on religious and ethnic lines.
Zuckerberg told Recode that the company has begun “downranking” fake posts on its website that have been identified by independent fact checkers.
“There are really two core principles at play here. There’s giving people a voice, so that people can express their opinions,” he said. “Then, there’s keeping the community safe, which I think is really important. We’re not gonna let people plan violence or attack each other or do bad things.”
The implementation of Facebook new policy on fake content and misinformation will see the platform fulfilling its core founding principle: FRIENDSHIP and NOT hatred and violence.