Facebook begins its first-ever measure of the prevalence of hate speech on its platform

Facebook hate speech prevalence

The world’s biggest social media giant company, under investigation over its policing of abuses, especially around November’s U.S. presidential election, released the estimate in its periodical content moderation report.

For the first time ever, Facebook has unexpectedly uncovered numbers of hate speech prevalence on its platform. It shows that out of every 10,000 views of content on Facebook, 10-11 of them are hate speech.

“We specifically measure how much harmful content may be seen on Facebook and Instagram because the amount of times the content is seen is not evenly distributed. One piece of content could go viral and be seen by lots of people in a very short amount of time, whereas other content could be on the internet for a long time and only be seen by a handful of people. We evaluate the effectiveness of our enforcement by trying to keep the prevalence of hate speech on our platform as low as possible while minimizing mistakes in the content that we remove,” said Facebook in its report.

Facebook prevalence calculated the frequency of time people see violating content on its platform. They estimated the hate speech by selecting a sample of the content present on Facebook and then calculating what percentage of it violates the Facebook hate speech policies.

Since hate speech depends on various factors such as language and culture, Facebook sends these samples to reviewers around the globe. Based on this method, they calculated its prevalence from July 2020 to September 2020, which was found to be 0.10% to 0.11%. In other words, out of every 10000 views of content, 10 to 11 of them were hate speech.

Facebook told it took the step on 22.1 million bits of hate speech content in the third quarter, about 95% of which was actively recognized, compared to 22.5 million in the prior quarter.

Facebook hate speech prevalence
Image Source: Facebook

Facebook defines ‘taking action’ as eliminating content, covering it with a notice or warning, disabling accounts, or raising it to external agencies.

“We’ve invested billions of dollars in people and technology to enforce these rules, and we have more than 35,000 people working on safety and security at Facebook. As the speech continues to evolve over time, we continue to revise our policies to reflect changing societal trends. But we believe decisions about free expression and safety shouldn’t be made by Facebook alone, so we continue to consult third-party experts in shaping our policies and enforcement tactics. And for difficult and significant content decisions, we can now refer cases to the Oversight Board for their independent review and binding decisions,” Facebook added.