New Delhi, November 20: Social media giant Facebook has for the first time disclosed prevalence of hate speech on its platform, saying that out of every 10,000 content views globally in the third quarter, 10-11 were hate speech. Facebook, which has 1.82 billion daily users globally, has drawn flak in the past for its handling of hate speech on the platform in India that is among its biggest markets.
In its Community Standards Enforcement Report for September 2020 quarter, Facebook said it is including the prevalence of hate speech on its platform globally "for the first time". Facebook Responds to Congress Leader KC Venugopal Over Hate Speech, Says 'We Take Allegations of Bias Seriously, Denounce Hate and Bigotry in All Forms'.
"In Q3 2020, hate speech prevalence was 0.10 per cent – 0.11 per cent or 10 to 11 views of hate speech for every 10,000 views of content," it added. Facebook said due to its investments in artificial intelligence, the company has been able to remove more hate speech and find more of it proactively before users report it.
"Our enforcement metrics this quarter, including how much hate speech content we found proactively and how much content we took action on, indicate that we're making progress in catching harmful content," it added.
Prevalence, on the other hand, estimates the percentage of times people see violating content on its platform, Facebook explained. During the third quarter, Facebook took action on 22.1 million pieces of hate speech content, about 95 per cent of which was proactively identified.
On Instagram, the company took action on 6.5 million pieces of hate speech content (up from 3.2 million in June quarter), about 95 per cent of which was proactively identified (up from about 85 per cent in the previous quarter), it added.
The latest Community Standards Enforcement Report provides metrics on how Facebook enforced its policies from July to September, and includes metrics across 12 policies on Facebook and 10 policies on Instagram.
Facebook Vice President (Integrity) Guy Rosen said the company is also updating its Community Standards website to include additional policies that require more context and can't always be applied at scale. These policies often require specialised teams to gather more information on a given issue in order to make decisions, he added.
The company said while the COVID-19 pandemic continues to disrupt its content review workforce, it is seeing some enforcement metrics return to pre-pandemic levels.
"Our proactive detection rates for violating content are up from Q2 across most policies, due to improvements in AI and expanding our detection technologies to more languages. Even with a reduced review capacity, we still prioritise the most sensitive content for people to review, which includes areas like suicide and self-injury and child nudity," it added.
In September quarter, Facebook took action on 19.2 million pieces of violent and graphic content (up from 15 million in June quarter ), 12.4 million pieces of child nudity and sexual exploitation content (up from 9.5 million in Q2), and 3.5 million pieces of bullying and harassment content (up from 2.4 million in the previous quarter).
For Instagram, action was taken on 4.1 million pieces of violent and graphic content (up from 3.1 million in Q2), 1 million pieces of child nudity and sexual exploitation content (up from 481,000 in Q2), 2.6 million pieces of bullying and harassment content (up from 2.3 million in June quarter), and 1.3 million pieces of suicide and self-injury (up from 277,400 in Q2).
The Community Standards Enforcement Report is published in conjunction with Facebook's biannual Transparency Report. The Transparency Report shares numbers on government requests for user data, content restrictions based on local law, intellectual property takedowns and internet disruptions.
During the first six months of 2020, government requests for user data increased 23 per cent from 1,40,875 to 1,73,592 globally, it said. Of the total volume, the US continues to submit the largest number of requests, followed by India, Germany, France, and the UK, it added.
During the period, the volume of content restrictions based on local law increased globally by 40 per cent from 15,826 to 22,120. The increase was in part related to COVID-19 related restrictions, it said.
Also, in the first half of 2020, the company identified 52 disruptions of Facebook services in nine countries, compared to 45 disruptions in six countries in the second half of 2019.