
In Q2 2019, Facebook removed about 512,000 pieces of content related to child nudity and child sexual exploitation on Instagram.
kavishakohli@gmail.com
San Francisco, November 14
Facebook has shared for the first time data on how it takes action against child nudity and child sexual exploitation, terrorist propaganda, illicit firearm and drug sales and suicide and self-injury on its photo-sharing app Instagram.
In Q2 2019, Facebook removed about 512,000 pieces of content related to child nudity and child sexual exploitation on Instagram.
"In Q3 (July-September period), we saw greater progress and removed 754,000 pieces of content, of which 94.6 per cent we detected proactively," Guy Rosen, VP Integrity, said in a statement on Wednesday.
It is ironic that Instagram has also become a platform, like Facebook, for such acts.
"For child nudity and sexual exploitation of children, we made improvements to our processes for adding violations to our internal database in order to detect and remove additional instances of the same content shared on both Facebook and Instagram," Rosen explained.
In its "Community Standards Enforcement Report, November 2019," the social networking platform said it has been detecting and removing content associated with Al Qaeda, ISIS and their affiliates on Facebook above 99 per cent.
"The rate at which we proactively detect content affiliated with any terrorist organisation on Facebook is 98.5 per cent and on Instagram is 92.2 per cent," informed the company.
In the area of suicide and self-injury, Facebook took action on about 2 million pieces of content in Q2 2019.
"We saw further progress in Q3 when we removed 2.5 million pieces of content, of which 97.3 per cent we detected proactively.
"On Instagram, we saw similar progress and removed about 835,000 pieces of content in Q2 2019, of which 77.8 per cent we detected proactively, and we removed about 845,000 pieces of content in Q3 2019, of which 79.1 per cent we detected proactively," said Rosen.
In Q3 2019, Facebook removed about 4.4 million pieces of drug sale content. It removed about 2.3 million pieces of firearm sales content in the same period.
On Instagram, the company removed about 1.5 million pieces of drug sale content and 58,600 pieces of firearm sales content.
On spread of hate speech on its platforms, Facebook said it can detect such harmful content before people report it and, sometimes, before anyone sees it.
"With these evolutions in our detection systems, our proactive rate has climbed to 80 per cent, from 68 per cent in our last report, and we've increased the volume of content we find and remove for violating our hate speech policy," said Rosen.—IANS
Most Read
Don't Miss
Cities
Protest by kin of Dasehra tragedy victims enters Day 3
They have been demanding employment on compassionate grounds...
Six illegal commercial godowns sealed in city
All located at Afeem Wali Gali, just 100 metres from MC off...
‘Restricting garden timings no remedy’
PU students, residents want security up
Salon manager alleges extortion by cops, retracts
SHO of MDC police station suspended; Home Guards jawan, accu...
For UT, bus shelters can wait, not flyover
Both projects took root in 2016, but Admn rushed with the l...
NSUI members clash with cops during protest
Rally organised against education policy; DTC bus vandalise...
Sanitation workers burn Finance Minister’s effigy
Block traffic for an hour; seek disbursal of dues, salaries
All readers are invited to post comments responsibly. Any messages with foul language or inciting hatred will be deleted. Comments with all capital letters will also be deleted. Readers are encouraged to flag the comments they feel are inappropriate. The views expressed in the Comments section are of the individuals writing the post. The Tribune does not endorse or support the views in these posts in any manner.