Facebook Removes 8.7 Million Child Nudity Images

Facebook says company moderators removed 8.7 million user images of child nudity during the last quarter with the help of previously undisclosed software that automatically flags such photos.

The machine learning tool rolled out over the last year identifies images that contain both nudity and a child, allowing increased enforcement of Facebook's ban on photos that show minors in a sexualised context.

A similar system also disclosed on Wednesday catches users engaged in "grooming," or befriending minors for sexual exploitation.

READ MORE:Facebook Security Breach Hits 50 Million Accounts And Might Affect Instagram Too

Facebook's global head of safety Antigone Davis told Reuters in an interview that the "machine helps us prioritise" and "more efficiently queue" problematic content for the company's trained team of reviewers.

Image: Getty

The company is exploring applying the same technology to its Instagram app.

Under pressure from regulators and lawmakers, Facebook has vowed to speed up removal of extremist and illicit material.

READ MORE: The Terrifying New Cyber Trend That Targets Kids

Machine learning programs that sift through the billions of pieces of content users post each day are essential to its plan.

Machine learning is imperfect, and news agencies and advertisers are among those that have complained this year about Facebook's automated systems wrongly blocking their posts.

Antigone Davis, Facebook Global Head of Safety Director. Image: Getty

Davis said the child safety systems would make mistakes but users could appeal.

"We'd rather err on the side of caution with children," she said.

Facebook's rules for years have banned even family photos of lightly clothed children uploaded with "good intentions," concerned about how others might abuse such images.

Before the new software, Facebook relied on users or its adult nudity filters to catch child images.

A separate system blocks child pornography that has previously been reported to authorities.

Facebook has not previously disclosed data on child nudity removals, though some would have been counted among the 21 million posts and comments it removed in the first quarter for sexual activity and adult nudity.

READ MORE: Facebook Reveals Just How Good It Is At Moderating The Content We See