YouTube pulls 8.3M objectionable videos with help of detection software

  24 April 2018    Read: 1806
YouTube pulls 8.3M objectionable videos with help of detection software

YouTube removed 8.3 million videos in three months for breaching its guidelines, the majority of which were flagged by an automatic detection system, the company said in a report released this week.

Seeking to address criticism of offensive and violent content on the video platform, YouTube said in its first quarterly moderation report that it has stepped up its policing game, removing millions of videos between October and December last year. The company said the flagged videos were "mostly spam or people attempting to upload adult content."

YouTube said 76 percent of the videos was removed before receiving a single view, and that more than 80 percent of the removed content was automatically flagged by its machine detection systems.

The systems work in several ways, including algorithms that detect inappropriate footage, tracking systems that monitor suspicious uploading patterns, and machine learning technology that identifies offensive videos based on similarity to previous uploads.

YouTube said the software has been key in dealing with the problem of offensive content, though human detectors also flag and remove videos.

It said the latest report would "help show the progress we're making in removing violative content from our platform."

A subsidiary of Google's parent company Alphabet, YouTube has come under pressure from national regulatory bodies and the EU to tighten policing of objectionable content.

Google promised to enlarge its enforcing staff to 10,000 people by the end of 2018, most of whom will be human reviewers working on YouTube.


More about: YouTube  


News Line