YouTube 'failing to remove terrorist videos'

  20 September 2017    Read: 1422
YouTube 'failing to remove terrorist videos'
Extremist footage posted by Islamists and neo-Nazis remains online despite pledges from the world’s largest internet companies to remove it, a study has found, reports citing the Independent.
Research carried out by the Henry Jackson Society think-tank found hundreds of extremist videos are available on YouTube, including many that have already been flagged to monitors.

One was entitled “Adolf Hitler was right”, another showed a Muslim man being attacked and Taliban propaganda was also found on the world’s second most-viewed website.

Yvette Cooper, who commissioned the report, said it was “unacceptable” that footage glorifying extremist violence remained online.

The Labour MP, who is chair of the Home Affairs Select Committee, accused YouTube of taking too long to remove the material, adding: “Whether that’s Islamic extremism or far Right extremism, the reality is that this material is far too easy to access.

“We know social media can play a role in the radicalisation of young people, drawing them in with twisted and warped ideology.

“YouTube have promised to do more, but they just aren't moving fast enough.”

The Henry Jackson Society found that 61 reported far-right videos and 60 Islamist videos were still online on YouTube, although dozens of videos that had been flagged were removed.

Those remaining included a video entitled “Adolf Hitler was right”, which showed praise of the Nazi leader alongside images of Jewish families being taken to concentration camps.

Another video showed a child singing over footage glorifying terrorism, and Taliban propaganda was also found on the site.

Another video showed a suspected hate crime seeing a man slapping a Muslim teenager with bacon and calling him “Isis scum”.

All four videos were flagged in July and August, but remained online this week.

Dr Alan Mendoza, executive director of the Henry Jackson Society, said his group’s research showed that in more than a third of Islamist terror cases between 1998 and 2015, the internet had a major impact on the offender’s engagement with extremism and terrorism.

“These ideologies can be freely disseminated and amplified online and there is room for improvement by technology firms to provide spaces to expose and debate their inconsistencies,” he added.

Ms Cooper called on the Government to introduce “proper penalties and fines for social media companies who do not act swiftly enough to remove dangerous and illegal content”.

Her comments came after separate research by the Policy Exchange think-tank found that almost three quarters of the British public want large internet companies to do more to find and delete content that could radicalise people.

Its report warned that Isis is winning an ongoing “netwar” against authorities trying to stop the spread of extremist material online and remains able to distribute propaganda and instructions on carrying out terror attacks.

The Policy Exchange found that jihadi content was accessed more frequently in the UK than anywhere else in Europe, with the country in fifth place globally behind Turkey, the US, Saudi Arabia and Iraq.

General David Petraeus, the former director of the CIA, said current situation was “clearly unacceptable”, adding: “It is clear that that our counter-extremism efforts and other initiatives to combat extremism on line have, until now, been inadequate.”

The retired general, who commanded Nato forces in Afghanistan, said the attempted bombing in Parsons Green underscored the threat generated by instructions and other materials available online.

He cautioned that while few doubt Isis’ physical “caliphate” in Syria and Iraq will be eradicated along with most of its militants, the group will continue to inspire atrocities around the world by targeting the most vulnerable sections of society with its “poisonous ideology”.

Analysts have warned that the role of the internet in radicalisation has been overplayed, with research showing that personal relationships and real-world networks play a defining role, but online propaganda has been targeted in intensifying crackdowns after being linked to a series of terror attacks.

In May 2016, Facebook, Microsoft, Twitter and YouTube signed up to an EU-sponsored code of conduct that pledged to establish improved ways to take down illegal hate speech and other extremist material.

Amber Rudd, the Home Secretary, said she had already made it “crystal clear” to internet firms that “they need to go further and faster to remove terrorist content from their websites and prevent it being uploaded in the first place”.

She added: ”This Government has been instrumental in the creation of the Global Internet Forum to Counter Terrorism, which is being led by the major companies and will develop technical solutions to automatically detect and remove terrorist propaganda.

“The internet cannot be used as a safe space for terrorists and criminals, and industry need to ensure that the services they provide are not being exploited by those who wish to do us harm.”

Theresa May is due to co-host a meeting on terrorist groups’ use of the internet alongside Emmanuel Macron and Paolo Gentiloni on the margins of the UN General Assembly in New York on Wednesday.

A spokesperson for YouTube said the site was “determined to be part of the solution” to extremism.

He added: “We’ve put our best talent and technology to the task and we’re making progress through new machine learning technology, partnerships with experts and collaborations with other companies through the Global Internet Forum.

“Through new uses of technology, the majority of videos we removed for violent extremism over the past month were taken down before receiving a single human flag. We’re doing more every day to tackle these complex issues.”

There have been concerns that YouTube's new algorithms intended to identify extremist content have been resulting in the removal of footage documenting war crimes in Syria.

Other mainstream platforms including Google, Twitter and Facebook have recently been found to contain extremist links and posts, with analysts comparing the continuing battle to stop removed content re-appearing elsewhere to a game of “whack-a-mole”.

More about: #YouTube