Kent Walker, a senior vice-president and general counsel at Google, said in a blogpost: “Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all.
“Google and YouTube are committed to being part of the solution. We are working with government, law enforcement and civil society groups to tackle the problem of violent extremism online. There should be no place for terrorist content on our services.
“While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now.”
Google says its engineers have developed technology to prevent re-uploads of known terrorist content, using image-matching techniques.
Google will increase the number of independent experts in YouTube’s Trusted Flagger programme, and will expand its work with counter-extremism groups to help identify content that may be being used to radicalise.
It will take a tougher stance on videos that contain inflammatory religious or supremacist content.
Walker said: “Collectively, these changes will make a difference. And we’ll keep working on the problem until we get the balance right. We are committed to playing our part.”
Labour’s Yvette Cooper, chair of the Commons home affairs select committee, welcomed the pledges. “This is a very welcome step forward from Google after the [committee] called on them to take more responsibility for searching for illegal content,” she said.
“The select committee recommended that they should be more proactive in searching for and taking down illegal and extremist content, and to invest more of their profits in moderation.
“News that Google will now proactively scan content and fund the trusted flaggers who were helping to moderate their own site is therefore important and welcome, though there is still more to do.”