Facebook and Twitter could be hit by new tax as part of crackdown
Web firms will have a chance to give their views on the levy being proposed by Culture Secretary Karen Bradley and even suggest alternatives, as part of a drive to make the UK the “safest place in the world” to be online.
It comes as the chair of Ofcom spoke in Parliament to raise the prospect of web companies facing a greater amount of regulation to tackle the rise of “fake news”.
Theresa May has also already challenged the world’s biggest technology firms to take down terrorist propaganda in as little as one hour or face the threat of new fines.
Among the options proposed in Ms Bradley’s internet safety green paper is “an industry-wide levy” so social media companies and service providers fund schemes that “raise awareness and counter internet harms”.
The Independent understands that the Government is interested to see what action the private sector takes first – with a voluntary funded approach possible – before imposing any new levy on firms.
Officials said how any new tax would work and how much it would raise, are subject to a consultation to be held with industry and other stakeholders.
Ms Bradley said: “The Internet has been an amazing force for good, but it has caused undeniable suffering and can be an especially harmful place for children and vulnerable people.
“Behaviour that is unacceptable in real life is unacceptable on a computer screen. We need an approach to the Internet that protects everyone without restricting growth and innovation in the digital economy.
“Our ideas are ambitious – and rightly so. Collaboratively, government, industry, parents and communities can keep citizens safe online, but only by working together.”
Other proposals include a new voluntary social media “code of practice” to remove bullying, intimidating or humiliating online content as quickly as possible, an annual internet safety transparency report to show progress on addressing abusive and harmful content and extra guidance for tech and digital start-ups to ensure safety features are built into apps and products from the very start.
Shadow Culture Secretary Tom Watson said: “This announcement is short on detail.
“The Government needs to say more about who exactly will pay the proposed levy, how much they will pay and how it will be spent. And they need to explain what transparency information they will be asking social media companies to provide.”
Ofcom Chair Dame Patricia Hodgson revealed that the organisation’s board discussed how the internet could be regulated in the future last week, although she said this was ultimately a matter for government.
Asked about the rise of ‘fake news’ and whether internet companies should face tougher rules to remove it from platforms, Dame Patricia pointed out that firms are not within Ofcom’s responsibility, but said she personally believed that they are “publishers” and so should be in the regulator’s remit.
She said: “We feel very strongly about the integrity of news in this country and we are totally supportive of steps that should and need to be taken to improve matters.”
Research from regulator Ofcom has shown half of adults say they are concerned about online material, while a fifth of 12 to 15 year olds encountered something online they found “worrying or nasty”.
Children are particularly at risk, with the proportion exposed to hate content rising – 64 per of children and young people aged 13-17 have seen images or videos that are offensive.
The UK Safer Internet Centre, a partnership of charities Childnet, the Internet Watch Foundation and South West Grid for Centre, welcomed the measures.
Social enterprise Parent Zone, which offers services to help parents manage children’s web use, said: “It is encouraging to see the Government proposing concrete steps to ensure that industry is doing everything they can to support families and make the Internet a place that contributes to children flourishing.”
An NSPCC spokesperson said: “Keeping young people safe online is the biggest child protection issue of our generation. Social media companies are marking their own homework when it comes to keeping children safe, so a code of practice is definitely a step in the right direction but ‘how’ it is implemented will be crucial.”
At the UN last month Ms May joined with some other world leaders to pile pressure on to web giants to better deal with extremist content, challenging them to take it down in as little as an hour
Currently, it takes an average of 36 hours for such material to be removed – a figure cut from 30 days one year ago – so one to two hours would be a dramatic reduction.