YouTube has already removed 50m channels deemed inappropriate in the last week including the popular Toy Freaks and a number running fake "adult" versions of cartoons featuring beloved animated characters from Peppa Pig, Frozen, Minions and Thomas the Tank Engine.
According to investigations by BBC News and The Times, there are still estimated to be tens of thousands of predatory accounts leaving indecent comments on videos of children. Some videos are posted by paedophiles and many are innocently posted by youngsters.
Some of the comments are said to be sexually explicit, while others reportedly encourage children posting the videos to perform sexual acts.
Anne Longfield, the Children's Commissioner, said the findings were “very worrying”, while the National Crime Agency said it was “vital” online platforms have robust protection mechanisms in place when they are used by children.
The BBC and The Times spoke to people from the site's “trusted flagger” scheme who report inappropriate content or behaviour by users to YouTube employees.
Some of the volunteer moderators told the BBC there could be “between 50,000 to 100,000 active predatory accounts still on the platform” while another told The Times there are “at least 50,000 active predators” on the site.
As well as trusted flaggers, YouTube also uses algorithms to identify inappropriate sexual or predatory comments.
However, the system is said to be failing to tackle the problem and paedophiles are continuing to comment on videos of children.
According to The Times, adverts for several major international brands, including a global sportswear brand and food and drink giants, appear alongside the videos, raising concerns that they could be indirectly funding child abuse.
Several of the companies are reported to have pulled adverts from the site on the eve of Black Friday.
YouTube said it had noticed a growing trend around content “that attempts to pass as family-friendly, but is clearly not” in recent months and announced new ways it was “toughening our approach”.
Johanna Wright, vice president of product management at YouTube, said in a blog post: “We have historically used a combination of automated systems and human flagging and review to remove inappropriate sexual or predatory comments on videos featuring minors.
“Comments of this nature are abhorrent and we work... to report illegal behaviour to law enforcement. Starting this week we will begin taking an even more aggressive stance by turning off all comments on videos of minors where we see these types of comments.”
Ms Longfield told the BBC: “This is a global platform and so the company need to ensure they have a global response. There needs to be a company-wide response that absolutely puts children protection as a number one priority, and has the people and mechanisms in place to ensure that no child has been put in an unsafe position while they use the platform.”
The National Crime Agency told the BBC: “It is vital that online platforms used by children and young people have in place robust mechanisms and processes to prevent, identify and report sexual exploitation and abuse.”
More about: #YouTube