On Feb. 9, more than a month after Paul posted a video featuring the blurred face of a suicide victim, YouTube announced that it would suspend monetization from his channel, cutting off a lucrative revenue stream for the creator. The decision was made after Paul not only posted a video that was deemed insensitive to mental health issues, but after he released videos that showed him tasering a rat and encouraging viewers to participate in the Tide Pod challenge.
"We think that's actually a pretty strong statement in itself," Wojcicki said of the decision to demonetize Paul's account during a talk at the Code Media conference on Monday evening in Huntington Beach, Calif.
Paul's video in Japan's so-called "Suicide Forest" was posted in the final hours of 2017 and immediately drew criticism from the creator community. Many people who operate channels were frustrated by how long it took YouTube to respond. When the Google-owned company took its first action on July 13, it removed him from its premium ad tier, Google Preferred, and put its original projects with him on hold.
But it was clear that the broader community was still concerned that Paul's actions could have a negative impact on how advertisers and other partners view YouTube. So on Feb. 9, YouTube announced that, in response to the Paul situation, it would tighten its policies and provide a more detailed list of possible consequences if a YouTuber violates its guidelines or does something that has the potential to cause widespread harm to the community. Meanwhile, it decided to temporarily pause monetization on his channel.
While YouTube has a three-strike rule that can get channels kicked off its platform, Wojcicki noted that Paul's actions wouldn't have warranted that level of punishment. "We can't just be pulling people off of our platform," she explained. "They need to violate a policy and we need to have consistent behavior."
Part of YouTube's challenge is how to balance being an open technology platform and one that must moderate content uploaded into its ecosystem. That dilemma has become especially relevant over the last year as YouTube has faced one advertiser revolt after another over everything from inappropriate videos to exploitative children's content.
Wojcicki announced in December that YouTube would grow its content and moderation teams to 10,000 people in 2018 to help it better review the videos posted to its platform. But she noted on stage on Monday that over 400 hours of video are uploaded to the platform ever hour, making it basically impossible for humans to moderate it all.
Wojcicki said that she's been especially focused on figuring out how much moderation is the right amount. "It's really important to figure out where are you drawing that line," Wojcicki noted, explaining that "on one side is censorship, but on the other side is too much freedom of speech."
When interviewer Kara Swisher asked whether Wojcicki considers YouTube a media platform, the CEO responded, as tech executives predictably do, by saying, "We are a technology platform. The output of our product is media."
This article appeared on billboard.com.
More about: YouTube