YouTube viewers asked to help uncover how users are sent to harmful videos

  18 September 2020    Read: 4417
YouTube viewers asked to help uncover how users are sent to harmful videos

YouTube viewers are being asked to become “watchdogs” and record their use of the site to help uncover the ways in which its recommendation algorithm can lead to online radicalisation.

Mozilla, the non-profit behind the Firefox web browser, has produced a new browser extension, called RegretsReporter, which will allow YouTube users to record and upload information about harmful videos recommended by the site, as well as the route they took to get there.

“For years, people have raised the alarm about YouTube recommending conspiracy theories, misinformation, and other harmful content,” said Ashley Boyd, Mozilla’s head of engagement and advocacy. “One of YouTube’s most consistent responses is to say that they are making progress on this and have reduced harmful recommendations by 70%. But there is no way to verify those claims or understand where YouTube still has work to do.

“That’s why we’re recruiting YouTube users to become YouTube watchdogs. People can donate their own recommendation data to help us understand what YouTube is recommending, and help us gain insight into how to make recommendation engines at large more trustworthy.”

More than two years ago, the Guardian revealed how YouTube’s recommendation algorithm, the workings of which are not fully understood outside the company, gave “dangerously skewed” video suggestions.

In the years since, users have repeatedly expressed surprise and dismay at the harmful content they are encouraged to view. After watching a YouTube video about Vikings, one user said they were recommended content about white supremacy; another found You’ve Been Framed-style footage led to grisly clips from real-life fatal accidents.

“More than 70% of all videos viewed on YouTube are suggested by the site’s recommendation engine,” Mozilla said. “But even the basics of how it works are poorly detailed. The organisation says it wants to research the answers to questions like what type of recommended videos lead to racist, violent, or conspiratorial content, and whether there are specific YouTube usage patterns that lead to harmful content being recommended. The company says it will share findings from the research in an open-source fashion.”

In a statement, a YouTube spokesperson said: “While we welcome more research on this front, we have not seen the videos, screenshots or data in question and can’t properly review Mozilla’s claims. Generally, we’ve designed our systems to help ensure that content from more authoritative sources is surfaced prominently in recommendations. We’ve also introduced over 30 changes to recommendations since the beginning of the year, resulting in a 50% percent drop in watchtime of borderline content and harmful misinformation coming from recommendations in the US.”

The campaign is a high-stakes move for Mozilla. The organisation gains most of its revenue from a deal with Google under which Google Search is set as the default search engine on the Firefox browser, and though it has been trying to diversify its income streams, a revenue squeeze in January this year led to 70 staff members being laid off.

 

The Guardian


More about: YouTube  


News Line