The algorithm recommended content was 40% more likely to be flagged by a volunteer than the video they found through search.
The nonprofit Mozilla asked users of its Firefox web browser to install a browser extension called RegretsReporter, which tracks the videos they watched on YouTube, and asked if they regretted watching each of the videos.
In September 2020, YouTube launched an extension that allows viewers to report videos that violate or outrage the rules. The Mozilla Foundation has asked 30,000 volunteers to install the extension in their browsers. The extension “remembered” the path to a particular video, capturing the content, followed by the recommendation of the algorithm.
Mozilla has done research with RegretsReporter, an open source browser extension. People volunteered their data and gave researchers access to YouTube’s pool of preserved recommendation data.
As it turned out, people from non-English-speaking countries are more likely to encounter offensive content on YouTube, and videos with hate speech, cruelty and fraud appear in recommendations. The study revealed an unexpected detail: it turned out that “in 43.3% of cases, the recommendation was completely unrelated to the previous videos that the volunteer watched.”
Volunteers complained to the platform’s moderators about the videos because of the spread of the coronavirus panic, political misinformation, and inappropriate “children’s” cartoons.
More than 71 percent of all videos reported by volunteers were actively recommended by YouTube’s own algorithm. Nearly 200 of these videos have now been removed from the platform, including several that YouTube saw as violating its own policies. These videos had a total of 160,000,000 views, according to a Mozilla report.
Catch up on more stories here
Follow us on Facebook here