Mozilla RegretsReporter data shows YouTube keeps recommending harmful videos

0
639
Mozilla RegretsReporter data

Mozilla RegretsReporter data: In countries where English is not the main language, the number of messages is higher.

That YouTube’s machine-learning-based recommendation feed can often produce results that are annoying or even radicalizing is no longer in doubt. YouTube itself promotes tools that it says can give users more control over their feed and transparency about certain recommendations, but it’s hard for outside observers to understand the impact they have. Now, after spending much of last year collecting data using the RegretsReporter extension (available for Firefox or Chrome), the Mozilla Foundation has more information about what people see when an algorithm makes poor choices and has released a detailed report (pdf).

In September 2020, an extension was launched that uses a crowdsourced approach to find “regrettable” content that people encounter using the recommendation engine. After receiving 3,362 reports (along with data from people who installed the extension but did not submit reports), trends in the data show the dangers of YouTube’s approach.

Instagram will focus on the video content: ‘We’re no longer a photo-sharing app,’ says head of Instagram

While the foundation claims that the concept of “regret” was purposely vague, it found that 12.2 percent of the videos reported violated YouTube’s own content rules, and noted that about nine percent of them (nearly 200 in total) were removed from YouTube, racking up more than 160 million views. As for why these videos were posted in the first place, a possible explanation is their popularity-Mozilla noted that the videos reported averaged 70 percent more views per day than other videos viewed by volunteers.

Mozilla’s senior director of advocacy Brandi Gerkinck says, “YouTube has to admit that their algorithm is designed in a way that harms and misinforms people.” However, two statistics from the study particularly caught my attention: Mozilla states that “in 43.3 percent of the cases where we have data on videos that volunteers watched before Regrets, the recommendation was completely unrelated to previous videos that the volunteer had watched.” In addition, the number of videos reported with regret was 60% higher in countries where English is not the primary language. Despite the small sample size and possible bias in the data, they point out that there is something to pay attention to in places where people who speak mostly English are not even paying attention.

NBC News included a statement from YouTube about the report, claiming that “in the last year alone, we’ve launched more than 30 different changes to reduce recommendations of harmful content.” A similar response was received when the project was launched last year. The reforms proposed by Mozilla include transparent reporting and the ability to opt out of personalization, but given that YouTube gets more than $6 billion a quarter from advertising, opting out of profiling seems questionable.

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here