acastro STK092 04

YouTube’s ‘dislike’ and ‘not interested’ buttons don’t work, study finds

Even when users tell YouTube they’re not interested in certain types of videos, similar recommendations keep coming, a new study by Mozilla has found.

Using video recommendation data from more than 20,000 YouTube users, Mozilla researchers found that buttons like “Not interested,” “Dislike,” “Stop recommending a channel,” and “Remove from viewing history” are largely ineffective at preventing similar content from being recommended. Even at their best, these buttons still allow more than half of recommendations that a user says they’re not interested in, the report found. At worst, the buttons barely make a dent in blocking similar videos.

To collect data from real videos and users, Mozilla researchers recruited volunteers who used the foundation’s Regrets Reporter, a browser extension that overlays a simple “stop recommending” button to YouTube videos participants watch. On the back end, users are randomly assigned a group, so each time they click a button placed by Mozilla, different signals are sent to YouTube — dislike, not interested, don’t recommend the channel, remove from history, and a control group for whom no feedback was sent to the platform.

Using data collected from more than 500 million recommended videos, the research assistants created more than 44,000 pairs of videos — a “rejected” video, along with a video recommended by YouTube. The researchers then evaluated the pairs themselves or used machine learning to determine whether the recommendation was too similar to the video the user rejected.

Compared to the baseline control group, sending “dislike” and “no interest” signals were only “highly effective” at preventing bad referrals, blocking 11 percent of bad referrals and 12 percent, respectively. The “Do not recommend channel” and “Remove from history” buttons were slightly more effective — they prevented 43 percent and 29 percent of bad recommendations, respectively — but researchers say the platform’s tools to keep out unwanted content are still insufficient.

“YouTube users should respect the feedback they share about their experience, considering them as meaningful signals about how people want to spend their time on the platform,” the researchers wrote.

YouTube spokeswoman Elena Hernandez says these behaviors are intentional because the platform doesn’t try to block all content related to a topic. But Hernandez criticizes the report, saying it doesn’t consider how YouTube’s controls are designed.

“Importantly, our controls do not filter out entire contents or viewpoints, as this can have negative effects on viewers, like creating echo chambers,” Hernandez said. to the edge. “We welcome academic research on our platform, which is why we recently expanded data API access through our YouTube Researchers program. Mozilla’s report doesn’t take into account how our systems actually work, and so it’s difficult for us to gain many insights.

Hernandez says Mozilla’s definition of “similar” fails to take into account how YouTube’s recommendation system works. A “Not interested” option removes a specific video, and a “Do not recommend channel” button prevents future channel recommendations, Hernandez said. The company says it won’t stop recommending a topic, opinion or overall content for a speaker.

Besides YouTube, other platforms like TikTok and Instagram have introduced more feedback tools to train the algorithm to show users relevant content. But users often complain that similar recommendations persist even when they flag something they don’t want to see. It’s not always clear what the different controls actually do, says Mozilla researcher Becca Ricks, and platforms aren’t transparent about how feedback is taken into account.

“In the case of YouTube, I think the platform is balancing user engagement with user satisfaction, which is a trade-off between recommending content that ultimately leads people to spend more time on the site and content that the algorithm thinks people will like,” Ricks said. to the edge By email. “The platform has the power to adjust which of these signals gets more weight in its algorithm, but our study suggests that user feedback is not always important.”

Leave a Comment

Your email address will not be published.