A new report from Mozilla, the makers of the privacy-focused Firefox browser, suggests that YouTube’s user controls are ineffective at controlling what people see on the platform—despite what Google claims. Using data from almost 23,000 volunteers, Mozilla was able to show that YouTube kept recommending similar videos even when people used the various different options to indicate that they didn’t want to see that kind of content.
YouTube is the second-most popular website in the world (the first is Google) and according to Mozilla, an estimated 70 percent of the 1 billion hours viewed daily on the platform are as a result of algorithmic recommendations. Various reports have shown how the algorithm can polarize people and recommend misinformation and harmful content—something that Google claims it has worked hard to fix. In this study, Mozilla set out to test the effectiveness of the controls YouTube offers to users to manage the recommended videos they see.
In a previous report released in July last year, Mozilla found that people were routinely recommended videos they didn’t want to see and felt that the controls available to them were ineffective. This new study used a browser plug-in Mozilla developed called RegretsReport to see if this was true.
Mozilla looked at four different Google-suggested controls: Clicking that thumbs-down “Dislike” button, “Not interested,” “Don’t recommend channel,” and “Remove from watch history.” Meanwhile, users of the RegretsReport plug-in see a “Stop Recommending” button on YouTube videos. When they clicked it, the control-option (such as for the Dislike button) corresponding to their test group was sent to YouTube, while data about future recommended videos were sent to Mozilla. (There was also a control group where clicking the button did nothing.)
[Related: Why YouTube is hiding dislikes on videos]
Over the course of the study, 22,722 participants used the RegretsReporter, allowing Mozilla to analyze 567,880,195 recommended videos. To assess this huge amount of data, the researchers reviewed 40,000 pairs of recommended videos and rated their similarity. This allowed the team to quantitatively study whether the videos participants were being recommended were similar to videos that they had previously rejected. In other words, to look at whether YouTube’s tools effectively reduced the number of bad recommendations.
For example, if someone saw an anti-vax video recommended to them, and clicked “Not interested,” and then got recommended a cat video, that would be a good recommendation. On the other hand, if they kept getting suggested anti-vax videos after indicating that they weren’t interested in them, those would be bad recommendations. Page 22 of the report [PDF] has some good visual examples.
Mozilla’s report found that no user control was especially effective at preventing unwanted recommendations. The “Don’t recommend channel” option had the biggest impact, preventing 43 percent of bad recommendations, with “Remove from watch history” preventing 29 percent, and “Dislike” and “Not interested” preventing 12 percent and 11 percent, respectively. Mozilla argues that its “research suggests that YouTube is not really that interested in hearing what its users really want, preferring to rely on opaque methods that drive engagement regardless of the best interests of its users.”
As a result of its findings, Mozilla is calling on people to sign a petition asking YouTube to fix its feedback tools and give users actual control over the videos they get recommended. It also has four specific recommendations for YouTube and policy makers based on its study.
Mozilla suggests that YouTube’s user controls should be easy to use and understand, and be designed to put “people in the driver’s seat.” It also wants YouTube to grant researchers better access to data (so they don’t have to use browser extensions to study these kinds of things). Finally, it calls on policy makers to pass laws providing legal protections for those engaged in public interest.
Whether this report is enough to get Google to add some real user controls to YouTube remains to be seen. For now, it’s a fairly damning indictment of the ineffective controls that are currently in place.
Source : Popular Science