YouTube’s ‘dislike’ barely works, according to new study on recommendations | Engadget

If you’ve ever felt like it’s difficult to “un-train” YouTube’s algorithm from suggesting a certain type of video once it slips into your recommendations, you’re not alone. In fact, it may be even more difficult than you think to get YouTube to accurately understand your preferences. One major issue, according to conducted by Mozilla, is that YouTube’s in-app controls such as the “dislike” button, are largely ineffective as a tool for controlling suggested content. According to the report, these buttons “prevent less than half of unwanted algorithmic recommendations.”

Researchers at Mozilla used data gathered from RegretsReporter, its browser extension that allows people their recommendations data for use in studies like this one. In all, the report relied on millions of recommended videos, as well as anecdotal reports from thousands of people.

Mozilla tested the effectiveness of four different controls: the thumbs down “dislike” button, “not interested,” “don’t recommend channel” and “remove from watch history.” The researchers found that these had varying degrees of effectiveness, but that the overall impact was “small and inadequate.”

Of the four controls, the most effective was “don’t recommend from channel,” which prevented 43 percent of unwanted recommendations, while “not interested” was the least effective and only prevented about 11 percent of unwanted suggestions. The “dislike” button was nearly the same at 12 percent, and “remove from watch history” weeded out about 29 percent.

In their report, Mozilla’s researchers noted the great lengths study participants said they would sometimes go to in order to prevent unwanted recommendations, such as watching videos while logged out or while connected to a VPN. The researchers say the study highlights the need for YouTube to better explain its controls to users, and to give people more proactive ways of defining what they want to see.

“The way that YouTube and a lot of platforms operate is they rely a lot of passive data collection in order to infer what your preferences are,” says Becca Ricks, a senior researcher at Mozilla who co-authored the report. “But it’s a little bit of a paternalistic way to operate where you’re kind of making choices on behalf of people. You could be asking people what they want to be doing on the platform versus just watching what they’re doing.”

Mozilla’s research comes amid increased calls for major platforms to make their algorithms more transparent. In the United States, lawmakers have proposed bills to “opaque” recommendation algorithms and to hold companies for algorithmic bias. The European Union is even farther ahead. The recently passed Digital Services Act will require platforms how recommendation algorithms work and open them to outside researchers.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at the time of publishing.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TheDailyCheck is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected] The content will be deleted within 24 hours.