From cute cat videos to sourdough bread recipes: sometimes, it feels like the algorithm behind YouTube's "Up Next" section knows the user better than the user knows themselves.

Often, that same algorithm leads the viewer down a rabbit hole. How many times have you spent countless hours clicking through the next suggested video, each time promising yourself that this one would be the last one?

The scenario gets thorny when the system somehow steers the user towards conspiracy theory videos and other forms of extreme content, as some have complained.

SEE: Managing AI and ML in the enterprise 2020: Tech leaders increase project development and implementation[1] (TechRepublic Premium)

To get an idea of how often this happens and how, the non-profit Mozilla Foundation has launched a new browser extension that lets users take action when they are recommended videos on YouTube that they then wish they hadn't ended up watching.

Dubbed the RegretsReporter extension, it provides a tool to report what Mozilla calls "YouTube Regrets" – this one video that messes up the recommendation system and leads the viewer down a bizarre path. 

Mozilla has been collecting examples of users' YouTube Regrets for a year now[2], in an attempt to shed light on the consequences that the platform's recommendation algorithm can have. 

YouTube's recommendation AI is one of the most powerful curators on the internet, according to Mozilla. YouTube is the second most visited website in the world, and its AI-enabled recommendation engine drives 70% of total viewing time on the site. "It's no exaggeration to say that YouTube significantly shapes the public's awareness and understanding of key issues across the globe," Mozilla said – and yet, Mozilla said, for years, people have raised the alarm about YouTube recommending conspiracy theories, misinformation, and other

Read more from our friends at ZDNet