If you use Youtube a lot, you’ve probably come across this: the same suggestions pop again and again, and it often seems like we’re just trapped in a bubble.
Camille Roth, a researcher at the French National Centre for Scientific Research, studied something rather unusual in the world of science: Youtube. Or rather, Youtube’s recommendation algorithm.
Along with his colleagues, he explored recommendations from a thousand videos on different subjects — a total of half a million recommendations. They then compared how different videos sent the users on different paths, based on these video recommendations.
Since almost 2 billion people use Youtube every month, the content recommendations that the site makes can have an effect on a large chunk of mankind. The popular belief is that Youtube — and social media, in general — tends to form confinement bubbles (or echo chambers).
In this case, the popular belief seems to be true.
While some social networks thrive on helping users find new types of content, that is not often the case on Youtube.
When researchers looked at the recommendation chains formed, they found that the Youtube algorithm often tends to confine people in the same bubbles, promoting the same type of content over and over again.
“We show that the landscape defined by non-personalized YouTube recommendations is generally likely to confine users in homogeneous clusters of videos. Besides, content for which confinement appears to be most significant also happens to garner the highest audience and thus plausibly viewing time.”
It tends to work like this: when you watch a video, you essentially enter a network of interconnected videos that can serve as recommendations. Depending on which video you start with, the recommendation network is more or less closed — which leads to more similar or more different content.
In addition, the content that leads to the most confined recommendation networks also seem to revolve around the most viewed videos or the ones with the longest viewing time — in other words, the more popular a video is, the more likely it is to send you in a closed loop, which creates a self-enforcing mechanism.
“To simplify our findings very roughly and informally, let us say that there are two main stereotypes of YouTube videos: some with a high number of views, featuring loopy and not very diverse suggestions, and some with a low number of views, which generally feature more diverse and exploratory suggestions,” the researchers explain.
It’s important to keep note of this as you’re using Youtube, particularly if you’re watching polarizing or biased videos: the more you look at something, the likelier it is for the algorithm to suggest more similar things and reinforce the bias.
The study has been published in PLoS.