This month, YouTube published a blog giving some insight into how video recommendations work, and why users might sometimes see videos that are irrelevant to what they’ve been watching.
Though detailed, the blog doesn’t say anything particularly surprising. Recommendations are often dictated by which videos other viewers of the same videos as you have been watching, so there may be interests that cross over, such as people who like a certain sport tending to also listen to a certain type of music. Popularity is determined by more than just clicks, with watch time, likes and dislikes – and sometimes even personalised survey responses – among the metrics used. It’s also interesting to read how YouTube’s recommendations have evolved since 2008, when the site would simply recommend the most popular videos on topics, to today’s technique of analysing users’ data (if the user consents) to offer more personalised and niche suggestions.
Also, perhaps in light of ongoing concerns about social media’s role in spreading misinformation related to Covid-19 and politics, a significant portion of the blog is devoted to YouTube’s efforts to differentiate between “authoritative” and “borderline” content.
In 2018, the New York Times published a piece called “YouTube, the Great Radicalizer”, accusing the site of sending people down a “rabbit hole” of extremist content. It noted that watching slightly left- or right-leaning content resulted in YouTube recommending videos that pushed viewers towards the extreme ends of the political spectrum, often towards unfounded conspiracy theories and hateful content.
Three years on, YouTube insists it is doing more to determine the authenticity of videos, including using human “evaluators” to assess videos, with doctors and other professionals used in some cases. Videos identified as “borderline” are those that come close to infringing on YouTube’s Community Guidelines, such as conspiracy theories, but YouTube has decided to allow these with respect for the freedom to access and express unorthodox views.
It may surprise many people to learn that, according to YouTube, most users do not want to see “borderline” content and prefer authoritative information. The site says that when it began to demote “tabloid-style” content in people’s recommendations, it found average watch time increased, and that there is no evidence that videos promoting outlandish theories, such as flat Earth, are any more engaging than more authoritative ones. It goes on to say that views of borderline content that come from YouTube’s recommendations are now below 1%, and the site’s goal is to eventually reduce this to below 0.5%.
The blog offers reassurance to those of us who have always believed that the key to successful internet marketing is not to insult your audience’s intelligence, but to offer them authentic, high-quality content. With algorithms increasingly taking this human element into consideration, why not speak to Engage Web about how you can become an authoritative voice in your industry through your website and social media channels?