fbpx
Wiki Youtube

Wikipedia info to be added to YouTube conspiracy videos

Wiki Youtube

Wikipedia info to be added to YouTube conspiracy videos

In a blog a short while ago, I suggested that one way social media sites could better inform their users on what sort of content they are reading, sharing and interacting with could be to team up with Wikipedia to provide brief snippets of neutral information on them. At the risk of sounding a right smug little so-and-so, I’m very pleased to see that YouTube appears to be considering working with the online reference site.

The video streaming site is full of videos of outlandish conspiracies and other extreme content, and as we mentioned earlier this week, it sometimes nudges viewers of politically weighted but fairly moderate material towards these more radical offerings.

TechCrunch reports that in a recent panel discussion, company CEO Susan Wojcicki discussed plans to introduce “information cues” to certain controversial videos. As viewers watch videos on conspiracies to do with subjects like the Apollo mission, chemtrails and 9/11, text boxes beneath the video would inform them that they were reading an alternative viewpoint. Wojcicki enthused about the diversity of this feature, noting that it could be expanded to include information from Wikipedia.

Not everyone has a positive view of this move however, including TechCrunch itself. Writer Darrell Etherington notes that the information provided about one of the videos seemed insufficient, and asks whether YouTube is really tackling the problem here or simply shifting the blame over to somebody else. His article rather bleakly concludes:

“The bottom line is that all social platforms relying on user-generated content will eventually become completely co-opted and unusable.”

Also, from a statement tweeted by the Wikimedia Foundation, it appears that Wikipedia itself had not been informed of these plans.

But how far should sites like YouTube go? Should they be blitzing controversial and extreme videos the minute they see them, or should they adopt an ‘anything goes’ approach and allow users to decide for themselves what they want to look at?

I would like to see something in between the two. I’m generally not in favour of actually removing content from social media sites unless it directly encourages hatred towards a certain person or group. I prefer the idea of social media sites doing a better job of educating people.

This week, Facebook followed in the footsteps of Twitter by banning the pages of far-right group Britain First. You could certainly argue that this is a group that spreads hatred, but it my opinion so are newspapers like the Daily Mail for the frequency and tone with which they report on groups like migrants and asylum seekers, so where do you draw the line?

On Facebook, Britain First had nearly 2,000,000 likes, but this support is not at all mirrored by the popularity of the party away from Facebook, where its occasional electoral bids always result in the party only getting a tiny fraction of the vote. This suggests that most Britain First Facebook followers are simply misinformed.

I’ve long thought Wikipedia is the best site on the internet. It’s advert free, backed up by sources and anyone can add to it, with moderation from a committed and highly discerning team keeping it in check. If you want to find out about a sporting event, a musician or band, or a film release, it’s often quicker, more reliable and more up-to-date than going to the site of the entity itself.

To me, teaming up with Wikipedia sounds like a very good way to help people make informed social media choices without unnecessary censorship.

John Murray

Get in touch

Please enable JavaScript in your browser to complete this form.
Acceptance

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

>

Book a consultation with Engage Web