fbpx
Notre Dame

YouTube can’t tell one burning landmark from another

Notre Dame

YouTube can’t tell one burning landmark from another

Footage of the Notre Dame fire uploaded to YouTube has been bringing up information about the 9/11 attacks in an incident that reminds us that video and image recognition technology is still in its embryonic stage.

As flames engulfed the Parisian cathedral on Monday evening, many people headed to social media to keep up to date with firefighting efforts in the French capital. Those who went to YouTube to look for newly captured videos, however, may have been confronted with one of YouTube’s ‘knowledge panels’ directing them to articles from the Encyclopedia Britannica and Wikipedia. Unfortunately, these articles related to an incident from nearly 18 years earlier, on different continents and with an entirely different cause, with the only similarity being that both resulted in a large, globally recognised building catching fire.

Google-owned YouTube introduced its knowledge panels last year in an effort to keep users better informed and steer them away from false or misleading video content. In doing so, however, it has complicated this particular matter further and demonstrated the potential problems of algorithmically generated info boxes.

The video sharing site explained that its algorithms “sometimes make the wrong call” and said it had disabled the knowledge panels for live streams of the fire, but the error raises two important questions about how YouTube handles breaking news of international interest.

The first is that when an incident as major as Monday’s fire occurs, should YouTube be content to sit back and let its algorithms do the work? With young people using social media more often than TV as a source for news, surely these sites have a duty to “report” it accurately and have staff working on the story in the same way as a TV or press news team would? In times of crisis, YouTube may be better advised to disable such algorithms altogether rather than risk presenting users with wrongly detected accompanying information.

Secondly, some in the tech sector have called for YouTube to be more transparent on how its algorithms work. Speaking to The Guardian, Harvard University machine-learning researcher Caroline Sinders argued:

“In this case specifically, with the recommendation being something so unrelated, we really need better audits to see why it is recommending what it’s recommending. Hiding it is not helping.”

Image recognition is a remarkable concept, and it could be argued that even for algorithms to recognise a burning building is some achievement. With Notre Dame looking nothing like the World Trade Centre towers though, it shows that the technology behind it still lacks sophistication and that tech companies need to learn to walk before they can run.

John Murray

Get in touch

Please enable JavaScript in your browser to complete this form.
Acceptance

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

>

Book a consultation with Engage Web