Last week, free newspaper and light-reading website Metro published a quirky article about how Google’s search results, especially its image search, could be seen as being racially insensitive. In particular, it highlighted that Googling “unprofessional hairstyles for work” and “professional hairstyles for work” brings up a notable contrast in the ethnicities of the people in its results.
Google argues that its search results don’t necessarily reflect its own opinions, but societal views and the way people metatag their images, but are there instances of Google letting its own views affect its results and suggestions?
Is Google a Tory?
In February, Scottish Twitter user and regular anti-Conservative tweeter Andrew Barr picked up on a peculiar trait of Google’s suggestions when searching for political parties:
Type any UK party followed by 'are' into Google search and you get questions. But conservatives/tories get zero. Explains the tax issue!
— Andrew Barr (@scotab) February 1, 2016
Barr is quite right, or at least he is about the autosuggest results. If we start to Google most of the main political parties, we get suggestions, many of which are rather unflattering. They include:
– Labour are scum/a joke/finished in Scotland
– SNP are liars/racist/evil
– UKIP are racist/finished/Nazis
– Lib Dems are finished/traitors
However, a search beginning ‘Conservatives are’ or ‘Tories are’ suggests nothing at all, leading some left-leaning social media users to imply a political bias from the search engine, or even that it is in cahoots with the party on tax avoidance.
Google soon explained this behaviour in a story run by Conservative newspaper The Telegraph. It pointed out that its algorithms pick out potentially offensive search results, hence no autosuggestions for ‘Conservatives are’ or ‘Tories are’.
I suppose ‘Conservatives/Tories’ are people, whereas ‘Labour/UKIP’ are groups, so it is perhaps understandable, and searches beginning ‘the Conservative Party are’ and ‘conservative voters are’ generate some negative suggestions. I think this one should be put down to an algorithm quirk rather than anything more sinister, but it’s one that Google should sort out rather than giving wishy-washy excuses.
What’s Google’s religion?
An autosuggest oddity that hasn’t been clarified, as far as I can see, is what happens when you try to ask Google what it thinks of religions.
Searches for ‘Christians are’, ‘Jews are’ and ‘Buddhists are’ bring up nothing, suggesting that Google’s anti-offensiveness algorithm is at work here. A search for ‘Muslims are’ brings up just one result – “Muslims are not terrorists” – which is a wise statement but probably not an organic one.
However, how do we explain the one result Google suggests to us when we search for “atheists are”:
That’s a very bold statement, isn’t it Google? Declaring everyone from Socrates to Stephen Fry to be “wrong” without suggesting anything else?
Generally speaking, Google does a good job of working out what we’re searching for, and we probably shouldn’t get too worked up about its autosuggestions. If it’s trying not to upset certain groups though, these findings suggest that a little more balance might be required next time it tweaks its algorithms.
- How to find a circular reference on Excel - May 23, 2024
- Five life skills learned from internet marketing - January 3, 2024
- How artificial intelligence can (and can’t) help you write content - September 29, 2023