captcha

 

How did Google’s predictions go this weekend?

Posted on November 21, 2016

 

It’s time once again to look at the Google Trends data we went over last week and see whether the search engine has continued the pattern we have observed of search volumes hinting at future real world outcomes.

We put sports to one side last week and looked at the two big Saturday night reality TV shows, and explored which contestants were not being Googled as much as the others. The idea behind this is that if people couldn’t be bothered to type their names into a search engine, it was unlikely that they were going to pick up their phones and vote for them.

Were we right, and did Google’s crystal ball serve us well again? Let’s find out:

The X Factor

Google Trends data actually showed 5 After Midnight to be the act receiving the least search traffic in the run-up to the event, but I discounted this because a lot of people were searching for ‘Five’ instead of ‘5’. Taking this into account, it was actually Saara Aalto who wasn’t getting the love from Google, and I predicted that this meant she was likely to be on her way home this weekend.

But this was way off the mark. Saara was nowhere to be seen in the bottom two, which was made up of Ryan Lawrie (as per bookmakers’ predictions), and the controversial Honey G, who we had discounted completely because her search volume has been constantly astronomical in recent weeks.

In the end, she survived, Ryan got the chop, and Google goofed completely.

Strictly Come Dancing

Google had also gone against the grain in its prediction for the BBC’s big Saturday crowd puller, forecasting gymnast Claudia Fragapane to be on her way out, despite being one of the better performers to date.

This was a much better effort from the search engine, with Claudia finding herself in the bottom two, but being spared elimination at the expense of Greg Rutherford thanks to the judges’ decision.

Claudia’s bottom two finish could perhaps be compared to the Max Verstappen prediction the week before, where Google went for an outsider and, while it didn’t quite come off, it was a decent effort. Overall though, it’s a disappointing weekend for this Google prediction system, with neither of the two predicted outcomes actually taking place.

So what went wrong this week? Well, as I’ve said all the way through this, there are a lot of variables. Google is an indicator of hype, which in turn can be an indicator of popularity, but it doesn’t consider how people are going to perform on the night. It should also be remembered that even though it’s a public vote, the judges on both programmes always have some control over who is eliminated, and they’re unswayed by the search engines. In the case of Strictly, Google Trends data did actually tell us who the bottom two would be.

This has been an interesting experiment and I firmly believe that this use of ‘Big Data’ can give us an inkling into what’s about to happen, but perhaps it’s time to let it lie for a while now after this fairly indifferent weekend. I may resurrect it next time there’s a major election, as I think Google Trends data in the run-up to the EU referendum and U.S. presidential votes suggests that we should be taking people’s searches seriously.

John Murray

Content Team Leader at Engage Web
John works for Engage Web as a Content Team Leader and regularly contributes to the website and programmes of his beloved Chester F.C.

Like us on Facebook to see more posts like this

You might also be interested in:

2 Comments »

RSS feed for comments on this post. TrackBack URL

Have your say!

We have worked with:

minute-man-press-image
TEL: 0345 621 4321