Machine learning is the science of getting computers to act without being explicitly programmed. It is a subfield of computer science that evolved from the study of pattern recognition and computational learning theory in artificial intelligence. As it can be inferred by the term, we are talking about machines learning to complete tasks without human intervention.
Who would be more interested in harnessing the potential of such technology than Google? Two recent Wall Street Journal articles show how important is for Google to focus on machine learning and how Google envisions the use of machine learning and artificial intelligence in the future. Google is already using machine learning in automatic translation, voice-based searching, self-driving cars and the Nest connected thermostat. Mr Larry Page wants to make this technology available to everyone and not something only made for the high rollers of the world.
If you actually think about it, which is the easiest way that Google can make machine learning technology available to “everybody”? Certainly not the self-driving cars! The best way is for Google to apply machine learning technology to its search engine algorithm.
The irony here is that machine learning is already being used to reverse engineer the Google ranking algorithm. Completely decoding the algorithm is practically impossible, with or without machine learning, and the most obvious reason is that it is constantly changing. Nevertheless, some parts of it that carry the biggest weight or have been in the algorithm for a long period of time can, and have, been identified. In his infographic, Neil Spencer explains how machine learning is being used to decode the Google ranking algorithm and clearly highlights some of its weaknesses.
At the moment, the brains at Google HQ that build and change the variables of the algorithm do so with the sole purpose of making the algorithm more effective in delivering the best search results possible. But imagine if the algorithm was learning how to become more efficient in delivering the best search results on its own. What could this mean for the complexity of the algorithm? For search engine optimisation professionals? For website owners and for the whole SEO industry? Rand Fishkin has taken a look into Google algorithm and machine learning and its effects on SEO in his weekly vlog Whiteboard Friday.
With the masses of data available to Google today, the self-learning algorithm could start identifying patterns that optimise the effectiveness of the search results that the people in charge now would never even think of! Can you imagine an algorithm with unlimited calculating capabilities that will have an unimaginable number of variables connected with the strangest patterns and parameters? Just because the algorithm identified patterns that the human mind could not see.
On the other hand people are quite unpredictable and they are the ones affecting many of the inputs that the algorithm now takes into account and will probably continue to take into account when it becomes self-learning. This means that the algorithm will work within certain probabilities, calculating which search result is more probable to satisfy the user. The algorithm is already constantly evolving with reported speculation stating that it is changing twice a day by the developers at Google - at that point will the algorithm change twice a second or even faster? If this did happen, then the end result would be that no-one would be able to decipher what is a ranking factor and what is not. SEO experts would only be able to take a snapshot of the algorithm, which will not be true the next moment and perhaps will not even be decipherable due to its complexity.
For end users this would possibly be the best service they will ever be able to receive from a search engine. The chances of them getting what they want as a search result could not be better and the Google algorithm will have then fulfilled its purpose and continue to do so better with every new search query.
For the website owner that wants to attract organic traffic, as well as for the whole SEO industry, there will be a significant change of focus. A change that in our view should have taken place already, but apathy towards the importance of SEO will soon diminish if, or rather when, this is introduced.
The whole SEO industry is based on identifying key factors the Google Algorithm takes into account and trying to manipulate these in order to make websites rank better. A lot of SEO professionals say that great content is the key to success, which is true to an extent. The caveat to that is that you cannot solely rely on great content without looking at the links on your website, its page load speed, its bounce rate, keyword densities, URLs, sitemaps, mobile optimisation etc. The list of factors that are thought to be affecting the Google rankings is endless. Nevertheless, professionals dedicate time to research these factors and adapt websites accordingly because great content on its own does not bring results - or at least it does not bring results as quickly as business leaders demand.
If the Google algorithm becomes so complex that people will not be able to identify the ranking factors, then they will need to focus their attention on the other side of the equation, which is User Experience. The focus will finally be on what the website should do to delight the visitor and provide them with the best experience possible, so that the Google algorithm can then understand which websites are most valuable and should perform better (rank higher) in the search results.
The SEO industry will need to change once more and focus more on customer satisfaction rather than website optimisation.
Since the dawn of search engines, SEO professionals have been trying to find and exploit cracks or opportunities in the rules of the search algorithms to make their websites rank higher. Keyword density led to keyword spamming, link building led to link farms and negative SEO. Google now rules above all and is always fighting back to overcome these ‘black hat’ SEO techniques and evolve its algorithm. Whether it is Pandas, Penguins, Pigeons, Hummingbirds or any other animals - these updates are designed to fight the attempts of black hat SEO to improve the rankings of websites with unorthodox methods. However, a consequence of these black hat techniques is that they make the life of all website owners difficult as we have to constantly evolve in order to continue to rank on Google.
It seems that the use of machine learning in the Google ranking algorithm could signify the end of all the black hat techniques and the beginning of an era where the Internet will be a world of valuable information and great user experience.