Understanding The Purpose Behind Each Google Algorithm Update Is More Crucial For SEO

 Understanding The Purpose Behind Each Google Algorithm Update Is More Crucial For SEO

Google turned 15 last month and Amit Panchal posted on the Google blog that:

"We’ll keep improving Google Search so it does a little bit more of the hard work for you. This means giving you the best possible answers, making it easy to have a conversation and helping out before you even have to ask. Hopefully, we’ll save you a few minutes of hassle each day. So keep asking Google tougher questions—it keeps us on our toes! After all, we’re just getting started...."

Google from the time it started as a search engine has always focused on search as their main activity. Social media initiatives (Especially the Google+ initiative) have primarily been made to add quality social signals to improve search results. In fact I think each and every Google product along with the search algorithm updates are directly or indirectly connected to the quality of search results they display on the Google search engine.

The latest talk of the SEO town - 'The Hummingbird Update' points out the latest effort made by Google since it started in 1997. Every update since then has been made with a goal to improve the quality of search results and display the 10 blue links as per the search made by the user in the query.

Google has remained focused on one main goal i.e “To give quality search results to the user” . Google has been trying to reach closer to this goal with each algorithmic update . But as Google is one entity and its users include each and every human on this planet , every update gets interpreted differently by each one and also gets implemented differently by each one.

Let us go in the past and dissect a little:

In the late 90’s when Altavista was the main search engine being used by people and word to word mapping was the sole ranking factor on search engines people spammed their websites by repeating the keywords and also by hiding the keywords by camouflaging the font colour with the background colour of the page. When Google came up with the PageRank Technology to beat the keyword spam Google became THE SEARCH ENGINE and churned out high quality search results which made people discard Altavista and make Google their constant search companion.

The PageRank Technology (An effort by Google to improve search results for the user) is not bad but the way it was interpreted and spammed by the people to rank better is what is bad. The PageRank Technology was spammed so much by people that a new industry namely the LINK BUILDING INDUSTRY came into existence. This link building went to such an extent that people paid a huge amount to buy links. This kind of link spam increased so much that the main purpose of the PageRank Technology was defeated and Google in order to beat that had to recently come up with a Penguin Update and the disavow tool (An effort by Google to improve search results for the user ) which penalised websites with low quality and non-topical links.

The Penguin Update made people again stand up on their toes and made them undo all the paid and irrelevant links they had to their websites for which they again paid a huge amount. i.e first they paid to get links to remain in the search results and now they are paying to remove those links to remain in the search engines.

The PageRank Technology remains the same and is also one of the main quality signals which Google gets about a website but as the people spammed it the essence of the whole technology evaporated and resulted in unnecessary clutter on the web diluting the quality of the search results.

When Google integrated social signals (An effort by Google to improve search results for the user ) in their search algorithms the whole web took an about turn to social media sites to remain in the search results. But, again just being on social media is not enough the social media profile needs to send quality signals to search engines in order to include them in their search results. Of course what social media signal parameters are integrated in search is a very debatable issue . But, in order to get first hand social media data Google came up with their own social media platform i.e Google+ and every business today even if they are not active on Google+ have a Google+ profile, but is that enough ? The answer is an emphatic - “NO” and beating the spam on social media is the next big challenge for Google .

Apart from signals to the search algorithms via links, social media, meta tags, etc. Google currently is focusing on semantic search which focuses on the meaning of the content in the search query and in the content of the websites in the Google index. For this Google has introduced the Knowledge Graph , The Authorship Markup and now the Hummingbird Update.

Google started with a goal to give quality search results to the users as per the technology standards available for hardware and software at a give point of time but as it went ahead it also had to work at beating the spam which people kept on bringing on the web to just remain in the search results . This of course makes Google remain on its toes and makes Google keep working on becoming better than it was previously, which is in fact Good and bad for Google – Good because all the spam Google has to beat and face makes Google remain humble which is very much necessary as the kind of monopoly Google is having for search it can be very dangerous and unhealthy for the entire web. Bad because unnecessary clutter keeps on getting added on the web in the form of spammy links, low quality Content, a bad reputation for the SEO industry, misleading and wrong social signals, mistaken identities, etc.

Google has been focusing on the term QUALITY right from the beginning but people have been interpreting the meaning of quality as per the algorithms instead of the true meaning of the term. We have to understand one cannot buy reputation and popularity but we need to earn it, we cannot buy friends, followers and people to include in our circles but we need to make friends by the way we interact , connect and help them. Just adding content is not enough but every piece of content needs to be informative or offer solutions for which people are searching on the web. The content, links, social media signals and the overall online reputation resulting from the total footprints made by your overall online presence from the time you created an online identity will determine your knowledge Graph which should become stronger with every Google update rather than make you to undo stuff to remain in the search results. After all the online world is a reflection of the real world and the same rules pertaining to life and business are applicable.

A very good real time example of manipulation of schemas on the site came up during one of the sessions at SMX East 2013 held recently in the first week of October which I attended.

Chris Silver Smith, President, Argent Media (@si1very) during his presentation mentioned that as Google is giving more importance to rich snippets and schemas it is a good practice to add the selective Google reviews on your site using schemas. But in reality this again is like the misleading signals that the spammy link building efforts generated . The PageRank which got misinterpreted by some generated a link clutter on the web and created a link of all wrong wires getting connected which resulted in the short circuit of quality rather than lighting the bulb of quality due to the link building connections. Similarly adding random reviews on the site though in the form of schemas will not achieve the purpose of semantic search. The ideal way to add the reviews and rating on the site is to add a tool for rating and reviews on the site which the users can use and express their views so that Google gets a true idea about the product or the service the business is offering and correlate it with the user experience and feedback to ascertain the quality signal.

Quality signals can be ascertained by Google only if they get the aggregate feedback of any business online via the user response on the site. When any website owner filters the reviews and adds the selective reviews on the site then it surely falls under the SPAM category.

This was in fact pointed out strongly by Pierre Far of Google who was attending the session and could not resist himself from putting this point forward that any kind of such efforts are not as per the norms of Google.

Many times unknowingly a hasty implementation of any kind on the site can harm the site rather than healing it for a previous update or prepare it for a future update as explained in the example above. Hence, the right interpretation of the Google updates and understanding what Google is trying to achieve with every update should be a priority for every SEO before going on an implementation and an executing spree.

Author

  • Bharati Ahuja

    Bharati Ahuja is the Founder of WebPro Technologies LLP. She is also an SEO Trainer and Speaker, Blog Writer, and Web Presence Consultant, who first started optimizing websites in 2000. Since then, her knowledge about SEO has evolved along with the evolution of search on the web. Contributor to Search Engine Land, Search Engine Journal, Search Engine Watch, etc.

October 25, 2013
WebPro-logo

Contact Info

802, Astron Tech Park, Satellite Road, Opp. Gulmohar Park Mall, Ahmedabad 380015, India

+91 9825025904
info@webpro.in

Daily: 9:00 am - 6:00 pm
Sunday: Closed

Copyright 2023 WebPro Technologies LLP ©  All Rights Reserved