This article is going to look into the history of how Google Ranks the webpages it indexes. I will take you all the way to the humble beginnings of Google to all the way through the future outlooks of webpage rankings and how to stay ahead of the curve.
First, we’ll look into the era of the late 90’s all the way to 2010. Yes, I know that is a large amount of time, but besides slight updates, Google kept their ranking system the same. In this era of ranking, the best way to optimize your website was to select keywords you wanted your site to promote, and abuse the hell out of them. Basically, you would spam these keywords in your meta tags and all throughout the HTML of your webpage. Google bots would look for the most keywords on a site that matched what a user was typing in their search engine. In this era, most people made “doorway pages” which were basically webpages stuffed with the desired keywords as much as possible. These sites were not the main website, they were simply websites that were used to put the websites name on the front page of Google.
The Next step in Google rankings took place in 2011, when they released the “panda” update. This update changed the whole way Google ranked webpages, this was evident when websites started to appear in the top pages that had never even been seen before. Webpages were now being ranked by the amount of links that were, well, linked to their website. Google’s thinking was that if a website had a bunch of people/organizations linking to it, it must be a quality site and therefore the best quality websites would be found for their users. However, Google did not expect companies to start buying links in order to artificially boost themselves onto the first page of Google. When Google caught onto this, they cracked down on fake links. This made A LOT of webpages vanish, including giant companies and well known websites. This was called the “penguin” update. The Penguin update was an on-line authority that decided on what counted as quality links, and therefore which websites were of the best quality. As this case study explains, site speed now also came into play in regards to how high your page was ranked. If the quality links were similar, then it would come down to site load speed, and why come in second on something that is so easy to correct?
Now in 2013 and onward, Google has turned to yet another way,they are no longer supporting the penguin update and have something big in the works. It seems now that Google has decided social media is the new measurement tool of where a page shall be ranked. So a webpage must now be highly regarded, linked to, and talked about on all major social networking sites in order to become highly ranked. So my suggestion, get your butt on as many social media sites as possible and get people talking about your webpage. Also, as this case study explains, you must author articles on-line that receive good feedback in order to gain rep with Google rankings.
I strongly suggest watching this video below. It covers everything I just stated in depth and will give you further knowledge on the history of Google rankings and where they’re heading.