The Early Days of Search Engines
Once upon a time all it took to rank highly on a search engine was meta tags that included a search keyword and a high density of that keyword on your site. This resulted in site owners spamming results pages with ad ridden websites that provide users with frustrating and unsatisfying experiences.
Enter Google.
Google arrived on the search engine scene with a revolutionary algorithm called PageRank. PageRank was a formula that essentially ranked sites based on the number and quality of links pointing towards that site. By introducing a metric that accounted for data from other websites, this made it harder for spammy websites to dominate the results pages as each of these qualifying links could be seen as a ‘vote of confidence’ for that site.
Although PageRank was a game changer, spammers once again found there way around it through link spam and other black hat techniques. Link spam can take many forms but basically involves links coming from illegitimate sources in abundance to trick Google’s algorithm into providing a site with a higher authority than it deserves. This exposed loophole marked the beginning of Google’s battle and never ending process of updates to their algorithm that we are still experiencing today.
Overview of Google Algorithm Updates
From the beginning of Google’s dominance in the search engine space to today – with Google still accounting for approximately 78% of searches – Google has been continuously working on perfecting its algorithm. As long as spammers and black hat SEO specialists continue finding loopholes, Google will be making improvements and new additions. So much so that every single day Google is adjusting and tinkering with its algorithm to the point that nobody outside of their core team knows exactly what makes up their algorithm. So when it comes to writing an overview, it would take days to analyze every minor update Google has made over the years. And who has time for that? So instead, we’ve done the dirty work for you and pulled out the major updates with a brief synopsis of each (in chronological order) and what they mean for you and your website. Let’s get started with one of the most important updates still today, the Panda Update.
1. The Panda Update
The Google Panda Update was released in February 2011 and is still being updated from time-to-time today, so obeying the rules laid out here is important for ongoing SEO success.
The Panda Update was the introduction of a filter to Google’s process. In January 2016, the filter was such a critical tool for Google, that it was officially incorporated into the algorithm. The Panda filter is in charge of syphoning out websites that have poor overall content from the search results or at the very least, preventing them from ranking well. The major flags that Panda will check for are duplicate content, thin content and keyword stuffing, meaning pages that stuff their text with target keywords.
In order to prevent your site from getting penalized by Panda, several practices should be kept to ensure your content remains quality:
- Eliminate duplicate content from your site, which could exist in duplicate pages and/or similar content on multiple pages.
- Update all pages on your site to have at least 800 words of content and refresh this content a periodically.
- Avoid any keyword stuffing or use of keywords that seems unnatural.
2. The Venice Update
The Venice Update went live in February 2012 within a list of updates to the current algorithm. Frequently overlooked, this update is an important guideline for local SEO tactics.
Venice is an update to Google’s algorithm that improved search results for local queries. Until this point, if a user wished to receive location based results from their query, they would need to attach the appropriate modifier to their search (eg. for restaurants in London, the search would need to include the word London). After the Venice update, Google began using a computer’s IP address as well as the user’s physical location to aid their results for searches that had a local intent but were not obviously typed out (eg. if you searched for ‘good italian restaurant’, Google could understand that you meant not only a good italian restaurant but results close to you). See an example below for a search for ‘seo agency’ in the San Francisco area below:
Further updates known as the Possum and Pigeon Updates, have been made to compliment the Venice Update since 2012. These updates have further honed in on results that are closest to you showing up for your query. Other improvements were also made to help remove new local spammers who had found loopholes in the Venice update.
All in all, if you have a page that survives off local traffic, it is important to ensure your pages are reflecting that intent. Optimizing your meta data and on page content with local references is a good start to seeing higher rankings from a local audience.
3. The Penguin Update
The Google Penguin Update was first introduced in April 2012 and has since been updated a number of times. This update unlike Panda, is a real-time part of Google’s algorithm.
Penguin is a piece of software that focuses on backlinks. This was an update that was overdue from the early days of PageRank and has caused many black hat SEO specialists a major headache. The essence of this update is that any links that Penguin deems spammy or manipulative are given zero additional value to a website and will result in the respective website being given a penalty. In earlier versions of Penguin, these links would have devalued the website on top of being penalized, but Google noticed that anti-competitive sites were targeting manipulative links at their competitors and made the change.
To avoid being penalized by Penguin, the following measures should be taken:
- Quality over quantity: avoid low authority domains when searching for new links. A good way to ensure the domains you are sourcing from are trustworthy for links is to enter the URL on Majestic and locate the scores for Citation Flow and Trust Flow. The rankings are between 0-100, a score in double digits is respectable, and the scores should be relatively close to each other.
- Diversified anchor text: when creating anchor text for a link, the text should avoid always being keyword rich or Google will flag this as keyword stuffing. Instead, focus on diversifying your anchor text to make a quality backlink.
- Avoid buying links or using tools to create backlinks: both of these tactics will land you in partnership with low quality sites that are known for black hat tactics. Creating a strong network of links is not accomplished overnight and requires a good amount of work and Google knows that.
4. The Hummingbird Update
In August 2013, Google shook up the game and changed the core of their algorithm. This major update came to be known as the Hummingbird Update.
This update, unlike Panda and Penguin, was less about identifying and penalizing black hat techniques and more about improving Google’s search results. The idea was to better understand user’s search intent with their queries and provide them with more relevant answers. This meant expanding results past those that just matched on-page keywords to include latent semantic indexing, co-occurring terms and synonyms. By employing an advanced language processing algorithm, more low quality content was cut out and results pages were filled with more relevant pages than ever.
As mentioned above, the Hummingbird Update was less about catching spammy users and more about improving Google’s results pages organically. In order to be successful with respect to Hummingbird, ensure your content is natural and you are targeting key themes, not just individual keywords.
5. The Mobile Update
In April 2015, Google launched their Mobile Update which is as straightforward as it sounds.
This update punished sites that lacked a mobile-friendly version of their website or had poor mobile usability. Thus if a search was made on a mobile phone, results with a mobile site were given higher priority and sites that did not were pushed down results pages.
The solution here is as straightforward as the issue – if you haven’t already, launch a mobile version of your website. Once launched, you can utilize Google’s Mobile Friendly Test to see if your site is mobile friendly and has any usability issues. An additional update in July 2018 was released by Google discussing how page speed will now be an official factor for ranking. So in terms of best practices, usability and speed are both important considerations. To learn more about going the benefits of going mobile, check out this post on how to rank better with mobile SEO.
6. The RankBrain Update
RankBrain was released in October 2015 as a compliment to the Hummingbird algorithm. Although we don’t know exactly how RankBrain works, we do know that Google has publicly stated that it is the third most important factor for ranking.
RankBrain is a machine learning system that acts as a query processor to further assist Google in understanding search queries. It is believed that RankBrain is continuously recording and backstoring written and verbal queries and processing them into potential intentions.
The strategy for remaining successful with RankBrain does not differ from Hummingbird but simply puts an even larger focus on ensuring your website is searchable, user-friendly and has an appropriate amount of content throughout. A wide range of keywords and supporting backlinks on authoritative partnering websites would also prove to be beneficial.
7. The Fred Update
The latest Google update came in March 2017 with the coined name Fred. Fred’s goal is to identify websites that are violating Google’s Webmaster Guidelines and send a warning (example below) and subsequent penalty if the issue is not resolved. These guidelines are created to prevent many tactics but the majority of flagged sites are dealing with content issues. These content issues can be anything from thin content with an obvious attempt to upsell to pages that are covered in advertisements.
Best practices to avoid being flagged by Google Webmaster Guidelines are to review the outlined rules and adjust your site if anything is not up to par. Having ads or promoting third party sites are not going to get you in trouble as long as they are posed in a natural way and the parties you are advertising are authoritative.
Best Practices on Google Today
If you’ve made it this far, give yourself a pat on the back. Google algorithm updates aren’t always the most captivating reads. But needless to say, they are important to be up to date on. If any of the updates above have you worried that you may be being penalized, a good resource to check is the Barracuda Tool which will help you investigate if you’ve been affected. After going over each of the major updates above, we can see a couple overarching themes that can help us be successful:
- Google is transparent about SEO and wants your site to be impactful
- SEO can be complicated and difficult to master, but knowing just a few best practices goes a long way
Its common to fall in to the feeling of looking at Google as the unfair ruler of the search engine world. But Google really does want your website to be as good as it can be. Their success and relevance depends on it. When someone enters a keyword search, they expect Google to give them the answer to their question instantaneously regardless of how many ways their search could potentially be interpreted. That can be a lot to ask. So in order for Google to pull off such a trick every time, they require our help in making our sites as easy to crawl, understand and categorize as possible. And in return, Google will rank your site higher and higher up their results pages.
Further proof of Google’s support came in 2016 when Google announced the top 3 search ranking factors. By announcing the three biggest factors for ranking, that not only shows that Google is more of an open book about SEO than we may have assumed but it also allows us to hone in on these factors and optimize our websites to rank as highly as possible. Diving into those ranking factors, it is no surprise that content and RankBrain are both in the top 3. Both of these factors stress the importance of your site being quality and natural, which is basically Google rewarding the best sites with the best content. Finally, seeing backlinks ranking so highly is essentially telling us that Google is using the internet to grade itself. Backlinks can essentially be seen as votes of confidence for a website so the more votes a website gets and the higher the authority these votes come from shows Google that the site in question is legitimate.
At the end of the day, SEO can be a daunting task to master for your website. But knowing what Google considers important, avoiding black hat SEO techniques like buying links and keyword stuffing, and monitoring what is on your site and who is connected to it are all easy tasks that can get your site ranking. From here, it’s key to find an SEO agency to partner with for the long haul. Take your time to review and choose the best SEO agencies and decide on a partner who understands your business objectives, and has a proven track record of success with customers in your industry.
Leave A Comment