Google Algorithm Updates You Should Know
September 12, 2017
Search engines have come a long way since the first ones appeared in the early 90’s.
Most of the first search engines have been pushed out by the big players we have now, like Google, Bing, and Yahoo.
Now, when it comes to search engines, Google comes out on top.
Although search engine share rates fluctuate regularly, Google regularly takes the majority.
Since 1998, Google has been working and perfecting their algorithm to show the best results to their users.
As the years have gone by, they’ve regularly updated the core algorithm in addition to releasing updates to ensure their search engine performs for users and stays on top.
Some of these updates weren’t a big deal, they were expected based on Google’s best practices and evolving technology.
However, other updates threw the SEO industry for a loop. In hindsight, it was for the best, but at the time, it left many companies scrambling, some of which never recovered.
Search engine algorithms are always changing and evolving. It can be hard to keep up and to understand how each one affects your site.
All the algorithm updates released have grown Google into the search engine it is today and they brought the digital marketing industry to where it is, but not every algorithm update is equally important when it comes to knowing how best to develop and maintain your website.
To clear up the confusion and make it easy to reference the updates that matter most to your site, here is an alphabetical list of the Google algorithm updates you should know:
Ads Above the Fold (a.k.a. “Top Heavy”)
The Panda algorithm update targeted high ad-to-content ratios, among other things, but the Ads Above the Fold update, or “Top Heavy” update as some SEOs called it, of January 2012 targeted and devalued sites that placed too many ads “above the fold” (the area you can see on a website without scrolling).
February 2014 rolled around and it seems some webmasters didn’t get the memo about having too many ads above the fold, so “Top Heavy” was refreshed and more sites were penalized.
This AdWords update may have been all about paid search, but it still made a big difference in the SERPs (search engine results pages). With this update, Google completely removed the right-column ads and added a 4-pack of paid ads to the top of commercial searches.
Because of these changes to the layout of the SERPs, the organic site listings were pushed even further down the page. More than ever before, success in the search engines was less about “ranking #1” for a keyword and instead was about becoming the most valuable, relevant resource for your users.
Austin was released in January 2004 and was considered a continuation of Florida. It cracked down further on what Google had started considering deceptive on-page tactics, like keyword stuffing through invisible text and in meta-tags.
Around this time, it’s speculated that Google began to take page relevance more seriously as they activated the “Hilltop” algorithm, which considered the quality and relevancy of the outbound links on your website.
This was the first named Google update and it was announced February 2003 at SES Boston.
On its journey to a more effective algorithm and more relevant SERPs, Google aimed for one major update every month.
Before Boston, it was the “Google Dance,” which referred to the monthly combination of algorithm changes and major index refreshes that normally resulted in PageRank fluctuations and shakeups in rankings.
The Boston update was one of these of these monthly updates, the only difference this time was that it got a name.
Brandy’s release in February of 2004 marked a huge index expansion with the inclusion of LSI (Latent Semantic Indexing).
This introduced the concept of link “neighborhoods” and began grouping topics together, increasing the attention Google paid to anchor text relevance. It also expanded upon Google’s ability, at the time, to understand synonyms and related keywords.
Caffeine marked a major infrastructure change. It was designed to speed up crawling and expand the index by integrating indexation to move towards ranking in real-time (or as close as they could get).
A preview was released in August 2009. The official rollout of Caffeine began in 2010, spanning several months and ending around summertime (June 2010).
Cassandra was released back in April 2003. It addressed, and cracked down on, a few link-quality issues, like tons of links from co-owned domains. Much of the update cracked down on hidden text and hidden links.
With hindsight being 20/20 and the focus slowly starting to shift towards quality links, we could consider Cassandra an early precursor to Penguin.
Esmeralda & Fritz
Released in June 2003 and July 2003 respectively, Esmeralda and Fritz marked the end of the monthly “Google Dance.”
Updates became a part of rolling algorithm updates and thus became more frequent, the idea of one major update a month and a monthly “Google Dance” faded.
Google “dances” still happen as other updates are rolled out and results change, but the idea of a monthly “Google Dance” and a quota for algorithm updates no longer exists.
These updates marked the beginning of a more continuous, rolling update process and an index that began to update daily instead of monthly.
Released in November of 2003, Florida was the first of several updates that shook up the SEO industry. This one caused several pages on sites to lose ranking, but also caused other pages to gain ranking.
Looking back, this was the first big alert that keyword stuffing was on its way out the door. We could also argue this was another step for Google towards identifying and beginning to devalue black hat SEO tactics.
In March of 2017, Google released an update that affected websites with outdated or irrelevant content. Commonly called “Fred”, professionals in the SEO field have analyzed and monitored this update to understand how it affects different types of websites.
Even though Google hasn’t officially released information regarding this update, industry experts have researched its affects and uncovered a few common insights. Websites that have been hit the hardest by “Fred” share a few common characteristics.
Outdated or Irrelevant Content
Significant Amount of Paid Ads
To avoid being penalized by this update, there are a few key areas you should improve on your website. If you have product/service descriptions or other content that hasn’t been updated in a few years, it may be time to refresh what you have. Giving search engine crawlers new content to index not only helps your ranking but also helps create a better user experience for your visitors overall.
Paid ads can be a great way to earn extra revenue for your website. However, if your visitors feel bombarded by them, it may negatively affect both your rank and traffic. There are several tools available on the market to assess the overall experience of your website but one of the easiest ways is simply visiting it yourself. If you are distracted by the number of pop-ups and paid aids on your website, chances are your visitors are too.
The Freshness Update was released in November 2011 and it rewarded fresh content. Fresher content got a boost in time-sensitive results, outdated content didn’t.
This update was a strong statement from Google that keeping your content updated and adding fresh content to your site mattered.
Google+ was Google’s attempt to create a social network to compete with Facebook. It was released in June of 2011. Though it never reached the levels of Facebook, Google+ did amass 10 million users within 2 weeks after launch.
The Search + Your World update in January of 2012 pushed Google+ social data into the search results, including user profiles and posts.
Although Google+ currently has over 2.5 billion users (because each Gmail address automatically gets a Google+ profile), only approximately 4 to 6 million users actively post and engage.
Google launched Local Business Center in March of 2005. After several years and iterations, we now have Google My Business, which integrated local information on Google with your Google+ business page into one easy admin center.
Back in October of 2005, however, we had the Local Business Center and it merged with Google Maps data. This would be the beginning of an increased focus on Local SEO and the foundation for the Universal Search update of May 2007 and the local pack we’re used to seeing in today’s SERPs.
In 2005, information from the Local Business Center merged with data from Google Maps to integrate local into the search results. In April 2010, Google Places was officially launched and Local Business Center was re-branded as Google Places.
This move from Google integrated information from the new Google Places even further and more closely with local search results.
Google Suggest & Google Instant
Suggest was released in August of 2008. Again, not a traditional algorithm update, but a significant change to the appearance and function of search. With Google Suggest, as a user typed their query, suggested searches would be displayed in a drop-down below the search box.
Personalized Search laid the foundation for this when it was released June 2005. And, Google Suggest went on to evolve into Google Instant, which was released September 2010.
The focus of many companies in supporting a website is to drive leads and generate revenue for their business. One of the easiest ways to help do this is to create a Google Business listing and link it to your website. August’s algorithm update is aimed to help businesses differentiate themselves in a search and improve their ranking.
Since the release of the Possum update, businesses with mailing addresses in the same building or on the same street have had trouble differentiating themselves during a search. In some cases, businesses would unintentionally block other local competitors due to Google’s initiative to eliminate duplicate content.
The “Hawk” update seems to be an attempt to resolve this bug and give businesses in close proximity to one another a strong presence in Google Searches. Even though this update has only been released in August of this year, experts believe that certain businesses have already seen a profound effect in both traffic and ranking.
In August of 2014, Google released the HTTPS/SSL update, which gave preference to secure sites. Google also announced that adding encryption would provide a “lightweight” boost in site visibility.
Though the boost would start out small, Google implied it could increase if the change was positive (in Google terms, this means if the users like secure sites more than “un-secured” sites).
It’s still a bit vague when it comes to how much of a boost adding encryption gets your site, but it undoubtedly does help.
Hummingbird was released in August of 2013. It was the biggest move towards semantic search Google had made thus far. When it comes to Hummingbird, think “conversational search.”
This was Google’s attempt to focus on the meaning behind the words, not just the words themselves.
When it comes to your site and the content on it, focus on topics, not singular keywords to make sure you’re addressing the questions your users may be asking.
“In the News” Box
Before October 2014 and the “In the News” Box Update, only content from traditional news sites showed up in the News Box results. After the update, any content deemed newsworthy could show up there.
This definitely wasn’t the birth of newsjacking, but it certainly made it a more effective tactic for sites in any industry.
With the biggest impact occurring in October 2005, Jagger was another update aimed at improving link quality. It targeted low-quality links like reciprocal links, paid links, and link farms.
This update, along with the Cassandra update a couple of years prior, showed a slow, but steady move towards valuing link quality over link quantity. Something Google really put its foot down on when it released the Penguin updates.
In May of 2012, Google rolled out the “Knowledge Graph.” This was a display integrated into the SERPs (usually appearing on the right-hand side) that offered additional information about people, places, and things users were searching.
The goal was to provide answers, not just links. The Knowledge Graph was another step towards semantic search for Google.
In December of 2012, the Knowledge Graph received some enhanced capabilities and additional functionality for several non-English queries. Another expansion occurred July 2013 to increase the number of searches for which Knowledge Graph entries appeared.
May’s algorithm update focused on UX and made a powerful statement to remind website owners that content is key! Similar to “Fred” and the previous Panda update, this month’s update focused on directing users to websites with strong user-experiences.
Like many recent algorithm updates, Google hasn’t officially released information regarding the affects the update would have on websites. Industry experts and bloggers have done research and kept a watchful eye over how this has affected them. Here’s what I’ve found.
Websites that faced a severe drop in organic traffic as well as search rank exhibited several common characteristics. These characteristics include websites with large amounts of paid ads placed throughout multiple pages, deceptive ads that link to unrelated sites or spam, and overall content that poorly relates to your product/service or organization.
The result of this information supports that strong relevant content and a positive user experience not only drive traffic to your website but also help improve how well it ranks. For a complete list of guidelines published by Google regarding content visit: Google’s Rater Guidelines
Hailed as a precursor to the more widely known Panda update, the May Day update of May 2010 hit thin content hard, mostly in regards to long-tail search traffic.
Websites with a lot of thin content were hit the hardest and saw significant drops in traffic, particularly from long-tail searches.
Several months before this update was released in April 2015, Google announced it was coming.
It was a rare move for them, but after years of dealing with SEOs, Google knew by announcing upcoming penalties, they might be able to get webmasters to respond accordingly to avoid it – they were right.
No one could deny the mobile update, dubbed “Mobilegeddon” was coming. Google had made it clear – get mobile-friendly or get out.
As such, once the update began rolling out April 2015, the impact of the update was smaller than expected, at least for the short-term.
A second mobile-friendly update was released in May 2016. This would serve as a good foundation for Google’s Mobile-First Index.
Although it was not a traditional algorithm update, the “nofollow” attribute was introduced in January 2005 and over time, has had a significant impact on the link graph.
It was a combined move from Google, Yahoo, and Microsoft to help control outbound link quality as well as combat spam. With the “nofollow” attribute, it was much easier for webmasters to clean up unvouched for links, spammy blog comments among them.
Though several previous updates targeted content quality, none made as big of an impact as the Panda algorithm update of February 2011.
Panda came down hard on thin content, high ad-to-content ratios, content farms, and several other issues related to quality.
For sites still using black hat SEO tactics and relying on thin content (and low-quality links), this update left them reeling, some of which never recovered, and even more that then got blasted by the impending Penguin update.
The message from Google was clear – create quality content for your users, or else.
Over the next few years, updated versions of Panda would be rolled out until January 2016 when it became part of the rolling algorithm update.
Originally referred to as the “Webspam Update” and later named “Penguin,” this update rolled out in April 2012. It targeted “over-optimization,” so if you were caught keyword stuffing or using link schemes, you were hit hard by Penguin.
Penguin changed the way SEOs approached link building. It was no longer about which site had the most links pointing to it, it was about the quality, relevancy, and balance of those links as well.
Much like Panda, several iterations of Penguin were released until September 2016 when it too became part of the rolling algorithm updates. September 2016 was also when Penguin shifted from site-wide penalties for bad links to simply devaluing bad links and if penalizing, doing so on a page-by-page basis.
Though not the first time Google had tried to tap into personalized search, this June 2005 update tapped into users’ search histories to use that information to automatically adjust results.
The initial impact of this update was small. However, this would be the first of many steps towards the more personalized and semantic-based search we have today.
Pigeon was released in the United States in July 2014 and was all about local.
According to Google, Pigeon tied the local algorithm and the core algorithms closer together to improve how they interpreted location cues and to provide better local results.
In December of 2014, Pigeon expanded to include Canada, Australia, and the United Kingdom.
Also referred to as the DMCA Penalty, “Pirate” was released in August 2012 to target and penalize sites with a track record of copyright violations. Duplicate content, scrapped content, and similar spam have been targeted before, but this was the first time Google cracked down specifically on copyright violations.
Since some sites still didn’t get the message, Pirate 2.0 was released in October 2014. A much smaller group of sites was hit this time around, but the ranking drops for this segment were significant.
Google’s message was clear – don’t steal or scrape content.
October 2011 marked another significant change with Google’s announcement that they would be encrypting search queries. Google cited privacy reasons as to why they made this move.
Regardless of their reasoning, this was the beginning of “(not provided)” and a big jump away from focusing ranking for specific keywords towards offering the best content around a topic.
In October 2015, Google made a major announcement – machine learning had been part of the core algorithm for months and it contributed to the 3rd most influential ranking factor.
This announcement indicated a level of automation on the horizon for providing better results in real-time.
Universal Search in May 2007, Google Suggest in August 2008, and the preview of Caffeine released in August 2009 paved the way for Real-time Search in December 2009.
This integrated several sources, including Twitter feeds, Google News, newly indexed content, and more to make some top SERPs into a real-time feed.
In February 2009, Google, Yahoo, and Microsoft teamed up again to announce the release, and support of, canonical tags.
These tags allowed webmasters and SEOs to signal canonicalization to search bots without affecting human visitors. This gave webmasters and SEOs a better solution for dealing with self-created duplicate content on their site.
When you look at the SERPs today, a website listing is not just a URL and a description, it’s so much more. All of that extra makes richer search results (rich snippets), and you can thank structured data, or schemas, for that.
The Schema.org announcement from Google, Yahoo, and Microsoft in June 2011 was the first big move towards these rich snippets in the search results.
Though the use of social signals as a ranking factor is widely debated today, back in December 2010, both Google and Bing confirmed they used social signals from Twitter and Facebook to determine ranking.
Universal Search was launched in May of 2007. Though it was not a “typical algorithm update,” it changed the traditional 10-listing SERP forever by integrating News, Video, Images, Local, and more into the search results.
This changed the format of the SERPs and started the shift away from “ranking” and towards “visibility.” After all, with these other verticals included in the search results, your website listing being “#1” didn’t mean what it used to.
Before the XML Sitemaps update of June 2005, traditional HTML sitemaps were all that was available. With XML sitemaps, webmasters and SEOs were given a little bit more influence over the crawling and indexation of their website.
An XML Sitemap, in addition to a robots.txt file, and other methods allows you to tell search engine crawlers which pages of your site you want to be indexed, which to give priority, which pages they can ignore, etc.
All of this works together to improve the crawl rate of your website and improve the rate at which the content on your pages is refreshed and indexed in the SERPs.
Google and other search engines are always updating their algorithms and releasing other updates to ensure they are delivering the best results to users.
It’s a lot to take in and a lot to track. The one thing you can count on is that the algorithm will continue to change along with evolving technology and user needs.
For right now, the message is clear – what’s good for users is good for your site.
So, get to know your users, and then do what’s actually best for them (not what you think is best for them).
Not sure if your site is keeping up?
Reach out to the VIG for a quick chat.