After reading this article, you will have key takeaway knowledge about the main Google algorithm updates allowing you to monitor any SEO services you employ.
We also give you actionable insights to apply immediately.
Understand the SEO implications of the main Google Algorithm updates in 9 min
After reading this article about how the main Google algorithm updates affect SEO services, you will be able to distinguish whether your SEO services have fallen victim to an update and what corrective steps your SEO company would need to take. Pay close attention to this free guide and you could avoid a penalty due to one of these updates all together. We guide you on how to optimise content to work with the updates instead of against them. We will only explore the main updates that we believe are most relevant to the main SEO issues we find, when helping clients.
It’s important to understand Google makes algorithm updates all the time, between 500 and 600 changes, at least, per year. Most of these go unnoticed but when your results suddenly drop or rise without warning by a large variation it’s usually because of what we SEOs call a major update. These updates have been known to ruin companies in the past but also allow good ernest SEO companies like BrandSeed to rank on page 1. How do identify if you have been hit by a penalty by one of this updates:
If your SEO company practices white-hat, Google supported SEO techniques, there should be no major negative changes in your rankings. Support Google in delivering its value proposition of superior, relative content to its searchers and you will never be affected negatively by a new update. In the meantime, let’s learn about four major updates and how they affect SEO services.
How does RankBrain affect SEO services?
RankBrain is Google’s spearhead code in machine learning, RankBrain functions with AI and focuses on swapping and comparing website rankings according to measured metrics of dwell time, bounce rate and whether or not, pogo-sticking takes place. The websites that are measured to have more engagement through longer site visit duration and lower bounce rates will rank preferentially to counterparts vying for the top positions.
Let’s take a look at what these metrics mean:
The amount of users that click on the back button within a few seconds of landing on a webpage
Site visit duration:
The amount of time a user spends on the site in a single session.
When users jump back off pages of your website, suddenly and often, after clicking on an inbound URL.
Google wants to know that your website is indeed what the searcher is searching for. One of most logical ways to measure this, would be how long the visitor views the website and also whether a user hits the back button almost immediately. Both telltale signs of a user not finding what they are looking for.
How do we lower the bounce rate?
The design, layout and copy of your website above the fold and mobile optimisation is crucial to bounce rates being positive or not. The copy should be bold and headings large so that the user can ascertain what the value proposition is immediately. It goes with out saying that the content should be exactly what the user was looking for, do not try and use keyword combinations that match synonyms or unrelated content
How to improve the dwell time?
Dwell time is how much time a user spends on your website. In order for content to be absorbed or even noticed by a user, it must be formatted accordingly:
Breaking down ideas in one or two sentences. Large paragraphs with 5 or more sentences tend to be skipped. Break paragraphs down if you want the user to carry on reading.
use images and video
Humans process with 92% visual processing. images and video help to break up large swaths of copy but also help the user understand the copy faster and engages them more. Also, when users watch a video embedded on a page, it will also affect dwell time positively, as more time is spent on the page. This in turn positively affects RankBrain over time, a user data positively aggregates over time, thereby improving ranking.
Make sure that your content is extensive. Long-form content has been proven to be shared characteristic of websites ranking in upper positions on page one results. it is also logical, more words, mean longer reading and more dwell time. The caveat of course, is making sure that the content is formatted correctly and of high quality so as to keep the user engaged.
Use variations in keyword phrases relative to concepts in copy
RankBrain, like a human, can group ideas or keywords into concepts. In fact, RankBrain will expect certain associations or variations of keywords to exist if a blog post or page is written naturally. Make sure that you use enough variations of you keyword phrase to satisfy RankBrain, in your content relevance. Those of you who think I’m confusing this point with Hummingbird, should know that although RankBrain is part of Hummingbird, it has evolved to be the forerunner of the two.
Monitor your Google Analytics carefully under content customer behaviour. Look at where your site users travel and where they drop off your site. Analyse the results yourself when working with an SEO company, the results are what you are paying for!
How to work with RankBrain?
1. Ensure that headings convey the value proposition immediately.
2. Ensure the value proposition matches the user intent.
3. Format the layout of the content for skimming.
4. Ensure you have Google Analytics installed correctly to monitor bounce rate, dwell time and page clickthrough depth.
How does the Panda update affect SEO services?
The Panda update originally rolled out in 2011 with various updates until a last twitter mention in 2014 by Matt Cutts, former head of webspam at Google, even tweeted about it. The update aimed to prevent low-quality content websites from appearing in the upper search echelons. This update was and is still functioning as a core algorithm update. This update continues to be fine tuned today and when it comes to SEO services it is extremely important to keep in mind when looking at content marketing and user generated content.
Weather report: expect some Panda-related flux in the next few weeks, but will have less impact than previous updates (~2%).
— Matt Cutts (@mattcutts) October 5, 2011
To simplify Panda, you can see it as the main drive of Google to ensure quality content for its searchers. Thin content, with lots of ads to affiliates and lots of content pulled from other websites is a recipe for disaster when it comes to Panda.
(source: Calogero Dimino)
Let’s take a look at the main Panda ranking factors to take into consideration when designing and deploying a content strategy.
Thin Content will affect SEO services
Content marketing is the new SEO. Content is the commodity consumed on the web after all. Google wants searchers who use Google search to find in-depth knowledge of the topic they are searching for. Content should be lengthy, have images and video to enhance it where possible
We mentioned long form content earlier in this article as it improves dwell time. Long form content shows depth which is also relevant to the Panda update.
Duplicate content is another trigger for a Panda penalty to your website. Google frowns heavily on duplicate content. Use tools like siteliner.com to scan for duplicate content. It is normal for websites to have some duplicate content and especially common content but the percentage or duplicate content should be under 10% to be safe.
Lack of trustworthiness
A Google staffer, Amit Singhal, went on record to say that websites should work to establish themselves as an authority on the subject matter of their content of their website. Google wants to see that the authors of content of identified, referred to by backlinks and even socially active. Another Google algorithm update that may be tied to this, is AuthorRank where Google monitors your followers and social interactions across the web in order to rank your website. Recently Google added a new feature to verify you influencer knowledge tile.
Amit Singhal, in his article on Google Webmasters about the Panda Update, went on to offer some questions that may guide you when looking at the quality of your content in relation to Panda:
Would you trust the information presented in this article?
Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
How relevant is the content to the search query?
It is also very important that your content matches the user’s search query as closely as possible. Some SEO companies use keywords that are not completely relevant to their client’s content in the hope to gain extra sales or subscriptions but this is a big “no-no”. Make sure you help Google connect their users with the exact content they are looking for when employing SEO services.
How to make the best of Panda update?
1. Create in-depth content.
2. Create authority for yourself by supporting your points from other expert sources.
3. Use siteliner.com to scan your website for duplicate content.
4. Repurpose or edit duplicate content or use rel canonical links to canonicalise the main content
5. Read our SEO Cape Town Guide for on-page SEO to help with relevancy.
How does the Penguin update affect SEO services?
Originally rolled out in 2012, Penguin update, was made part of Google’s core algorithm in 2016. As a core algorithm component, you should take note of Penguin when looking at your SEO services. The Penguin update is relative to the link profile of a website both in- and outbound links, penalising websites that game search engines through unnatural link profiles.
Building backlinks has always been considered operating in murky waters when building unnatural links in order to rank. The issue is Google still strongly relies on backlinks in their ranking algorithm but many SEO companies employ black-hat or paid backlink strategies. The problem is that these links are often beneficial for a period of time and then later Google penalises a site long after the unnatural backlink profile was created. Often the SEO company and client have long since parted ways, leaving the client with SEO services now working against them, after the fact.
Google specifically addressed the release of the Penguin update, in a original post by Matt Cutts in 2012, by using the example below of how outbound anchor text of the blog was not relative to the content of the featured blog. This is again a strong indicator that Penguin is related to link profiles, both inbound and outbound and relative anchor text.
The example below is considered black hat:
As of last week, at the time of writing, Google announced that it had managed to bring down spammy backlinks by 50%. If you weren’t too sure about the severity of Penguin update, then this announcement should make it very clear for you – contrived backlinks are a disaster.
Google made another an announcement last week, that even backlinks from forums would now count less to combat unnatural backlink building through comment spam. They even went further to downgrade the authority or link juice of backlinks from press releases. This technique was always considered a highly effective, white-hat standard technique. Google are serious about the quality and authenticity of backlinks. Make sure that the SEO consultants or agency delivering your SEO services are aware of this!
How to make the best Penguin update?
1. Do not allow your SEO company to buy links!
2. Quality of backlinks is more important than the quantity!
3. Do not automate link-building with any SEO company!
4. Create amazing content worth linking to for authentic backlinks.
5. Employ a white-hat SEO company!
As a side note:
In the same Google Webmaster post as the Penguin release, Matt Cutts mentioned that keyword stuffing is not advised. Keyword stuffing is a really ancient and ineffective technique of trying to rank higher by overloading copy or meta descriptions, heading and title tags with keywords in away that is unnatural and not enjoyable to read. This is more related to Panda Algorithm that we have already discussed. In 2018, any company offering SEO services that still applies this technique should be parted ways with, immediately!
Interesting enough, John Mueller, a famous Google engineer in their search department, recently announced as of last week at the time of writing, that if the content was of a very high standard it may turn a blind eye to this practice. This certainly is not an indication to keyword stuff as Google has been ambiguous about backlinks in the past but today black-hat gaming of backlinks is highly penalised by Google’s Penguin update. The slight advantage gained is not worth a potential Google penalty.\
Yeah, but if we can ignore boring keyword stuffing (this was popular in the 90’s; search engines have a lot of practice here), there’s sometimes still enough value to be found elsewhere. I don’t know the page, but IMO keyword stuffing shouldn’t result in removal from the index.
— John ☆.o(≧▽≦)o.☆ (@JohnMu) June 20, 2018
How does the Pigeon update affect SEO?
Local search has become a major component of SEO services of late. The reason being is almost all of us click on the local stack or results panel next to a snippet of Google maps. We do this because we know it is real and close by, and usually when we require a service it is more convenient when it is close by.
Google Pigeon was an historic local algorithm update, released in December 22, 2014, changing the original 7 “pack” of results to just 3 and a Google maps listing snippet. Google wanted a way to marry its Organic results with its Local results and the Pigeon release was intended to achieve this.
Pigeon uses the amount of local contextual directory backlinks to gauge how relevant the website is locally to the search query inputted by the user. Pigeon also relies on mention of actual geographic location names contextually within copy an geographic keywords to be used in the copy of the pages of the website.
Google wanted to improve its local results even further, so it introduced a sub-algorithm update called Possum. Possum basically took the distance of the company’s physical location in relation to the searcher into account. The three results in the local stack are displayed to the searcher relative to their geographic position in relative to the company’s actual physical or walk-in store location. Your local business will also show if you demarc a local area as a service area but these are heavily overshadowed by actual brick and mortar shopfronts.
How to make the best of Pigeon update?
1. List your company on Google My Business, keep your name the same as all other listings.
2. If it is possible have a walk-in brick and mortar store, it will rank with preference when compared to zoned service areas.
3. Use locations terms in copy
4. Create quality directory citations