Free SEO analysis by a human in 24 hours, just enter your website address ...
Often as an SEO company, we find clients initially want to dictate to us what content they will produce. The issue is that although a client understands its unique service or product offering best, the content will only be found in search engines if it matches users' queries entered into search engines. The SEO company has to advise which user queries will deliver the most traffic.
It is important to realise that Google has a value proposition to fulfill. Imagine if one day, Google began serving results that weren't related to the query you were entering. You would soon be looking for another search engine. See the bigger picture from an SEO Company's point of view
The point is that search engines will give primacy to content, therefore rankings, that match the user query. The challenge matches what content with the user queries your target audience are inputting into the search engine.
Rather have traffic driven to your website for queries that have been inputted in sufficient volume than no traffic with content that you prefer.
Site architecture comes into play here too. An SEO company will make sure that the user path or journey is at a maximum of three clicks away. Users want the information that solves their query as soon as possible, the site architecture should enhance this process.
SEO companies match user queries with content to avoid high bounce rates. Bounce rates are when someone hits the back button almost immediately when they land on a page. When someone is trying to solve their pain point, they won't hesitate for a moment to hit the back button on their browsers. High bounce rates, in turn, affect Google's RankBrain algorithm negatively, as it measures user metrics.
Another important metric affected by match user queries with content is, dwell time. Dwell time is the site visit duration of each user per session. Google measures dwell time, that is why its an available metric in Google Analytics. Dwell time is one of the main ways Google can "see" if your content is engaging. If someone is looking for a specific solution or information and spend a long time, longer than 2.5 minutes on your site, this is a strong signal to Google that your content is solving their pain point or need and thus appropriately matched.
Again, if we help Google solve its value proposition, we are already half-way there, with SEO.
Google can only "see", what its GoogleBot or crawling program can crawl. When there are technical SEO errors that prevent Google from "seeing" your content and in turn indexing it, your website won't rank. Read our SEO Cape Town Guide for how to optimise content to allow Googlebot to crawl it efficiently.
SEO companies should do a technical SEO audit as part of their SEO services. There is a range of free SEO tools to help you with an SEO company with such a task.
Screaming Frog is leading web scanning software that crawls your site and shows how a crawler would see your site.
The idea is to use crawl software that mimics GoogleBot's crawl path to see if there are any errors. Often large enterprise sites, have strange redirects or inaccessible pages, and often they are not even aware of it.
As a Golden Rule, any main page, if not all, should be accessible within three clicks.
Often simple but crucial elements of technical SEO are negligently overlooked such as XML sitemaps and htaccess files.
Are basically a content map and hierarchy of your site pages and media such as videos and images. Often sitemaps are not updated when new content is produced. XML sitemaps help search engines crawl your site more efficiently. It is an SEO company's duty to build appropriate sitemaps with appropriate sub-levels for enterprise-size websites as a roadmap for search engines.
Search engines can and will crawl your site without sitemaps but nowadays it is crucial that Google gets to content sooner in order to rank sooner and therefore drive traffic sooner. An XML sitemap makes Google's GoogleBot easier and therefore the ranking of your pages.
As an SEO company, we often come across sites that have redirected page upon page in the htaccess file. Htacess files are there to add redirects but also to show the preferred version of your website such as https://www vs https://. Google sees these as separate sites. Whenever a company redirects pages, the redirects are coded here. The more lines of code here, the slower the load speed for each page. SEO companies need to ensure a light yet an efficient version for their client's websites.
This file is used by an SEO company to tell Google what it may or may to crawl. Password protected or member areas of sites usually use the robots txt to prevent Google from indexing these sections. Be aware, however, preventing Google from crawling sites can have negative consequences.
Technical SEO is best left to an SEO company due to the unforgiving nature of errors in this sphere of SEO. Google Search Console, however, is a great tool to be made aware of any issues from Google directly. Have your SEO company or developer upload the code snippet downloadable from Google Search Console. It will give you the means as a novice to monitor your SEO company and plays a large part in reporting of SEO services.
Users often follow a buying cycle or conversion cycle. The user will begin by a problem solving or need identification, research for a solution, do a comparison of alternatives and finally. purchase or convert. At different stages of this buying cycle, the user's will be after different content. The idea is structure content to offer value at each stage.
The idea is to continue to expand to answer direct questions, especially with the advent of voice search surpassing typed search. Begin with the most pertinent-to-ROI keyword phrases and then work towards encompassing the subject as a whole, creating a wide net of search terms, each providing value for users at different stages if their buying or conversion cycle.
This is especially important for service companies and brick and mortar stores. People use voice search on their mobile phones.
Again, Dwell time and bounce rates will be affected by whether the content satisfies the user intent. A good SEO company will ensure. When user intent is satisfied you can expert high dwell times and low bounce rates, as there is no reason for the user to go elsewhere or hit the back button.
The user journey should also follow intent. An example would be a search query with a high level of intent such as SEO Cape Town prices in our case would be better served with content structure point to "packages" than "what is SEO?". A good SEO company understands the journey of user intent in terms of content.
Almost three weeks ago, at the time of writing, there was the "Medic Update" and Google said that there was no fix. A lot of SEOs around the world focus on EAT as a possible solution or safeguard against future content qualifying updates. EAT is an acronym featured in Google's Search Quality Evaluator Guidelines. We'll break down this crucial SEO success factor.
EAT has to do with the Google Algorithm called Panda. Panda was designed to separate low quality sits from high-quality sites. EAT became an acronym suggested by Google to best approach your site content in order to rank favourably by Panda.
How does Google decide, based on your content, whether you should rank below or above another website with similar content?
As an SEO company, we need to make sure that our clients' content departments are producing content and structuring it in a way that signals to Google that they are indeed the experts. Signals we take into consideration to achieve an expert status:
The more someone knows about something, the more we trust them. It is therefore logical, that when you create content in large volumes it seems as if you know a lot. In a massive study of 1 million websites, it was discovered that websites with 1890 word plus featured regularly on the first page.
How would Google be able to judge the depth of content? One simple way, maybe to analyse what levels of headings and subheading exist. For instance, are H1, H2, H3, H4, H5 heading included which offer a hierarchy of information similar to the framework one would use in a person's thesis for a university.
Good content references sources that stem from websites that are seen as experts, similarly to our university students must reference work appropriately for their thesis or essays.
The authority of your own content is determined by who is linking to your content and how many times this is the case. We will visit this crucial element later in this article but for now, it will suffice to understand that if many quality websites are linking to your articles it is a clear signal to Google that you are an authority on the subject.
How does one trust a website? One of the main yet basic signals is whether your website is HTTPS or not. Search for any common keyword phrase such as buy a new car and then see how many sites are on the first of results that are not HTTPs.
Do you have a linked in profile that is linked to the site? Linkedin profiles are a source of trust for many people and are indexed by Google. The idea is to indicate who is responsible for the content. Some SEOs have speculated that there is a sub-algorithm called AuthorRank.
Furthermore, Amit Singhal, on Google's official Webmaster blog, produced a set of questions that website owners or content production teams could use to ensure a higher ranking. Many of these questions are related to the acronym EAT.
John Mueller from Google has come out and said that freshness is not a ranking factor. Our website data disagrees strongly and John Mueller contradicts himself, where he mentioned that Google crawls websites more often if they produce content more often. Why would Google do this if freshness was not a ranking factor?
From our own testing, we can say that Freshness is indeed at the very least a short-term ranking factor. You can experiment with your own website, put new content up and for a few days your site as a while will rank better for current positions before falling back again. This only works when other on-page SEO factors are in place, of course, and in very competitive industries off-page too.
Content should be produced on a regular cycle. So every Tuesday or every second Wednesday as an example. In this way, you train or indicate to GoogleBot when to crawl the site and the frequency of those crawls.
On-page SEO factors are what the world think SEOs do. On-page SEO factors are adding keywords to the correct title or heading tags, labeling images correctly with keywords and ensuring a prominence and density of keywords in your copy (text). Although this is the simplest SEO factor, if Google cannot tell what your website is about in terms of keywords, your website will not rank for those search queries.
Once again content length comes into play. The more you have to say about something the more it seems that you know about a subject, referring to the case study mentioned under EAT above, the amount of copy plays an important role.
Keyword density, the amount of keywords in relation to the amount of text, is seen today as less important than keyword prominence.
Click-through rates are determined by measuring the clicks, when people click on a link, versus the impressions or how many times your link is shown in the search results. RankBrain is Google's machine learning or AI algorithm. Its sole purpose is to learn from engagement metrics and CTRs off the SERP or Search Engine Results Page and metrics already mentioned above such as dwell time and bounce rates. Therefore, the more clicks you receive when the link or search result of your website are shown in the SERP, the more it affects your ranking positively.
We optimise for this scenario, by ensuring that your page and site titles match the user's search query almost identically. Some companies use Schema, search engine language coding, to display the company's star reviews, which then attract more clicks due to the orange stars drawing more attention in the many search results, as an example.
Titles should be alluring and the meta descriptions should have a compelling value proposition too, inducing the user to click on your result in listings.
There are two aspects to the uniqueness of content
Now duplicate content has really become almost mythic in misunderstood proportions. Duplicate content is normal for most websites, it is the ratio of duplicate, common and unique content. The idea is to keep duplicate to a minimal in terms of within your own site.
Duplicate content from another website, hosted on your own, however, is a seriously negative ranking factor and should be avoided. There exceptions such as quoting a piece an article, but do not copy past large swaths of content from other websites.
Google has an index of the web, it is really easy for Google to detect where the duplicate content comes from. To be safe, always publish content on your website first, before syndication. Google will then associate that content with your domain.
Unique content is what SEO companies focus on when designing a content strategy, and you should too. Unique content pays huge dividends. It addresses longer-tail search terms, which in turn is less competed for, so you have a better chance of ranking. Longer tail search queries usually are used by searchers, deeper in their buying cycle. For a company just starting out with its SEO strategy this is crucial advice if you can't seem to rank for shorter tale queries.
In 2018, Speed is most certainly a ranking factor. Speed is inextricably linked to Google's Mobile First Index. On the 26th March 2018, Google told the world that there would only be one version of its index, the mobile index. Combine this announcement with the fact that it is widely accepted, that the longer a website takes to load, the higher the bounce rate for that website will be.
Think of the last time you searched for a gym or restaurant on local search, did you wait for the longer loading pages? Of course, not. We live in the age of instant gratification.
An SEO company hack is to always make sure that your website is hosted int he same country as your target audience, for enterprise-sized companies with multiple domains. If a company opts for a single domain other platforms need to be used such as CDNs.
CDNs or content display networks are ideal platforms for large enterprise-size companies with a Global reach. CDNs' save a cached version of your site and display it instantly from various server locations all over the world, closest to where the user is accessing the site. This offers a standard, high performance, fast user experience on a global scale.
Great free tools to test your site speed are:
PageSpeed Insights gives you Google's own recommendations. This is obviously an important checklist to correct, and the great news is Google points out what it recommends that you should fix.
GTmetrix is based in Canada, so take that into account when you test your site. Take a few local equivalent sites to use as benchmark mean for comparison. GTMetrix gives you a good set of analysis and recommendations too.
Pingdom has the same functionality more or less as GTMetrix but offers various test server locations to test your site.
As mentioned above, on the 26th of March 2018, Google announced that it would only have a Mobile index. Mobile versions are thus crucial for ranking and are an actual ranking factor.
Media queries are CSS (website display programming) layouts, that format the layout and display of your website specifically for various devices such as iPhone vs Samsung and even desktops.
A poor mobile user experience can influence other metrics such as dwell time and bounce rates, that in turn detract from ranking. too. Mobile searches surpassed desktops searches in 2015 already. Make sure that your sites display and engage with your target audience efficiently.
Backlinks are links from other websites to yours. The more authority or popularity of those websites, the more they act as votes of confidence for your website. Links are still one of Google's three primary ranking factors!
Internal links follow the same "vote of confidence" as mentioned above, yet are used internally. The pages with the most final links will be seen as the most important by Google. Internal links do not have as much ranking power as external backlinks, but play a pivotal role in signal to Googe what you believe to be most important.
Internal links also play a crucial role in signaling to Google what your website is most authoritative about. You can imagine that internal links are you tell Google whats important, and backlinks or external links, are what other people are signaling to Google what's important.
If you have no backlinks, however, you will certainly not ranking competitive industries. The danger is that there are several penalties for link buying or artificial link building. Google wants natural organic links.
The best possible way for a novice to build links without worry is to create valuable content, that solves pain points or informational requests of users. Think of an amazing infographic that explains a difficult-to-grasp methodology or 10 steps to solve a pain point. Sites will link out to you naturally when your content supports their content and offer value.