– Top products
– Internal links
– The Googlebot will download an HTML file.
– The Googlebots will then download the JS and CSS files.
– The Google Web Rendering Service then fetches the data from the database, external APIs, and so on.
– The content can then be indexed by the indexer.
– Google can discover new links and add them to the crawling queue.
This process is a lot more extensive when compared to crawling a traditional HTML site. There are then a number of different issues that can go wrong. For example, it is common that search engines are not able to discover any links on a page before the page has been rendered.
The Googlebot does not act in the same manner as a real browser
3. Crawl budget
Firstly, crawlability relates to how Google is able to crawl your site. You need to ensure your site has the correct structure and that Google can discover all of your valuable resources. Next, renderability means that Google should be able to render all of your site. Finally, your crawl budget refers to how much time it is going to take for the search engine to crawl and render your site.
Can Google technically render your site?
When looking at the screenshot, there are a number of things you need to look out for. You need to make sure that the main content is visible first and foremost. You then need to be sure that Google can see the other crucial elements of your page and that Google can access areas like similar products and articles. You may also want to take a look at the HTML tab within the report that has been generated in order to delve a bit deeper.
Is your content indexed in Google?
The more accurate option is to check the Google Search Console. You may want to do the ‘site’ command method first, so you can get some quick information. However, accuracy is always best, and so we would still recommend going for the Google Search Console option.
It is advisable to repeat this process for a random selection of URLs to see if Google has indexed your content correctly. Don’t merely stop after checking one page!
Client-side rendering and server-side rendering
There’s a really simple way of understanding the concept. With server-side rendering, it’s like Google receiving a cake, iced and ready to enjoy. With client side-rendering, it’s like Google getting the recipe and ingredients that are needed to make the cake.
As this is the case, it means that you need to include Facebook Open Graph markup and Twitter Cards in the initial HTML. If you don’t do this, your content is not going to be displayed properly whenever people share it on social media. This can seriously hurt your online marketing efforts, as social media users are going to be much less likely to click on the content that is shared from your website.
Using hashtags in URLs
Pagination implemented incorrectly
Pagination is a popular way of fragmenting a large quantity of content. However, it is common that these sites will only enable the Googlebot to visit the first page of the pagination. This is something you need to watch out for when developing your website.
Blocking CSS and JS files for Googlebot