Crawlability refers to, how effectively Google spiders can crawl or browse your site for indexing.
Poor crawlability leads to great losses for bigger brands.
For personal niche blogs, it causes a downward trend in search traffic. Or many of the pages on your blog may not be indexed.
Why is that?
You are not allowing search spiders to crawl your entire site effectively.
Here are some of the working tips.
Tips to boost crawlability of your site
The below tips helps you to make sure that Google crawls every part of your site in the best possible way.
Build backlinks to internal pages
Building backlinks to the homepage strengthen SEO! This is a huge misconception.
Building tonnes of backlinks only to your homepage increases the page authority (ranking power) of your home page (ONLY).
Linking to internal pages, increases domain authority (whole ranking power)
Build backlinks to the most internal pages on your site or your old blog posts.
Here are the benefits of linking to internal pages.
- As you have a link to the homepage in the header of every page, it will also pass some juice to the homepage.
- Your old content gets some juice and exposure.
- It helps in maintaining the rankings of your old blog posts.
- It passes juice to untapped content on your blog.
- Increases domain authority and as well as the page authority of that internal page.
Internally link related content
Internally link blog posts.
It helps in uniformly spreading ranking power among all the pages on your blog. Thus, it spikes up the D.A of the entire site.
A dozen of internal links can sometimes, act as an inbound backlink to your blog.
Pages that are not at all linked from any other pages, may not shine to its fullest potential in search engines.
Interlinking helps Google in understanding the structure better and also the relation between the content on your blog. It in turn shoots up the crawlability of your site.
To help Google better understand the structure, use LSI keywords as anchor texts for your internal links. It helps your posts rank for some lateral search terms.
Do not use AJAX and flash menus
Once you include AJAX, flash on your site, it decreases the Crawlability. Google can read AJAX and flash, However Google does not like that sort of content. It also causes Crawlability issues with other search engines.
Avoid navigational menus that are AJAX and flash rich. Craft menus that constitute only HTML and CSS. (CSS 3 has much more rich features).
Link to the pillar articles in your navigational menu. The pillar content normally has internal links to dozens of your blog posts. So linking to pillar articles, helps in passing the homepage ranking juice to your pillar content. And also for all your linked blog posts.
Along with this include an about page, contact us page.
Create an HTML sitemap if you do not have one
Sitemaps help Google in effectively crawling the content. It mainly helps Google to classify the content on your site. Whether it is based on category, tags, whatever. You should include a sitemap in your site.
There are two sorts of Sitemaps. One is XML and another one is HTML sitemap. HTML Sitemaps are both Google and user readable.
One HTML sitemap serves both purposes.
I am using Yoast XML sitemap feature on my blog. It comes bundled with WordPress SEO plugin.
But after watching the above Matt Cutts video, I am thinking of building an HTML sitemap for my site.
After creating a sitemap, make sure you submit the sitemap in Google webmaster tools.
Increase page speed
Google and other major search engines take the time of site to load completely as a ranking factor.
Sites, which load faster, have an edge in rankings over their competitors.
I have already posted about this. You should stress on high page speed.
Google has a limited time. Within that limit, it should crawl your pages (crawl time budget). To make room for Google to crawl more pages, you have to improve your page speed.
High page speed is a signal of great user experience. People love to visit fast loading pages. They want the results fast. The focus time of web users is dropping rapidly.
So if you want to offer a great user experience, increasing page speed is a must.
Take a note of robots.txt
Robots.txt is a file, that resides in the root directory of your site. It helps inform search engines, what to crawl and what not to crawl.
Your robots.txt should be something like,
User-agent: * Disallow: /wp-admin/
But not like,
User-agent: * Disallow: /
The above indicates not to crawl your entire site!
Remember you should disallow only admin pages of your site.
If you are facing any indexing issues on your site, then do check your robots.txt file once.
You can do this with the help of WP Robots plugin for WordPress. The feature is also bundled with WordPress SEO plugin under "Edit files" sections.
If you want to fill up, then use this tool.
Avoid 404 error pages, internal server errors
The 404 errors mean that the requested page cannot be found on the server. It can be caused, due to deleted blog posts or changed permalinks.
You can keep an eye on these broken links using some WordPress plugins. This sort of plugins is known as resource hogs. So, note that these broken link monitoring plugins, when used in shared hosting environments lead to lots of problems.
Make use of online tools like Broken link check.
Make sure that your hosting provider is maintaining 100% uptime of your site. In case, if you use services like Cloudflare, make sure that your blog's not returning error 522.
Broken pages make search spiders stop crawling your site. It breaks the internal linking strategy of yours.
Up to you.
Increasing the crawlability of your site makes sure that Google indexes each page of your site. Indexing is the first step towards gaining search rankings, right?
What are you waiting for? Go ahead, speed up your site. Internally link your blog posts. Fix broken links.