fbpx

SEO website auditing in 18 Practical Steps: How-to auditing guide (2021)

SEO website auditing in 18 Practical Steps: How-to auditing guide (2021)

In this article, we will cover one of the most important processes in SEO optimization. The SEO website audit.

I am going to share my personal auditing steps and explain my thought process behind each of them.

Let’s get started.

Essential SEO website auditing steps

Step #1: Audit your website’s indexability

Your website has to be indexable by search engines. This is a must if you want to stay in the game of SEO. You can’t rank if your website and pages are not available for indexing. Even if you have the best website and the best content strategy. You just can’t rank if you don’t allow the engine to put you on the results page.

Make sure to check for noindex meta tags and x-robots tags that block the search engine bot.
Analyze for server errors and forbidden error pages.

Step #2: Audit your website’s crawlability

When search engines come to your website the process of going through all pages is known as crawling. If your website is not open for crawling then the engine will not be able to find your content.

The only way for search engines to put you on the results page is if they can see that your website exists. In other words, they have to know about its existence.

You can’t get indexed and stay indexed if you don’t allow search engines to discover you and put you in their database (aka SERP).

Allow them to crawl your site. Audit your robots.txt file and see if you have blocked all bots or any specific bot in general.

If you see “disallow: /”, you have blocked everything.

Also, you should check if your internal links have nofollow attributes. If you don’t allow engines to crawl you will have bad results for sure.

Also, check for pages with added passwords. Search engines can’t open pages with passwords, therefore they can’t access them. So if you have content that is accessible after login or after a button click… Search engines can’t access it (crawl it)

Step #3: Check if your website is mobile-friendly

In 2015 Google has launched Mobilegeddon update (not a real update, just a change in the search algorithm). The update affects how your website shows up in the search results. If you have a mobile or responsive site you will be ranked more accurately and higher in the SERPs.

If you don’t have a responsive or mobile-friendly site it might be less visible and rank not as higher as it could.

In other words… If search engines do not see the content on the mobile version, they won’t take it into consideration when choosing whether or not to rank you better.

Make sure that all the content is accessible for the search engines both on mobile and desktop.

Google has launched a tool that helps you check if your website is mobile-friendly or not. You can check your website with that tool and see what Google thinks about it (and if you are really mobile-friendly).

Step #4: Test how many versions are available of the website

Sometimes developers forget to redirect all website versions to the main one. That causes content duplications and lowers the quality of the whole website.

The things that you have to look for are:

  • www vs non-www – choose one and stick to it.
  • HTTP vs HTTPS – should be the same
  • / vs /index.html – should redirect to the main version;
  • slash vs non-slash – choose one main version for the whole website
  • upper case to lowercase redirect – you want that because it can prevent content duplications.

Step #5: Website speed

If you have everything fixed thus far and you think that there is no problem for the website to be present in the SERP (search engine results page) that’s good news! But there are a few things that could potentially harm your indexation and crawlability. That’s the page speed. If your pages do not load on time, even tho they are not blocked by any type of code, they might not show in the SERP. If a page is too slow to give a response to the crawl bots then it will be considered as missing or not crawlable.

As Google explains:

“Slow page load times can frustrate users and hurt your business. Crawl errors are often used as a signal that a website is slow to load. If we think that pages on your site might be too slow, we will display the Slow Down Badge on those pages in search results. When we find that a URL is blocked, or when we detect certain redirects on the page, it can also result in crawl errors.”

Now that you know that, how do you optimize your website for higher speed?

Use Google’s page speed insights tool, GTMetrix, Pingdom, WebpageTest, or any other tool you like.

You can figure out what’s the problem with your website and fix it. If you have lots of images and scripts for example make sure to compress them and optimize your website for higher speed. You can host it on a VPS if you think that your server can’t handle all the requests or if you just want to have better speed than with shared hosting.

The most frequently seen problems are:

Non-minified CSS and JavaScript – Use browser tools to minify them (you can use online tools or add plugins if you use WordPress);

Bad server settings – Make sure that the following files are enabled – Add a CDN – Host some of the images on another server.

Validate HTML and CSS – The browsers might not accept your code and render it as it should. Search engines have a special way to interpret code.

Too large image sizes – Make sure to optimize your images (Google page speed insights tool is very helpful here);

Too many render-blocking resources – The web browser cannot render the page until all the resources (scripts, CSS, images) are loaded. Make sure to reduce the render-blocking resources by combining them or simplifying your website.

Long TTFB (Time to First Byte) – the period of time which is required by the server to start sending the first byte of the response to the client. Google recommends that TTFB should be less than 1 second. Most of the time the server (hosting) is responsible for that time. Contact your hosting provider and try to figure out why it is happening. If you are using reliable hosting, they will help you out.

Non-enabled gzip compression – This is another issue that can increase the TTFB. You need to enable gzip compression in your web server settings.

Step #6: Audit the canonicalization of your website

Canonical links are links that are point to the same content on a different URL. Search engines like Google, Yahoo, and Bing use canonical links to determine which URL is original and which one is duplicate.

Canonicalization of your website helps to avoid duplicate content issues, so you need to make sure that you have canonicalized your pages.

But there is more than just to canonicalize your pages. You have to canonicalize them in the right manner.

For example, if you have a canonical link from your home page pointing to another page on your website, you will experience a lot of ranking problems. The reason being is that your home page has the highest authority and is your main page on the website.

It also works the other way around. If you have a lot of canonicalized pages pointing to the home page, it will confuse search engines about which link to use. They will try to use all the links they can find as canonical and this could result in problems.

Step #7: Fix broken links

Let’s finish up the initial checks with the last thing that can limit your bot crawling and indexing abilities. That’s broken links on the website. A bot uses your website’s HTML as a roadmap to other content on the site, but if it runs into a 404 error (i.e., bad links), the bot will move on to the next URL and not crawl and index those pages. This is probably the most common issue you will encounter.

Additional SEO website auditing steps

Step #8: Check your organic traffic

If you see any pages that are well structured but do not rank and get organic traffic. You should audit those pages. One of the reasons why those pages are not ranking is because they are not structured in the correct manner.

Another reason could be that those pages have duplicate content issues and are being penalized for this.

It is also possible that those pages are no longer relevant in the eyes of search engines and are not showing up in search results because they have been de-indexed.

Step #9: Find and delete “Dead pages”

Brian Dean from Backlinko is calling them “Zombie pages”. They are “dead” because they do not generate any type of traffic and they are “live” because they ready for crawl and index. Make sure to exclude your “dead” pages from your standand “live” crawl. You can do that by either deleting them and redirecting them to relevant content or by adding “noindex, nofollow” attributes on those pages.

In my opinion, you should delete them. There is no need to show the crawler that you have a lot of noindex, nofollow pages. It only decreases the quality of your website.

Step #10: Analyze your backlinks

Sometimes the only reason that your website is not ranking is that you have no backlinks. Sometimes the only reason that you don’t rank is that you have toxic backlinks. You have to monitor and audit them! You can use Ahrefs Free Backlink Checker.

Most of the websites do not distribute their backlinks properly. Everything besides the homepage gets backlinks. There are tons of backlinks that go to the homepage, but there are no links going from the other pages… That’s because of bad backlink distribution.

Audit your internal pages. Do they all have 0 domains pointing towards them? Well you know what to do! Build some backlinks towards them!

Step #11: Audit if your pages and content match the keyword intent

Make sure that your content is relevant to the user’s query. If Google thinks that a certain keyword does not match with your page’s topic then it will be ranked low. So make sure to write your content that is matching the intent.

  • If they are searching for information, offer them a blog post.
  • If they are searching for a product, offer them a category.
  • If they are searching for a visual representation of an item/service, offer them an image/video.

You should have keyword researchcontent strategy with keyword mapping ready before you analyze the intent of the pages that you optimize.

Step #12: Get rid of duplicate content

A lot of people have duplicate content issues. That is because they have duplicated their content across different pages, domains, subdomains, and etc.

You should not do that. That will create a lot of problems for you in the future.

Google identifies possible duplicate content issues and it penalizes the websites that use them.

To find duplicate content you can use Copyscape, Ahrefs, and even Google’s own tool for that.

Step #13: Optimize your XML sitemap

Sitemap is a list that contains URLs of the pages on your website, repeating them in an organized manner. If you are a WordPress user then you should install the XML Sitemap plugin.

You should not have low quality pages in you XML sitemap. You should have informative pages, product pages, blog posts.

Your XML sitemap should not have broken links. If you have a lot of pages that are no longer relevant you should remove them from your XML sitemap.

Don’t forget to submit your XML sitemap to the search engines.

Step #14: Audit your HMTL sitemap (if you have any)

If you have a lot of pages and it is hard to manage them all, than add HTML sitemaps to the mix. HTML sitemap is for crawlers, XML sitemap is for webmasters. You need them both.

First of all. Create your HTML sitemap if you don’t have one. If you have one then check if your it has any broken links. If you find any, fix them and re-post your HTML sitemap.

Step #15: Check for manual actions or penalties

Go to Google Search Console and see if you have any manual actions or penalties.

If you have them, act upon them. Those are different types of penalties that Google gives for different reasons. You have to fix them if you want your website to be un-penalized and returned to its original state.

Bonus SEO website auditing steps

Bonus step #1: Check for structured data errors

Schema.org helps search engines better understand what your web pages are about. It allows you to mark up your content using structured data and then generate rich snippets for display in the search results.

So let’s say you have a blog post about pets on your website but the Schema markup is not correct. The rich snippet will come out as a regular listing instead of a rich snippet. So make sure to check that you are implementing schema correctly, and that your pages have been crawled by Googlebot for it to appear there.

Bonus step #2: Check your hreflangs settings if you have multilanguage version

A hreflang tag provides a hint to search engines about the language and location of the content they should use in the search results.

If you have errors in your hreflangs settings , your page might not appear in the search results for a given language even if they are relevant to that query.

For example, your website has English and German versions. If you have an error in a hreflang tag on the German version, it might not rank for German queries even if it is relevant to them.

This step is very important as it can help search engines choose the correct version of your website in case there are different versions.

Bonus step #3: Try to fix broken links TO your website

If you have broken backlinks you might be missing on a lot of link juice. Make sure to audit your broken backlinks. You can find them using a broken backlink checker tool.

You can also try to outreach to those sites and ask them if they could fix the link. It is a great way to get new backlinks. If you can’t make them work again, disavow those links and move on.

Conclusion

I hope this detailed list of steps helps you with your future audits and makes your job much easier. It took me a lot of time to understand the importance of each of this steps. I will be making improvements of this article so stay tuned for future updates.

See you in the next article!

Leave a Reply

Your email address will not be published. Required fields are marked *