What is the Technical SEO Audit? Technical SEO Checklist and Carry out an Audit
An SEO audit is a complex checking of the site parameters that are crucial for promotion.
Regular audits are a must if you have a resource that you promote on search engines, as they will enable you to avoid problems with crawling, indexing, and the ranking of pages.
Also, an audit is no less important at the very beginning of working with a website. If you are going to promote a new resource, first, it is necessary to check all of its parameters to see if they are correct and meet all the requirements of search engines.
An audit that has been correctly performed provides a lot of useful information. It helps you understand where there are problems on the site that need to be solved for successful promotion in search engines.
What Is Technical SEO?
As we have already mentioned, SEO consists of on-page, off-page, and technical SEO.
For quality SEO promotion, it is necessary to carry out audits for all three types. In this article, we are going to talk about technical SEO, which is the identification and elimination of technical faults that prevent a proper display of a site for users and search engines.
Good content and quality backlinks will not place the site at the top positions in SERP if it has technical mistakes, which is why every SEO specialist has to keep an eye on the occurrence of such problems on the resource and solve them in a timely manner.
Why to Draft a Checklist and How to Use it
Understanding the definition of technical SEO is not enough to do productive work in this direction, the problem is that specialists can miss some important aspects of the site audit.
That is why it is necessary to draft a checklist that will contain all the steps of the check. Thus, nothing will be omitted and there is a possibility of noticing all the mistakes that need to be corrected at once.
It is possible to draft a basic SEO checklist at first that will contain all the most important elements of technical SEO and this list can be expanded in the future at your discretion.
We advise drafting a checklist in Google Sheets in the form of a convenient and understandable table where all the site checkpoints will be listed. Also, it is possible to make columns to update the status of a certain checkpoint, problems that have been identified, and the ways to fix them.
A simple table like this will remind you about the check of all the necessary parameters and fixing the mistakes. But this checklist will only be effective if you regularly perform audits and do not neglect them.
What a Technical Seo Audit Checklist Contains
Fixing technical problems won’t be complicated but it is necessary to clearly understand what you need to pay attention to. The technical SEO checklist below involves some of the most important elements of technical SEO.
1. Exhaustivity Indexing the Site Pages
The audit can be started with checking the presence of all the pages in the search engine indexing. To do this, enter the command site:www.apple.com in the Google search bar and you will see all the pages of the Apple site that are present in indexing.
In a perfect scenario, the number of pages in the SERP would coincide with the actual number of pages on the site. If not, it is necessary to identify the reason for this occurrence and understand why some pages are not being indexed by the search engine.
2. File Robots.txt
Robots.txt is a file that is located in the root directory of your site. It contains information about the pages that should be indexed by search robots. This file contains directives that describe access to various sections of the resource, it is called the robots exclusion standard.
To put it simply, the robots.txt file informs search systems which pages or files on your site can or cannot be processed. With its help, it is possible to set separate access settings for mobile devices and PCs.
This file has to be composed correctly to block only the pages you do not want to index.
When you adjust robots.txt, pay attention to the following:
- Restricted and unnecessary pages are closed for indexation.
- For various search robots, the relevant User-Agent is set.
- the main site mirror is indicated.
- Pages with dynamic data are closed from indexation.
To find out more about robots.txt and how to correctly set it, familiarize yourself with the recommendations from Google where they describe the peculiarities of working with this file.
Here, you will find a guide on how to check the robots.txt file on your site, pay attention that this function is available to only approved owners of the resource.
3. Setting the File Sitemap.xml
The sitemap file contains information about the organization of content on the site. Because of it, search systems can more accurately index the pages of the resources.
On Google, there are recommendations on how to create sitemap.xml files that should not be ignored if you want your site to meet the search system requirements.
Here are the situations when the site would need this file:
- The site has a lot of pages.
- The resource contains an archive of pages that are not interlinked.
- The site was created recently and it has a few backlinks.
- There is a large number of multimedia and news content on the site.
It is necessary to remember that the sitemap.xml file has to be updated and can not contain mistakes.
4. The Correct Setting of Canonical Tags
Sometimes, the owners of sites accidentally create different pages with identical or very similar content. The canonical tag can help prevent the resource from being harmed because of this problem as it indicates to the search system which page is preferable to be indexed.
This tag should refer to a page that has no redirects and is indexed, also the URL has to be like in the example below.
It is possible to check canonical with the help of the Screaming Frog in the relevant tab of the report.
Here, it is possible to see which URL is indicated as canonical for every page of your resource.
5. Correct Setting of 301 Redirects
A redirect informs the search system that the page has been translocated for good, this function is used when pages are deleted or when the URL is changed.
The page must be redirected to the final destination avoiding the chain of redirects.
For example, Page A has got a 301 redirect to page B, while page B redirects to page C.
This is what the chain of redirects looks like, in this case, page A must be redirected to page C, while page B should be also redirected to page C.
It is possible to check your redirects in the Screaming Frog report in the tab Response Codes.
This function will help you see which pages are redirected and to where. For convenience purposes, it is always possible to download the report from the program and check the data in the table.
6. Crawling Error
A crawling error means that Google has experienced difficulties while checking the pages of your site. This can negatively influence the indexing and ranking of the resource pages, which is destructive for SEO promotion.
These errors can easily be found in the Google Search Console in the Coverage report.
If you have noticed that the web crawler experiences problems accessing any of your resource pages, it is necessary to fix these errors.
Another convenient method is to scan the site with the Screaming Frog, it will show responses of the server for every page of your resource during the check.
If the page is closed in robots.txt, you will see this result:
Pay attention that all the unnecessary pages are closed for crawling and vice versa.
7. How Google Views Pages and How to Get Google to Crawl Your Site Faster
Sometimes the search bot cannot view all the information on the site page. In this case, it won’t be able to correctly crawl, and thus, rank it.
This can be checked by entering the URL in the upper search bar of the Google Search Console. By doing this, you will get a report on how Google views this page.
By clicking the View crawled page button, you can see the HTML code of the page, as well as scripts that have not been loaded or which are erroneous. The Screenshot function can display the visual content that the search system crawls.
With the help of the Live test, it is possible to see the actual information about the page.
8. Make Sure that the Site is Optimized for Mobile Devices
Recently, Google has launched a new Mobile-First Index. It means that Google will be predominantly using the mobile version of the content for indexing and ranking. Therefore, if the site is not optimized for mobile devices, its rating will drop and its ranking will worsen. Read how to optimize your site for mobile devices.
Sites can be checked concerning the optimization for mobile devices with the help of Google’s Mobile-Friendly Test.
Also, note that Google recommends using the AMP of the page for mobile devices.
The AMP is a special technology that makes it possible to speed up mobile pages.
On the site, certain tags, whose functionality and number are strictly limited, are used. Google finds these tags and caches information in them. Then, when a user is searching for something, the browser loads the information from CDN Google in the special iframe, and by following the link, it opens the page that has been previously loaded in a special window.
You can read more detailed information about this in Google’s recommendations.
9. Checking the Speed of the Site Loading
The low speed of the site loading influences its ranking in the search system. If a resource is loading slowly, it is possible to lose a lot of visitors.
Faster loading time leads to a higher conversion rate and lower bounce rates, which is why it is necessary to regularly check the site loading speed with the help of PageSpeed Insights.
Below is a report on the example of the Apple site, as we can see, the resource has a problem with its loading speed on mobile devices.
It also has insignificant problems with the loading of the desktop version.
Taking the importance of the Mobile-First Index into account, webmasters of the Apple site should fix the problem of the mobile version’s loading speed. However, the Apple resource would remain popular even without SEO optimization.
But if you discover a similar problem on your site, it is necessary to fix it immediately. PageSpeed Insights will inform you about the parameters that need to be fixed, you will only need to scroll to the report below and familiarize yourself with the recommendations on how to fix problems.
10. Content Duplicates
Duplicated content means that on the site, there are several pages with the same or similar content. When search systems run into duplicate content, they do not understand which version of the site should be indexed and ranked.
Also, duplicated pages are among the most common reasons why a budget is wasted.
There are several ways to solve and prevent duplicate content:
- Use the canonical tag to indicate to Google which version of the page is preferable.
- Use the 301 redirect to redirect pages with similar content to a preferable page.
- Make sure that you use either the version with www or the one without www. But you cannot use both of them. In this case, the duplicate of the whole site is created.
11. Make Sure That Your Site Has HTTPS Protocol
Safety is the main priority for Google, that is why the search system has taken steps to create a safer web interface for users and has recommended that webmasters use HTTPS encryption. Thus, Google increases the ratings of the sites using HTTPS and marks HTTP pages as unsafe to warn users.
Make sure that your site is safe by installing a valid SSL certificate. More information on the use of HTTPS protocol can be found in the Google report.
12. Language Markup
If your site has several language versions, it is necessary to write correct hreflang attributes. This attribute informs which language is used on a certain page, therefore, the search engine can provide the result to users who search in this language.
For example, Apple’s home page contains a lot of language versions, which is why all of them are indicated in the page code.
It is possible to manually check the availability of hreflang on the site in the code of pages or using the Screaming Frog in the relevant tab of the report.
By scrolling to the right on the report, you will see all the hreflang that are on a certain page.
13. Fixing of Broken Links
Non-working links can be harmful to your promotion. That is why it is necessary to find and eliminate them in a timely manner. One way to do this is by using DrLinkCheck.com, it is a free tool that scans the site and detects if broken links are present.
Conclusions: Is it Necessary to Carry out Audits and Draft Checklists?
The Technical SEO is an integral part of any SEO strategy, that is why the site needs to meet all the recommendations of search systems.
Even if you are not a technical marketer, you should always know about the state of your resource and about the areas that can be improved. Just draft your checklist for every type of audit (on-page, off-page, and technical) and check if your site meets the necessary criteria. This will enable you to duly fix the errors that hinder a successful site optimization.