Technical SEO Checklist – SEO Audit

Technical SEO Checklist – SEO Audit

The Technical SEO is an integral part of any SEO strategy, that is why the site needs to meet all the recommendations of search systems.
The technical SEO audit checklist includes multiple aspects in order to analyze and rank the technical parameters of the website content and structure. Let’s briefly discuss what technical SEO is and compile an SEO audit guide.

Exhaustivity of Indexing the Site Pages

The audit can be started with checking the presence of all the pages in the search engine indexing. To do this, enter the command site:www.domain.com in the Google search bar and you will see all the pages of your site that are present in indexing.

Crawling Error

A crawling error means that Google has experienced difficulties while checking the pages of your site. This can negatively influence the indexing and ranking of the resource pages, which is destructive for SEO promotion.
These errors can easily be found in the Google Search Console in the Coverage report.

Setting the Primary Website Address

Of all the options for the address at which you can get to the site, only one should be available. You need to select the main mirror of the site, and set up a redirect to the main page at other addresses. Most experts recommend it as one of the ranking factors. You can check and decide:
  • The main version of the site in the Google Search Console
  • Check duplicate pages with www and HTTP
  • Index Directory Files
  • Duplication of pages with slashes
  • URL Accessibility in Different Registers

Checking Robot.txt

Sometimes, indexing issues are related to the robots.txt file that tells Google which pages to check. If there are errors in this file, Google bot may misunderstand the content. Google specifications describe in detail the types of these files.

Pages with Noindex

General recommendations for the noindex tag
There are two basic ways to hide the page from indexing:
  • Close a page in robots.txt with Disallow
  • Add a meta tag to the page in <head>: <meta name = "robots" content = "noindex, nofollow" />
The Main Differences:

In robots.txt, you can close not only the page from the index but also the folder, file type, service pages of the site, site search results, etc. That is, you can work in bulk with groups of pages.
<meta name = ”robots” content = ”noindex, follow”> allows you to close pages pointwise.

If you need to close a specific page, it is better to use the meta tag so as not to overload robots.txt with extra lines. It is also more likely that the rule will work (compared to robots.txt).
Remember that robots.txt are just recommendations, in other words, search engines can ignore it and index and crawl illegal URLs. Therefore, if you want to hide the URL with a guarantee, it is better to do this through the meta tag. And if you would like to make this for sure, then you can, for example, close directories with a password.

So, if you want the page not to be indexed, it is better to use noindex in the robots meta tag. To do this, add the following meta tag on the page in the <head> section: <meta name = ”robots” content = ”noindex, nofollow”>

Sitemap Verification

XML files structure the website and facilitate search engine indexing. This is especially true for large websites with many pages where a large amount of content is dynamically updated. Using the Google Search Console, you can check the site’s XML files and make the necessary changes according to the Sitemap report.

The Sitemap protocol format consists of XML tags. All data values ​​in the Sitemap must use masking. The file must use the UTF-8 encoding.

Sitemap Rules:

In the beginning, put the opening tag <urlset>, and at the end, put the closing tag </urlset>. Specify the namespace (protocol standard) in the <urlset> tag. Include an <url> entry for each URL as the parent XML tag. Include a child <loc> for each parent <url> tag.

Pages with the following conditions should not be indicated on the map:
  • Prohibited for indexing in robots.txt
  • Tagged with <meta name="ROBOTS" content="NOINDEX" />
  • Pages with the status 4xx, 3xx
Priorities must be put:
  • For the main <priority> 1.0 </priority>
  • For landing pages <priority> 0.9 </priority>
  • For technical pages * <priority> 0.8 </priority>
Put down the probable frequency of changes:
  • For the main <changefreq> daily </changefreq>
  • For landing pages <changefreq> weekly </changefreq>
  • For technical pages * <changefreq> monthly </changefreq>
Recommendations for tags in the site map:
  • All URLs must have a <lastmod> tag, indicating the last date the document was edited.
  • Tags <urlset>, <url>, <loc> are required for the sitemap
  • A sitemap should not contain more than 50,000 URLs, and its uncompressed size should not exceed 50 MB. If the size of the Sitemap is too large, you need to break it into several parts: sitemap1.xml, sitemap2.xml
  • The URL must use the same syntax.
  • Do not include session identifiers in the URL.
  • The sitemap must define the following XML namespace: xmlns = "http://www.sitemaps.org/schemas/sitemap/0.9".
  • A sitemap can only describe pages of the domain on which it is located. Pages of subdomains or other domains cannot be described.
  • When accessing the file, the server should return a 200 response code.
  • Before uploading a file to the site, it is recommended to check its correctness using the Sitemap file validator.
  • The site map should be automatically updated when adding or removing pages from the site.
Technical SEO does not include:
  1. 1. Content optimization.
  2. 2. Website semantics.
  3. 3. Link profiling and building.
Read more in article types of SEO.

Multilingual Settings - Hreflang and Alternate

If your site has several language versions, it is necessary to write correct hreflang attributes. This attribute informs which language is used on a certain page, therefore, the search engine can provide the result to users who search in this language.
It is possible to manually check the availability of hreflang on the site in the code of pages or using the Screaming Frog in the relevant tab of the report.

The Correct Setting of Canonical Tags

The canonical tag can help prevent the resource from being harmed because of this problem as it indicates to the search system which page is preferable to be indexed. This tag should refer to a page that has no redirects and is indexed, also the URL has to be like in the below example.

<link rel...

It is possible to check canonical with the help of the Screaming Frog in the relevant tab of the report

Read How to Execute This in the Article: https://www.seoquake.com/blog/canonical-tag/

Correct Setting of 301 Redirects

A redirect informs the search system that the page has been translocated for good. This function is used when pages are deleted or when the URL is changed. The page must be redirected to the final destination avoiding the chain of redirects.
For example, Page A has got a 301 redirect to page B, while page B redirects to page C.
It is possible to check your redirects in the Screaming Frog report in the tab Response Codes.
Read How to Execute This in the Article: https://www.seoquake.com/blog/301-vs-302-redirects/

Checking Your Website URL and Pages

It is important to make sure that the URL of the site and pages is relevant to its subject and content, as well as will be understandable to search engines and users. It is crucial to pay attention to this point at the very beginning and carry out optimization even before the site is launched since changing the URL can lead to difficulties in recognition by search engines. When creating a URL, you should use keywords and make the names concise.

Checking for Duplicate Pages

Duplicate pages may appear for various reasons, including:
Errors in the domain settings.
Printable pages are set available for indexing.
Duplication of product catalog pages.
Using session identifiers.
Duplicate pages are especially characteristic of the e-commerce segment. To detect them, check the indexed pages on request site: mysite.com using the Google Search Console and crawl. After finding duplicates, get rid of them using robot.txt. To do this, use the disallow directive.

Broken Links (404)

Links are divided into internal and external ones. The former lead to pages within the website, and the latter - to third-party resources. It is important that all links are working, so during the technical site audit, you check all the links and make sure they work.

You may find broken links for the following reasons:
  1. 1. The resource where the link leads to has changed its domain or URL.
  2. 2. There are problems with the server.
  3. 3. The linked document has been deleted.
In this case, configure a redirect to existing pages or relevant websites, or delete the link if the resource no longer works.
“The Best Place to Hide a Dead Body Is Page Two of Google.” – Unknown

Server Errors (5xx)

HTTP Status Code 500 - Internal Server Error
Instead of a problem with missing or not found pages, this status code indicates a problem with the server. 500 is a classic server error that will affect access to your site. Both people and bots will be lost and your link equity will not go anywhere. Search engines prefer sites that are maintained in good condition, so you will want to examine these status codes and fix them as soon as you come across them.
HTTP Status Code 503 - Service Unavailable
Another answer 503 means that the server is unavailable. All (people or not) are asked to return later. This may be due to temporary server overload or server maintenance. The status code 503 ensures that search engines know that they will return soon because the page or site will not be available for only a short time.

Design 404 pages

To minimize the loss of traffic, a page with 404 errors must be correctly formatted:
  • It is desirable that the design of 404 pages is the same as that of the entire site;
  • A correct 404 page should contain a link to the home page and menu so that the visitor can return to work with the site. Additionally, you can add a search bar.
  • Tell visitors briefly why they see the 404 error page.

Links to External Resources

Outbound links can be subdivided into two types, namely, nofollow and dofollow. The latter links transmit the ranking power from your site to the page to which the link is added, and nofollow links do not fulfill this function, therefore, they inform search engines that these links should not be followed.
By default, all the links are created as dofollow. To make it nofollow, one should should add the following tag inside the code: rel=“nofollow”
We recommended applying the rel = "nofollow" attribute to all outbound links.

Check Page Titles

This is one of the most important internal ranking factors. The main purpose of the title is to provide the user with a brief description of the page.
  • The title must be unique to each individual page.
  • It should be different from level 1 heading - h1
  • The recommended number of characters: 35 to 70

Check Meta Description

For websites with a large number of pages, especially e-commerce websites, the problem of meta description duplication is typical. This confuses crawlers and interferes with indexing. It is important to identify duplicate meta descriptions and optimize them using keywords.

This attribute allows users to get a more accurate picture of the page. It is important to remember that the meta description should be informative and high-quality since it is the observance of these rules that can increase the number of clicks to your site.

The description must contain the primary key. Length - up to 165 characters with spaces. It must be unique to each individual page
When you adjust robots.txt, pay attention to the following:
  • Restricted and unnecessary pages are closed for indexation.
  • For various search robots, the relevant User-Agent is set.
  • The main site mirror is indicated.
  • Pages with dynamic data are closed from indexation.

Check H1

The <h1> tag is the most important first-level heading. It should be the very first in the hierarchy and the only one on the page. Other requirements for writing this title:
  • H1 should not duplicate titles.
  • It has to be unique on every single page.
  • The number of characters - up to 60.
  • The main keyword in the title should be closer to the beginning
Read More in the Article: https://www.seoquake.com/blog/h1-tag/

Showing the Site in Frames

Why is it important?
A situation may arise when some sites start displaying the content of your site in iframe blocks. The goals of this embedding can be different: fraud, clickjacking, theft of content. In any case, this can adversely affect the promotion of your site.
How to prevent your site from using iframe?
To prohibit embedding your site in IFRAME on foreign sites, you need to add the header: X-FRAME-OPTIONS in your server settings. It can take two values:
  • DENY - always forbid to embed a site
  • SAMEORIGIN - allow embedding only on the same domain
  • X-Frame-Options: deny

Checking Your SSL Certificates

SSL certificates are small data files that encrypt data packets when they are sent over the Internet. SSL is often used to transfer login information and credit card information on the Internet.

There are three types of SSL certificates:

  • Domain Validated
  • Business Validated
  • Extended Validated SSL Certificate
An SSL certified website ensures that all website traffic between your web server and the user's browser is secure and cannot be read. When your website has an active SSL certificate, the application protocol changes from HTTP to HTTPS.

No Mixed Content

The HTTPS protocol is content transmitted over the HTTP protocol, which is considered only partially encrypted.

Mobile Version

Website interaction with mobile devices plays a key role for users when crawling by Google. It is worth auditing your website with Google Mobile-Friendly Test and, if necessary, optimizing it.

Speed and Load Time

Download speed greatly affects user behavior — more than half of them will not wait for a page with the load time over 3 seconds. To analyze your site load speed, use the Google Speed Test.

User-Friendly Website

To optimize the technical parameters of your website while taking into account its user-friendliness, it is important to pay attention to navigation using breadcrumbs and structured data. This will make your website convenient and comprehensive for users and increase the time they spend on your website.

Image Optimization

To properly respond to requests, search robots need to understand what is shown in the pictures. You can help them. The basic guidelines on how to optimize images for SEO are provided in Google’s Webmaster Guidelines and on the Google Image best practices page:

  • Choose relevant images. They should be unique, high-quality, consistent with the subject of the page, and complement the text blocks within the meaning.
  • Try to avoid text in pictures. Headings, menu items, and other important information should be placed in the text. If the image does not load, the page structure will look illogical. In addition, images with text interfere with the full translation of the page into other languages.
  • Optimize images for mobile devices. Distorted proportions or a cropped piece of the image can spoil the impression of the page. For each type of device, use images of the appropriate format.
  • Enter a detailed URL. Google experts advise indicating not only the name of the file but also the logical structure of its placement.
  • Fill in the meta tags. Google Images automatically generate the page title and snippet. To make the result as informative and useful for the user as possible, pay attention to the title and description fields. What information should you put there?
  • Define the Alt attribute. If the pictures do not load due to a slow connection or user settings, you should have a plan B. Alternative text conveys the meaning of the image and can be used as an anchor.
  • Provide structured data for products, recipes, and videos. This will help earn you a special badge and better serve your target audience. Follow the rules carefully, though– Google has penalties for violators.
  • Speed ​​up website loading. It takes more time to display big images– 53% of users will not wait for the site to load for more than 3 seconds. You can check your site’s speed using the PageSpeed Insights tool. Add AMP markup to the image page. The site will receive the AMP logo and will load faster.
  • Use sitemaps. They may contain additional information necessary for indexing.
  • Optimize images for safe search. Many users want to exclude adult content from results, especially if the whole family uses the computer, including children. Google listens to their opinions and strongly recommends adding special markup to candid images.

Make Sure That Your Site Has HTTPS Protocol

Safety is the main priority for Google, that is why the search system has taken steps to create a safer web interface for users and has recommended that webmasters use HTTPS encryption.

The Validity of HTML and CSS

Having the right HTML and/or CSS is presumably not a factor that influences ranking. Though, it is still better if you check your code, as this helps the search system to properly crawl and index your content. Void HTML code can break the scanner just as it can break in the browser. Therefore, you should put effort to make sure that your code is correct and easy for search engines to read, crawl, and/or interpret.

Micro-markup

Micro-markup helps search engines quickly find content on the site and understand it correctly. The introduction of micro-markup is the use of tags and attributes, the purpose of which is to structure information. We figure out how to do it quickly and efficiently.

15 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *