Technical SEO Checklist – SEO Audit
The Technical SEO is an integral part of any SEO strategy, that is why the site needs to meet all the recommendations of search systems.
The technical SEO audit checklist includes multiple aspects in order to analyze and rank the technical parameters of the website content and structure. Let’s briefly discuss what technical SEO is and compile an SEO audit guide.
Exhaustivity of Indexing the Site Pages
These errors can easily be found in the Google Search Console in the Coverage report.
Setting the Primary Website Address
- The main version of the site in the Google Search Console
- Check duplicate pages with www and HTTP
- Index Directory Files
- Duplication of pages with slashes
- URL Accessibility in Different Registers
Pages with Noindex
- Close a page in robots.txt with Disallow
- Add a meta tag to the page in <head>: <meta name = "robots" content = "noindex, nofollow" />
In robots.txt, you can close not only the page from the
index but also the folder, file type, service pages of
the site, site search results, etc. That is, you can
work in bulk with groups of pages.
<meta name = ”robots” content = ”noindex, follow”> allows you to close pages pointwise.
If you need to close a specific page, it is better to
use the meta tag so as not to overload robots.txt with
extra lines. It is also more likely that the rule will
work (compared to robots.txt).
Remember that robots.txt are just recommendations, in other words, search engines can ignore it and index and crawl illegal URLs. Therefore, if you want to hide the URL with a guarantee, it is better to do this through the meta tag. And if you would like to make this for sure, then you can, for example, close directories with a password.
So, if you want the page not to be indexed, it is better to use noindex in the robots meta tag. To do this, add the following meta tag on the page in the <head> section: <meta name = ”robots” content = ”noindex, nofollow”>
XML files structure the website and facilitate search engine indexing. This is especially true for large websites with many pages where a large amount of content is dynamically updated. Using the Google Search Console, you can check the site’s XML files and make the necessary changes according to the Sitemap report.
The Sitemap protocol format consists of XML tags. All data values in the Sitemap must use masking. The file must use the UTF-8 encoding.
In the beginning, put the opening tag <urlset>, and at the end, put the closing tag </urlset>. Specify the namespace (protocol standard) in the <urlset> tag. Include an <url> entry for each URL as the parent XML tag. Include a child <loc> for each parent <url> tag.
- Prohibited for indexing in robots.txt
- Tagged with <meta name="ROBOTS" content="NOINDEX" />
- Pages with the status 4xx, 3xx
- For the main <priority> 1.0 </priority>
- For landing pages <priority> 0.9 </priority>
- For technical pages * <priority> 0.8 </priority>
- For the main <changefreq> daily </changefreq>
- For landing pages <changefreq> weekly </changefreq>
- For technical pages * <changefreq> monthly </changefreq>
- All URLs must have a <lastmod> tag, indicating the last date the document was edited.
- Tags <urlset>, <url>, <loc> are required for the sitemap
- A sitemap should not contain more than 50,000 URLs, and its uncompressed size should not exceed 50 MB. If the size of the Sitemap is too large, you need to break it into several parts: sitemap1.xml, sitemap2.xml
- The URL must use the same syntax.
- Do not include session identifiers in the URL.
- The sitemap must define the following XML namespace: xmlns = "http://www.sitemaps.org/schemas/sitemap/0.9".
- A sitemap can only describe pages of the domain on which it is located. Pages of subdomains or other domains cannot be described.
- When accessing the file, the server should return a 200 response code.
- Before uploading a file to the site, it is recommended to check its correctness using the Sitemap file validator.
- The site map should be automatically updated when adding or removing pages from the site.
- 1. Content optimization.
- 2. Website semantics.
- 3. Link profiling and building.
Multilingual Settings - Hreflang and Alternate
It is possible to manually check the availability of hreflang on the site in the code of pages or using the Screaming Frog in the relevant tab of the report.
The Correct Setting of Canonical Tags
The canonical tag can help prevent the resource from being harmed because of this problem as it indicates to the search system which page is preferable to be indexed. This tag should refer to a page that has no redirects and is indexed, also the URL has to be like in the below example.
It is possible to check canonical with the help of the Screaming Frog in the relevant tab of the report
Correct Setting of 301 Redirects
For example, Page A has got a 301 redirect to page B, while page B redirects to page C.
It is possible to check your redirects in the Screaming Frog report in the tab Response Codes.
Checking Your Website URL and Pages
Checking for Duplicate Pages
Printable pages are set available for indexing.
Duplication of product catalog pages.
Using session identifiers.
Duplicate pages are especially characteristic of the e-commerce segment. To detect them, check the indexed pages on request site: mysite.com using the Google Search Console and crawl. After finding duplicates, get rid of them using robot.txt. To do this, use the disallow directive.
Broken Links (404)
Links are divided into internal and external ones. The former lead to pages within the website, and the latter - to third-party resources. It is important that all links are working, so during the technical site audit, you check all the links and make sure they work.
- 1. The resource where the link leads to has changed its domain or URL.
- 2. There are problems with the server.
- 3. The linked document has been deleted.
Server Errors (5xx)
Design 404 pages
- It is desirable that the design of 404 pages is the same as that of the entire site;
- A correct 404 page should contain a link to the home page and menu so that the visitor can return to work with the site. Additionally, you can add a search bar.
- Tell visitors briefly why they see the 404 error page.
Links to External Resources
By default, all the links are created as dofollow. To make it nofollow, one should should add the following tag inside the code: rel=“nofollow”
We recommended applying the rel = "nofollow" attribute to all outbound links.
Check Page Titles
- The title must be unique to each individual page.
- It should be different from level 1 heading - h1
- The recommended number of characters: 35 to 70
Check Meta Description
For websites with a large number of pages, especially e-commerce websites, the problem of meta description duplication is typical. This confuses crawlers and interferes with indexing. It is important to identify duplicate meta descriptions and optimize them using keywords.
This attribute allows users to get a more accurate picture of the page. It is important to remember that the meta description should be informative and high-quality since it is the observance of these rules that can increase the number of clicks to your site.The description must contain the primary key. Length - up to 165 characters with spaces. It must be unique to each individual page
- Restricted and unnecessary pages are closed for indexation.
- For various search robots, the relevant User-Agent is set.
- The main site mirror is indicated.
- Pages with dynamic data are closed from indexation.
- H1 should not duplicate titles.
- It has to be unique on every single page.
- The number of characters - up to 60.
- The main keyword in the title should be closer to the beginning
Showing the Site in Frames
A situation may arise when some sites start displaying the content of your site in iframe blocks. The goals of this embedding can be different: fraud, clickjacking, theft of content. In any case, this can adversely affect the promotion of your site.
How to prevent your site from using iframe?
To prohibit embedding your site in IFRAME on foreign sites, you need to add the header: X-FRAME-OPTIONS in your server settings. It can take two values:
- DENY - always forbid to embed a site
- SAMEORIGIN - allow embedding only on the same domain
- X-Frame-Options: deny
Checking Your SSL Certificates
SSL certificates are small data files that encrypt data packets when they are sent over the Internet. SSL is often used to transfer login information and credit card information on the Internet.
There are three types of SSL certificates:
- Domain Validated
- Business Validated
- Extended Validated SSL Certificate
No Mixed Content
Speed and Load Time
To properly respond to requests, search robots need to understand what is shown in the pictures. You can help them. The basic guidelines on how to optimize images for SEO are provided in Google’s Webmaster Guidelines and on the Google Image best practices page:
- Choose relevant images. They should be unique, high-quality, consistent with the subject of the page, and complement the text blocks within the meaning.
- Try to avoid text in pictures. Headings, menu items, and other important information should be placed in the text. If the image does not load, the page structure will look illogical. In addition, images with text interfere with the full translation of the page into other languages.
- Optimize images for mobile devices. Distorted proportions or a cropped piece of the image can spoil the impression of the page. For each type of device, use images of the appropriate format.
- Enter a detailed URL. Google experts advise indicating not only the name of the file but also the logical structure of its placement.
- Fill in the meta tags. Google Images automatically generate the page title and snippet. To make the result as informative and useful for the user as possible, pay attention to the title and description fields. What information should you put there?
- Define the Alt attribute. If the pictures do not load due to a slow connection or user settings, you should have a plan B. Alternative text conveys the meaning of the image and can be used as an anchor.
- Provide structured data for products, recipes, and videos. This will help earn you a special badge and better serve your target audience. Follow the rules carefully, though– Google has penalties for violators.
- Speed up website loading. It takes more time to display big images– 53% of users will not wait for the site to load for more than 3 seconds. You can check your site’s speed using the PageSpeed Insights tool. Add AMP markup to the image page. The site will receive the AMP logo and will load faster.
- Use sitemaps. They may contain additional information necessary for indexing.
- Optimize images for safe search. Many users want to exclude adult content from results, especially if the whole family uses the computer, including children. Google listens to their opinions and strongly recommends adding special markup to candid images.