How to Create a Sitemap by John Mueller
How can you begin to understand what to focus on when creating a sitemap from an SEO point of view? No one can tell you better than Google experts. In this article, we will share the recommendations of John Mueller with you. He is one of the company’s leading analysts, and works on improving search engine optimization algorithms.
What Is a Sitemap?
Before giving recommendations on how to make a sitemap for Google, let’s look at the terms.
A sitemap is a file with a list of links leading to site pages. Maps help search robots navigate faster and crawl without transitions. There are two types of sitemaps, HTML and XML.
The first type (HTML) is a regular web page with a list of links to categories and other important pages. An HTML map contains a limited number of links (up to a hundred). Without them, search robots won’t be able to index it. It’s not enough to cover all pages, especially when it comes to large-scale sites with hundreds of thousands of pages.
These are automatically generated XML maps. They are useful for search engines and can contain up to fifty thousand links. Such maps allow web developers to prioritize each individual link and indicate to search engine crawlers when pages are updated.
Such maps can be created manually, but modern webmasters practice automatic generation from CMS. You will find out why in this article.
The Role of a Sitemap in SEO
How to Create a Sitemap for Google Effectively?
The search engine prioritizes resources with an XML map automatically generated in the CMS. XML files are scanned by search robots. Google’s algorithms recognize the structure of the page and use metadata and other information that is important for ranking.
John Mueller: Who It Is and How He Influenced the Creation of the Sitemap
John Mueller is a Webmaster Trends Analyst, one of the top positions at Google. He is important in this article due to the subreddit on how to create a sitemap for Google.
Mueller noted that manual creation of HTML maps for large sites is no longer relevant, and it is better to create them through a CMS. HTML maps can and will still apply. However, they tell crawlers about poor site navigation and inconsistent internal links.
Let’s take a look at how to create a quality sitemap.
John Mueller’s Sitemap Guidelines
About a year ago, a Reddit user posted a thread asking if it’s worth making sitemaps in the generally accepted HTML format. John Mueller left a detailed, competent answer. After that, he recorded a video with guidelines on how to create a sitemap for Google.
We have summarized the information provided by Mueller and presented it in five tips.
Tip # 1: remove the old HTML map
John Mueller says there is no need to use manually generated HTML maps.
His comment was that such maps should not be indexed on large sites. It’s better not to create them manually. They can be useful for ordinary site visitors. However, this is a direct signal that web developers have done a poor job of navigating for search crawlers.
Instead, it’s better to place your sitemap in a dedicated XML Sitemap file. The specialist creates an absolutely logical network of internal links. Users must perform the minimum number of clicks in order to get to the link they need.
John Mueller also noted that using traditional CMSs greatly simplifies the SEO process in terms of building a sitemap. Search robots find it much easier to navigate the site.
Tip # 2: use a CMS
On large-scale sites, there are problems with sitemap updates. A mistake that’s easy to make on a site with tens of thousands of URLs can collapse the entire structure of a web resource.
Therefore, it is so important to use CMS that is able to independently generate Sitemap.xml files. It is they who allow automatic updating and reduce the risk of specialist error.
Tip # 3: do internal linking
Any SEO specialist knows how important it is to get internal links correctly. They provide transitions on the site, strengthening its position in the search results.
A properly organized structure of the sitemap will help increase the level of trust for search robots. Users follow the links on the site, meaning they are interested and can be trusted.
Tip # 4: prohibit sitemap indexing
John Mueller argues that search engine crawlers should not use the sitemap file to rank. You can disable indexing manually using X-Robots-Tag and Disallow.
Otherwise, your site will be expecting an unspoken mark from the crawlers that the navigation is not built in the best way. This means that you can give preference to other, more organized web resources in search results.
Tip # 5: automate updates
John Mueller claims that different URLs are crawled in random order, regardless of where they were last modified. Search robots periodically use these hints but are guided mainly by their own algorithm.
Crawlers are programmed to provide an even load on the server. Therefore, some pages may be scanned once a week, and others once every six months.
This means that when a webmaster makes changes to the site on several URLs at once, some of them can be processed only after six months.
If you want to automate updates and speed up the reindexing procedure, you need to do the following. Go to the Sitemap file and update the last modification date. This is what will serve as a guideline for crawlers.
Are Old HTML Maps No Longer Relevant?
Can you say that traditionally handcrafted HTML maps have lost their SEO relevance forever?
The question of how to create a sitemap for Google is very important for small web resources with a minimal budget and sites with hundreds of thousands of URLs. Indeed, in this case, it is much easier to create a sitemap manually, and the XML file auto-generator limits webmasters to a mark of 50 thousand addresses. What do you do in such a situation?
Judging by Mueller’s statements, manual mapping should still be abandoned. The fact is that subsequent updates to manually generated Sitemap.xml files will prove to be a real problem for web developers and can bring down the entire site structure. Therefore, it is better to immediately generate the link mass for the sitemap through the CMS on which it is based, or through the DBMS.
If the mark of 50,000 URLs is exceeded, it makes sense for webmasters to logically split the sitemap into several files. Then they can be easily put into one file, which will serve as a folder for sitemaps.
As you can see, the standard ways to create a sitemap manually have long been outdated in terms of their usefulness for SEO. It is better to use the current recommendations of one of the leading experts at Google such as John Mueller.
Have you noticed that after generating a sitemap through the CMS, its position on the Google search results has increased? Share your experience with us and our readers below in the comments.