Paginated Pages: A Complete Best Practices Guide

Paginated Pages: A Complete Best Practices Guide

What is Pagination and How Does It Affect SEO?

Pagination is the division of a site into separate web pages for ease of use. This process is required for almost every large site including online stores. Large product categories or blog sections are all paginated.

Properly executed SEO pagination allows search crawlers to understand the relationship between a series of pages. This prevents possible duplicate content due to thematic similarities. Bots will not index technical URLs, but only relevant ones.

There are several options for pagination:

  • List of numbers, where numbers are links to pages
  • Alphabetic breakdown, where pages and content are grouped alphabetically
  • Digital ranges: When clicking on a digital group, the user is redirected to another page
  • You can only leave one-page scrolling down (this became relevant after the announcement of Google in 2019).

Which of the presented types to choose is up to the webmaster oneself. It all depends on the design of the site, the content posted, etc.

The main purpose of such pages is to make it convenient for users to browse the company’s products or news and facilitate the search for goods and services on the site.

How Does Pagination Affect SEO?

The pagination meaning for SEO is positive if done correctly. Pagination improves the quality of the site’s usability. It is easier for users to perceive the information if a site has a large number of products and categories. It also improves loading speed.

Many experts note that pagination helped improve behavioral factors. Most users don’t like the endless web of content on the page. They prefer more structured sites. This is especially true for online stores, where there are a large number of almost identical product pages.

Google also recommends that a page contains no more than 100 URLs (external and internal). This allows search crawlers to quickly penetrate deeper into the site architecture. However, many SEO specialists noted that this is not a requirement for the quality of the site. It is better to adhere to the recommendation so that bots can quickly index content that is really important for promotion.

Pagination does not affect the ranking of sites in the search results! Pagination can negatively impact SEO. Let’s look at the main risks associated with incorrect pagination on sites:

1. Pagination Can Weaken Ranking Signals

Pagination causes internal and reversed URLs, social media shares, and other ranking signals to spread across pages.

When performing pagination, assess its benefits. If content degrades usability, then content should be divided into pages. In other cases, this process will make the situation worse.

2. Pagination Can Cause Duplicate Content

Although we wrote that pagination prevents duplicates from appearing, if rel=canonical is missing, or you additionally created page = 1 to the root page, you will face a problem. Read the related article – Rel=Canonical Tag: Best SEO Practices for Canonical URLs in 2020.

This can be easily avoided if the site is optimized correctly even if H1 and Title are identical. The content of the page is different, and it will not be considered a duplicate.

That’s why John Mueller responded on Twitter about this:

Twitter

3. Pagination Consumes a Crawling Budget

This statement is only true if you allow search bots to crawl pagination pages. You can prevent crawling in Google Search Console or in the robots.txt document of less important pages for promotion, saving your crawling budget. Read related articles – What is Google Search Console and Robots.txt: How to Create the Perfect File for SEO.

4. The Presence of Pagination Can Negatively Affect the Scan Depth

This parameter affects Google PageRank and StaticRank Bing, algorithmic ranking systems for sites that determine the popularity of a URL. If spiders don’t reach meaningful content in a minimal number of clicks, the site will rank worse in search results.

5. Pagination Can Lead to Thin (Meaningless) Content

This happens if a webmaster is focused on generating banner ad revenue, not creating quality page content. If you split your gallery into multiple pages to increase conversions through views, unsubstantiated content appears. This will rank worse in the search engine.

What Does Google Say About Pagination?

In 2011, Google announced rel=”next” rel=”prev”. The post said that this attribute helps search bots understand how paginated pages are interrelated.

The element was placed in the <head> of a web page, or in an HTTP header and signaled to the search engine that:

  1. Pagination pages should be combined into one piece of content.
  2. This page should be a priority when ranking in search results.

However, Google announced on its blog and officially on Twitter that rel=”next” and rel=”prev” are no longer used as a signal to index pages on March 21, 2019.

Twitter

This information was confirmed by a search engine analyst, John Mueller.

Twitter

Google commented on their decision as follows:

We have been analyzing indexing for several years, so we decided to nevertheless close the previous next attribute (rel=prev and rel=next). Based on the results of our research, it was concluded that the majority of the audience prefers content posted on one page. We recommend that webmasters take this into account, but do not forget that some users prefer pagination as well. You better understand your target audience, so leave the decision to yourself.

Google Analyst John Mueller spoke about how the search engine handles paginated pages in its index on his March 22, 2019 Hangouts for Webmaster interview:

The search engine does not treat pagination pages in any other way. We treat them as standard web pages.

Search bots consider such URLs as unique. Let’s give an example of how it looks on the website of an online store, where there is a page with a certain category of products and 3 more with pagination:

https://www.shop.com/category-1

https://www.shop.com/category-1?page=2

https://www.shop.com/category-1?page=3

https://www.shop.com/category-1?page=4  

After canceling rel=”next” and rel=”prev”, the bot sees not one landing page but five.

For webmasters, this was not a pleasant discovery. They had to change the pagination algorithms for Google. We will look at what modern practices are now used later in the article.

Should You Remove the rel=”next” and rel=”prev” Attributes?

Despite Google’s claim, you are not required to remove these attributes. It is not entirely clear whether the search engine uses them for link discovery or for some other algorithms. It was exclusively about indexing. No other announcements have yet been made.

Here are some more reasons why you should keep rel=”next”, rel=”prev” on your site:

  • Attributes do no harm to the site but you will spend a lot of time removing them.
  • Some browsers use them to preload pages.
  • Other search engines like Bing still use them to better understand the relationship between pagination pages. Here’s a confirmation from a search engine official:

Twitter

Remember that the rel=”next” and rel=”prev” attributes are not directives, so they can be ignored by search engines. Bots usually take this information into account when crawling a site.

Also, some webmasters use rel=”previous” – this is not a bug but most prefer to shorten it.

Нow to Paginate in Pages: SEO Best Practices

1. Using Crawled Anchor Links

Search Engine Journal

To enable search crawlers to efficiently crawl pagination pages for SEO, it is recommended that you use anchor URLs with href – <a href=”your-paginated-url-here”> attributes for internal links. You should not use href via JavaScript pagination.

For the Bing search engine, experts still recommend using rel = “next” and rel = “prev” to indicate the relationship between pagination pages. You should also add rel = “next” / “prev” with a rel = “canonical” attribute. If the URL has additional parameters, it is recommended to include them in rel = “prev” / “next” links, and not in rel = “canonical” links.

Here’s a good example of how it should look:

Search Engine Journal

This will help bots understand how pagination pages are interconnected and prevent duplicate content from appearing.

The main mistakes when implementing this practice are as follows:

  1. Attributes are placed in the <body> of the page, not in the <head> HTML when pagination;
  2. You add rel = “prev” to the root page or rel = “next” to the last one. All other pages in the chain should use both attributes;
  3. Don’t put the canonical URL on the root page. Most often it is used for the second one, where the code looks like this:? Page = 2, rel = prev.

2. Don’t Use Noindex and Nofollow for Pagination Pages

It is not recommended to use the noindex robots directive on pagination pages.

If web pages are not indexed for a while, search bots will receive a signal to stop crawling their content. When using rel = “next” and rel = “prev” attributes, some pages will be shown in search results extremely rarely. This can significantly affect site traffic.

The same rule applies to the nofollow attribute, which informs search crawlers that they are not allowed to crawl hosted content. You simply close the pagination pages from the search engine and prevent bots from finding new and useful content that could help your site get a higher ranking in the SERP.

3. Change the Elements of Pagination Pages

Due to changes in Google pagination, each page can compete in ranking from the root. In order for the search engine to return the root page in the SERP and there are no duplicates of meta descriptions and titles in Google Search Console, it is recommended to follow these steps:

  1. Deoptimize H1 tags on pagination pages.
  2. Add more relevant and quality content to the root page.
  3. Add images with optimized alt tags to the page. Read more in the article, SEO Images.

Experts recommend intentionally de-optimizing pagination pages so that the search engine will less often display them in search results and focus on the root one.

4. Set up Pagination Using Google Search Console

You can customize pagination parameters with Google Search Console using the Paginates feature. This allows you to quickly change the link crawling signals for search bots (Every URL or No URLs). The option you choose will depend on how you want to spend your crawl budget.

It is not recommended to use fragment identifiers (#) for pagination pages. These items cannot be crawled and indexed by bots. Accordingly, not suitable for search engines.

5. It is Not Recommended to Include Pagination Pages in an XML Sitemap

We recommend that you include in your XML sitemap only those pages for which you want to rank in the SERP. Most pagination URLs do not fall into this category.

6. Optimize Faceted Navigation

Using facets in conjunction with pagination can be a challenge for the effective crawling of content by search bots. Most large online stores face this challenge, resulting in many takes.

It is very important to make sure that pagination pages are not blocked or canonicalized along with faceted URLs. Otherwise, search crawlers will simply stop following links or remove pagination pages from the index.

7. Improve Content Quality

Google treats pagination pages all the same. Webmasters have an acute question of improving the quality of content on all pages, so duplicating content to save time is impossible.

John Mueller made a recommendation to webmasters on this matter  in his interview with English Hangouts on March 22, 2019:

I would like to recommend specialists to make sure that all pagination pages can work independently. That is, when moving to another section, the user was able to find something useful for himself. Therefore, it should be borne in mind that due to changes in the algorithms of search engines, pagination is not just a group of pages from 1 to 100 with different types of products, each of them must contain relevant information. However, you can always define a priority root page for yourself and spend more time optimizing it.

That is, pagination should primarily improve the user experience and the content should be relevant to the audience’s search queries. There is no need for unique text content to be present on each page. You can just eliminate duplicates. You can use third-party services to check the similarity of the pagination pages.

8. Controlling Keyword Cannibalization

In addition to quality content, check that pagination pages do not compete with the root page for keywords. This will make them useful to the user and de-optimized for relevant keys.

To understand which URLs need improvement, you can use the performance report from Google Search Console, or use the Pi Datametrics tool to find competing pages.

Pi Datametrics

Don’t overdo it. This can negatively affect the efficiency of site indexing. You just need to identify the SEO elements that interfere with the ranking of the root page for the keywords.

Common Pagination Errors

1. Bill Slawski blogged that the most common mistake is the canonized first page.

Pagination spreads PageRank across all page groups and ideally leads users to the most relevant variation of their search query. If you have incorrectly canonized the root page, then the search engine will think that you have only one page.

Each pagination page must have a canonical link to itself. Googlebot will simply ignore the rest of the content.

2. Some wizards canonicalize View All Page, which leads to confusion for search bots.

3. You’re not giving search bots clear signals to crawl and index content. While Google believes that their search crawler is smart enough to find the most relevant URL pagination page. Many experts deny this. There have been cases when the search results did not turn out to be the root page at all, containing more relevant content.

4. Do not create pagination on pages with infinite scrolling and loading. Googlebot cannot completely mimic user actions, which makes the crawler less efficient in crawling their content. For this, anchor links with href attributes are used, which remain available to webmasters even if JavaScript is disabled during pagination.

It is recommended to use pushState to simulate a click or active scrolling of the page. You can see a demo of this feature, created by John Mueller, here.

How to Track the Effectiveness of Pagination for SEO?

Even if you did everything right, we always recommend tracking the optimization results when paginating pages. You can analyze the success of the work performed using:

  1. Server log data: includes the number of pagination pages crawled by bots
  2. Site: search operator to get the number of pages that have been crawled
  3. Google Search Console search analytics: reports showing the number of hits on pagination pages and which specific URLs
  4. Report from Google Analytics on landing pages: see user behavior on the site.

Changes in Google indexing have caused many webmasters to change their approach to the pagination of site pages. The priority when breaking down is to determine which URL to root and how to canonize links so that the bot crawls only content important for promotion and minimizes the crawling budget.

Think about simple and user-friendly site navigation. When optimizing, take into account that search crawlers cannot completely repeat human actions. This can lead to errors during crawling. Therefore, it is important to provide clear signals for internal URLs during pagination so that relevant content appears in the index.

Author

Anna Stunkin Anna Stunkin

Anna is a content manager and copywriter since 2013. In 2017, she started working as a copywriter and editor at a digital agency. In 2019, she began to cooperate with a SERM agency. The main responsibility is writing of the corporate blog. In 2020, she completed courses in SEO-optimizer and began cooperation with SeoQuake as a content manager.

Leave a Reply

Your email address will not be published. Required fields are marked *