Google Algorithms That Affect SEO: Panda, Penguin, and Hummingbird
If you are working with SEO, you probably know about the existence of algorithms such as Panda, Penguin, and Hummingbird. They can affect the ranking of your site in the search results of Google, and they impose penalties on sites that do not follow their policies.
In the article below, we will discuss these algorithms in more detail and tell you how to get out of the sanctions imposed by them.
How Have Google Search Algorithms Changed? A Brief History Since 2002
Let’s talk about Google’s ranking policy. Initially, copywriters and SEO-specialists could simply spam the site with keywords in order to get the right positions in the search results. This was stated at the Webmaster World Forum in 2002.
However, every year the algorithms of Google’s search bots become more complicated. Today, in order to get to the top of the results page, it is not enough to create an unreadable text filled with keywords.
So, with the development of Google’s search technologies, the company’s engineers began to focus on making their site’s content offer more and more relevant and useful to Internet users.
They began to pay special attention to the uniqueness of the content and its relevance (topicality) for the target audience. Learn how to properly conduct a content audit of the website.
An additional criterion for choosing in favor of a particular site began to be its credibility (this is evidenced by the number of naturally acquired external links to a web resource).
Currently, the Google search algorithm is changing quite often. It is constantly filled with new technological innovations, such as machine learning and artificial intelligence.
The first real breakthrough was the already irrelevant search algorithm Caffeine, which first appeared back in 2010. Because of it, the traditional SEO began to crumble, and it laid the foundation for the emergence of truly high-quality content on commercial websites.
Following Caffeine, Google launched three new and much more advanced algorithms in turn: Panda, Penguin, and Hummingbird. We have compiled a detailed cheat sheet of algorithms.
Google Takes a Step forward: the Advent of the Panda Algorithm
The history of modern Google dates back to 2011, with the advent of the Panda algorithm. Its first launch took place in the winter, on February 23, 2011.
The goal of the developers of this algorithm was to automate the monitoring of the quality level of text content that web resources offer. So, sites with non-unique and non-informative content were placed at the very bottom of the search results, giving priority to fresh, useful, and relevant text content.
This algorithm was especially captious with those sites whose content makers brazenly stole texts from other sources and, practically without rewriting them, spread them on promoted web resources.
This, of course, turned out to be a very useful innovation for ordinary Internet users. They got access to mostly relevant web pages that contained exactly what they were looking for (and not where most of the keywords were). This algorithm also forced website owners to seriously engage in filling pages and to review their former policies for creating content. Now Panda has become part of the main algorithm, which means that its action is constant (though Panda’s policies are not as strict as before).
More information about the Panda algorithm can be found in this video.
Panda Website Evaluation Criteria
You probably already understood which sites might fall under the sanctions of the Google Panda algorithm. In order to make this information more clear, we have compiled a list of questions that will help evaluate how relevant your web resource is according to the criteria of this algorithm. So let’s go.
- How reliable is the information presented in the article? Are there any links leading to other reputable sources?
- How deep is the topic in the article? Is it written by a specialist or a person who is slightly familiar with the subject of discussion?
- Are there any articles on your site with similar topics that reveal the topic from the same angle?
- Does the material presented in the article enhance the credibility of the site?
- Are there spelling or stylistic mistakes in the text?
- How relevant is the content presented on the site for its target audience (especially for online stores)?
- What is the uniqueness of the submitted content?
- How competitive is your article in terms of informativeness and usefulness to articles on similar topics presented by third-party web resources?
- How objectivive is the material presented in the article (are both the pros and cons of the subject of discussion presented)?
- How well is the layout of the article done, and doesn’t it look visually sloppy or hastily done?
- Does the quality of site content encourage you to share it with your friends as an authoritative source of information?
- How comprehensive is the topic of the article?
- Does the content contain some analytic findings, perhaps not obvious to the vast majority of readers?
- Is the article full of ad units and links to other sources?
- How big is the article?
- How detailed is the material presented in the article?
- Does the article contain elements that certain groups of user may complain about?
How to Understand that Your Site Is under the Sanctions of the Panda Algorithm?
Now it’s time to figure out which sites will automatically fall under the Panda algorithm’s penalties.
Thin pages are called web pages that contain little to no value to readers. As a rule, on such pages there are several uninformative offers and they force users to retrieve the information they need on third-party sources.
The Panda algorithm is especially unfavorable for sites that have more than a couple of such pages. Google search robots immediately put such a site at the very bottom of search results, and in order to get out of there, site owners have to work hard on the content.
Sites with Non-unique Content
The second pitfall of the Panda algorithm, which any SEO specialist knows about, is the presence of non-unique content on the site.
Sometimes the reason for nonuniqueness is not plagiarism but spamming with keywords and other frequently used phrases. To protect the site from such misfortunes, we recommend using special web services to check the uniqueness of the text, such as Duplichecker, Quetext and Plagiarismdetector. Read more about the best SEO tools of 2020.
You can find both paid and free plagiarism checkers – it all depends on how much text you plan to check.
It should also be noted that it is extremely important to check all text blocks on your site, including those articles that have been posted in the past. If you find out that the uniqueness of the content is not high enough (usually less than 70%, although indicators from 90% and higher are considered exemplary), non-unique texts will have to be rewritten.
What should you do if someone has copied your content? Typically, Google search robots take this into account and give preference to the site on which the unique content was posted earlier. Therefore, it’s certainly not worth worrying about the fact that many copies of your articles are present on the network.
Another aspect of nonuniqueness may be that some products may repeat the same text within themselves (for example, if you make a description for screw-nuts with different diameters from the same manufacturer, it would be foolish to write a separate, completely unique text).
In this case, you have two options. The first is to create a product page with several selection options. The second is to use a canonical URL, which is simply placed in the form of a tag in the code of the page with duplicated content. Learn more about the canonical tag.
Sites with Low-quality Content
It is essential to pay special attention to low-quality content. What specific articles fall under this description?
Formally, we have identified a list of questions that each copywriter or SEO specialist should ask before posting another article or product description on the site.
Nevertheless, there is a global phrase that can describe the quality of content – this is a value for the target audience. It happens that the article seems quite informative, with good images and design. However, the data used in its analytical part is already outdated and now absolutely irrelevant. This aspect especially concerns topical issues, for example, related to the coronavirus or the economic situation in the world.
In this case, you have two options: either replenish your site all the time with updates and indicate the relevance of the data in the previous materials for a specific period of time, or constantly update the same article (which, in principle, is not the best solution).
In the first case, you will automatically create new topics for research for yourself (for example, “The most popular JS frameworks for the second half of 2020”). In the second, significantly limit your choice and perhaps you will break through the search for something relevant and interesting for the target audience from your area of activity.
Sites with Poor Quality External Links
Now, let’s talk about external links. It is believed that sites with a large number of external links are automatically favorable to search engines and the latter begin to put them at the top of search results.
In a simpler world, site owners would regularly set aside a certain budget for the purchase of external links. In reality, only the sites in which they invested more would be at the top of the results.
Nevertheless, the Panda algorithm turned out to be smarter than prudent supporters of black SEO. It somehow began to recognize those who are rapidly building up the link mass in dishonest ways.
Therefore, don’t try to outsmart what Google’s brightest minds have worked on. Search for web resources where linking to your articles would be an appropriate solution. This does not necessarily require the use of money. After all, there are a lot of authoritative Internet portals like Reddit and Quora, where users can leave links to third-party sources for no charge. This is only the case if their link is appropriate in their answer to someone else’s question, and it wasn’t previously deleted by moderators. Read how to improve your business with Reddit SEO.
High Bounce Sites
Also, there is a separate category of sites – these are sites with high bounce rates. Usually these include web resources where content does not meet the expectations of users and people leave them.
A large number of refusals, in fact, can be caused not only by critically irrelevant content on the pages (for example, a user googles something like: “How to get rid of dandruff” and stumbled upon an article on shaving techniques when switching from search results to your page).
The fact is that sometimes inept webmasters create an interface so unsuitable for life that users (especially those who visit the site for the first time) simply find themselves unable to figure out how to cope with the navigation bar.
Also, the bounce rate is affected by a large number of advertising blocks flickering or obscuring the main text. It can also be affected by the noisy audio accompaniment that haunts visitors from the start page of the site.
How to Get out of Panda Sanctions?
The first thing you should realize is that the changes you’ve made will not immediately improve your site’s position in search results. Usually, the first results can only be achieved after a few weeks after the changes are made (since you don’t know when the last time Google Panda was updated).
The second thing that needs to be noted by those whose site is subjected to Panda’s restrictions is a focus on creating content that would be really useful for the site’s target audience. But what does useful content mean in the context of this algorithm? We have already described this in detail above.
Next Stage of Google Search Algorithm Improvement: Penguin Appearance
Following the Panda algorithm, which increased the accuracy of Google’s search robots, the Penguin algorithm appeared. For the first time, its launch occurred on April 24, 2012.
The main goal of Penguin was to optimize site ranking scenarios based on their credibility. Authority, in turn, was determined by the presence of a sufficient number of external links to verified sources. We have already discussed this aspect of ranking in sufficient detail in the previous paragraph.
Vice versa, those sites that artificially increased link mass, linking their pages to completely irrelevant semantic load third-party sites, underwent a decrease in ranking in search results.
If we talk about the differences between Panda vs. Penguin, we can see that the latter algorithm, of course, added new challenges for SEO specialists. However, the innovations of the Penguin algorithm do not sweep away all the policies that the Panda algorithm applies to sites. Therefore, in addition to doing link building, we strongly recommend that you not forget about the quality of the content you offer.
Penguin currently evaluates websites and external links in real time, rather than once a month, as it did before 2016. Therefore that you can see the instant impact of linking or updating content.
Learn more about the Penguin algorithm from this video.
Penguin Website Evaluation Criteria
First, let’s talk about the correct link mass.
So, in the context of the Penguin algorithm, a link is a “vote” for your site. That’s just from whom this “voice” is, is also very important for the algorithm. If the link is placed on a trusted site with a large number of visitors, the Penguin algorithm will increase confidence in your site. But if the link is posted on some local forum, and even as part of a frank advertising post, it will reverse the position of the site down.
At one point in time, the practice of underestimating the ranking of competitive sites was very popular. It consisted of posting links to them in the comments on porn sites. We hope that you will use this knowledge for informational purposes only.
However, having external links is far from all you need to know about the Penguin algorithm. The fact is that Google search robots perceive links that are covered by anchor text much better. This text not only attracts the attention of readers but also conveys the theme of the page to search robots. Therefore, indicators of clickability on them are growing.
You must admit: it’s one thing when you simply left a link to the site and quite another when you covered this link with a good phrase. For example, “visit our blog” is a call to action. The main thing is to insert anchors and links only in those places where it is necessary and not to allow spam.
Also, the Penguin algorithm has another, rather unobvious parameter of low ranking. It is a low rate of return visits.
The fact is that many users, having visited the site once, may never return to it again. It will also be a kind of beacon for Google bots.
This can happen both because of the same low-quality or useless content for readers, or because of, for example, too high prices for goods in comparison with other competitive web sites (if we are talking about promoting an online store).
How to Understand that Your Site Is Subject to Penguin Sanctions?
Now let’s try to figure out if your site is subject to the sanctions of the Penguin algorithm.
Using Website Checkers for SEO Professionals
The first thing we recommend to do is use website checkers for SEO specialists, which allow you to evaluate such important indicators as the number and sources of external links. Usually they are paid, but they contain a lot of useful things for SEO in addition to link building tools,. The most popular of them are Semrush and Ahrefs.
You can also use the completely free Google webmaster, which is available to all account owners in the Google Search Console. In order to find external links that are related to your site, just open the “Links to your site” tab and view the report data. Perhaps this is the reason why your web resource is subject to the sanctions of the Penguin algorithm.
All of these services will help you find out exactly which sites link to yours. And this means that if one of the competitors applied black SEO and indicated a link to you on one of the web sites with 18+ category content, you can take steps to save the rating of your site. For example, turn to the Google disavow tool, which allows you to reject inappropriate external links to a site.
Hiring Manual Beta Testers
Many website owners completely forget about the convenience of visitors when they are in pursuit of a unique interface. As a result, bounce rates are increasing for the simple reason that the site is inconvenient to use.
To get an objective assessment of the usability level of a web site, it makes sense to hire a team of third-party beta testers who will analyze the intuitiveness of the interface and navigation logic.
How to Get Out of Penguin Sanctions?
If you realized that the site most likely lost its position due to the sanctions of the Penguin algorithm, you will have to closely deal with link building. The main and first thing to do is analyze the list of external links in the disavow tool. All inappropriate link sources must be turned off and the link building strategy should be redefined.
The easiest way to start is proven sites where you can post free links. These are question-and-answer forums, social networks, and other websites for multi-user interaction.
Then you can contact the moderators of sites such as Mashable, HubSpot, Investopedia, Entrepreneur, etc., which allow the placement of guest posts. However, this will cost money.
And finally, probably, you will have to optimize the site interface if the hired team of beta testers has come to the conclusion that some of its elements are too inconvenient or incomprehensible to use.
What Is Google Hummingbird? The Appearance of a Brand New Ranking Tool
In the early 2010s, developers around the world began to actively work on the development and concept of machine learning and artificial intelligence. Their best practices were also related to the search engines.
On September 26, 2013, Google introduced a completely new search tool called Hummingbird (although formally it was launched a month earlier). Its operating principle was very different from the previously presented algorithms Penguin and Panda, so the news of its implementation was a kind of resonant event for the global community of SEO-specialists.
The fact is that, contrary to popular belief, the Hummingbird algorithm does not provoke serious changes in Google’s search results. Moreover, its sanctions are much milder than in the case of Panda and Penguin. Nonetheless, it is the compliance with the requirements of this algorithm that helps sites get into the top search results for the most specific user queries.
The main goal of the Google Hummingbird algorithm is to learn to better understand user requests based on context. Therefore, it does not have a direct impact on the decrease in the rating of sites.
From a technical point of view, the Hummingbird algorithm uses two large-scale mechanisms: semantic search (the principle of its operation you just observed above) and the Google Knowledge graph. Read more about semantic search and how can it be used in SEO.
So, the Google Knowledge graph is a complex mathematical spaghetti plot, which combines the structured knowledge obtained from specialists in the industry and associates them with specific objects. This increases the accuracy of guessing by the Google search engine of those queries for which a specific user is looking for answers.
In order to understand how this graph works, just consider the example of finding a recipe for onion cream soup. If you enter a query like “onion cream soup recipe” in the search bar, then, in addition to recipe links, you will also see the cooking time and calorie content of the dish. This means that the Google Knowledge column independently collected important data specifically for those users who were looking for such recipes.
Also, artificial intelligence, on which the modern Google search engine is based, understands that usually with the search for basic information of cities such as Paris, users very often google data on the population statistics and its area. Therefore, in the search results, we immediately get this:
In fact, a semantic search using the Google Knowledge graph compares search results with the contextual queries of Internet users, thereby going beyond the specific meanings of individual keywords. All of this is controlled by machine intelligence. With the constant analysis of user queries, it guesses better and better what they are looking for on the Internet.
What does all this mean for SEO professionals? First of all, the fact that when creating content for promoted sites will also have to take into account both the local features of their target audience and queries that are suitable in a general context. In order to understand what Internet users are most often looking for, just start typing in the search bar request you are interested in and see the results of the auto-selection below. Read more about local SEO.
You can learn more about this algorithm in this video.
Hummingbird Website Evaluation Criteria
Obviously, the main criterion for evaluating sites using the Hummingbird algorithm is the relevance of content to a specific target audience. The more specific your chosen topic for creating content is and the more you develop it, the more likely it is that your site will occupy one of the top positions in search results for the entered key request.
So, various tutorials, guides, studies, questionnaires, and articles with a lot of analytical data are especially popular with Google. Also, search robots always welcome the presence in the articles of videos and relevant images with corresponding alt tags. Read more about image SEO.
How Do You Know if Your Site Is Subject to Hummingbird’s Sanction?
Sites that have come under the sanction of search results also include those with content that carries a minimal semantic load. These include sites with keyword spamming, with keywords like: “red caviar buy Houston” and sites with outdated content that dates from last year and earlier.
How to Get Out of Hummingbird Sanctions?
Previously, Google Hummingbird updates occurred approximately once a month. However, the main Google search algorithm does not prioritize sites as the next update is performed today, but on an ongoing basis.
At the same time, the principle of its operation has become much more loyal: now the policies of this search engine are no longer aimed at punishment (most of the latest algorithm updates were as soft as possible), but at building the correct order in SERP and improving the quality of guessing user queries.
Now site owners are almost not affected by filters. Instead, if the site does not comply with the policies of Panda, Penguin or Hummingbird, it simply does not go to the top.
So, in order to promote the site to the top according to the policies of the Hummingbird algorithm, you need to carefully check the content for its interests and queries of your target audience.
It is also very important to replenish the website with fresh articles, while touching upon pressing topics (coronavirus, economic decline in the country, etc.).
And, of course, watch out for keywords – you should not use them in unreadable connectives or mention them in each paragraph of the text.
Universal Guidelines for Combating Algorithms
Finally, here are 12 recommendations for SEO professionals to help get around the sanctions of the core Google algorithm, which includes the policies of Panda, Penguin, and Hummingbird.
- Regularly create high-quality, useful, and relevant content of large volume. What is high-quality, useful, and relevant content? You can fnd this infromation above. Nevertheless, we remind you that for good promotion of websites it is very important to monitor the regularity of article posting. Search robots take into account the aspect of freshness of materials published in search results.
- Delete duplicate pages and articles. Check for duplicate content on your site. Google crawlers are extremely negative about page repeats. If copying some text blocks is a necessary measure (for example, you understand that it makes no sense to make different descriptions for similar products that differ from each other by just a couple of parameters), then the use of canonical tags will help you.
- Accumulate correct external links. Proper link building is a very difficult task, requiring time and effort on the part of your content makers. You can start with forums, social networks, publics on social networks where people ask each other for advice, and then go on to build link mass through guest posting. We have already indicated a short list of sites for guest posting in the previous paragraphs. In turn, buying links like black SEO experts do is definitely not worth it. Search robots will calculate the presence of illogical links of your site with third-party sources and lower it to the bottom of the search results.
- Optimize UI and UX. Optimization of UI and UX – this is not a question for SEO specialists, but for web developers of the site. Nevertheless, it is the SEO specialist who will help determine how suitable the site is for the average Internet user and whether it is adapted for those who own a PC. To do this, special metrics are calculated using SEO services, such as bounce rates, number of clicks on site pages, etc.
- Reduce site load time. Of course, optimizing site performance is also a task for web developers. However, the same SEO specialist can understand whether the low page loading speed is one of the main reasons for the lack of a site in the top of search results. It is considered that the acceptable download speed does not exceed the mark of 3 seconds and the rendering speed is 1 second.
- Create responsive design for mobile devices. Given the fact that more and more users are surfing the web through their mobile devices, it is very important to have a website with a responsive design that is oriented to any format of user devices. Therefore, you may need to check whether the site you are promoting is operational on small screens, since the reason for the large number of user failures may be this one. Learn how to optimize your site for mobile devices.
- Less keywords, more focus on the pressure points of your target audience. Once and for all, forget about creating articles with lots of unreadable combinations of keywords. Google is massively blocking such content. Instead, think carefully about what topics would really be of interest to your target audience and how deeply you could cover them. Perhaps it makes sense to create a whole guide or a list of recommendations for solving one of the problems of your potential readers.
- Update obsolete content. Another painful point for website owners is the presence of outdated, irrelevant content. If possible, you will have to deal with updating it, as well as creating articles using new data. It also provides you with an almost inexhaustible source for new topics that will be of potential interest to your target audience.
- Work through meta tags. Special attention is also paid to working with meta tags. For each article, you need to come up with a unique h1, title and description, in which key phrases would be involved. The more accurately the key is selected, the better for your site. You also need to come up with tags (<alt>) for media content, which usually accompanies articles.
- Add internal links in articles. In addition to accumulating a large number of external links, you have to deal with the internal linking of the site. The most successful way to do this is to leave links with eye-catching anchors to previous articles in more recent publications. It also helps to increase Google’s search engine quotes.
- Fix PHP errors, remove inappropriate HTML tags. Google really does not like pages with broken links and very poorly quotes web resources that redirect users to pages 404, 301, 302, etc. Therefore, you will have to carefully check whether all the URLs on your site are active and work as a site visitor is expecting. Learn more about redirects.
In conclusion, we would like to present to your attention a table – for fixing the main characteristics of the three above-described algorithms: Panda, Penguin, and Hummingbird:
The main task of this article was to provide our readers with a general idea of the principles of the three algorithms that Google uses to this day: Panda, Penguin, Hummingbird.
In addition, we tried to provide readers with the most detailed list of ways that will help you remove their site from Google’s penalty list and optimize its position in the search results.
Now tell us, have you had any experience with sanctions against Google’s algorithms? If so, with what methods did you overcome them? Share your success story with our readers in the comments below.