Google Sandbox: Does Google Really Hate Young Websites?

Google Sandbox: Does Google Really Hate Young Websites?

In 2004, SEO specialists noticed that Google wouldn’t allow new sites in positions higher than the second page of the SERP. Some specialists decided that this was caused by a new specific algorithm of Google, while others thought that it was the cause of something else.

The possible algorithm was called Sandbox. Until 2010, new messages had been appearing from time to time on forums. They contained examples that proved this algorithm to be true. Nowadays, SEO specialists use the term Google Sandboxed websites to describe this phenomenon when sites are poorly ranked, not to define a certain algorithm.

Let us figure out the history of the Sandbox, the reason it exists, and how to avoid it.

What Is Google Sandbox

Sandbox Google is a mythic algorithm or sanction from Google that does not let new sites get to the top positions of the search engine. It is called mythic because Google has not officially confirmed its existence and has not included it on the list along with Panda and Penguin. You can read more information on the matter in the article Google Algorithms That Affect SEO.

The presumptive sanction of the system was called Sandbox having compared the site with small children that play in sandboxes.

The Supposed Functional Principle

In the first several months of a new site’s existence (maximum of 2 years), Google puts information about it into cash and either does not include it in the SERP by the relevant requests or places it in positions no higher than the second page. Within this period, the rating of the site is not influenced by optimization, even though other search engines rank this site regularly.

How Long Does Google Sandbox Last?

The average time of staying in the Sandbox ranges between several weeks to a year. However, taking into account that this is not a confirmed algorithm, there is no precise data. If you are looking for a response to a question, “What does Sandboxed mean?”, the answer is complicated. Without taking into account the mythic sanction and the fact that a new site is absent in the SERP,  the time within which the site rating can be increased depends on the niche, the number of competitors, and optimization strategies. For more information on the matter, read the article SEO Competitive Analysis.

Your Website Is In a Sandbox If:

  • Google does not show the site pages even if you type search requests exactly like its headers.
  • Other sites or pages are ranked higher.
  • At first, the site is placed on an adequate competitive position in the SERP and then it accidentally loses its position and falls between 30 to 500 positions in the SERP.
  • According to the other metrics and search engines, the site is ranked regularly.
  • There are no other reasons that could lead to such behavior from Google.

How Is Google Sandbox Related to Google Chrome Sandbox?

If replying in brief: almost in no way. In fact, they are connected only by Google. 

Google Chrome Sandbox mode is an existing function built in the Google Chrome browser. It is used for protection against malware.

How does this work? Chrome launches every new site as an isolated process. This causes an increased load on the system, however, if there will be malicious code in a new process, it won’t influence the other components.

What is no-Sandbox Google Chrome mode? This means that the Sandbox mode is switched off. In this way, it is easier for a browser to work, even though it is less protected.

Does Sandbox Exist?

The Opinion of SEO Specialists

In the SEO community, there is no unanimous response to the question of Sandbox’s existence. However, experts and SEO bloggers such as Neil Patel, tend to believe that the Sandbox is a myth. Some simply do not even cover this topic.

The team of Search Engine Watch proved that Sandbox does not exist. They launched a new blog in 2018 and in 4 months, and they have reached the first page of the Google search results.

Within 2005 – 2009, there were different publications created in Moz that were related to the matter of Sandbox in the course of which the company had changed their opinion from “Yes, the Sandbox exists” to “No, this is a myth.”

In 2005, the co-founder and ex-head of Moz, Rand Fishkin declared in the blog that the company official website had been sandboxed for 9 months and was finally set free.


In his post, he provided the links to site ratings according to several search requests at that moment. However, he didn’t show the metrics screenshots that could have proven that Moz was sandboxed at that time.

In 2006, several posts were published about the reasons and methods of fighting against the Sandbox on the Moz site.

The last article on the matter together with proof of the Sandbox existence was published by Fishkin in 2009. As an example, he chose the site (owned by Hubspot). In the post, he compares the site rating according to the same search request on Google, Yahoo, and MSN Live. For instance, for the request “twitter grader”, the site was displayed at positions 55, 1, and 2 respectively.

Later in the same year, Rand Fishkin updated this post adding that after the site had moved to another domain, and its rating became adequate. Consequently, Fishkin drew the conclusion that was not sandboxed. In this updated version, he also stated that this placed the final nail in the coffin of the Sandbox myth.

Meanwhile, a number of specialists, such as Edward Sturm, the World of Ether Marketing Director, supposes that even though there is no Sandbox sanction, there are algorithms that can act like it.

Andy Crestodina, a co-founder of Orbit Media, agrees that new sites do not deserve to be trusted by Google and the system needs time to check them before properly ranking them.

What Google Says

During 16 years, Google has never admitted that the Sandbox sanction exists. However, in 2005, the representatives of Google have debunked this myth a couple of times with some disclaimers.

In 2005, Matt Cutts, ex-head of the Google search quality team on search engine optimization issues, expressed his opinion on the matter for the first time. The piece of news was published as the confirmation of the sanction. However, Cutts declared that Sandbox does not exist, but for some niches, the algorithm can function with a similar effect.

In 2012, Cutts spoke on YouTube again about how Google works with young sites.

The word Sandbox was not mentioned in the course of the speech, however, answering the question “How does Google see my site?” and how important the domain age is, Cutts answered as follows:

“The difference between a domain that’s six months old versus one-year-old is really not big at all. So long as you’ve been around for at least a couple of months, a few months, you should be able to make sure that you are able to show up results. (…) I would say it’s often good to go ahead and buy a website, put up a placeholder page to tell people what’s coming, and just go ahead and develop the website. And by the time you get your website live, often that’s two or three months down already.”

Read more detailed information on the matter in the article How to View Website Source Code.

In fact, Cutts provides a recommendation on how to escape the Sandbox effect: the site promotion should begin when it reaches the age of 2-3 months.

In 2016, Gary Ilsh, the analyst of trends for webmasters in Google, answered the question of a Twitter user briefly and precisely.


The most recent statement of Google was posted on Twitter in 2019 from the Google Webmaster Trends Analyst, John Mueller. Replying to a user question, he once again denied the existence of Sandbox.


Before that, Mueller had also negatively replied to alike questions from users in 2017 on Twitter and in 2018 on the Webmaster Hangout streaming. A full video is available on YouTube.

The excerpt from the speech on Sandbox:

“With regards to Sandbox, we don’t really have this traditional Sandbox that a lot of SEOs used to be talking about in the years past. We have a number of algorithms that might look similar, but these are essentially just algorithms trying to understand how the website fits in with the rest of the websites trying to rank for those queries. (…) It’s always kind of tricky in the beginning when we have a new website and we don’t quite know where we should put it.”

It is possible to draw a conclusion that the other Google algorithms can influence new sites, but the search engine won’t intentionally keep the site in cash only because it is new.

Why Can the Sandbox Effect Appear?

Neither Google representatives, nor SEO specialists deny that it is hard for new sites to be promoted in the rating. However, the reason is not caused by a search algorithm that keeps young domains in cash for the first several months. It is caused by the influence of several algorithms or by being hit by another Google sanction due to violating the rules of the search engine.

In the initial period of a domain existence, the search system is trying to understand if a new site is qualitative and relevant and if it deserves a high rating. That is why at first, Google can display it at high positions to check the reaction of users and then return the rating to the average rate to stabilize growth.

For instance, This is how it was with TheBlogging:

Search Engine Watch

The same happened with the domain


Three main factors by which Google defines a domain as deserving of top positions in SERP are expertise, authoritativeness, and trustworthiness (E‑A-T). Aside from these factors, the search system is taking into account backlinks and other aspects. That is why it is so hard for a new website to persuade Google that it can be trusted in a couple of months. You can read more information on the topic in the article Build High-Quality Backlinks.

The main aspects that can prevent the site from reaching top positions:

  1. Insufficient content amount: Google needs to crawl a lot of pages to understand to which niche your domain belongs and define keywords by which you can be searched.
  2. Absence or not enough signals from users: When the number of users is small, Google cannot process data related to this factor and understand their behavior.
  3. Insufficient link mass: Backlinks are one of the defining factors for Google. If they are scarce, the search system won’t display your site at top positions.
  4. Low quality of the link mass: Even if you purchase a lot of backlinks in 1 month, Google will ignore them for quality purposes.
  5. High competition: If there already are a lot of players in your niche, it will become more complicated to reach top positions.

Serious specialization: For example, Google ranks legal and medical sites higher because they contain crucial information.

Web MD

The first site in the SERP for the request “medical site” was launched in 2005.

To conclude: Google algorithms cannot quickly assign high rates to a young site and place it to top positions. Therefore, it is hard to promote new sites in the first several months. This is not because of the Google filter but because the search engine is not yet familiar with them.

Sandbox vs Poor SEO

There is the  Google Sandbox checker in four steps:

  • Google your site with the command site:

The system will display all the pages of your site with the relevant headers and descriptions. If some of your pages are displayed with only URLs or are hidden in the expanded results, there may be technical errors on the site.

  • Check the domain rating in other search systems such as Yahoo!, MSN, Bing, and Ask.
  • Compare the domain rating in Google and other search systems. If the other engines rank the site poorly, there is a problem with it. If there is a big difference in the indices (the site position in other systems is in the top 10 while in Google, the position is lower than 60), then Google is the cause.
  • Make sure that the Sandbox effect is actually present for the whole site, not a specific keyword or page.

How Not to Get in the Sandbox or Quickly Get Out of It

As the Sandbox effect is the result of various factors, it is difficult to track all of them when you launch and shows to Google a new site. The situation is the most favorable to sites with virus content, they become popular quickly. If you are limited by the niche, there are methods that can help grow slowly but consistently. Read the article What Is Link Bait?

Design a New Site on an Old Domain

Using an old domain excludes the influence of the site age on ranking. It is possible to find domains for sale on the service Flippa. For these purposes, the old domain has to correspond to your niche, be un-updated for the last several years, and have a clear history. Google filters should have not been previously applied to this domain.

How to check the domain you are about to purchase:

  • Check the domain age. For instance, use Small SEO Tools. It is better to consider purchasing domains that are older than 5 years. 
  • Check if the site is indexed by the request “” in Google.
  • Check how the site is ranked by keywords and if there are backlinks to it. Keywords have to correspond to the niche, while backlinks should not be spammed. It is possible to carry out this check by the SEMrush service.

  • Check if the site has been filtered by Google; this can be done with Google Search Console. But this service requires a confirmation that you are a site owner.

Google Console


Inform Google About New Content

Google does not crawl new content immediately. If you do not inform Google about changes on the site, they will find them separately, not at once. For new sites, it may take at least 2 days.

You can manually request Google to crawl the site again through the URL check-up (for one link) or Sitemap (for several links) in the Google Search Console. This process can be automated by means of the Google XML sitemap generator plugin for WordPress or by the Pingomatic tool.


Content sharing on social networks also speeds up crawling.

Focus on Quality Content with References to Authoritative Resources

Users value content that is useful and unique not by uniqueness checkers, but by their physical meaning. If you are writing about something, provide a reference and link. This way, you will hit three birds with one stone:

  1. This is useful to a reader who can follow the link and read more information on the matter.
  2. This shows that you have got an authoritative source of information.
  3. The mentioned site may repost your link if they choose (if this site is very popular, the chances are low, but not zero).

Some fear that when providing references, users will click on them and leave their resources. But if the content you have created is qualitative, users who clicked the link will check what they wanted and return to your resource.

While Google is crawling a domain, it follows all the links that it has. If your content contains links to authoritative resources, Google will increase your rating.

Adding such links, it is also important to:

  1. Make sure that the link navigates to quality content.
  2. The link should be added only if it is necessary. Do not spam with irrelevant links.
  3. As an anchor, use the brand name or URL.
  4. If possible, inform the site owners that you have mentioned them.

Do Not Stop Working on the Site

The problem of the Sandbox is that it is very easy to conclude that Google hates you and to consider the wrong strategy as a mythic algorithm. However, if you stop optimizing the site, you will never “get out of the Sandbox” on your own when the time comes.

If you see that within several months, SEO optimization brings no results, analyze your strategy once again and change it if necessary. An outside point of view can help, so don’t be afraid to contact an expert for advice.

If there are certain reasons to use this particular strategy or you have just begun implementing it, go on and keep an eye on the change in results.

About author
Viktoriia Pushkina is a writer with 5+ years of experience in the field. She started in 2014 as a copywriter and now works as a content and blog writer and freelance journalist. She specializes in writing about SEO, technologies, culture and society.