this article from SeoMoz!
Perhaps the most contentious and difficult issue of the current SEO industry and efforts is Google's "Sandbox" filter, an inaptly named effect that has come to mean:
"The penalty or devaluation in the Google SERPs of sites with SEO efforts begun after March of 2004."
The date of March 2004 is variable, with SEOs claiming it started in Jan. or Feb. (some even claim the famous Florida update in Nov. 2003 is to blame) and others saying it started only in late April or May. March, however, has become the most popular date of identification and serves well as a reference point. There is also contention about whether the penalty is only on new sites, or applies to somewhat older sites (those started in late 2003, etc.). One point of agreement seems to be that sites whose major SEO efforts began after March of '04 are nearly universally affected.
In addition to time-frame issues, measuring the exact effects of the sandbox are exceptionally difficult, fueling the speculation, contention & misunderstanding among SEOs. A daunting, hard-to-follow history is to blame for much of the confusion and even the original creation of the term itself.
History of the Sandbox
Originally, the 'sandbox' factor was used to describe an odd phenomenon noticed by SEO professionals in conjunction with the hilltop algorithm. The effect was first introduced to the SEO community at large by Garret French of WebProNews, who wrote Google "SandBox Effect" Revealed on May 6. A second, oft-referenced article by Wayne Hulbert of SEOChat was published a month later, on June 9 - The Sandbox Effect: Not a Nice Place to Play. These articles both focused on the implementation of the Hilltop Algo & how its use created a 'sandbox' where non-expert sites went to play - away from the regular SERPs.
Loosely, Hilltop says that to find an "expert document" on a specific subject, you must have at least 2 "expert" sources" linking to the document. For one, two & even many three keyword phrases, the Hilltop algorithm works correctly, but for longer strings, its logic fails and no "expert documents" can be found. This was presumed to account for the original "sandbox" factor which found new sites ranking well for search terms when nonsense gibberish was appended. For example:
- A search for Microsoft Keyboards would return a set of results that did not include the new(er) sites while
- A search for Microsoft Keyboards -alksdjf -adijfsh - dfjh -aiodsfg would return a different set of results that included the new(er) sites that the SEOs suspected of being in the "sandbox"
The initial effects were also noticed to follow a pattern for ranking. Initially, the site would rank very well for its keyword phrases for 1-3 weeks, after which the site would not appear in the top 1000 results for almost any keyword phrase searches. Naturally this sparked speculation of a specific "punishment" for new(er) sites.
Over the past 6 months, the "sandbox" has evolved to have a very different meaning. The original search with exclusion factors no longers displays "pre-sandbox" results, sites have been in the "sandbox" for far longer than 3-6 months and some (albeit rare) sites appear to have evaded the filter entirely or been only slightly affected by it. This has sparked even greater controversy and confusion as SEOs look to emulate the few sites that have escaped and continue to suffer great traffic losses under the sandbox's effects.
Introduction of The March Filter
In a thread from early September of 2004, I wrote - No More Sandbox - Let's say the March Filter. This thread sparked additional discussion and sadly, more confusion across several of the SEO forums. The goal was to remove the "sandbox" term, as technically it refers to a test environment for software and no longer corresponds to the current penalties applied to new(er) sites.
The March Filter discussion did prompt a more careful examination of the sandbox and even sparked the creation of the SEOmoz Website. Although results will take some time to arrive, the sandbox's definition has moved ahead and certain consistent features that stand out can now be used to track & predict its effect. The March Filter discussion was followed up by several theories that coalesced around the idea that backlinks, rather than sites, were being penalized.
BLOOD - Back Link Over Optimization Devaluation vs. TLD - Temporary Link Devaluation
The new term BLOOD, also called BLOOP (with penalization rather than devaluation) refers to the factors that are speculated to affect new(er) sites or new(er) links. This theory is based on the idea that an SEO who builds many new links to a site very quickly, following typical SEO patterns (such as similiar anchor text, links from high PR sites, etc.) are subject to penalization for "over-optimizing" their inbound links.
A second term, TLD (Temporary Link Devaluation) surmises that new links have only 5-10% of the full value of normal links and that over time, they gain their full weight when considered in the algorithm. If TLD exists, it would suggest that newer links are not as relevant or helpful as old, established links. This argument conflicts with the theory described by Dr. Garcia (Orion) at SearchEngineWatch in a thread titled - Temporal Link Analysis (TLA).
TLA purports to help search engines return more relevant results by adding a time analysis component to the value of a link. However, if speculation about the 'sandbox' factor holds true, it would suggest that TLA is not yet being included in Google's algorithm, or that sites suffering from sandboxing are not benefiting from it. In the same thread, Orion points out that the TLA patent is currently held by IBM, so it is likely not a part of any search engine algorithm at this point.
Many people have speculated that the sandbox is nothing more than the effects of temporary link devaluation or link over-optimization. This theory would be illogical if it were clear that some sites 'escape' the sandbox filter, but as dissension still arises as to whether a filter even exists, it cannot yet be put to rest.
Are you in the Sandbox?
The first step for SEOs who believe they may be suffering from the sandbox penalties is to rule out all other factors that could be influencing rankings. Many SEOs have mistakenly assumed that the sandbox is to blame, when their SEO work still has a long way to go. In order to rectify this problem, I've created a set of criteria to judge your site against when attempting to determine if the sandbox is affecting you.
- Are you ranked #1,2 or 3 for an allinanchor search for your keyword phrase?
- Are you ranked #1-10 at Yahoo! & MSN?
- Is your visible toolbar PageRank +/- 1 unit away from your top 10 competitors?
- Are you ranked in the top 3-5 results for a search of 6-10 keywords that exactly match your page title? Try this several times with several pages on your site that Google has cached.
If you answer "yes" to all of the above questions, yet your site is not ranking in the top 10-20 for your keyword phrase, you could be under the impact of the sandbox effect. Remember, however, that the sandbox is a "sitewide" effect and will not impact only a single keyword phrase or page. Therefore, consider these other signs of "sandboxing" before ruling out other possibilities:
- Pages on your site do not rank in the top 50 even for very long or obscure searches that match your page titles; i.e. for this page, if I were to search for Guide to SEO > The Sandbox, the March Filter and not find this page in the first 50 results, I could assume the sandbox was at work (make certain the page is in Google's index before trying this).
- Try the allintext, allintitle & allinurl searches for your keyword phrase. If your site consistently comes up in the top 5, but isn't ranking well for the keyword phrase, the sandbox may be to blame.
- If your site is ranking fairly well - top 50 - for a particular keyword phrase, try typing in the phrase in "quotes". Generally, you will see your site show up much farther back in the results - another possible sign of sandboxing.
- Finally, SEOmoz has created a sandbox detection tool, that can be used to detect many of the common sandbox features in a website's rankings.
What to Do when Sandboxed
If you believe the sandbox is affecting your site, you should not give up hope. There have been several examples in the past of sites breaking free from the affect and turning up high in the SERPs. Sadly, no common factor has been noted in breaking free, but several theories are worth experimenting with - especially since you can't make things worse...
- Get more deep links - Get your linking partners, ads, etc. to link to deeper pages in your website, rather than the homepage.
- Slow link-building down to 2-4 new links per week.
- Only add new links that could be seen as fully natural - make sure they have related text surrounding the link.
- Turn off or cease buying sitewide style links - these are estimated to hurt a site's rankings.
- Build Content and a good internal linking structure
- Obtain links from sources beyond reproach - news articles, DMOZ listings, link from high PR and highly reputable sources, sometimes these can be obtained by 'donating' to a foundation or cause...
Remember that these factors are only the subjective, collective theories of some in the SEO community and should not be presumed to be authoritative. Experiment, test and try again - it's the only strategy we currently have. In the future, the results of the SEOSurvey should provide additional guidance.
The filter on new sites, by whatever name, is a strong, overriding factor for many in the SEO industry. By properly recognizing its effects, documenting as much as we can & constantly experimenting with new ideas, an eventual victory seems likely. Unfortunately, the sandbox does not appear to be loosening its grip - more and more SEOs are finding themselves strangled by it, creating a frustrated community. As evidence arises from sites that have escaped, I will post more info in the blog section.