1. Robots and Meta Tags
The first and simplest solution many be that your robot.txt file has been changed to prevent search engines from entering your site. Or your meta tags could be directing the search engine robots to exclude your site. While this would be highly unlikely, it is best to rule this out. So check your robot.txt file (if you have one) and your meta tags. Unless you want your site hidden, you should never read this in your meta tags: meta name="ROBOTS" content="NOINDEX". If you see this, you are blocking your site from Google.
You can also ban your own site by having a robots.txt with the wrong code. Two examples of robots.txt code are below.
This example allows all robots to visit all files because the wildcard "*" specifies all robots.
This example keeps all robots out:
Read more about this at: http://en.wikipedia.org/wiki/Robots.txt
2. Cloaking (A Big Google No-No)
Straight from Google's website: "The term "cloaking" is used to describe a website that returns altered web pages to search engines crawling the site. In other words, the web server is programmed to return different content to Google than it returns to regular users, usually in an attempt to distort search engine rankings. This can mislead users about what they'll find when they click on a search result. To preserve the accuracy and quality of our search results, Google may permanently ban from our index any sites or site authors that engage in cloaking to distort their search rankings."
If your website or web pages are set up to display different information for a search engine spider versus a real person, then you are cloaking. Cloaking delivers one version of a page to an Internet user and a different version to a search engine. The cloaked page is packed with keyword and terms that the site wants to be highly rank for so, in essence, they are cheating. There are good reasons for cloaking as well, such as targeted advertising, but if you are trying to manipulate your rankings you should put an end to this immediately.
3. Duplicate Content or Websites
If Google finds multiple web pages have the same content they may penalize each website for this. Of course, someone may have copied your content and Google banned you even though it was your original content that was taken. Make sure no other site is using your content. You can do this by performing a Google search using some of your text with quotation marks (") around it. If you do find someone is using your original copy visit here to learn more about copyright infringement: http://www.google.com/dmca.html.
You can check here to see if your site has been duplicated unbeknowst to you: http://www.copyscape.com
4. Hidden Text and or Links
How can text been hidden? Well, there are a variety of ways - some are more sneaky than others. But is boils down to this: it is considered hidden if the text or link is invisible to the website visitor but can be seen by search engine spiders. This used to be done quite often, such as making your text white on a white background or using cascading style sheets (CSS) to hide your text, but search engines can easily spot this today so it is best to avoid it altogether.
5. Keyword Spam and Keyword Stuffing
Ever seen a web page with a very awkwardly written first paragraph where a certain word is repeated ad nauseam? Here's an example:
"We sell the best father's day gifts for father's day. If you like to celebrate father's day we can help with the best father's day gifts for father's day."
Care to guess which keywords are being targeted? This is keyword spamming or stuffing but it is just the tip of the SEO iceberg. This is just the content on the page, there is probably keyword stuffing happening in the code: in the meta tags, invisible text, alt tags, title tags and comment tags. etc. If the word or phrase is repeated too often Google can place a filter to reduce the site's rankings or simply ban the site. Keyword density can be tricky but, as a general rule, Big Oak shoots for 1% to 5% of all text on a page to be our targeted keywords. Ultimately you must write for the reader not the search engine. Be sure the keywords flow naturally.
6. Doorway Pages
Defining a doorway page can be difficult so here is our definition that could potentially ban your site in Google: pages that are created in order to attract search engine spiders and be ranked highly for their targeted keywords. Real visitors find this page and then continue to the "real" website from there. Hence the name "doorway page". These pages aren't in the navigation most of the time. If you come across a page where much of the information is duplicated from other pages on the site but it is different in terms of keywords only, this is most likely a doorway page.
As you can see this can be a gray area. Some pages on a website may focus on a particular subject and be innocent of trying to lure search engine spiders only for high rankings. Err on the side of caution and make sure the page is useful and part of the your site's navigation.
7. Redirect Pages
Sneaky redirection pages are set up in groups from 5 to hundreds. They all target similar and related keywords or phrases. Usually, the only links on these pages are links to other pages in the same family creating a false sense of related linking.
These pages don't necessarily contain content that any human would be interested in. These pages may show up high in Search Engine Results Pages (SERPS), but when you click on one of these pages from the SERPS, you will be redirected to another page. In other words, the page you click to see is not the page you actually get to read.
The redirect can be automatic, done with a meta refresh command or through other means such as a the mouse moving while on the redirect page.
8. Buying Links
While buying links may not get you banned, they can certainly hurt your page rank. Google has slowly been catching on to this fad and has measures in place to put your site in limbo for 6-8 months (known as the "sandbox effect") so you can't instantly benefit from buying links to your website. Many sites that sell links are being devalued by Google, making an investment in this strategy a waste of money and time. Ultimately, stay away from buying links to increase your ranking.
9. Linking to Bad Neighborhoods
Link campaigns are good thing when done correctly; we would say they are a necessity in today's SEO world. But linking to bad neighborhoods are a sure way to lose your rank in Google. If you aren't careful about who you are linking to you can easily disappear overnight. Basically, while you may be ethical and do everything right linking to someone who isn't can be considered guilt by association. Always verify your links to other sites. Make sure they have page rank in Google and are indexed by Google. Try searching for their URL to see if they are indexed. Avoid linking to any sites that use spamming techniques to increase their search engine rankings. Regularly checking outbound links from your site and removing any offenders is a good idea.
A few site types to avoid:
- Free-for-all link farms
- Adult sites
- Gambling sites
10. Code swapping
Optimizing a page for top ranking, then swapping another page in its place once a top ranking is achieved.
What does Google say?
"Don't deceive your users, or present different content to search engines than you display to users," Google says, and they list some bullet points on avoiding being banned.
- Avoid hidden text or hidden links.
- Don't employ cloaking or sneaky redirects.
- Don't send automated queries to Google.
- Don't load pages with irrelevant words.
- Don't create multiple pages, subdomains, or domains with substantially duplicate content.
- Avoid "doorway" pages created just for search engines, or other "cookie cutter" approaches such as affiliate programs with little or no original content.
Google also states:
"Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, 'Does this help my users? Would I do this if search engines didn't exist?'"
While creating a page without a thought to search engines is probably going a little too far, optimizing your site for an organic search, as long as it conforms to their standards, is perfectly acceptable.
How to get back into Google
Visiting this old link I used to reccommend http://www.google.com/support/bin/request.py shows no option for “I’m a webmaster inquiring about my website” any longer which allowed you to request reinstation.
However, logging in to Google Sitemaps now shows a direct link at the bottom of your main account page to “Submit a reinclusuion request” which takes you here - https://www.google.com/webmasters/sitemaps/reinclusion?hl=en
This means you will have to register your site to do.
From there you get to check boxes that let you admit guilt, acknowledge modification, agree not to do it again, and even a box to explain yourself.
You don't have to contact Google but it can't hurt. They will eventually spider your site again and see that you have cleaned up your website. You may have to wait a few months for Google to re-index your site so be patient and don't tinker with your website too much unless dictated by your site's products or content needs.
The worst case scenario is to start a new site. Sometimes this can be necessary but only in the most extreme cases.