I was doing some SEO research for a new client’s website and I was looking into evidence for the devaluation of sitewide links. A sitewide link is a link on every page of a single website. In the past sidewide links were great for SEO, still are, but they are becoming less effective on Google. Matt particularly comments on google’s devaluing of sitewide links in a few posts. From my experience, on older sites if you have a large amount of other links, and you get say a link on 50% of pages on another websites Google will count them all as links and this will be a bit more valuable then say a single link, but for a new website, avoid sitewide links on Google. If you go from having a few links to having a bunch of links all from one website it’s appears that Google will completely devalue those links because they appear to have been paid for and one of Google’s current crusades is to cut down the power of paid links.
Currently buying and selling links on the “black market” is one of the major and very important tactics of modern SEO. You build a good site, optimize for the search engines, be creative, try to get as many links as possible and then buy the rest to give you that “push” above the competition. Although Matt talks about how they won’t help you, I believe a lot of this is wishful thinking, or a prediction of the near future. Currently as long as the advertiser doesn’t label your link with a title link “advertisement, sponsor, link partner” etc, the link will still count, especially if they include it in the body of their content with normal anchor text. Of course you’d better be ready to pay a pretty price for that service, because they’ll actually be sending you some of their traffic and promoting your product as opposed to just providing a link for SEO
Some tricks on devaluing linksThe short list below highlights ways of diminishing or nullifying the value of a link to your site from another Web page.
Meta Tag Masking
This old trick simply used CGI codes to hide the Meta tags from browsers while allowing search engines to actually see the Meta tags.
Robots Meta Instructions
Using noindex and nofollow attributes let's the novice link partner see the visible page with their link while telling the search engines to ignore the page and the links found on the page. Nofollow can be used while allowing the page to be indexed which gives the impression that the search engines will eventually count the link.
This is not a real attribute based upon HTML standards, but rather it is an attribute approved by the search engines to help identify which links should not be followed. This attribute is often used with blogs to prevent comment and link spam. The link will appear on the Web page and in the search engine's cache, but never be counted.
Dynamic listing is a result of having links appear randomly across a series of pages. Each time the link is found on a new page, the search engines count consider the freshness of the link. It is extremely possible that the link won't be on the same page upon the next search engine visitation. So, the link from a partner displaying rotating, dynamic link listings rarely helps.
This can be easily missed when checking link partners. Essentially, your link could be number one today, but as new link partners are added your link is moved down the list. This is harmful because the values of the links near the bottom of the list are considered to be of lesser value than the links at the top. With the floating list, it is possible to have your link moved to a new page whose PR value is significantly less or non-existent and the new page may not be visited and indexed for months.
The caching date provided by Google indicates the last time the page was cached. Pages with lower PR values tend to be visited and cached less often than pages that have medium to high PR values. If the cache is more than six months old, it can be surmised that Google has little or no desire to revisit the page.
While Denver, CO is a nice place to visit, Denver Pages are not a place you want to find your link in a trade. Denver Pages typically have a large amount of links grouped into categories on the same page. Some people call this the mile high list. These types of pages do not have any true value in the search engines and are not topically matched to your site.
Muddy Water Pages
These are dangerous and easy to spot. Your link will be piled in with non-topically matched links with no sense of order. It's like someone took all the links and threw them in the air to see where they land. These are worse than the Denver Pages.
Cloaking is the process of providing a page to people while providing a different page to search engines. You could be seeing your link on the Web page, but the search engines could possibly nevër see the link because they are provided with a different copy. Checking Google's cache is the only way to catch this ploy.
This can be easily performed with server-side scripting like PHP and is rarely easy to catch. In this situation people that attempt to view the robots.txt file receive a copy of the robots.txt file that does not include exclusion instructions for the search engines. However, when the search engines request the robots.txt file they receive the exclusion instructions. With this situation the links pages will never be linked and you'll never know why without expert assistance.
Meta Tags and Robots.txt Confusion
Which instructions have the most weight? Don't know the answer? Shame. Search engines do. If they conflict, the page Meta tags are typically considered the rule to follow.
Link the Head
While these links do not count in the search engines and do not show up on the Web page, they do get counted by scripts or programs designed to verify the links exist. These programs only look for the URL within the source codes for the Web page.
This is a nasty trick, but can be an honest mistake. The links exist and are counted by the search engines, but unfortunately are neither visible nor clickable on the Web page. So, there are no traffic values from the link.
The goal of trading links is to trade them for equal value. Understanding the ways people will attempt to prevent passing a quality value from their Web page to your Web page can help you avoid these useless links. If your link partner pulls under-handed tricks the links they trade you are useless.
While you may never be an expert in knowing all the latest tricks, traps and tests, you can now become an expert in knowing the thirteen mentioned above. Ensuring your link partners are not following or using these tactics can help improve the quality of links you gain from other Web pages. By having quality links pointing to your Web page you will gain additional traffïc through organic search engine results and visitors driven directly from your linking partners.
It's the search engines that drive spam because good results for carefully tuned phrases is the pot of gold spammers lust after. With link nullifiers in place, Google will need to look at alternative methods to rank sites, perhaps in the mould of services Clive mentions which offer a good measure of human interaction of sites. I can still see a potential for abusing the mechanisms though.
Perhaps the future of the web and a good rank is unique, informative and genuinely good content ;)