Recently findings were published that suggested that 85% of all sales generated on the internet came from the front page of the results returned on a search request. With this in mind, it's no wonder that the competition for the prime real estate in the search results is so fierce. So fierce in fact that several world wide search engine companies have made recent statements concerning the ethical use of certain techniques used to build websites. In this article we will discuss “Cloaking,” methods, uses, what the search engines will tolerate, and ethics.
Defining the term cloaking has been the subject of much debate on the internet lately, but all agreed on one point, that cloaking is presenting one version of a website to the search engine spider and then displaying another version of the same site to a human viewer. The intent of the cloak is what keeps the definition muddied, because there are ethical uses for cloaking.
Google and several other search engine companies have stated in recent months past that website cloaking would not be tolerated if discovered, and the offending site would be removed from many indexes. In spite of this there is argument over the fact that there are legitimate uses for cloaking, unfortunately though, there are bad apples out there that would use some of these methods to gain popularity with the search engine spiders.
There are more than a few methods used for cloaking, and an infinite amount of reasons, but some of the more popular methods involve user agent cloaking, and IP cloaking. There are also methods of cloaking links in text or on a page, this will be discussed in more detail later, here are explanations of the first two.
This is where the website requests the user agent name, if the user is a web browser such as I.E. or Firefox, the human version is delivered. If the user agent is a known search-bot name the search engine version is delivered. This method is easily spoofed though, so isn't widely used anymore.
IP cloaking involves the website receiving the IP from the requester, if the IP presented is a known search-bot a search engine friendly version of the website is delivered. This is a common practice for websites that contain a lot of graphics, flash content, or is a dynamic website.
Link Cloaking is used when it is desired to put a dynamic link into a text (e-book, or a publication) that cannot be updated. Here's how this works: If a customer clicks a link in a .PDF publication, you don't want the possibility of the customer being taken to a dead link, so the link will have a link to your websites affiliate link. This creates the possibility of redirecting a customer to the affiliate of your choice, also allowing you to maintain the commissions developed from the link you created.
Cloaking techniques are widely used on the internet as evidenced by the amount of software available and websites dedicated to the coding of it. In many instances cloaking may be the only way for a website to get visited by a search engine's spider, if a website is dynamic this is the case many times. Many times the original website may have a lot of flash or shockwave which makes it difficult for the search spider to index it.
The ethical implications of this technology are just starting to impact this industry. SEO and search ranking has become a marketing method that has companies scrambling to achieve the best possible “real estate” as quickly as possible in the search results. This has sparked many arguments for and against cloaking, here are just a few of both.
The ethical conflict that surrounds cloaking will continue as long as search engines cannot distinguish the reasoning behind cloaking the website or link. When used properly it can be a powerful tool to advertise a website that would not otherwise be found by the search engine spiders, but if used in an improper manner it could result in the banishment of a website to search engines completely, forcing a company to rebuild their website entirely. Care should be taken when using cloaking script.