SEO Glossary > Cloaking


Cloaking is a technique whereby a website presents one version of a web page to users and a different version to search engine crawlers.

How do you detect cloaking?

It is not always easy to detect cloaking because people involved in this practice often try to avoid being detected by making the version meant for spiders difficult to examine. This is achieved by embedding a 'noarchive' within the meta tags. Spiders will obey that directive and will avoid archiving the page. This will cause the 'cached' link in the page's search listing to disappear, meaning that viewing the version the spider is crawling will become difficult. This can be solved in the case of user-agent-based cloaking by using the User-Agent Switcher in Firefox. However, skilled cloakers will avoid detection through this method. Instead, they will feed content to a spider based on a known IP address. This means a person not using this known IP address will be unable to see the cloaked page. Especially if it has also been excluded from the search engine's cache. In cases like these users are still sometimes able to view the cloaked pages by using Google Translate. The reason is cloakers generally don't differentiate between a spider coming to crawl the content or a spider coming to translate it. The Google Translate spider will use the same range of Google IP addresses. When cloakers do IP delivery they generally offer up the Googlebot-only version to the translation tool.

What is IP cloaking?

IP cloaking, simply put, is when a person hides their IP address. Since anything a person does online is linked to their IP address, the main benefit of IP cloaking, is people want to avoid being tracked and monitored. This is achieved by accessing the internet through a second computer called a proxy server. The proxy server acts as an internet gateway while their IP address remains hidden.

Is cloaking always against search engine guidelines?

Not all cases of cloaking are negative. In December 2005, the major search engines went on record to define what is good cloaking and what is bad cloaking. They confirmed that it is acceptable to replace search engine unfriendly links (for instance ones with session IDs and superfluous parameters) with search engine friendly links for the sake of spiders.

black hat