A web robot is a program that automatically retrieves web pages by following the links on web pages that it has already retrieved before. Web robots are used by search engines to find web pages for inclusion in their search database.
Other web robots, however, will try to harvest any email addresses on your web pages to include in their email database, which they then use or sell to email spammers.
Through a special file on your web host, called robots.text, you can tell a web robot which pages you would like it to ignore. Most email harvesting robots will, of course, just ignore the instructions in your robots.txt file.
Powered by WHMCompleteSolution