A search engine ‘robot,’ known by many names including the frequently used ‘bot’ and ‘spider,’ is an electronic device that travels the World Wide Web by following links (clickable or “hot” content that lead to web pages). The mission of the search bot is to access and scan content so that the search engine can index it and later effectively deliver results pages consisting of a list of links that lead to web pages with content relevant to a search made by an end-user (a real person).
The ‘robot’ is the harvester of content. There is actually more than one. Sometimes these persistent little electronic devices are denied access to certain web content. May websites have secure areas that are programmed to keep out the robots. And sometimes websites unfortunately are so badly architected that the robot has no chance of accessing the web content and instead gets caught in a sort of “web” or eddy, effectively going around and around in circles but arriving nowhere.
Search engine optimization is a process that clears the path for the search engine bot so that it can conduct its web content harvesting fruitfully, returning to the search engine a horn of plenty that is easily indexed for future retrieval.