Crawl Rate refers to the frequency with which a search engine ‘bot’ visits or crawls your web page content looking for new or refreshed content.
In line with the search engine’s objective to provide the most relevant content, its ‘bots’ are programmed to crawl the web looking for new and updated content according to the rules of its proprietary ‘algorithm.’
Google’s Webmaster Tools provides information about Google bots only so the only other way to determine crawl rate is to examine the server logs. Having your own dedicated server facilitates this task; howvere, if you are on a shared server be prepared to dedicate some time to it.
Knowing the exact crawl rate isn’t necessarily critical unless you have a very large website, for example, where information is time-sensitive. In this scenario you will want to know how often bots are coming to ensure your schedule for adding new information is optimal.
For most others, it is best to adopt the practice of updating your website information at the very least once every two to three months. Complacency toward your website content may hurt your efforts to list on page one in the search engine results pages. More importantly, it is in your best interests to continue building out your site. The more pages of relevant content you offer the greater chances you have of reaching and attracting your diverse target audience groups at the time they are searching for what you offer. If more roads lead to Rome, there are more ways to get there!