An ‘XML sitemap’ is a file that lists all the web page urls associated with a website along with meta data that provides additional information that helps the search engines intelligently index the content gathered by the search ‘bots.’
Search engine ‘bots’ are the electronic devices that travel the ‘World Wide Web’ looking for new and refreshed web content to scan. They are able to find and access web page content by following hyperlinks. An ‘XML sitemap’ is a convenient way for search ‘bots’ to access all the available links and web pages within a particular site. The additional information that a ‘webmaster’ provides about the web page within the meta tags can include many things such as a descriptive title for an element of the web page content like a contact form, descriptive information about an image or interactive element, when the content was last updated or how often it is generally updated, whether or not the ‘bot’ should follow a link (some web page content is secure and not intended for public view), and how important the url is in relation to the other pages in the site.
Search engines generally provide a “Submit” tool that allows a ‘webmaster’ to submit the ‘XML sitemap’ directly to the search engine. This does not guarantee that the submitted web pages will be indexed. Neither does not submitting the ‘XML sitemap’ mean the web pages will be excluded from the index. It is the relevance, quality and “popularity” (as identified by the quality and number of ‘inbound links’ pointing to the web pages) of the web pages that determines whether they are to be indexed and where in the ranking they are to be delivered in the ‘search engine results pages’(SERPs).
A search engine optimized ‘XML sitemap’ along with search engine optimized website pages is the surest way to meet the requirements that will enable your web pages to be delivered to and discovered by your target audience groups in the ‘search engine results pages’ (SERPs).