A site map (or sitemap) is a list of pages of a web site.
There are three primary kinds of site map:
Site maps used during the planning of a Web site by its designers.
Human-visible listings, typically hierarchical, of the pages on a site.
Structured listings intended for web crawlers such as search engines.
Sitemaps may be addressed to users or to software. Many sites have user-visible sitemaps which present a systematic view, typically hierarchical, of the site. These are intended to help visitors find specific pages, and can also be used by crawlers. Alphabetically organized site maps, sometimes called site indexes, are a different approach.
They also act as a navigation aid by providing an overview of a site’s content at a single glance.
Google introduced the Sitemaps protocol so web developers can publish lists of links from across their sites. The basic premise is that some sites have a large number of dynamic pages that are only available through the use of forms and user entries. The Sitemap files contains URLs to these pages so that web crawlers can find them. Bing, Google, Yahoo and Ask now jointly support the Sitemaps protocol.
Since the major search engines use the same protocol, having a Sitemap lets them have the updated page information. Sitemaps do not guarantee all links will be crawled, and being crawled does not guarantee indexing. Google Webmaster Tools allow a website owner to upload a sitemap that Google will crawl, or they can accomplish the same thing with the robots.txt file.
XML Sitemaps have replaced the older method of “submitting to search engines” by filling out a form on the search engine’s submission page. Now web developers submit a Sitemap directly, or wait for search engines to find it. Regularly submitting an updated sitemap when new pages are published may allow search engines to find and index those pages more quickly than it would by finding the pages on its own.
Benefits of XML sitemaps to search-optimize Flash sites
Sitemaps are a useful tool for making sites built in Flash and other non-html languages searchable. If a website’s navigation is built with Flash, an automated search program would probably only find the initial homepage; subsequent pages are unlikely to be found without an XML sitemap.
The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for crawling. A Sitemap is an XML file that lists the URLs for a site. It allows webmasters to include additional information about each URL: when it was last updated, how often it changes, and how important it is in relation to other URLs in the site. This allows search engines to crawl the site more efficiently and to find URLs that may be isolated from rest of the site’s content. The sitemaps protocol is a URL inclusion protocol and complements robots.txt, a URL exclusion protocol.
Sitemaps are particularly beneficial on websites where: Some areas of the website are not available through the browsable interface
Webmasters use rich Ajax, Silverlight, or Flash content that is not normally processed by search engines.
The site is very large and there is a chance for the web crawlers to overlook some of the new or recently updated content when websites have a huge number of pages that are isolated or not well linked together, or when a website has few external links.
XML Sitemaps for Search English Consumption.