As many know, Google search results are based on a crawler software that surfs our sites like a regular user, more or less. If for some reason we have pages on our site that are not linked from any other page, they will not appear in search results. In order to improve google's crawling in our site we should send google a link to an Xml file that we create in our site, and in this file we can list all the site's pages that we want Google to show in his search results. This file is called a Sitemap.
If a site is made of static pages, you just write them down in your file and you are done. All you need to do now (After submitting the Sitemap, of course) is to update the file with any new pages that you create. If, on the other hand, you are running a dynamic site that pulls it's pages out of a database, you have more work to do.
First, you need to create a Permalink for every page. Permalink is a static Url, without ? and parameters in the Url. You can use any unique value for that, like product name or page title.
A permalink can look like this: http://www.5min.com/Video/What-is-5min%20XH9DUGl7fO8%3d.
Then you need to generate the sitemap from a server page. You can use the XmlDocument object model to build the correct Xml file format. I suggest that you use an HttpHandler for this task, but any Asp page will do as well. All you need to do is to write the Xml file into the Http Response object.
Now that you have the Sitemap ready, you have to submit it to google as your Sitemap. For this you have to register to the Google Webmaster Tool service. If you already have a Google account then it's enough. There you have simple instructions how to submit your Sitemap. You can submit many sites and see how google reads them and more statistics.
You can find good information on Sitemaps at http://www.sitemaps.org/.
An already-made c# Sitemap class can be found here.
Please comment with your opinion on this post.
Benny.
Qt Start Basic Programming :-1
11 years ago