It's important to point out up front that Google makes it very clear that providing this file doesn't get you a higher page rank (compared to other sites). What it does do, however, is help google do a better job of indexing your site.
So for instance:
- you can use the Google SiteMap.xml file to give google a hint of how often a page is likely to change so it knows about how often to re-index
- You can also give a priority for pages so it'll take into account what YOU think the most important pages are to show first
- You can also turn off indexing for a page, this applies not just to google but all search engines
To use this feature go to the 'Search Engines' tab, here you will see the ability to create a 'sitemap.xml' file as well as assign a priority and change frequency (how often this page changes).
First let's tell Google that our "Jelly Fish Most Important" we consider the most important page on this site and the 'Jelly Fish" less important is, well, less important. Also since we expect our more important site to update more frequently let's change the frequency to daily & weekly.
Now let's say there is a page you REALLY don't want any search engine to index (for any reason).... no problem click on 'disable search' and this page is excluded from the siteindex but also a meta tag is added to it so search engines won't index it (in the example below I've disabled searching my Home Page)
Finally we should click on the 'create sitemap.xml' file button and a file will be created.
Where does my sitemap.xml file go?
One annoying aspect of how RW currently works with sitemap is the only place you can put files is in the 'files' directory of your page (it's default name is files but in some situations it may have a different name like 'sitemap_files' or 'index_files' or the like)
This means that if your sitemap is below the root level and you put it in a folder called 'sitemap' it would be in.
The problem is that google really wants that sitemap.xml file to be at the root level (e.g. /sitemap.xml) this is because google assumes that if you can put a file at the root level you have a legitimate claim to the site (ok, if you read the google docs you know I'm lying at this point.. The root level is not required but google will only believe you for directories below where sitemap.xml is so for all practical purposes it needs to be the root level)
You have three options at this point.
If your web hosts allows shell access you can log in and create a symbolic link (using the ln command) to a point from your root of your website to where the sitemap.xml file is. This works great and is what I do at loghound.com but it won't work for all folks as many sites don't give shell access
You can copy it over manually... Kind of a pain but if you only update your site every so often not too bad
You can create a applescript or automator workflow to copy it. I've done this for one customer using automator and transmit, using transmit it's dead easy to 'download a file' and then 'upload a file' to a new spot. so after you publish a RW home page you just double click on the work flow and it's handled for you.
I've put together a small screencast that shows how the three different approaches work (15MB, 10 minutes). you can view the screencast here