Feature - Automatic Sitemaps for Qrimp Apps

Posted: 6/16/2008 2:24:00 AM
Keep your sites plugged into the web with automatic sitemap generation for all your public content.
Now Qrimp automatically builds XML site maps for your sites. To do this, we have added a few new items to your sites. Additionally, Qrimp will automatically display a robots.txt file to let crawlers know about your site map. If you would like to override this default robots.txt, add a url map to an attachment that contains your desired robots.txt settings.

What have we added?

First, we created a new query that returns the list of tables in your site that are visible to anonymous users. You can modify this query if you would like to share more of your site with the search bots. The Query is called "SiteMapIndex" and you can see it by visiting Develop > Query Designer, then selecting SiteMapIndex in the Query list.

Second, we added two new views, one for the SiteMapIndex and one for the SiteMap. Qrimp will build one sitemap index with individual sitemaps for each table in your system. This will allow for a more scalable sitemap system and ensure that more records in your tables are available for indexing. The current limit of urls per sitemap is 50,000, so that's the most we include.

You can modify either of these views. You may want to modify the SiteMap view to link directly to a particular view for the detail items. We have linked them to view 11, but you may have another standard view for your items. The system is customizable like other parts of Qrimp. You may want to tailor it to your specific needs.

For more information, read about Sitemaps and robots.txt.

Google Webmaster Tools

You may want to use Google Webmaster Tools to verify that your sitemaps are working as planned. To do this, add a site to your dashboard and then verify it. To verify your site, you should choose the option to upload an html file. Google will give you the name of the file to upload. Copy the name and then create a Clean Url to this filename. Map it to the dirty url "portal.aspx" without the quotes. It doesn't really matter which file you map it to as long as Google doesn't get a 404 error when attempting to retrieve the page.

Once you have verified your site, you can use all the standard Google Webmaster Tools to control access to your site, set crawl rates, and more.


Automatically generating sitemaps improves visibility for your sites by better instructing search engines and other web crawlers how to crawl your pages. Typically, entering your site from the default url will take the crawler to the portal. JavaScript and URLs with & and ? can confuse some crawlers. The Sitemaps feature alleviates many of these issues and your site's pages should appear more easily in the search engines.