I've implemented sitemaps for my site (www.trenchmice.com), and I've run 
into a problem because of my site's size.

TrenchMice has 275K pages, generated from 275K+ database objects. (These 
are "topics" and "scoops".) The sitemap classes return information on 
every object, which means try to return information on 275K+ objects! 
And as a result, the sitemap.xml lookup never finishes.  (I gave up 
after waiting an hour...)

The sitemap classes dutifully return infrequently updated objects with a 
low priority and frequency.  But because the classes look up 275K+ 
objects, returning _all_ the items in each set, etc., it never finishes.

Unless I'm missing something obvious (and I might be), a straightforward 
implementation of the sitemaps protocol won't work for large sites.

So, what do large sites do?  Do they return only the most recent N 
objects of every class?  If so, then how do the search engines find out 
about the infrequently updated objects?

John


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to