On Wed, 2011-01-19 at 10:58 -0500, Mike Rathburn wrote:
> You can speed up and control in more granular detail the Google stuff
> by signing up an account with Google Webmaster Tools.
This is interesting, from Webmaster Tools -> Diagnostics -> Crawl Stats
High Average Low
Pages crawled per day 19,875 1,530 4
Kilobytes downloaded per day 324,257 24,761 14
Time spent downloading a page 740 482 162
(milliseconds)
How Google is able to hit on avg 1.5k pages on the wiki is beyond me....
A whole lot of 404s and other things going on. From the looks of things
it would hit all pages at least twice, like the regular version, and
then the printed version.
Between how Google's bots and WikiMedia was developed, the two seem to
make each other go crazy ;)
--
William L. Thomson Jr.
Obsidian-Studios, Inc.
http://www.obsidian-studios.com
---------------------------------------------------------------------
Archive http://marc.info/?l=jaxlug-list&r=1&w=2
RSS Feed http://www.mail-archive.com/[email protected]/maillist.xml
Unsubscribe [email protected]