On Fri, 2008-05-23 at 22:04 +0200, BJörn Lindqvist wrote: > Many years ago, someone enabled google to index bugzilla by modifying > the robots file that keeps it out. It didnt work so well because the > google spider trashed the site to hard. But now bugzilla is hosted on > better hardware with more bandwith so it should be possible to try > again. Plus, there are lots of optimizations you can do to reduce how > much resources the spider uses. > > Bugzilla contains a huge amount of information. Being able to google > that would be awesome.
I'd agree to do this if bugzilla was rock-stable and always fast in the daily use, but these days it is instead quite slow and a lot of times it timeouts the connection. So maybe adding another extra load isn't the right thing to do. Cheers! -- Cosimo _______________________________________________ desktop-devel-list mailing list [email protected] http://mail.gnome.org/mailman/listinfo/desktop-devel-list
