To comment on the following update, log in, then open the issue: http://www.openoffice.org/issues/show_bug.cgi?id=58310 Issue #:|58310 Summary:|Overly restrictive robots.txt Component:|www Version:|1.0.1 Platform:|All URL:| OS/Version:|All Status:|NEW Status whiteboard:| Keywords:| Resolution:| Issue type:|DEFECT Priority:|P3 Subcomponent:|openoffice.org website general issues Assigned to:|st Reported by:|mtg
------- Additional comments from [EMAIL PROTECTED] Tue Nov 22 07:13:44 -0800 2005 ------- Hi there, I was wondering if it would be possible to have less restrictive robots.txt files, as currently the openoffice pages are not indexed by Google, requiring users to use the far-inferior built in search capability. I understand that the reason for overly restrictive robots.txt files is due to the fact that Collabnet's SourceCast product is very slow (issues 14352 and 24139), and that the additional load of web crawlers might reduce the user experience even further. However, the system is already almost unusable with regard to the speed of retrieving pages, and having a Google index would allow people to use the Google cache files as well as actually be able to find what they need. Thanks, Martin --------------------------------------------------------------------- Please do not reply to this automatically generated notification from Issue Tracker. Please log onto the website and enter your comments. http://qa.openoffice.org/issue_handling/project_issues.html#notification --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]