It is fine for robots to crawl the wiki pages, but they should perform
actions, generate huge diffs, search/highlight pages or generate
calendars.
---
htdocs/robots.txt | 4
1 file changed, 4 insertions(+)
diff --git a/htdocs/robots.txt b/htdocs/robots.txt
index 057c5899..36be4d13 100644
--- a/htdocs/robots.txt
+++ b/htdocs/robots.txt
@@ -14,4 +14,8 @@ Disallow: /bugzilla/show_bug.cgi*ctype=xml*
Disallow: /bugzilla/attachment.cgi
Disallow: /bugzilla/showdependencygraph.cgi
Disallow: /bugzilla/showdependencytree.cgi
+Disallow: /wiki/*?action=*
+Disallow: /wiki/*?diffs=*
+Disallow: /wiki/*?highlight=*
+Disallow: /wiki/*?calparms=*
Crawl-Delay: 60
--
2.43.2