https://bugzilla.wikimedia.org/show_bug.cgi?id=27173

wikipe...@deeprocks.de changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
             Status|RESOLVED                    |REOPENED
                 CC|                            |wikipe...@deeprocks.de
         Resolution|WONTFIX                     |

--- Comment #4 from wikipe...@deeprocks.de 2011-02-06 16:51:26 UTC ---
I hereby support the bug Matthias has described above and therefore the
deletion of the meta robots tag for logged in users. I do not care at all about
50 bytes more or less and that is not the point this is all about, but the meta
robots tag brings forward disadvantages in the shape of dangers. These dangers
are shown in the current debate about Google.de not listing German Wikipedia
articles under some circumstances.

Basically, there are two possible scenarios. I want to describe them both in
the following. When I say "Google Bot" I mean any search engine crawler as well
- I just take the Google crawler due to current topicality.

1st: Google Bot crawls pages as an anonymous user (not sending header cookies).
This scenario is the standard one we assume right now. We do not know any
search enigne bots which crawl while being logged in. Therefore: Any robots
information is totally senseless to be sent to logged in users as they are
generally no crawlers and so do not read robots messages. 1st scenario means:
The meta information is obsolete.

2nd: Google Bot crawls pages as logged in user. In this case, the usage of
robots information is sensitive also for logged in users. Then, however, this
could be the reason (or one reason among others) regarding the Google <->
Wikipedia problem existing right now. If the 2nd scenario could apply, the
robots information should be removed temporarily just to make sure it is NOT
responsible for the problems. 2nd scenario means: It is likely that the robots
information and the Google problem are related to each other. To fix the
problem as fast as possible, the robots information should be disabled (for a
while, at least).

As you can see, both possible cases make me urge to delete the robots
information - at least for a couple of weeks. As soon as Google lists up all
the Wikipedia articles again and both MediaWiki Techs and Google Techs found
the reason causing the problem, they should deliberate if this meta tags are
reasonable and should be added back.

However, according to statements from Wikimedia, neither Google Bots nor any
other search engine crawlers log in. If this is true, there is no need for
those meta tags as they are NEVER read by crawlers and are therefore no more
than source code waste.

-- 
Configure bugmail: https://bugzilla.wikimedia.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
You are on the CC list for the bug.

_______________________________________________
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to