daniel added a comment. |
robots.txt is controlled by the WMF. Special pages are listed there because Special pages generally contain dynamic data, and should not be cached.
Special:EntityData could be indexable, but it's a bit awkward. Depending on the request (particularly, the query string and Accept header), it may produce JSON or RDF, or a redirect to the regular HTML page. I think it would be fine to allow crawlers to index these, but I also see little added value in doing so.
TASK DETAIL
EMAIL PREFERENCES
To: daniel
Cc: hoo, daniel, aude, Aklapper, Lydia_Pintscher, Lahi, GoranSMilovanovic, QZanden, Wikidata-bugs, Mbch331
Cc: hoo, daniel, aude, Aklapper, Lydia_Pintscher, Lahi, GoranSMilovanovic, QZanden, Wikidata-bugs, Mbch331
_______________________________________________ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs