https://bugzilla.wikimedia.org/show_bug.cgi?id=45347

MZMcBride <b...@mzmcbride.com> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
           Keywords|                            |easy

--- Comment #4 from MZMcBride <b...@mzmcbride.com> ---
(In reply to comment #3)
> Note we have a robots.txt generation code from the MediaWiki:Robots.txt
> system message, so technically this is fairly simple to implement.

Trivial, even.

> If you really want to go on this proposal, please submit a humans.txt sample
> content, so we'll have a basis for discussion. I will then give you some
> technical notes about how to generate this content if there are fields to
> automate. Then, you'll be able to launch a discussion with the community on
> en. or meta.

I'm not sure a sample humans.txt file is needed. Project autonomy and
sovereignty can probably guide us here. We could easily implement the ability
to output a 404 error at /humans.txt unless a domain's [[MediaWiki:humans.txt]]
page exists. For example, [[MediaWiki:humans.txt]] would control
<https://en.wikipedia.org/humans.txt>. This approach would allow projects to
decide for themselves whether to have a file like this (if no MediaWiki page
exists --> no file exists) and what the file should contain if it's to exist,
based on local community consensus.

-- 
You are receiving this mail because:
You are on the CC list for the bug.
You are the assignee for the bug.
You are watching all bug changes.
_______________________________________________
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to