Am 18.06.2012 16:31, schrieb Thomas Morton:
On 18 June 2012 15:16, Tobias Oelgarte<tobias.oelga...@googlemail.com>wrote:
Any tagging by non neutral definitions would interfere with project. It's
like to create categories named "bad images", "uninteresting topics" or
"not for ethnic minority X".

Of course; but that is predicated on a bad process design. Solution; design
an appropriate process.

So far i have not seen any indication to design an appropriate process. If there is such a design work in progress i would be really interested how the current ideas look like and if they are more convincing then the latest proposals (e.g. referendum) that only touched the surface and ignored many potential issues.

Editorial judgment is based on how to wrap up a topic a nice way without
making an own judgment about the topic. A hard job to do, but that is the
goal.

If i would write the article "pornography" then i would have to think
about what should be mentioned inside this article because it is important
and which parts are not relevant enough or should be but in separate
sections to elaborate them in further detail. This is entirely different to
say "pornography is good or evil" or "this pornographic practice is good or
evil and thats why it should be mentioned or excluded".

There is a difference between the relevance of a topic and the attitude
toward a topic. The whole image filter idea is based on the latter and not
to be confused with editorial judgment.

Pornography articles, as it stands, have a community-implemented "filter"
as it is. Which is the tradition that articles are illustrated with
graphics, not photographs. So the example is a poor one; because we already
have a poor man's filter :)

Similarly the decision "does this image represent hardcore porn, softcore
porn, nudity or none of the above" is an editorial one. Bad design process
would introduce POV issues - but we are plagued with them anyway. If
anything this gives us an opportunity to design and trial a process without
those issues (or at least minimising them).


That is already a sad thing, but this does not apply to all language versions. Some only use this illustrations since they are more suitable to illustrate the term or practice, other because of the "community implemented filter" and it might vary from article to article.

You make me interested to hear what a good design could look like.
I would have nothing against additional work if i would see the benefits.
But in this case i see some good points and i also see list of bad points.
At best it might be a very tiny improvement which comes along with a huge
load of additional work while other parts could be improved with little
extra work and be a true improvement. If we had nothing better to do then i
would say "yes lets try it". But at the moment it is a plain "No, other
things have to come first".
Don't confuse opt-in and opt-out if a filter is implemented on an external
platform. There is no opt-in or opt-out for Wikipedia as long the WP isn't
blocked and the filter is the only access to Wikipedia.<contains some
irony>We have the long story that parents want their children to visit
Wikipedia without coming across controversial content, which they
apparently do everytime they search for something entirely
unrelated.</contains some irony>  In this case an opt-in (to view) filter
makes actually sense. Otherwise it doesn't.

We may be confusing opt in/out between us. The filter I would like to see
is optional to enable (and then stays enabled) and gives a robust method of
customising the level and type of filtering.


While I'm personally not against filtering on personal level someone will still have to deal with it (open design question).

We have such discussions. But I'm afraid that most of them do not circle
around the benefits of the image for the article, but the latter part that
i mentioned above (editorial judgment vs attitude judgment).

Filtering images would resolve most of these issues.


I think it would just reset the borders, but it won't take long until new lines are drawn and the discussions will continue. Now it is "OMG vs WP:NOT CENSORED" later it will be "OMG vs Use the filter". But at the same time we will have new discussions regarding the filter itself (open design question).
Believe me or believe me not. If we introduce such tagging then the
discussions will only be about personal attitude towards an image, ignoring
the context, it's educational benefits entirely.

We successfully tag images as pornographic, apparently without drama,
already. So I find this scenario unlikely.


No. We don't tag images _as_ pornographic. We tag them _as related to_ pornography. Just take a look at the category pornography at Commons.

http://commons.wikimedia.org/wiki/Category:Pornography

This applies to terms like violence and other stuff as well.

It is a chicken/egg problem. One part of our community (including readers)
dislikes tagging/filtering and sees it as (or the tool for) the creation of
road blocks that don't exist at the moment. A second part of our community
wants it to be more conservative in fear that it might the deciding factor
that could create road blocks. I already mentioned it above in the
"benefits vs effort" section.


We don't have much data on what our readers want; but a not insignificant
portion of them, at least, are concerned with controversial images
(nudity, Mohammed, etc.). I fully advocate finding out what the community
thinks; but when I raised this issue before it was snorted at with
something along the lines of "the readers aren't the driving force here".

*sigh*


I asked for the same thing and got no response as well. We had the referendum which had big flaws,[1] but not a single neutral survey directed at the readers under the light that our community is most likely biased...

[1] explained in length at http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/en



_______________________________________________
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l

Reply via email to