While this can work in some situations, in a Wiki run by volunteers you
rely on people to accurately self-classify their work, which many would
not. Or you rely on other volunteers changing the rating. Whether up or
down, it probably will lead to a big debate. This dozens or even
hundreds of debates a day, which would be quite time consuming. Too
many people already try to AfD photos for phony reasons. ("I don't like
that person; I don't believe you took the picture!" being one I
encountered myself.)
On 7/23/2014 9:51 PM, Kerry Raymond wrote:
I agree that offensiveness is in the eye of the beholder. And while
there may be all manner of very niche groups who find strange things
offensiveness, maybe some people object to seeing refrigerators or
reading about cakes, nonetheless we know that there are a lot of
widespread categories of offensiveness that generate the bulk of
discussions about the inclusion of items on Wikipedia or Commons.
What we could do is to have to some system of classification (like the
movies) for articles, images, and/or categories indicating that they
are potentially offensive for various reasons. Perhaps along similar
lines to the "content advisories" in IMDB, e.g.
http://www.imdb.com/title/tt0295297/parentalguide?ref_=tt_stry_pg
People could then put in their profiles that all classifications are
acceptable or them or that these are the classifications they don't
want to see (e.g. Sex and Nudity, Gore and Violence, Profanity, etc --
obviously our classifications might not be identical to IMDB as we are
dealing with different kinds of content but you get the idea). When
that person searches Wikipedia or Commons, then those articles, images
and categories that they would find offensive are not returned. When a
person reads an article containing an offensive-to-them categorised
image, it is simply not displayed or some image saying "Suppressed at
your request (Sex and Nudity)". We could possibly bundle such these
finer classifications into common collections, e.g. Inappropriate for
Children, Suitable for Muslims, or whatever, so for many people it's a
simple tick-one-box.
For anonymous users or users who have not explicitly set their
preferences, rendering of an article or image could first ask "This
article/image has been tagged as potentially offensive for SuchAndSuch
reason, click OK to confirm you want to view it". If they are a
logged-in user, it could also offer a link to set their preferences
for future use.
I note that movies are often made with variants for different
countries. Sometimes that's simply a matter of being dubbed into
another language but it can also include the deletion (or replacement)
of certain scenes or language that would be offensive in those
countries. So it is not as if we are reinventing the wheel here, just
customising it to Wikipedia.
Kerry
_______________________________________________
Gendergap mailing list
Gendergap@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/gendergap