I presume that uploaders only upload images they are personally comfortable
with, so it is almost axiomatic that it would be others who would probably
add such classifications, just as occurs with movies. I have no idea how
IMDB make it work, but they do and they are using volunteers too. I note
that IMDB use a 1-to-10 scale for the classifications. Maybe they just let
people vote and the result is the average.

 

But, whether or not my proposal can work, I think we have to use this list
to put forward ideas with a view to rolling out some kind of
trial/pilot/experiment. The gender gap is of long standing and is unlikely
to spontaneously disappear by just talking about it. 

 

Kerry

 

  _____  

From: Carol Moore dc [mailto:carolmoor...@verizon.net] 
Sent: Friday, 25 July 2014 6:34 AM
To: kerry.raym...@gmail.com; Addressing gender equity and exploring ways to
increase the participation of women within Wikimedia projects.
Subject: Re: [Gendergap] Sexualized environment on Commons

 

While this can work in some situations, in a Wiki run by volunteers you rely
on people to accurately self-classify their work, which many would not. Or
you rely on other volunteers changing the  rating. Whether up or down, it
probably will lead to a big debate. This dozens or even hundreds of debates
a day, which would be quite time consuming.  Too many people already try to
AfD photos for phony reasons. ("I don't like that person; I don't believe
you took the picture!" being one I encountered myself.)

On 7/23/2014 9:51 PM, Kerry Raymond wrote:

I agree that offensiveness is in the eye of the beholder. And while there
may be all manner of very niche groups who find strange things
offensiveness, maybe some people object to seeing refrigerators or reading
about cakes, nonetheless we know that there are a lot of widespread
categories of offensiveness that generate the bulk of discussions about the
inclusion of items on Wikipedia or Commons.

 

What we could do is to have to some system of classification (like the
movies) for articles, images, and/or categories indicating that they are
potentially offensive for various reasons. Perhaps along similar lines to
the "content advisories" in IMDB, e.g.

 

http://www.imdb.com/title/tt0295297/parentalguide?ref_=tt_stry_pg

 

People could then put in their profiles that all classifications are
acceptable or them or that these are the classifications they don't want to
see (e.g. Sex and Nudity, Gore and Violence, Profanity, etc - obviously our
classifications might not be identical to IMDB as we are dealing with
different kinds of content but you get the idea). When that person searches
Wikipedia or Commons, then those articles, images and categories that they
would find offensive are not returned. When a person reads an article
containing an offensive-to-them categorised image, it is simply not
displayed or some image saying "Suppressed at your request (Sex and
Nudity)". We could possibly bundle such these finer classifications into
common collections, e.g. Inappropriate for Children, Suitable for Muslims,
or whatever, so for many people it's a simple tick-one-box.

 

For anonymous users or users who have not explicitly set their preferences,
rendering of an article or image could first ask "This article/image has
been tagged as potentially offensive for SuchAndSuch reason, click OK to
confirm you want to view it". If they are a logged-in user, it could also
offer a link to set their preferences for future use.

 

I note that movies are often made with variants for different countries.
Sometimes that's simply a matter of being dubbed into another language but
it can also include the deletion (or replacement) of certain scenes or
language that would be offensive in those countries. So it is not as if we
are reinventing the wheel here, just customising it to Wikipedia.

 

Kerry

 

_______________________________________________
Gendergap mailing list
Gendergap@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/gendergap

Reply via email to