Hi Kerry,

Sad as it is to be the bearer of dispiriting news...

A proposal more or less similar to this was made by the Board in 2011
(some kind of image filtering on a user-selected basis) -
http://wikimediafoundation.org/wiki/Resolution:Controversial_content

The debate about whether (and/or how) to implement it was pretty
vicious, pretty angry, and went on for the best part of a year. A
September 2011 community poll gave interestingly mixed results -
https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-09-05/News_and_notes
and the development of any software was suspended pending further
discussion. In mid-2012, the Board then formally rescinded the
"develop a filter system" request -
http://wikimediafoundation.org/wiki/Resolution:_Personal_Image_Hiding_Feature
- and it has more or less been dead in the water since then.

There's been no significant attempt to revive it, but I think this is
in part because the wounds are still fresh - I think were it to be
reopened now you'd get much the same result, a lot of heat which
eventually stalls.

It's worth noting that a very small-scale version of this is in use
for some wikis - it's been pointed out that some sexual topics on
Arabic Wikipedia have a "click to expand" field which conceals an
image - but this is pretty rare and done on a page-by-page, not
image-by-image, basis; it also has no user-level customisability.

Andrew.

On 24 July 2014 02:51, Kerry Raymond <kerry.raym...@gmail.com> wrote:
> I agree that offensiveness is in the eye of the beholder. And while there
> may be all manner of very niche groups who find strange things
> offensiveness, maybe some people object to seeing refrigerators or reading
> about cakes, nonetheless we know that there are a lot of widespread
> categories of offensiveness that generate the bulk of discussions about the
> inclusion of items on Wikipedia or Commons.
>
>
>
> What we could do is to have to some system of classification (like the
> movies) for articles, images, and/or categories indicating that they are
> potentially offensive for various reasons. Perhaps along similar lines to
> the “content advisories” in IMDB, e.g.
>
>
>
> http://www.imdb.com/title/tt0295297/parentalguide?ref_=tt_stry_pg
>
>
>
> People could then put in their profiles that all classifications are
> acceptable or them or that these are the classifications they don’t want to
> see (e.g. Sex and Nudity, Gore and Violence, Profanity, etc – obviously our
> classifications might not be identical to IMDB as we are dealing with
> different kinds of content but you get the idea). When that person searches
> Wikipedia or Commons, then those articles, images and categories that they
> would find offensive are not returned. When a person reads an article
> containing an offensive-to-them categorised image, it is simply not
> displayed or some image saying “Suppressed at your request (Sex and
> Nudity)”. We could possibly bundle such these finer classifications into
> common collections, e.g. Inappropriate for Children, Suitable for Muslims,
> or whatever, so for many people it’s a simple tick-one-box.
>
>
>
> For anonymous users or users who have not explicitly set their preferences,
> rendering of an article or image could first ask “This article/image has
> been tagged as potentially offensive for SuchAndSuch reason, click OK to
> confirm you want to view it”. If they are a logged-in user, it could also
> offer a link to set their preferences for future use.
>
>
>
> I note that movies are often made with variants for different countries.
> Sometimes that’s simply a matter of being dubbed into another language but
> it can also include the deletion (or replacement) of certain scenes or
> language that would be offensive in those countries. So it is not as if we
> are reinventing the wheel here, just customising it to Wikipedia.
>
>
>
> Kerry
>
>
>
> ________________________________
>
> From: gendergap-boun...@lists.wikimedia.org
> [mailto:gendergap-boun...@lists.wikimedia.org] On Behalf Of Ryan Kaldari
> Sent: Thursday, 24 July 2014 7:11 AM
> To: Addressing gender equity and exploring ways to increase the
> participationof women within Wikimedia projects.
> Subject: Re: [Gendergap] Sexualized environment on Commons
>
>
>
> Personally, I don't think it's worth having a discussion here about the
> merits of deleting these images. There's no chance in hell they are going to
> be deleted from Commons. What I'm more interested in is the locker-room
> nature of the discussions and how/if this can be addressed, as I think that
> is actually more likely to dissuade female contributors than the images
> themselves.
>
> Ryan Kaldari
>
>
>
> On Wed, Jul 23, 2014 at 2:01 PM, Pete Forsyth <petefors...@gmail.com> wrote:
>
> Ryan, thanks for bringing this up for discussion. I've put a lot of thought
> into the series of photos this comes from over the years, and it's well
> worth some discussion. I'd like to hear what others think about this. Here
> is a link to the category for the larger collection; warning, there's lots
> of nudity and sexual objectification here, so don't click if you don't want
> to see that:
> https://commons.wikimedia.org/wiki/Category:Nude_portrayals_of_computer_technology
>
> First, I agree with Ryan that in the (various) deletion discussions I've
> seen around this and similar topics, there is often a toxic level of
> childish and offensive comments. I think that's a significant problem, and I
> don't know what can be done to improve it. Scolding people in those
> discussions often a backfires, and serves only to amplify the offensive
> commentary. But silence can imply tacit consent. How should one participate
> in the discussion, promoting an outcome one believes in, without
> contributing to or enabling the toxic nature of the discourse? I think I've
> done a decent job of walking that line in similar discussions, but I'm sure
> there's a lot of room for better approaches. I would love to hear what has
> worked for others, here and/or privately.
>
>
>
> Also, my initial reaction to these images is that they are inherently
> offensive; my gut reaction is to keep them off Commons.
>
>
>
> But after thinking it through and reading through a number of deletion
> discussions, the conclusion I've come to (at least so far) is that the
> decision to keep them (in spite of the childish and offensive commentary
> along the way) is the right decision. These strike me as the important
> points:
>
> * We have a collection of more than 20 million images, intended to support a
> wide diversity of educational projects. Among those 20 million files are a
> great many that would be offensive to some audience. (For instance, if I
> understand correctly, *all images portraying people* are offensive to at
> least some devout Muslims.)
> * Were these images originally intended to promote objectification of women?
> To support insightful commentary on objectification of women? Something
> else? I can't see into the minds of their creators, but I *can* imagine them
> being put to all kinds of uses, some of which would be worthwhile. The
> intent of the photographer and models, I've come to believe, is not relevant
> to the decision. (apart from the basic issue of consent in the next bullet
> point:)
>
> * Unlike many images on Commons, I see no reason to doubt that these were
> produced by consenting adults, and intended for public distribution.
>
> If they are to be deleted, what is the principle under which we would delete
> them? To me, that's the key question. If it's simply the fact that we as
> individuals find them offensive, I don't think that's sufficient. If it's
> out of a belief that they inherently cause more harm than good, I think the
> reasons for that would need to be fleshed out before they could be
> persuasive.
>
> Art is often meant to be provocative, to challenge our assumptions and
> sensibilities, to prompt discussion. We host a lot of art on Commons. On
> what basis would we delete these, but keep other controversial works of art?
> Of course it would be terrible to use these in, for instance, a Wikipedia
> article about HTML syntax. But overall, does it cause harm to simply have
> them exist in an image repository? My own conclusion with regard to this
> photo series is that the net value of maintaining a large and diverse
> collection of media, without endorsing its contents per se., outweighs other
> considerations.
>
>
>
> (For anybody interested in the deletion process on Commons, the kinds of
> things that are deliberated, and the way the discussions go, you might be
> interested in my related blog post from a couple months ago:
> http://wikistrategies.net/wikimedia-commons-is-far-from-ethically-broken/ )
>
>
>
> -Pete
>
> [[User:Peteforsyth]]
>
>
>
>
>
> On Wed, Jul 23, 2014 at 1:03 PM, Ryan Kaldari <rkald...@wikimedia.org>
> wrote:
>
> If anyone ever needs a good example of the locker-room environment on
> Wikimedia Commons, I just came across this old deletion discussion:
>
> https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Radio_button_and_female_nude.jpg
>
> The last two keep votes are especially interesting. One need look no farther
> than the current Main Page talk page for more of the same (search for
> "premature ejaculation").
>
> Kaldari
>
>
>
> _______________________________________________
> Gendergap mailing list
> Gendergap@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/gendergap
>
>
>
>
> _______________________________________________
> Gendergap mailing list
> Gendergap@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/gendergap
>
>
>
>
> _______________________________________________
> Gendergap mailing list
> Gendergap@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/gendergap
>



-- 
- Andrew Gray
  andrew.g...@dunelm.org.uk

_______________________________________________
Gendergap mailing list
Gendergap@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/gendergap

Reply via email to