The new hovercards (which I otherwise love) have created another problem,
in that lead images show up when your cursor hovers over a wikilink.

You would have to be reading an article where potentially offensive images
are in linked pages, so this won't be a problem across the board. But it's
easy to find yourself hovering over a link without intending to, so if
you're in an article that contains such links, you can suddenly have images
of genitalia on your screen without having clicked on the links that
contain them.

Sarah


On Fri, Jul 25, 2014 at 3:08 PM, Andreas Kolbe <jayen...@gmail.com> wrote:

> Thanks to Andrew Gray for covering some of the history.
>
> Kerry, there is further material that you might find of interest in a
> recent (May 2014) discussion on the Wikimedia-l mailing list:
>
>
> http://www.gossamer-threads.com/lists/engine?do=post_view_flat;post=466380;page=1;mh=-1;list=wiki;sb=post_latest_reply;so=ASC
>
> Best,
> Andreas
>
>
> On Fri, Jul 25, 2014 at 10:53 PM, Kerry Raymond <kerry.raym...@gmail.com>
> wrote:
>
>> Well, I am unsurprised that it has been considered before, as it's the
>> obvious solution. Sad that the Board lacked the will to see it through.
>>
>> But it doesn't mean that it could not or should not be raised again.
>> Social
>> justice issues rarely succeed on their first attempt. If we took that
>> attitude, women still wouldn't have the vote.
>>
>> The group we should be most concerned about is younger children. With many
>> children increasingly having smartphones, it is far harder for parents to
>> supervise the content they are viewing (unlike a desktop that can be
>> positioned where the parent can keep an eye on things). At the same time,
>> WMF is putting increasing effort into the mobile platforms and the WMF
>> metrics show consistent uptrends in mobile access. The two trends suggest
>> that Wikipedia and Commons are now a lot more likely to be accessed by
>> children in an unsupervised context.
>>
>> Kerry
>>
>> -----Original Message-----
>> From: shimg...@gmail.com [mailto:shimg...@gmail.com] On Behalf Of Andrew
>> Gray
>> Sent: Saturday, 26 July 2014 4:08 AM
>> To: kerry.raym...@gmail.com; Addressing gender equity and exploring ways
>> to
>> increase the participation of women within Wikimedia projects.
>> Subject: Re: [Spam] Re: [Gendergap] Sexualized environment on Commons
>>
>> Hi Kerry,
>>
>> Sad as it is to be the bearer of dispiriting news...
>>
>> A proposal more or less similar to this was made by the Board in 2011
>> (some kind of image filtering on a user-selected basis) -
>> http://wikimediafoundation.org/wiki/Resolution:Controversial_content
>>
>> The debate about whether (and/or how) to implement it was pretty
>> vicious, pretty angry, and went on for the best part of a year. A
>> September 2011 community poll gave interestingly mixed results -
>>
>> https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-09-05/News_a
>> nd_notes
>> and the development of any software was suspended pending further
>> discussion. In mid-2012, the Board then formally rescinded the
>> "develop a filter system" request -
>>
>> http://wikimediafoundation.org/wiki/Resolution:_Personal_Image_Hiding_Featur
>> e
>> - and it has more or less been dead in the water since then.
>>
>> There's been no significant attempt to revive it, but I think this is
>> in part because the wounds are still fresh - I think were it to be
>> reopened now you'd get much the same result, a lot of heat which
>> eventually stalls.
>>
>> It's worth noting that a very small-scale version of this is in use
>> for some wikis - it's been pointed out that some sexual topics on
>> Arabic Wikipedia have a "click to expand" field which conceals an
>> image - but this is pretty rare and done on a page-by-page, not
>> image-by-image, basis; it also has no user-level customisability.
>>
>> Andrew.
>>
>> On 24 July 2014 02:51, Kerry Raymond <kerry.raym...@gmail.com> wrote:
>> > I agree that offensiveness is in the eye of the beholder. And while
>> there
>> > may be all manner of very niche groups who find strange things
>> > offensiveness, maybe some people object to seeing refrigerators or
>> reading
>> > about cakes, nonetheless we know that there are a lot of widespread
>> > categories of offensiveness that generate the bulk of discussions about
>> the
>> > inclusion of items on Wikipedia or Commons.
>> >
>> >
>> >
>> > What we could do is to have to some system of classification (like the
>> > movies) for articles, images, and/or categories indicating that they are
>> > potentially offensive for various reasons. Perhaps along similar lines
>> to
>> > the "content advisories" in IMDB, e.g.
>> >
>> >
>> >
>> > http://www.imdb.com/title/tt0295297/parentalguide?ref_=tt_stry_pg
>> >
>> >
>> >
>> > People could then put in their profiles that all classifications are
>> > acceptable or them or that these are the classifications they don't want
>> to
>> > see (e.g. Sex and Nudity, Gore and Violence, Profanity, etc - obviously
>> our
>> > classifications might not be identical to IMDB as we are dealing with
>> > different kinds of content but you get the idea). When that person
>> searches
>> > Wikipedia or Commons, then those articles, images and categories that
>> they
>> > would find offensive are not returned. When a person reads an article
>> > containing an offensive-to-them categorised image, it is simply not
>> > displayed or some image saying "Suppressed at your request (Sex and
>> > Nudity)". We could possibly bundle such these finer classifications into
>> > common collections, e.g. Inappropriate for Children, Suitable for
>> Muslims,
>> > or whatever, so for many people it's a simple tick-one-box.
>> >
>> >
>> >
>> > For anonymous users or users who have not explicitly set their
>> preferences,
>> > rendering of an article or image could first ask "This article/image has
>> > been tagged as potentially offensive for SuchAndSuch reason, click OK to
>> > confirm you want to view it". If they are a logged-in user, it could
>> also
>> > offer a link to set their preferences for future use.
>> >
>> >
>> >
>> > I note that movies are often made with variants for different countries.
>> > Sometimes that's simply a matter of being dubbed into another language
>> but
>> > it can also include the deletion (or replacement) of certain scenes or
>> > language that would be offensive in those countries. So it is not as if
>> we
>> > are reinventing the wheel here, just customising it to Wikipedia.
>> >
>> >
>> >
>> > Kerry
>> >
>> >
>> >
>> > ________________________________
>> >
>> > From: gendergap-boun...@lists.wikimedia.org
>> > [mailto:gendergap-boun...@lists.wikimedia.org] On Behalf Of Ryan
>> Kaldari
>> > Sent: Thursday, 24 July 2014 7:11 AM
>> > To: Addressing gender equity and exploring ways to increase the
>> > participationof women within Wikimedia projects.
>> > Subject: Re: [Gendergap] Sexualized environment on Commons
>> >
>> >
>> >
>> > Personally, I don't think it's worth having a discussion here about the
>> > merits of deleting these images. There's no chance in hell they are
>> going
>> to
>> > be deleted from Commons. What I'm more interested in is the locker-room
>> > nature of the discussions and how/if this can be addressed, as I think
>> that
>> > is actually more likely to dissuade female contributors than the images
>> > themselves.
>> >
>> > Ryan Kaldari
>> >
>> >
>> >
>> > On Wed, Jul 23, 2014 at 2:01 PM, Pete Forsyth <petefors...@gmail.com>
>> wrote:
>> >
>> > Ryan, thanks for bringing this up for discussion. I've put a lot of
>> thought
>> > into the series of photos this comes from over the years, and it's well
>> > worth some discussion. I'd like to hear what others think about this.
>> Here
>> > is a link to the category for the larger collection; warning, there's
>> lots
>> > of nudity and sexual objectification here, so don't click if you don't
>> want
>> > to see that:
>> >
>>
>> https://commons.wikimedia.org/wiki/Category:Nude_portrayals_of_computer_tech
>> nology
>> >
>> > First, I agree with Ryan that in the (various) deletion discussions I've
>> > seen around this and similar topics, there is often a toxic level of
>> > childish and offensive comments. I think that's a significant problem,
>> and
>> I
>> > don't know what can be done to improve it. Scolding people in those
>> > discussions often a backfires, and serves only to amplify the offensive
>> > commentary. But silence can imply tacit consent. How should one
>> participate
>> > in the discussion, promoting an outcome one believes in, without
>> > contributing to or enabling the toxic nature of the discourse? I think
>> I've
>> > done a decent job of walking that line in similar discussions, but I'm
>> sure
>> > there's a lot of room for better approaches. I would love to hear what
>> has
>> > worked for others, here and/or privately.
>> >
>> >
>> >
>> > Also, my initial reaction to these images is that they are inherently
>> > offensive; my gut reaction is to keep them off Commons.
>> >
>> >
>> >
>> > But after thinking it through and reading through a number of deletion
>> > discussions, the conclusion I've come to (at least so far) is that the
>> > decision to keep them (in spite of the childish and offensive commentary
>> > along the way) is the right decision. These strike me as the important
>> > points:
>> >
>> > * We have a collection of more than 20 million images, intended to
>> support
>> a
>> > wide diversity of educational projects. Among those 20 million files
>> are a
>> > great many that would be offensive to some audience. (For instance, if I
>> > understand correctly, *all images portraying people* are offensive to at
>> > least some devout Muslims.)
>> > * Were these images originally intended to promote objectification of
>> women?
>> > To support insightful commentary on objectification of women? Something
>> > else? I can't see into the minds of their creators, but I *can* imagine
>> them
>> > being put to all kinds of uses, some of which would be worthwhile. The
>> > intent of the photographer and models, I've come to believe, is not
>> relevant
>> > to the decision. (apart from the basic issue of consent in the next
>> bullet
>> > point:)
>> >
>> > * Unlike many images on Commons, I see no reason to doubt that these
>> were
>> > produced by consenting adults, and intended for public distribution.
>> >
>> > If they are to be deleted, what is the principle under which we would
>> delete
>> > them? To me, that's the key question. If it's simply the fact that we as
>> > individuals find them offensive, I don't think that's sufficient. If
>> it's
>> > out of a belief that they inherently cause more harm than good, I think
>> the
>> > reasons for that would need to be fleshed out before they could be
>> > persuasive.
>> >
>> > Art is often meant to be provocative, to challenge our assumptions and
>> > sensibilities, to prompt discussion. We host a lot of art on Commons. On
>> > what basis would we delete these, but keep other controversial works of
>> art?
>> > Of course it would be terrible to use these in, for instance, a
>> Wikipedia
>> > article about HTML syntax. But overall, does it cause harm to simply
>> have
>> > them exist in an image repository? My own conclusion with regard to this
>> > photo series is that the net value of maintaining a large and diverse
>> > collection of media, without endorsing its contents per se., outweighs
>> other
>> > considerations.
>> >
>> >
>> >
>> > (For anybody interested in the deletion process on Commons, the kinds of
>> > things that are deliberated, and the way the discussions go, you might
>> be
>> > interested in my related blog post from a couple months ago:
>> >
>> http://wikistrategies.net/wikimedia-commons-is-far-from-ethically-broken/
>> )
>> >
>> >
>> >
>> > -Pete
>> >
>> > [[User:Peteforsyth]]
>> >
>> >
>> >
>> >
>> >
>> > On Wed, Jul 23, 2014 at 1:03 PM, Ryan Kaldari <rkald...@wikimedia.org>
>> > wrote:
>> >
>> > If anyone ever needs a good example of the locker-room environment on
>> > Wikimedia Commons, I just came across this old deletion discussion:
>> >
>> >
>>
>> https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Radio_butt
>> on_and_female_nude.jpg
>> <https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Radio_button_and_female_nude.jpg>
>> >
>> > The last two keep votes are especially interesting. One need look no
>> farther
>> > than the current Main Page talk page for more of the same (search for
>> > "premature ejaculation").
>> >
>> > Kaldari
>> >
>> >
>> >
>> > _______________________________________________
>> > Gendergap mailing list
>> > Gendergap@lists.wikimedia.org
>> > https://lists.wikimedia.org/mailman/listinfo/gendergap
>> >
>> >
>> >
>> >
>> > _______________________________________________
>> > Gendergap mailing list
>> > Gendergap@lists.wikimedia.org
>> > https://lists.wikimedia.org/mailman/listinfo/gendergap
>> >
>> >
>> >
>> >
>> > _______________________________________________
>> > Gendergap mailing list
>> > Gendergap@lists.wikimedia.org
>> > https://lists.wikimedia.org/mailman/listinfo/gendergap
>> >
>>
>>
>>
>> --
>> - Andrew Gray
>>   andrew.g...@dunelm.org.uk
>>
>>
>> _______________________________________________
>> Gendergap mailing list
>> Gendergap@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/gendergap
>>
>
>
> _______________________________________________
> Gendergap mailing list
> Gendergap@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/gendergap
>
>
_______________________________________________
Gendergap mailing list
Gendergap@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/gendergap

Reply via email to