NYT > opinion > Newitz > We Forgot About the Most Important Job on the Internet

2020-03-14 Thread nettime's_elderly_janitor


< 
https://www.nytimes.com/2020/03/13/opinion/sunday/online-comment-moderation.html
 >

Opinion

We Forgot About the Most Important Job on the Internet

Content moderators are essential gatekeepers, but also our
greeters, paramedics, law enforcers, teachers and curators.

Annalee Newitz

March 13, 2020

Most of the internet is made of comments. Some are like the
old-fashioned ones you can see accompanying this article
online, but others take the form of memes, gamer
live-streams, or even breaking news from people on the
ground at a disaster scene.

And yet, the more ubiquitous comments are, the more that
tech companies treat them like the detritus of the internet
-- little more than raw data to be mined and analyzed for
political candidates or marketers, or mechanically sorted by
algorithms for posting or rejecting.

It doesn't have to be this way. In all these efforts to
process comments in various ways, we've lost sight of one of
the most crucial jobs created by the internet economy: the
moderator.

We need to put human moderators back at the center of our
social media, where they belong. But to do it, we'll need to
acknowledge what moderators have already done, and what the
job actually involves.

 [What are your thoughts on the role of comments on the
 internet? What has your experience been commenting online?
 Tell us about it in the (ahem) comments.]

"Moderator" became a tech job in the early 2000s, right
around the time when people started joking, "Never read the
comments," because they were so unbearable. Companies hired
moderators to prevent abuse, report illegal content to law
enforcement, ban commenters who broke the rules and
generally keep the peace.

But the gig was more than that. Jessamyn West, a librarian
who was a moderator for 10 years at MetaFilter, said the job
is like what Catskill entertainers of the mid-20th century
called a tummler, "the person in the room who isn't quite
the M.C. but walks around and makes sure you're doing OK."
Tummlers were basically professional minglers at shows and
social gatherings. If you were feeling shy, they'd even help
you strike up a conversation with other vacationers at the
resort.

Then, as the number of commenters soared, behemoth platforms
like Facebook and YouTube had a tough time scaling up the
tummler model. They also needed a new kind of moderator, one
who was more like a paramedic than a social director.

These moderators are the people who review abuse complaints,
usually on posts that have been flagged by users. Like
paramedics in real life, they see a lot of things they wish
they could unsee. Sarah T. Roberts, an information studies
professor at the University of California, Los Angeles, has
interviewed moderators who report spending days at a time
looking at videos of animal torture, child abuse and worse.
In her recent book, "Behind the Screen," she found that
moderators suffer traumas that are very similar to those
felt by rescue workers at a disaster scene.

To cope, some companies have tried to replace human
moderators with algorithms. The results have been mixed at
best. Some of the most high-profile failures were at
Facebook, where algorithms censored archaeological images
showing a 30,000 year-old nude figurine, while allowing live
video of suicides to circulate widely. Facebook promised
last year to hire thousands of human moderators -- and, in
some cases, to provide them with trauma therapy.

Those are good first steps for disaster-response moderation,
but we also need to revive what Ms. West called the tummler
part of the job. It's a tough gig, but it can be done.
Especially if companies admit that there is no
one-size-fits-all solution for moderation.

This is why human moderators are so valuable: they can
understand what's important to the community they're
moderating. On the Reddit forum r/science, for example,
moderators will delete posts that aren't based on
peer-reviewed scientific research. And on the fan-fiction
forum An Archive of Our Own, where many people prefer to
post stories under pseudonyms, members can be banned for
revealing the legal names of another member.

A well-trained moderator enforces these rules not just to
delete abuse, but also to build up a unique community. At
AO3, for example, there is a class of moderator called a
"tag wrangler," whose job is to make sure stories are
labeled properly for users who don't want "Iron Man" fic
mixed in with "Iron Giant" fic. Or "Iron Chef"! The forum is
also recruiting bilingual moderators who can answer
questions and post items of interest for its growing
community on Weibo, China's most popular microblogging site.

Monique Judge, an editor at the black news site The Root,
told me that she and her colleagues are inundated with
racist comments. But instead of banning the commenters, or
deleting their words, The Root lets them stand. "We let
those stay so that people can see how ignorant they are,"
she said. "I feel like those comments are just our 

Re: nettime: down & up and the need for long-term archiving

2020-03-14 Thread nettime's mods


Dear Nettimers,

First and foremost: many thanks to everyone for the offers and
suggestions, on- and off-list.

In retrospect, our description of our infrastructure as brittle was
overstated.

The issues isn't really technical. Over more than 25 years, nettime has
relied on extensive networks of trans-atlantic friendship, and they've
served the list amazingly well. For many years now, nettime-l has been
run by kein.org, while the server (nettime.org) for the archive and for
the other nettime lists (such as nettime-nl) have been run by
bitnik.org. We're very happy -- and grateful -- about this.

The reason we raise the archiving question has less to do with any
breakdown of technical infrastructure than with the fact that the value
of the list changes at some point, from a living engagement to something
more historical. The earlier model is sustained by day-to-day attention
and care, but this later model needs a more institutional kind of care.
But we should note that this isn't a synchronous process; for some, the
'nettime' project ceased to be interesting long ago, while for others
its currently of interest or they see it as in a lull and it could
become interesting again in the future.

So when we speak about looking for an archival solution, we're thinking
about something that has a reasonable chance of lasting, let's say
arbitrarily, 25 years past the last 'nettimer' stops caring.


All the best. Felix & Ted



On 05.03.20 08:43, nettime's mods wrote:
> Dear Nettimers,
> 
> Very few people have noticed that the list was down for more than a month.
> Indeed, even we didn't noticed this immediately. It's up again now, but...

<>




#  distributed via : no commercial use without permission
#is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mx.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nett...@kein.org
#  @nettime_bot tweets mail w/ sender unless #ANON is in Subject:


no go topics in state media

2020-03-14 Thread János Sugár



Hungarian state media bosses told staff they need 
permission to report on Greta Thunberg and EU 
politics, and banned coverage of reports from 
leading human rights organizations, according to 
internal emails obtained by POLITICO.
Editors working in state media are provided with 
lists of sensitive topics, and any coverage 
related to the issues mentioned requires staff to 
send draft content for approval from higher up, 
the internal correspondence shows. In the case of 
Thunberg, the Swedish climate activist, 
journalists were told they need permission before 
they even start writing, according to one email.
Hungary is currently subject to the EU's Article 
7 censure procedure, triggered when the bloc's 
fundamental values are considered at risk in a 
member country. The European Parliament launched 
the procedure in 2018, citing media freedom as 
one of many issues that gave cause for alarm. 
Prime Minister Viktor Orbán's government has 
dismissed such concerns.



https://www.politico.eu/article/hungarian-state-media-not-free-to-report-on-greta-thunberg-human-rights/

#  distributed via : no commercial use without permission
#is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mx.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nett...@kein.org
#  @nettime_bot tweets mail w/ sender unless #ANON is in Subject: