Kerry, I think that I agree with you. Awhile back, my impression from
English Wikipedia arbitration pages was that there is a relatively small
number of users who stir up trouble repeatedly and are sometimes sanctioned
but rarely blocked. I don't want to speak for the Arbitration Committee,
and since Arbcom changes membership periodically I'm reluctant to criticize
current arbcom members for decisions of the committee in prior years. My
impression is that over the years Arbcom has become more willing to
sanction administrators who use their admin tools in ways that Arbcom feels
are not okay, which I think is progress, but there's much more besides
dealing with problematic administrators that ideally would be done to
address incivility, personal attacks, and harassment.

That brings me to Chris' email, and unfortunately I don't have answers for
most of his points. Differing interpretations and values are likely to be a
fact of life in the Wikiverse regardless of good intentions. I think that
some of us have more emotional armor than others, and some of us are more
willing than others to participate in uncomfortable or contentious
discussions. Similarly, people have a variety of emotional triggers that,
from my perspective, have little to do with reason and a lot to do with
other factors, some of which we probably don't control any more than we
control our autonomic reflexes. I don't think it's other people's
responsibilities to try to delicately work around someone's reflexes (which
I would guess vary significantly from person to person and are often
unpredictable), but neither should one intentionally try to trigger someone
else, and people who accidentally overreact when triggered should apologize
for doing so (I can recall making such an apology myself on one occasion,
and I think I've gotten better over the years about handling myself in
difficult situations). Public discourse in the Wikiverse, in politics, and
in any number of other requirements requires one to have a certain amount
of willingness to take risks and hear things that we might not want to hear
and might find offensive. In attempting to reduce the frequency and
intensity of personal attacks and harassment, I think that we need to be
careful that we don't go so far as to say that people "have a right not to
be offended", since others' beliefs and statements are very likely to seem
different or strange or alienating from time to time. However, I also hope
that we can reduce some of the more aggressive behavior for which I think
there is consensus has no purpose in Wikimedia that could be compatible --
or at least not opposed to -- Wikimedia's goals.

That brings me back to the training of the AI, and what it will be flagging
for admins to review. I recall getting the impression from Maggie's
presentation at a metrics meeting that the AI was catching some edits that
come across to me as very likely to meet the ENWP definition of a personal
attack, and I think that having an AI that could help admins might indeed
be useful. However, there's another dimension to this problem which we
haven't addressed, which is the limited human resource capacity of the
admin corps, and the limited number of individuals who are willing to spend
their free time policing Wikimedia and dealing with controversial or even
dangerous situations. So I think that the AI, and attempts to detoxify
Wikimedia, if designed well, can indeed be good -- but I can't help but
wonder if they will be insufficient unless the capacity of the admin corps
with skilled and selfless administrators is also increased in proportion to
the need, and I'm not sure what the solution to that problem will be. Human
resources are constraints throughout the Wikiverse, and I think that they
may be a problem with detoxification efforts as well.

Chris, returning to your point about emotional literacy: I don't know how
to address that systemically, although perhaps training might be
beneficial. I get the impression that in the western world, police officers
and military personnel (who seem to be disproportionately male, although
perhaps lightly less so than Wikipedia's population) are increasingly
trained in emotional resilience, communications, and other psychological
issues. Perhaps training is something that we could think about doing on a
large scale, although that would be complicated. WMF has already started
some limited training for functionaries, and I think that expanding
training might indeed be useful. Training probably won't be a cure, but it
might help to move the needle a bit. I would encourage WMF to consider
doing research into what kind of training might be beneficial for
Wikimedia's social environment, and how best to deliver that training, on a
large scale.

Pine
_______________________________________________
Wiki-research-l mailing list
Wiki-research-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wiki-research-l

Reply via email to