Unfortunately (but not surprisingly) not a very in-depth interview,
US-centric.

https://www.nytimes.com/2021/09/23/technology/wikipedia-misinformation.html
*Give us a sense of your direction and vision for Wikimedia, especially in
such a fraught information landscape and in this polarized world.*

There are a few core principles of Wikimedia projects, including Wikipedia,
that I think are important starting points. It’s an online encyclopedia.
It’s not trying to be anything else. It’s certainly not trying to be a
traditional social media platform in any way. It has a structure that is
led by volunteer editors. And as you may know, the foundation has no
editorial control. This is very much a user-led community, which we support
and enable.

The lessons to learn from, not just with what we’re doing but how we
continue to iterate and improve, start with this idea of radical
transparency. Everything on Wikipedia is cited. It’s debated on our talk
pages. So even when people may have different points of view, those debates
are public and transparent, and in some cases really allow for the right
kind of back and forth. I think that’s the need in such a polarized society
— you have to make space for the back and forth. But how do you do that in
a way that’s transparent and ultimately leads to a better product and
better information?

And the last thing that I’ll say is, you know, this is a community of
extremely humble and honest people. As we look to the future, how do we
build on those attributes in terms of what this platform can continue to
offer society and provide free access to knowledge? How do we make sure
that we are reaching the full diversity of humanity in terms of who is
invited to participate, who is written about? How are we really making sure
that our collective efforts reflect more of the global south, reflect more
women and reflect the diversity of human knowledge, to be more reflective
of reality?

*What is your take on how Wikipedia fits into the widespread problem of
disinformation online?*

Many of the core attributes of this platform are very different than some
of the traditional social media platforms. If you take misinformation
around Covid, the Wikimedia Foundation entered into a partnership with the
World Health Organization. A group of volunteers came together around what
was called WikiProject Medicine, which is focused on medical content and
creating articles that then are very carefully monitored because these are
the kinds of topics that you want to be mindful around misinformation.

Another example is that the foundation put together a task force ahead of
the U.S. elections, again, trying to be very proactive. [The task force
supported 56,000 volunteer editors watching and monitoring key election
pages.] And the fact that there were only 33 reversions on the main U.S.
election page was an example of how to be very focused on key topics where
misinformation poses real risks.

Then another example that I just think is really cool is there’s a podcast
called “The World According to Wikipedia.” And on one of the episodes,
there’s a volunteer who is interviewed, and she really has made it her job
to be one of the main watchers of the climate change pages.

We have tech that alerts these editors when changes are made to any of the
pages so they can go see what the changes are. If there’s a risk that,
actually, misinformation may be creeping in, there’s an opportunity to
temporarily lock a page. Nobody wants to do that unless it’s absolutely
necessary. The climate change example is useful because the talk pages
behind that have massive debate. Our editor is saying: “Let’s have the
debate. But this is a page I’m watching and monitoring carefully.”

*One big debate that is currently happening on these social media platforms
is this issue of the censorship of information. There are people who claim
that biased views take precedence on these platforms and that more
conservative views are taken down. As you think about how to handle these
debates once you’re at the head of Wikipedia, how do you make judgment
calls with this happening in the background?*

For me, what’s been inspiring about this organization and these communities
is that there are core pillars that were established on Day 1 in setting up
Wikipedia. One of them is this idea of presenting information with a
neutral point of view, and that neutrality requires understanding all sides
and all perspectives.

It’s what I was saying earlier: Have the debates on talk pages on the side,
but then come to an informed, documented, verifiable citable kind of
conclusion on the articles. I think this is a core principle that, again,
could potentially offer something to others to learn from.

*Having come from a progressive organization fighting for women’s rights,
have you thought much about misinformers weaponizing your background to say
it may influence the calls you make about what is allowed on Wikipedia?*

I would say two things. I would say that the really relevant aspects of the
work that I’ve done in the past is volunteer-led movements, which is
probably a lot harder than others might think, and that I played a really
operational role in understanding how to build systems, build culture and
build processes that I think are going to be relevant for an organization
and a set of communities that are trying to increase their scale and reach.

The second thing that I would say is, again, I’ve been on my own learning
journey and invite you to be on a learning journey with me. How I choose to
be in the world is that we interact with others with an assumption of good
faith and that we engage in respectful and civilized ways. That doesn’t
mean other people are going to do that. But I think that we have to hold on
to that as an aspiration and as a way to, you know, be the change that we
want to see in the world as well.

*When I was in college, I would do a lot of my research on Wikipedia, and
some of my professors would say, ‘You know, that’s not a legitimate
source.’ But I still used it all the time. I wondered if you had any
thoughts about that!*

I think now most professors admit that they sneak onto Wikipedia as well to
look for things!

You know, we’re celebrating the 20th year of Wikipedia this year. On the
one hand, here was this thing that I think people mocked and said wouldn’t
go anywhere. And it’s now become legitimately the most referenced source in
all of human history. I can tell you just from my own conversations with
academics that the narrative around the sources on Wikipedia and using
Wikipedia has changed.
_______________________________________________
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/PVTVLFF5DH3JWGUMFIAO7KPSUDJGHIU2/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

Reply via email to