[Wiki-research-l] Re: Talk on open participation and information quality in Wikipedia (Benjamin Mako Hill, 2023-02-15)

2023-02-08 Thread Tristan Miller

Greetings.

On 08/02/2023 21.29, Leila Zia wrote:
Hi Tristan, Thank you for sharing this with us. If the talk will be 
recorded, I'd appreciate it if you share a link with us after the 
session when it becomes available.



Thanks for your interest in the talk!  You're not the only one who wrote 
asking whether there will be a recording.


We do always ask our guest speakers if they consent to their talks being 
recorded.  If they agree, we post the videos on our YouTube channel: 
https://www.youtube.com/@ofai


If this particular talk ends up being recorded, I'll follow up here on 
the mailing list with a link to the video.


Regards,
Tristan

--
Dr.-Ing. Tristan Miller, Research Scientist
Austrian Research Institute for Artificial Intelligence (OFAI)
Freyung 6/6, 1010 Vienna, Austria | Tel: +43 1 5336112 12
https://logological.org/ | https://punderstanding.ofai.at/
___
Wiki-research-l mailing list -- wiki-research-l@lists.wikimedia.org
To unsubscribe send an email to wiki-research-l-le...@lists.wikimedia.org


[Wiki-research-l] Re: Talk on open participation and information quality in Wikipedia (Benjamin Mako Hill, 2023-02-15)

2023-02-08 Thread Leila Zia
Hi Tristan, Thank you for sharing this with us. If the talk will be
recorded, I'd appreciate it if you share a link with us after the session
when it becomes available.

Looking forward to learning more. :)

Leila

On Wed, Feb 8, 2023 at 2:23 AM Tristan Miller 
wrote:

> The Wikimedia Foundation has developed a set of ML/AI systems that have
> been shaping editing behaviour on Wikipedia. How these tools have
> impacted the efficiency and fairness of moderation work will be
> discussed in "Balancing Open Participation and Information Quality in
> Wikipedia Using Machine Learning", a talk by Benjamin Mako Hill of the
> University of Washington. The talk is part of the 2023 Lecture Series of
> the Austrian Research Institute for Artificial Intelligence:
> https://www.ofai.at/events/lectures2023
>
> Members of the public are cordially invited to attend the talk via Zoom
> on Wednesday, 15 February at 18:30 CET (UTC+1):
>
> URL:
> https://us06web.zoom.us/j/84282442460?pwd=NHVhQnJXOVdZTWtNcWNRQllaQWFnQT09
> Meeting ID: 842 8244 2460
> Passcode: 678868
>
> Talk abstract: Peer produced information goods like free/open source
> software and Wikipedia are both increasingly important and increasingly
> under threat. This talk will describe how Wikipedia has sought to
> balance its commitment to open editing and its desire to allow
> participation from unvetted and anonymous users with its need to
> maintain high information quality in its articles. I will focus on the
> way that a set of ML/AI systems developed by the Wikimedia Foundation
> allow scholars to measure the value of contributions from anonymous
> users and the surprising way that these systems can also be used by the
> Wikipedia community to shape editing behavior. I will argue that use of
> these ML/AI systems can both improve the efficiency of moderation work
> while also making moderation actions more fair to anonymous contributors
> who are the source of substantial vandalism by reducing reliance on
> social signals and making norm violations by everyone else more visible.
>
> Speaker biography: Benjamin Mako Hill is an Associate Professor in the
> University of Washington Department of Communication and an Adjunct
> Associate Professor in the Department of Human-Centered Design &
> Engineering, the Paul G. Allen School of Computer Science & Engineering,
> and the Information School. He is a member of Community Data Science
> Collective which he founded with Aaron Shaw. At UW, he is also Affiliate
> Faculty in the Center for Statistics and the Social Sciences, the
> eScience Institute, and the "Design Use Build" (DUB) group that supports
> research on on human computer interaction. He is also a Faculty
> Associate at the Berkman Klein Center for Internet and Society at
> Harvard University and an affiliate of the Institute for Quantitative
> Social Science at Harvard.
>
> --
> Dr.-Ing. Tristan Miller, Research Scientist
> Austrian Research Institute for Artificial Intelligence (OFAI)
> Freyung 6/6, 1010 Vienna, Austria | Tel: +43 1 5336112 12
> https://logological.org/ | https://punderstanding.ofai.at/
> ___
> Wiki-research-l mailing list -- wiki-research-l@lists.wikimedia.org
> To unsubscribe send an email to wiki-research-l-le...@lists.wikimedia.org
>
___
Wiki-research-l mailing list -- wiki-research-l@lists.wikimedia.org
To unsubscribe send an email to wiki-research-l-le...@lists.wikimedia.org


[Wiki-research-l] [Wikimedia Research Showcase] February 15 at 9:30AM PT, 17:30 UTC

2023-02-08 Thread Emily Lescak
Hello everyone,

The next Research Showcase will be livestreamed next Wednesday, February 15
at 9:30AM PT / 17:30 UTC. The theme is The Free Knowledge Ecosystem.

YouTube stream: https://www.youtube.com/watch?v=8VJmR-3lTac

We welcome you to join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

The evolution of humanitarian mapping in OpenStreetMap (OSM) and how it
affects map completeness and inequalities in OSMBy *Benjamin Herfort,
Heidelberg Institute for Geoinformation Technology*Mapping efforts of
communities in OpenStreetMap (OSM) over the previous decade have created a
unique global geographic database, which is accessible to all with no
licensing costs. The collaborative maps of OSM have been used to support
humanitarian efforts around the world as well as to fill important data
gaps for implementing major development frameworks such as the Sustainable
Development Goals (SDGs). Besides the well-examined Global North - Global
South bias in OSM, the OSM data as of 2023 shows a much more spatially
diverse spread pattern than previously considered, which was shaped by
regional, socio-economic and demographic factors across several scales.
Humanitarian mapping efforts of the previous decade have already made OSM
more inclusive, contributing to diversify and expand the spatial footprint
of the areas mapped. However, methods to quantify and account for the
remaining biases in OSM’s coverage are needed so that researchers and
practitioners will be able to draw the right conclusions, e .g. about
progress towards the SDGs in cities.


Dataset reuseː Toward translating principles to practiceBy *Laura Koesten,
University of Vienna*The web provides access to millions of datasets. These
data can have additional impact when used beyond the context for which they
were originally created. But using a dataset beyond the context in which it
originated remains challenging. Simply making data available does not mean
it will be or can be easily used by others. At the same time, we have
little empirical insight into what makes a dataset reusable and which of
the existing guidelines and frameworks have an impact.In this talk, I will
discuss our research on what makes data reusable in practice. This is
informed by a synthesis of literature on the topic, our studies on how
people evaluate and make sense of data, and a case study on datasets on
GitHub. In the case study, we describe a corpus of more than 1.4 million
data files from over 65,000 repositories. Building on reuse features from
the literature, we use GitHub’s engagement metrics as proxies for dataset
reuse and devise an initial model, using deep neural networks, to predict a
dataset’s reusability. This demonstrates the practical gap between
principles and actionable insights that might allow data publishers and
tool designers to implement functionalities that facilitate reuse.
We hope you can join us!

Warm regards,
Emily


-- 
Emily Lescak (she / her)
Senior Research Community Officer
The Wikimedia Foundation
___
Wiki-research-l mailing list -- wiki-research-l@lists.wikimedia.org
To unsubscribe send an email to wiki-research-l-le...@lists.wikimedia.org


[Wiki-research-l] Talk on open participation and information quality in Wikipedia (Benjamin Mako Hill, 2023-02-15)

2023-02-08 Thread Tristan Miller
The Wikimedia Foundation has developed a set of ML/AI systems that have 
been shaping editing behaviour on Wikipedia. How these tools have 
impacted the efficiency and fairness of moderation work will be 
discussed in "Balancing Open Participation and Information Quality in 
Wikipedia Using Machine Learning", a talk by Benjamin Mako Hill of the 
University of Washington. The talk is part of the 2023 Lecture Series of 
the Austrian Research Institute for Artificial Intelligence: 
https://www.ofai.at/events/lectures2023


Members of the public are cordially invited to attend the talk via Zoom 
on Wednesday, 15 February at 18:30 CET (UTC+1):


URL: 
https://us06web.zoom.us/j/84282442460?pwd=NHVhQnJXOVdZTWtNcWNRQllaQWFnQT09

Meeting ID: 842 8244 2460
Passcode: 678868

Talk abstract: Peer produced information goods like free/open source 
software and Wikipedia are both increasingly important and increasingly 
under threat. This talk will describe how Wikipedia has sought to 
balance its commitment to open editing and its desire to allow 
participation from unvetted and anonymous users with its need to 
maintain high information quality in its articles. I will focus on the 
way that a set of ML/AI systems developed by the Wikimedia Foundation 
allow scholars to measure the value of contributions from anonymous 
users and the surprising way that these systems can also be used by the 
Wikipedia community to shape editing behavior. I will argue that use of 
these ML/AI systems can both improve the efficiency of moderation work 
while also making moderation actions more fair to anonymous contributors 
who are the source of substantial vandalism by reducing reliance on 
social signals and making norm violations by everyone else more visible.


Speaker biography: Benjamin Mako Hill is an Associate Professor in the 
University of Washington Department of Communication and an Adjunct 
Associate Professor in the Department of Human-Centered Design & 
Engineering, the Paul G. Allen School of Computer Science & Engineering, 
and the Information School. He is a member of Community Data Science 
Collective which he founded with Aaron Shaw. At UW, he is also Affiliate 
Faculty in the Center for Statistics and the Social Sciences, the 
eScience Institute, and the "Design Use Build" (DUB) group that supports 
research on on human computer interaction. He is also a Faculty 
Associate at the Berkman Klein Center for Internet and Society at 
Harvard University and an affiliate of the Institute for Quantitative 
Social Science at Harvard.


--
Dr.-Ing. Tristan Miller, Research Scientist
Austrian Research Institute for Artificial Intelligence (OFAI)
Freyung 6/6, 1010 Vienna, Austria | Tel: +43 1 5336112 12
https://logological.org/ | https://punderstanding.ofai.at/
___
Wiki-research-l mailing list -- wiki-research-l@lists.wikimedia.org
To unsubscribe send an email to wiki-research-l-le...@lists.wikimedia.org