[Wikitech-l] Applications for WikiCite 2017 (Vienna, May 23-25, 2017) close in 5 days

2017-02-23 Thread Dario Taraborelli
A reminder that applications to attend WikiCite 2017
<https://meta.wikimedia.org/wiki/WikiCite_2017> close on *February 27, 2017*
.

Please consider applying
<https://docs.google.com/forms/d/e/1FAIpQLScWnCLfAt88cUWKSu_E-lU8m3te_r4P3ngJtCaPs7cewSwkew/viewform>
if you work on sources and citations (or related tools) in Wikipedia,
Wikidata, Wikisource or other Wikimedia projects. If there are other people
in your network we should consider inviting to the event, please let us
know. You can contact the organizing committee at: wikic...@wikimedia.org.

Best,
Dario
 -- on behalf of the organizers


On Thu, Feb 9, 2017 at 3:44 PM, Dario Taraborelli <
dtarabore...@wikimedia.org> wrote:

> Dear all,
>
> I am happy to announce that applications to attend WikiCite ‘17 officially 
> open
> today <https://goo.gl/forms/Kb9Wl6Xfw2EmFqEr2>.
>
> About the event
>
> WikiCite 2017 <https://meta.wikimedia.org/wiki/WikiCite_2017> is a 3-day
> conference, summit and hack day to be hosted in Vienna, Austria, on May
> 23-25, 2017. It expands on efforts started last year at WikiCite 2016
> <https://meta.wikimedia.org/wiki/WikiCite_2016/Report> to design a
> central bibliographic repository, as well as tools and strategies to
> improve information quality and verifiability in Wikimedia projects.
>
> Our goal is to bring together Wikimedia contributors, data modelers,
> information and library science experts, software engineers, designers and
> academic researchers who have experience working with Wikipedia's citations
> and bibliographic data.
>
> WikiCite 2017 will be a venue to:
>
>-
>
>Day 1. (Conference) – present progress on existing work and
>initiatives for citations and bibliographic data across Wikimedia projects
>-
>
>Day 2. (Summit) – discuss technical, social, outreach and policy
>directions
>-
>
>Day 3. (Hack) – get together to build, based on new ideas and
>applications
>
>
>
> More information on the event can be found here
> <https://meta.wikimedia.org/wiki/WikiCite_2017>:
>
> How to apply
>
> Participation for this year's event is limited to 100 individuals. In
> order to be considered for participation, please fill out the following
> form <https://goo.gl/forms/Kb9Wl6Xfw2EmFqEr2> and provide us with some
> information about yourself, your interests, and expected contribution.
> PLEASE NOTE THIS IS NOT THE FINAL REGISTRATION FORM. Your application will
> be reviewed and the organizing committee will extend an invitation by March
> 10, 2017. This application form is to determine the best mix of
> attendees. Not everyone who applies will receive an invitation, but there
> will be a waitlist.
>
> Important dates
>
>
>-
>
>February 9, 2017: applications open
>-
>
>February 27, 2017: applications close, waitlist opens
>-
>
>March 10, 2017: all final notifications of acceptance are issued,
>waitlist processing begins
>-
>
>March 31, 2017: attendee list is finalized
>
>
> Travel support
>
>
> Like last year, limited funding to cover travel costs of prospective
> participants will be available. Requests for travel support should be
> submitted via the application form
> <https://goo.gl/forms/Kb9Wl6Xfw2EmFqEr2>. We will confirm by March 10, if
> we can provide you with travel support.
>
> Contact
>
> For any question, you can contact the organizing committee via:
> wikic...@wikimedia.org
>
> We look forward to seeing you in Vienna!
>
> The WikiCite 2017 organizing committee
>
> Dario Taraborelli
>
> Jonathan Dugan
>
> Lydia Pintscher
>
> Daniel Mietchen
>
> Cameron Neylon
>
>
>
> *Dario Taraborelli  *Director, Head of Research, Wikimedia Foundation
> wikimediafoundation.org • nitens.org • @readermeter
> <http://twitter.com/readermeter>
>



-- 

*Dario Taraborelli  *Director, Head of Research, Wikimedia Foundation
wikimediafoundation.org • nitens.org • @readermeter
<http://twitter.com/readermeter>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Research FAQ gets a facelift

2016-06-20 Thread Dario Taraborelli
We just released a new version of Research:FAQ on Meta [1], significantly
expanded and updated, to make our processes at WMF more transparent and to
meet an explicit FDC request to clarify the role and responsibilities of
individual teams involved in research across the organization.

The previous version – written from the perspective of the (now inactive)
Research:Committee, and mostly obsolete since the release of WMF's open
access policy [2] – can still be found here [3].

Comments and bold edits to the new version of the document are welcome. For
any question or concern, you can drop me a line or ping my username on-wiki.

Thanks,
Dario

[1] https://meta.wikimedia.org/wiki/Research:FAQ
[2] https://wikimediafoundation.org/wiki/Open_access_policy
[3] https://meta.wikimedia.org/w/index.php?title=Research:FAQ&oldid=15176953


*Dario Taraborelli  *Head of Research, Wikimedia Foundation
wikimediafoundation.org • nitens.org • @readermeter
<http://twitter.com/readermeter>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] An early preview from WikiCite

2016-06-01 Thread Dario Taraborelli
Hey all,

Wikimedia Deutschland and the Wikimedia Foundation hosted the WikiCite
 event in Berlin last week,
bringing together a large group
 of
Wikidatans, Wikipedians, librarians, developers and researchers from all
over the world.

The event built a lot of momentum around the definition of data models,
workflows and technology needed to better represent source and citation
data from Wikimedia projects, Wikidata in particular.

While we're still drafting a human-readable report
, I thought I'd share
a preview of the notes from the various workgroups, to give you a sense of
what we worked on and to let everyone join the discussion:
Main workgroups
Modeling bibliographic source metadata


Discuss and draft data models to represent different types of sources as
Wikidata items
Reference extraction and metadata lookup tools


Design or improve tools to extract identifiers and bibliographic data from
Wikipedia citation templates, look up and retrieve metadata
Representing citations and citation events


Discuss how to express the citation of a source in a Wikimedia artifact
(such as a Wikipedia article, a Wikidata statements etc.) and review
alternative ways to represent them
(Semi-)automated ways to add references to Wikidata statements


Improve tools for semi-automated statement and reference creation
(StrepHit, ContentMine)
Use cases for source-related queries


Identify use cases for SPARQL queries involving source metadata. Obtain a
small open licensed bibliographic and citation graph dataset to build a
proof-of-concept of the querying and visualization potential of source
metadata in Wikidata.
Additional workgroups
Wikidata as the central hub on license information on databases


Add license information to Wikidata to make Wikidata the central hub on
license information on databases
Using citations and bibliographic source metadata


Merge groups working on citation structure and source metadata models and
integrate their recommendations
Citoid-Wikidata integration


Extend Citoid to write source metadata into Wikidata

We're opening up the wikicite-disc...@wikimedia.org mailing list to anyone
interested in interacting with the participants in the event (we encouraged
them to use the official wikidata list for anything of interest to the
broader community). Phabricator also has a dedicated tag
 for related initiatives.

The event was generously funded
 by the Alfred P.
Sloan Foundation, the Gordon and Betty Moore Foundation, and Crossref.
We'll be exploring the feasibility of a follow-up event in the next 6-12
months to continue the work we started in Berlin and bring in more people
than we could host due to funding/capacity.

Best,

Dario
on behalf of the organizers
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Analytics] 'Unique Devices' Data Visualizations Available

2016-05-30 Thread Dario Taraborelli
neat, well done people

> On May 24, 2016, at 9:51 PM, Nuria Ruiz  wrote:
> 
> 
> Hello!
> 
> 
> The analytics team would like to announce that we have a new visualization 
> for Unique Devices data. As you know Unique Devices [1] is our best proxy to 
> calculate Unique Users. We would like to reiterate that the data is available 
> in a public API that anyone can access [2]. We calculate Uniques daily and 
> monthly.
>   
> 
> See, for example, "Daily Unique Devices" for Spanish Wikipedia versus French 
> wikipedia:
> https://vital-signs.wmflabs.org/#projects=frwiki,eswiki/metrics=UniqueDevices 
> <https://vital-signs.wmflabs.org/#projects=frwiki,eswiki/metrics=UniqueDevices>
> 
> FYI that dashboard would not work on IE, only on Edge. 
> 
> Thanks, 
> 
> Nuria
> 
> [1] https://meta.wikimedia.org/wiki/Research:Unique_Devices 
> <https://meta.wikimedia.org/wiki/Research:Unique_Devices>
> [2] https://wikitech.wikimedia.org/wiki/Analytics/Unique_Devices 
> <https://wikitech.wikimedia.org/wiki/Analytics/Unique_Devices>
> 
> 
> ___
> Analytics mailing list
> analyt...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/analytics



Dario Taraborelli  Head of Research, Wikimedia Foundation
wikimediafoundation.org <http://wikimediafoundation.org/> • nitens.org 
<http://nitens.org/> • @readermeter <http://twitter.com/readermeter>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Applications open for WikiCite (Berlin, May 25-26, 2016)

2016-03-29 Thread Dario Taraborelli
Citations and references are the building blocks of Wikimedia projects.
However, as of today, they are still treated as second-class citizens.
Structured data bases such as Wikidata offer a unique opportunity
<https://www.wikidata.org/wiki/Wikidata:WikiProject_Source_MetaData> to
turn into reality over a decade of endeavors to build the sum of all
citations and bibliographic metadata into a centralized repository. To
coordinate upcoming work in this space, we're organizing a technical event
in late May and opening up applications for prospective participants.

*WikiCite 2016 <https://meta.wikimedia.org/wiki/WikiCite_2016>* is a
hands-on event focused on designing data models and technology to *improve
the coverage, quality, standards-compliance and machine-readability of
citations and source metadata in Wikipedia, Wikidata and other Wikimedia
projects*. Our goal, in particular, is to define a technical roadmap for
building a repository of all Wikimedia references in Wikidata.

We are bringing together Wikidatans, Wikipedians, software engineers, data
modelers, and information and library science experts from organizations
including *Crossref*, *Zotero*, *CSL*, *ContentMine*, *Google*, *Datacite*,
*NISO*, *OCLC* and the *NIH*. We are also inviting academic researchers
with experience working with Wikipedia's citations and bibliographic data.

WikiCite will be hosted in *Berlin* on *May 25-26, 2016*. Participation to
the event is capped at about 50 participants and we expect to have a number
of open slots for applicants:

   - if you were pre-invited and have already filled in a form, you will
   receive a separate note from the organizers
   - if you have not been invited but you would like to participate, please
   fill in this application form <http://goo.gl/forms/Yv6rve2wCt> to give
   us some information about you and your interest and expected contribution
   to the event.

Please help us pass this on to anyone who has done important technical work
on Wikimedia references and citations.

*Important dates*

   - *March 29, 2016*: applications open
   - *April 11, 2016*: applications close
   - *April 15, 2016*: notifications of acceptance are issued (if you
   applied for a travel grant, we'll be able to confirm by this date if we can
   cover the costs of your trip)


For any question, you can contact the organizing committee:
wikic...@wikimedia.org

The organizers,

Dario Taraborelli
Jonathan Dugan
Lydia Pintcher
Daniel Mietchen
Cameron Neylon


*Dario Taraborelli  *Head of Research, Wikimedia Foundation
wikimediafoundation.org • nitens.org • @readermeter
<http://twitter.com/readermeter>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] What Wikimedia Research is up to in the next quarter

2015-12-18 Thread Dario Taraborelli
Hey all,

I’m glad to announce that the Wikimedia Research team’s goals
<https://www.mediawiki.org/wiki/Wikimedia_Research/Goals#January_-_March_2016_.28Q3.29>
for
the next quarter (January - March 2016) are up on wiki.

The Research and Data
<https://www.mediawiki.org/wiki/Wikimedia_Research#Research_and_Data> team
will continue to work with our volunteers and collaborators on revision
scoring as a service <https://meta.wikimedia.org/wiki/R:Revscoring> adding
support for 5 new languages and prototyping new models (including an edit
type classifier
<https://meta.wikimedia.org/wiki/Research:Automated_classification_of_edit_types>).
We will also continue to iterate on the design of article creation
recommendations
<https://meta.wikimedia.org/wiki/Research:Increasing_article_coverage>,
running a dedicated campaign in coordination with existing editathons to
improve the quality of these recommendations. Finally, we will extend a
research project we started in November aimed at understanding the behavior
of Wikipedia readers
<https://meta.wikimedia.org/wiki/Research:Characterizing_Wikipedia_Reader_Behaviour>
, by combining qualitative survey data with behavioral analysis from our
HTTP request logs.

The Design Research
<https://www.mediawiki.org/wiki/Wikimedia_Research#Design_Research> team
will conduct an in-depth study of user needs (particularly readers) on the
ground in February. We will continue to work with other Wikimedia
Engineering teams throughout the quarter to ensure the adoption of
human-centered design principles and pragmatic personas
<https://www.mediawiki.org/wiki/Personas_for_product_development> in our
product development cycle. We’re also excited to start a collaboration
<https://meta.wikimedia.org/wiki/Research:Publicly_available_online_learning_resource_survey>
with
students at the University of Washington to understand what free online
information resources (including, but not limited to, Wikimedia projects)
students use.

I am also glad to report that two papers on link and article
recommendations (the result of a formal collaboration with a team at
Stanford) were accepted for presentation at WSDM '16 and WWW ’16 (preprints
will be made available shortly). An overview on revision scoring as a
service
<http://blog.wikimedia.org/2015/11/30/artificial-intelligence-x-ray-specs/> was
published a few weeks ago on the Wikimedia blog, and got some good media
coverage
<https://meta.wikimedia.org/wiki/Research:Revision_scoring_as_a_service/Media>
.

We're constantly looking for contributors and as usual we welcome feedback
on these projects via the corresponding talk pages on Meta. You can contact
us for any question on IRC via the #wikimedia-research channel and follow
@WikiResearch <https://twitter.com/WikiResearch> on Twitter for the latest
Wikipedia and Wikimedia research updates hot off the press.

Wishing you all happy holidays,

Dario and Abbey on behalf of the team


*Dario Taraborelli  *Head of Research, Wikimedia Foundation
wikimediafoundation.org • nitens.org • @readermeter
<http://twitter.com/readermeter>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Revision scoring as a service launched

2015-11-30 Thread Dario Taraborelli
We just published an announcement on the Wikimedia blog marking the official 
launch of revision scoring as a service 
<https://meta.wikimedia.org/wiki/Research:Revision_scoring_as_a_service> and I 
wanted to say a few words about this project:

Blog post: 
https://blog.wikimedia.org/2015/11/30/artificial-intelligence-x-ray-specs/ 
<https://blog.wikimedia.org/2015/11/30/artificial-intelligence-x-ray-specs/>
Docs on Meta: https://meta.wikimedia.org/wiki/ORES 
<https://meta.wikimedia.org/wiki/ORES> 

First off: what’s revision scoring 
<https://meta.wikimedia.org/wiki/Research:Revision_scoring_as_a_service#Rationale>?
 On the surface, it’s a set of open APIs allowing you to automatically “score” 
any edit and measure their probability of being damaging or good-faith 
contributions. The real goal behind this project, though, is to fix the damage 
indirectly caused by vandal-fighting bots and tools on good-faith contributors 
and to bring back a collaborative dimension to how we do quality control on 
Wikipedia. I invite you to read the whole blog post 
<https://blog.wikimedia.org/2015/11/30/artificial-intelligence-x-ray-specs/> if 
you want to know more about the motivations and expected outcome of this 
project.

I am thrilled this project is coming to fruition and I’d like to congratulate 
Aaron Halfaker <https://wikimediafoundation.org/wiki/User:Ahalfaker> and all 
the project contributors 
<https://meta.wikimedia.org/wiki/Research:Revision_scoring_as_a_service#Team> 
on hitting this big milestone: revision scoring started as Aaron’s side project 
well over a year ago and it has been co-designed (as in – literally – 
conceived, implemented, tested, improved and finally adopted) by a distributed 
team of volunteer developers, editors, and researchers. We worked with 
volunteers in 14 different Wikipedia language editions and as of today revision 
scores are integrated 
<https://meta.wikimedia.org/wiki/Research:Revision_scoring_as_a_service#Tools_that_use_ORES>
 in the workflow of several quality control interfaces, WikiProjects and 3rd 
party tools. The project would not have seen the light without the technical 
support provided by the TechOps team (Yuvi in particular) and seminal funding 
provided by the WMF IEG program and Wikimedia Germany.

So, here you go: the next time someone tells you that LLAMAS GROW ON TREES 
<https://en.wikipedia.org/w/index.php?diff=prev&oldid=642215410> you can 
confidently tell them they should stop damaging 
<http://ores.wmflabs.org/scores/enwiki/damaging/642215410/> Wikipedia.

Dario


Dario Taraborelli  Head of Research, Wikimedia Foundation
wikimediafoundation.org <http://wikimediafoundation.org/> • nitens.org 
<http://nitens.org/> • @readermeter <http://twitter.com/readermeter>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] WikiTracer: call for developers

2009-02-27 Thread Dario Taraborelli
WikiTracer needs your help!

WikiTracer [1] is a new Web service providing cross-platform visual  
analytics and comparative statistics for wikis. Its goal is to offer  
wiki administrators an easy way to monitor and evaluate the growth and  
performance of their own wiki. The service is meant to foster wiki  
adoption and should be of interest to a variety of users, including  
wiki researchers, developers and practitioners.

The project was introduced last year at WikiSym and its development is  
currently in alpha stage: we are working on data collection and  
validation with a small number of wikis to be able to implement the  
visualization tools with real data soon.

We are looking for developers to help us write a MediaWiki extension  
for this service. The extension will need to retrieve a number of  
statistics for the wiki (possibly relying on the MediaWiki API) and  
expose them in XML format (validated against a public XML schema) for  
the service to harvest them.

The specs [2] and some preliminary thoughts on implementing a  
MediaWiki extension [3] are available on the project documentation  
website. We can assist you with the development of the extension and  
the validation of the output.

If you wish to learn more on this project of fancy contributing you  
can drop me a line: info [at] wikitracer [dot]  com, join the (low  
traffic) WT community mailing list [4] or pop by on the WT IRC channel  
on freenode [5].

Best,

Dario

[1] http://wikitracer.com/
[2] http://wikitracer.com/docs/WTPluginDraft
[3] http://wikitracer.com/docs/WTMediaWiki
[4] http://dev.wikitracer.com/mailman/listinfo/community_dev.wikitracer.com
[5] irc://irc.freenode.net/wikitracer


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l