[Wikidata] Re: Fwd: [Talk-GB] Welsh Government is now using a Welsh-language baselayer largely based on OSM

2022-10-17 Thread Federico Leva (Nemo)

Il 17/10/22 13:56, Andy Mabbett ha scritto:

Forwarding for interest:


Thank you, this is excellent!


They take data via Mapio Cymru rather than from OSM directly because
we apply some additional rules and bring in names from Wikidata where
appropriate. This also means that as we improve Welsh name
identification they'll get the benefit of that.


Does this mean that the name on Wikidata takes precedence over any name 
on OSM and that there's no additional data source beyond OSM and 
Wikidata? In other words are they contributing any changes directly to 
OSM and Wikidata?


Federico
___
Wikidata mailing list -- wikidata@lists.wikimedia.org
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikidata@lists.wikimedia.org/message/C4YRSOU7AOSO6IWVN4QTGJSISH7PYQ2F/
To unsubscribe send an email to wikidata-le...@lists.wikimedia.org


[Wikidata] Heather Meeker on why CC-0 is good for open data

2022-04-22 Thread Federico Leva (Nemo)

Good summary and overview of the usual arguments:

https://plus.pli.edu/Details/Details?fq=id:(352066-ATL2)

«Developers of data sets need to be educated about the importance of 
releasing their rights under the public domain—or, at a minimum, using 
licenses with no notice conditions. Currently, this can be done via 
Creative Commons Zero or the Open Data Commons Public Domain Dedication 
and License.»


Federico
___
Wikidata mailing list -- wikidata@lists.wikimedia.org
To unsubscribe send an email to wikidata-le...@lists.wikimedia.org


[Wikidata] Fwd: Using wikidata to extract aditional information about entities

2021-12-16 Thread Federico Leva (Nemo)

Forwarding to the list a question sent to -owners.

Federico

 Messaggio Inoltrato 
Oggetto: Using wikidata to extract aditional information about entities
Data: Thu, 16 Dec 2021 10:33:49 +
Mittente: Ricardo Pinto

Good afternoon!

My name is Ricardo Pinto, I am currently doing my thesis, and I am trying
to make calls to Wikidata. I am presently using SPARQLWrapper in python to
make the calls, but I am having trouble with some things that I hope you
can help me with.

1. How can I get the entity identifier? Given an entity (e.g. Portugal) I
need to get its identifier (e.g., Q45). Currently, I am doing this
by searching on google, but I need this to be systematic. Can you help me?
2. While searching for entities related to another entity, I need to make
calls using properties. Is there a way to know what the most used
properties are? I need a systematic way to choose properties to search. Can
you help me?

Best regards,
Ricardo Pinto.
___
Wikidata mailing list -- wikidata@lists.wikimedia.org
To unsubscribe send an email to wikidata-le...@lists.wikimedia.org


[Wikidata] Re: Change list policy for call for papers postings?

2021-09-19 Thread Federico Leva (Nemo)

Il 19/09/21 13:10, Peter Patel-Schneider ha scritto:

"In accordance with funding body requirements, Elsevier does
offer alternative open access publishing options. Visit our open access
page for full information."


I did read it, and it says "This journal has an embargo period of 24 
months". Of course one can just ignore such abusive requests and archive 
anyway under a cc-by license the so-called preprint, which will be 99 % 
the same thing, but authors may not know that. Advertising such journals 
on this mailing list might be appropriate if the poster explains how to 
ignore abusive requests from the publisher.


In the specific case, some exceptions are admitted by the publisher for 
Plan S compliance but only to certain authors funded by certain funders. 
The result is a very complicated situation 
 and a very low open 
access rate of some 20 % . I don't 
mean to single out JWS as particularly egregious: this is typical of 
most venues controlled by closed access publishers (including ACM, IEEE 
etc.). I only mentioned JWS because it was recently advertised on this 
list (and Wiktionary-l).


I don't see any benefit in using Wikimedia properties to advertise 
for-profit endeavours which are clearly incompatible with the Wikimedia 
mission and values, as well as Wikidata's very reason of existence. The 
anti-OA venues usually have enough marketing power to get known without 
our help.


Il 19/09/21 13:24, Dan Brickley ha scritto:
> I guess refining policing wiki rules is what some folks do for fun around
> here, so maybe I should switch to listening mode at this point…

Personally I found everyone's contributions to this discussion useful so 
far. The most effective policy will be one which enjoys consensus among 
researchers and wiki contributors alike.


Federico
___
Wikidata mailing list -- wikidata@lists.wikimedia.org
To unsubscribe send an email to wikidata-le...@lists.wikimedia.org


[Wikidata] Re: Change list policy for call for papers postings?

2021-09-19 Thread Federico Leva (Nemo)

Il 19/09/21 11:10, Jan Ainali ha scritto:

I would be okay with them if the person mailing introduced it with a
sentence or two why they believe it to be specifically interesting for the
Wikidata community.


I agree.

The Wikidata community also can't benefit from those publications unless 
they're made (libre) open access, so I think it would be fair to require 
that the announcements include a mention or link clearly showing that 
all the papers will be OA (preferably) or explaining how the authors can 
archive them (for free) under a free license (libre green OA) à la:

https://www.coalition-s.org/rights-retention-strategy/
https://cyber.harvard.edu/hoap/How_to_make_your_own_work_open_access

From a search  it's easy to find 
good and bad examples. Bad is e.g. 
 
(claims embargos and all sorts of restrictions), rather good is e.g. 
 
( states CC-BY).


Recently I've started nudging frequent posters who neglect to explain 
how a CfP is relevant to the list. Most do not respond, so in practice 
the only option is placing them on moderation.


If there are no objections, I'd also like to experiment with Mailman 
topics. We could place all obvious CfP in their own topic, and 
subscribers would then be able to set their preferences to not receive 
them. A more drastic alternative would be to discard all such messages 
as spam, but that might end up having quite some collateral damage.


Federico
___
Wikidata mailing list -- wikidata@lists.wikimedia.org
To unsubscribe send an email to wikidata-le...@lists.wikimedia.org


[Wikidata] Re: Wikimedia Deutschland to receive a grant from Arcadia to work on minority languages

2021-07-08 Thread Federico Leva (Nemo)
Excellent! This makes sense from Arcadia, because languages are a common 
resource like biodiversity or the atmosphere.


Remember CLDR. :)
https://translatewiki.net/wiki/CLDR

Federico

Il 08/07/21 14:53, Léa Lacroix ha scritto:

Hello all,

We would like to give you a heads up on a quite exciting project that will
be starting soon. Wikimedia Deutschland just received a grant from the
organization Arcadia . The goal of this
grant is to make our software more usable by cultures underrepresented in
technology, people of the Global South and speakers of minority languages.
This project will be defined over the coming months and will run for 3
years, ideally in close collaboration with other Wikimedia organizations.

Our first milestone will be to support the creation of a new development
team that will function together with Wikimedia Deutschland to improve
parts of our codebase and to develop new functionality. With this
experiment, we would like to share and transfer the knowledge of the
Wikidata development team to other organizations around the globe.

If you have any questions, feel free to reach out to us directly by email.
Cheers,

___
Wikidata mailing list -- wikidata@lists.wikimedia.org
To unsubscribe send an email to wikidata-le...@lists.wikimedia.org


[Wikidata] Re: [discovery] Upcoming Search Platform Office Hours—July 14th, 2021

2021-07-08 Thread Federico Leva (Nemo)

Il 07/07/21 22:47, Trey Jones ha scritto:

Google Meet link:https://meet.google.com/vyc-jvgq-dww

Join by phone in the US: +1 786-701-6904 PIN: 262 122 849#


I'm pretty sure it's possible to dial-in by phone from an EU number too, 
I did so in some WMF-hosted meeting only few months ago. I forgot how to 
find the number, despite the instructions:

https://support.google.com/meet/answer/9518557
https://support.google.com/meet/answer/9683440

I think I found it in a Calendar invitation back then, could someone 
maybe share it from there?


Thanks,
Federico
___
Wikidata mailing list -- wikidata@lists.wikimedia.org
To unsubscribe send an email to wikidata-le...@lists.wikimedia.org


[Wikidata] Fwd: wikidata mailing list migration complete

2021-05-10 Thread Federico Leva (Nemo)

FYI

No action required, but feel free to click the links to get some extra 
joy courtesy Ladsgroup and Legoktm. :)


Speaking of which, should we have some more information on the mailing 
list description?



Federico

 Messaggio Inoltrato 
Oggetto: wikidata mailing list migration complete
Data: Mon, 10 May 2021 17:17:30 +
Mittente:

Dear list administrator,

Your mailing list, wikidata, has been migrated to Mailman3. You can
access it at 
.


Please create an account at .

Finally, review 
, which

has some items to check.

Happy emailing!
___
Wikidata mailing list -- wikidata@lists.wikimedia.org
To unsubscribe send an email to wikidata-le...@lists.wikimedia.org


Re: [Wikidata] Delta Dumps Production?

2021-02-25 Thread Federico Leva (Nemo)

Kingsley Idehen via Wikidata, 25/02/21 19:26:

Is there a mechanism in place for producing and publishing delta-centric
dumps for Wikidata?


There's
https://phabricator.wikimedia.org/T72246

Magnus Manske used to maintain some biweekly dumps as part of its WDQ 
service, IIRC.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Identifiers for WDQS queries

2021-02-19 Thread Federico Leva (Nemo)

James Heald, 18/02/21 15:33:
It also might be one step forwards towards creating a place like Quarry 
(https://quarry.wmflabs.org/) where users could save their queries, 
share them,


PetScan does at least part of this. I've used it sometimes to share queries.

I suppose it's not forbidden to create a Wikidata item about a query. A 
query is an intellectual work and you can probably (ab)use some 
properties to add the actual query either as string (might be a bit too 
long) or link.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] [wikicite-discuss] Death by a thousand cuts - Wikicite and Wikidata

2021-02-16 Thread Federico Leva (Nemo)

Mark Graham, 16/02/21 20:40:

I will also share that, from the Internet Archive’s perspective (including Open 
Library) collaborations with other Wikipedians and the Wikimedia Foundation(s) 
has never been better.


Nice to hear! (And thank you for joining the mailing list.)

It's quite amusing to see in this thread how we sometimes manage to 
quarrel while saying the same thing.


The corpus of tens of millions of "bibliographic" items on Wikidata, 
however one may choose to call it, has now been in limbo for a while. I 
say "limbo" because it's not going back (so far), but it's 
semi-(un?)officially prohibited from continuing. I dare anyone to claim 
the contrary (i.e. that it's already clear it's going to be removed, or 
to be "completed"). Some people are fine with this freeze, while others 
feel a sense of urgency about it (I suspect everyone in this discussion 
does).


Let's hope that more clarity will come as the linked project proceeds:
https://meta.wikimedia.org/wiki/WikiCite/Shared_Citations

Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] academic/scientific articles on Wikidata

2020-09-20 Thread Federico Leva (Nemo)
Andrew Su, 20/09/20 07:21:
> Is anyone aware of a list of academic articles that use Wikidata?

Manual lists and keyword searches tend to be a bit messy, but with
enough work you might be able to find what you're looking for. Do you
consider you have too many results, or too few? (There are about
2000-2500 results for "wikidata" on generic academic search engines like
BASE or CORE.)

For a slightly curated list you could use citations of a general
article, like:
https://www.lens.org/lens/scholar/article/019-729-680-880-124/citations/citing

If there are too many citations, it's sometimes possible to sort them by
"citation intent":
https://www.semanticscholar.org/paper/dab7e605237ad4f4fe56dcba2861b8f0a57112be#citing-papers

You could also give a look to the references of a suitable article, in
your case maybe something about WikiCite like
https://doi.org/10.3897/rio.5.e35820

Works by sufficiently meticulous authors may also be found by their
citation of a software library or other software for the usage of
Wikidata, like these:
https://www.base-search.net/Search/Results?filter[]=f_dctypenorm%3A%226%22=wikidata

For instance https://doi.org/10.5281/zenodo.60708 has one citation at
https://www.lens.org/lens/scholar/article/083-423-725-270-071/citations/citing
and your very own https://doi.org/10.5281/zenodo.3621065 finds a
citation from http://doi.org/10.7554/elife.52614 .

Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Community open letter on renaming Wikimedia

2020-06-23 Thread Federico Leva (Nemo)
[To all Wikimedia projects, minus mailing lists with an active
discussion already.]

On August 2020, the Wikimedia Foundation board of trustees may decide a
rename to "Wikipedia Foundation" and various other things.


Following a community meeting, a proposed open letter was written:


«We ask the Wikimedia Foundation to pause or stop its current movement
renaming activities, due to persistent shortcomings in the current
rebranding process. Future work should be restarted only in a way that
ensures equitable decision-making.»

(Sorry for the crossposting. When replying, be mindful of cc. Do
consider forwarding to language-specific discussion venues with a short
translated introduction, or translate the pages on Meta.)

Cheers,
Federico aka Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Language codes 'mul' and 'mis' not recognized

2020-06-22 Thread Federico Leva (Nemo)
Thomas Francart, 22/06/20 16:21:
> I also cannot use these 2 codes when editing through the human interface.

Language code validation on Wikidata is sometimes confusing. See also:
https://phabricator.wikimedia.org/T39459

Why "mis" instead of "und"? See also
https://phabricator.wikimedia.org/T230833#6103004

Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] [libraries] COVID-19 & EduWiki response: Wikipedia & Education User Group Open Meeting

2020-04-01 Thread Federico Leva (Nemo)

LiAnna Davis, 26/03/20 19:45:
The board of the Wikipedia & Education User Group invites you to attend 
our user group's next Open Meeting, one week from today, on Thursday, 
April 2, at 15:00 UTC, as always via Zoom


Thank you. What are you doing to mitigate the privacy risks of such 
proprietary software in general, and Zoom specifically?


In such a stressful period, I think it's important to protect the 
vulnerable from such risky tools.





Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Notability and classing "notable" properties

2019-11-21 Thread Federico Leva (Nemo)

Thad Guidry, 19/11/19 23:15:

Some of them we are familiar with such as "award received"
  or "notable work"
.


It seems that first of all you should create a super-class of 
 so that such properties can be 
listed, otherwise how could any software tell them apart?


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] References to newspaper articles behind paywalls like newspapers.com

2019-11-13 Thread Federico Leva (Nemo)

Let's not spread publisher FUD on Wikimedia lists, please.

Liam Wyatt, 13/11/19 02:03:

is there any bot that is systematically going
around and collecting new URLs that are added with the Reference URL
property (P854), adding them to Internet Archive,


As long as MediaWiki is not broken, and the URLs are announced on the 
expected venues (I believe 
), 
yes, the Internet Archive immediately crawls them. This is documented at 
 (sort of).


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Google's stake in Wikidata and Wikipedia

2019-09-20 Thread Federico Leva (Nemo)

Sebastian Hellmann, 20/09/19 11:22:
Maybe somebody could enlighten me about the overall strategy and 
connections here.


You can add more links to grants and other Wikimedia pages on 
.


Google and the Wikimedia movement are on opposite sides for most things, 
but occasionally some of their employees (or algorithms!) happen to be 
interested in the same things as us, so we end up doing things together 
and a few breadcrumbs travel towards WMF. What matters to me is that 
they don't abuse our brands.


Sadly WMF is not always careful about communication, for instance 
 still has an appalling 
sentence "Working with partners like Google" right under the heading 
"Partner for change".


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Personal news: a new role

2019-09-19 Thread Federico Leva (Nemo)

Denny Vrandečić, 19/09/19 19:56:
I had used my 20% time to support such teams. The requests became more 
frequent, and now I am moving to a new role in Google Research, akin to 
a Wikimedian in Residence


That's very interesting! Is it the first free culture project for which 
something of the like happens? From what you write, I understand it will 
be something separate from the Google Open Source office, right?


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Dead or alive ? Probably dead

2019-09-07 Thread Federico Leva (Nemo)

Fabrizio Carrai, 07/09/19 09:53:
Since the oldest know person was 122, what about to set "date of death = 
unknown value" for all the persons resulting older such age ?


It seems to me a sensible thing to do. It's good you asked because it's 
better to avoid the risk of conflicting mass changes.


I wounder if we need a qualifier to allow identifying this as an 
inferred piece of data: do people sometimes state "unknown value" when 
someone is known to be dead, but we don't know when they did? I would 
place a date of death with a precision of a decade or century in such a 
case, but I've not checked what's the frequency of such qualifiers yet.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] "Wikidata item" link to be moved in the menu column on Wikimedia projects

2019-08-08 Thread Federico Leva (Nemo)
Nice! The "tools" section of the sidebar is an unrelated mess to avoid 
(ah, from the task it looks like 2014 me agrees).


The overall crowdedness of the sidebar has only got worse in the last 
few years, so there's still more to do in the future to make sure it's 
not just a cemetery of unused links.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] dcatap namespace in WDQS

2019-07-29 Thread Federico Leva (Nemo)

Stas Malyshev, 29/07/19 04:14:

As part of our Wikidata Query Service setup, we maintain the namespace
serving DCAT-AP (DCAT Application Profile) data[1].


How many of the endpoints we federate with support DCAT-AP? I suppose 
federated queries may benefit the most from it.


DCAT-AP is allegedly taking off and it's given great importance for 
instance in the EU Open data maturity ranking 
: 
Italy, which is otherwise a laggard on open data, was scored high 
because its national portal embraced DCAT-AP.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Language codes for Chinese

2019-06-18 Thread Federico Leva (Nemo)

Vladimir Ryabtsev, 19/06/19 03:04:
How can I get the COMPLETE list of language codes (desirably with 
description) for Chinese that is supported by Wikidata?


Languages supported by a MediaWiki instance are expected to be all 
listed at siprop=languages in the API:




Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Are we ready for our future

2019-05-12 Thread Federico Leva (Nemo)

Erik Paulson, 12/05/19 01:54:
It's probably less about splitting the dumps up and more about starting 
to split the main wikidata namespace into more discrete areas [...]


In fact that was one of the proposals in "The future of bibliographic 
data in Wikidata: 4 possible federation scenarios".

https://www.wikidata.org/wiki/Wikidata:WikiCite/Roadmap

Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] [Wikidata-tech] wb_terms redesign

2019-04-25 Thread Federico Leva (Nemo)

Alaa Sarhan, 25/04/19 17:38:
Full migration is not possible unfortunately due to the current capacity 
of database master node.


Can you clarify whether it would also be too much load to write both to 
the new table and the old wb_terms table for a transition period 
(controlled by a configuration setting)?


(I'm not advocating for it, just asking because we did something of the 
sort in the past for other transitions.)


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Birth dates in Wikidata

2019-03-12 Thread Federico Leva (Nemo)

Andrew Black, 12/03/19 18:36:

Is it possible to change this behavior?


Someone made a mistake in the corresponding mix'n'match catalog:
https://www.wikidata.org/?diff=830322643

See:
https://tools.wmflabs.org/mix-n-match/#/catalog/76

Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Handling roles in multiple languages

2019-02-17 Thread Federico Leva (Nemo)

Thank you for this interesting question.

Darren Cook, 14/02/19 14:13:

The idea of adding lots of new property children, descended from P169,
for each possible combination of the Japanese and English name for the
title, feels very wrong.


What about the opposite? We could have a super-property, à la "key 
person" (term used in some infoboxes), and add a qualifier for each of 
the areas or roles they have responsibility for.


For instance, in Italy the law defines a rather clear role of "legal 
representative", held normally by the "president". Entities can arrange 
it differently, but there's only one way to describe it: a person is, or 
is not. If you use such a legal concept for each country, there can be 
dozens or hundreds of them, but not an unlimited amount.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Company data and properties

2019-01-23 Thread Federico Leva (Nemo)

Darren Cook, 23/01/19 12:07:

I wondered if anyone else is actively working on fleshing out the
company data within Wikidata?


There's also 
.


What always puzzles me is that Wikidata has tons of details about 
entities but almost nothing when it comes to the basics, such as 
revenues/budget and number of employees, the kind of information which 
is most often updated in infoboxes.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikibase as a decentralized perspective for Wikidata

2018-11-28 Thread Federico Leva (Nemo)

Yuri Astrakhan, 29/11/18 04:14:
The "Q" prefix has a strong identity in itself.  Anyone will instantly 
say - yes, it's a Wikidata identifier


But that's because most people only know one Wikibase installation, not 
the other way around.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikidata Logo on DBpedia HTML Website

2018-10-29 Thread Federico Leva (Nemo)

Sebastian Hellmann, 29/10/2018 13:20:
If nobody objects, we would just go ahead and use the logo to link to 
Wikidata. In my opinion only good things can come from that and it will 
help with new editors for Wikidata.


Logos next to links are ok at least when it comes to the the trademark 
policy.

https://meta.wikimedia.org/wiki/Trademark_policy#policy-linkstous

Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Looking for "data quality check" bots

2018-09-26 Thread Federico Leva (Nemo)

Ettore RIZZA, 26/09/2018 15:31:
I'm looking for Wikidata bots that perform accuracy audits. For example, 
comparing the birth dates of persons with the same date indicated in 
databases linked to the item by an external-id.


This is mostly a screenscraping job, because most external databases are 
only accessibly in unstructured or poorly structured HTML form.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Mapping Wikidata to other ontologies

2018-09-22 Thread Federico Leva (Nemo)

Maarten Dammers, 22/09/2018 14:28:
What ontologies are important because these are used a lot? Some of the 
ones I came across:

* https://www.w3.org/2009/08/skos-reference/skos.html
* http://xmlns.com/foaf/spec/
* http://schema.org/


Since 2016 there was some progress:
https://github.com/schemaorg/schemaorg/issues/280
https://github.com/schemaorg/schemaorg/issues/1186

The last time I looked into it was for music:
https://www.wikidata.org/?oldid=297764900#schema.org/MusicRecording

Mapping properties is tedious but a relatively amount of work (tens of 
hours rather than hundreds) can make a significant difference.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikidata in the LOD Cloud

2018-06-29 Thread Federico Leva (Nemo)

Tom Morris, 29/06/2018 18:09:



A complete aside, but who chose the date format for that URL? That's wacky!


Whoever it was, clearly not a xkcd fan. :)
https://xkcd.com/1179/

Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikidata in the LOD Cloud

2018-06-27 Thread Federico Leva (Nemo)

Maarten Dammers, 27/06/2018 23:26:
Excellent news! https://lod-cloud.net/dataset/wikidata seems to contain 
the info in a more human readable (and machine readable) way. If we add 
some URI link, does it automagically appear or does Lucas has to do some 
manual work? I assume Lucas has to do some manual work.


I'd also be curious what to do when a property does not have a node in 
the LOD cloud, for instance P2948 is among the 77 results for P1921 but 
I don't see any corresponding URL in 
http://lod-cloud.net/versions/2018-30-05/lod-data.json


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Solve legal uncertainty of Wikidata

2018-05-18 Thread Federico Leva (Nemo)

Olaf Simons, 18/05/2018 23:48:

Facts don't*have*  licenses. They have sources, and we track those.

Works well in an environment of no original research. Things get onto another 
level if I have a fact that is of value in other fields of research.


I'm not sure what your message means, but I suspect you're confusing 
copyright with moral rights (copyright is not the same as author rights) 
or even patents/inventions.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] [Wikimedia-l] Solve legal uncertainty of Wikidata

2018-05-18 Thread Federico Leva (Nemo)

Info WorldUniversity, 18/05/2018 20:45:

Wikidata may be heading to
https://creativecommons.org/licenses/by-sa/4.0/
which allows for a) sharing b) adapting and even c) commercially


No way. CC-BY-SA-4.0 handles, but doesn't waive, the sui generis 
database rights. It might be fine for folks in USA, but it would leave 
EU people under water.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] DBpedia Databus (alpha version)

2018-05-14 Thread Federico Leva (Nemo)

Sebastian Hellmann, 08/05/2018 14:29:
Working with data is hard and repetitive. We envision a hub, where 
everybody can upload data and then useful operations like versioning, 
cleaning, transformation, mapping, linking, merging, hosting is done 


Sounds like Wikidata!


automagically


Except this. There is always some market for pixie dust.

Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread Federico Leva (Nemo)

Andy Mabbett, 05/05/2018 20:50:

The statement I questioned was "never able"; that's not a matter of "a
long way to go".


I see. I'm not sure about the long run. On the other hand, in the long 
run we're all dead.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread Federico Leva (Nemo)

Andy Mabbett, 05/05/2018 17:33:

Both Wikidata and DBpedia surely can, and should, coexist because we'll
never be able to host in Wikidata the entirety of the Wikipedias.

Can you give an example of something that can be represented in
DBpedia, but not Wikidata?


More simply, there's still a long way to go until Wikidata imports all 
the data contained in Wikipedia infoboxes (or equivalent data from other 
sources), let alone the rest.


So, as Gerard mentions, DBpedia has something more/different to offer. 
(The same is true for the various extractions of structured data from 
Wiktionary vs. Wiktionary's own unstructured data.)


That said, the LOD cloud is about links, as far as I understand. 
Wikidata should be very interesting in it.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-04-30 Thread Federico Leva (Nemo)

Peter F. Patel-Schneider, 30/04/2018 23:32:

Does the way that Wikidata serves RDF
(https://www.wikidata.org/wiki/Special:EntityData/Q5200.rdf) satisfy this
requirement?


I think that part was already settled with:
https://lists.wikimedia.org/pipermail/wikidata/2017-October/011314.html

More information:
https://phabricator.wikimedia.org/T85444

Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Election data

2018-03-13 Thread Federico Leva (Nemo)

Yuri Astrakhan, 13/03/2018 01:23:

Something I wish was available is the voting record


This is available for some parliaments in open data:

https://www.votewatch.eu/blog/guide-to-votewatcheu/

Or  from .

Federico


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] About the support of Arabic dialects by Wikidata

2018-01-13 Thread Federico Leva (Nemo)
How many of those are supported by MediaWiki? Can you help find 
translators for the missing ones?



Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wiki Workshop @ The Web Conference 2018: call for contributions

2018-01-01 Thread Federico Leva (Nemo)

Leila Zia, 01/01/2018 15:16:
The 5th annual Wiki Workshop [1] will take place in Lyon on April 24, 
2018 and as part of The Web Conference 2018 (a.k.a. WWW2018)


To clarify, does this mean that attendance requires registration to 
WWW2018 (https://twc2018.insight-outside.fr/ )?


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] stats on WD edits and WDQS uptime

2017-12-20 Thread Federico Leva (Nemo)

Andrew Su, 20/12/2017 20:19:

In any case, pointers on the two stats I'm looking for are still welcome!


Ok. I don't think the number of edits is especially meaningful, but 
Wikidata easily wins this metric over any other project, even if you 
combine all Wikipedias.


https://stats.wikimedia.org/wikispecial/EN/TablesWikipediaWIKIDATA.htm 
shows 14 M/month (although G>I must be an error) and in 
https://stats.wikimedia.org/EN/TablesDatabaseEdits.htm under Σ you can 
see all Wikipedias combined are around 10.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] stats on WD edits and WDQS uptime

2017-12-20 Thread Federico Leva (Nemo)

Andrew Su, 20/12/2017 20:11:
I've scanned stats.wikimedia.org  but can't 
find that summarized. This link [1] seems to say that there were 18726 
WD edits in Nov 2017


That's the number of *users* making at least one edit. We generally 
consider the 5+ figure, which is a bit less than half that.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] New dashboard on the Wikidata Concept Monitor: Geo dashboard

2017-12-19 Thread Federico Leva (Nemo)

Léa Lacroix, 19/12/2017 17:29:
The usage of the items being the count of the number of pages in the 
Wikimedia projects where the Wikidata item is used.


What do you mean by "used"? That some statement from the item is 
fetched/transcluded on the page?


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Identifiers: Multiple VIAF numbers

2017-12-18 Thread Federico Leva (Nemo)

Finn Aarup Nielsen, 12/12/2017 21:29:

Yes, I think you should add all VIAFs. That is what I have done.


When I edit an item manually, I usually also mark one of the identifiers 
as preferred. In Wikimedia Italia events I regularly mention this as one 
of the benefits of Wikidata: it's the only real "broker" of identifiers, 
where links and mistakes or duplicates can emerge.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Where are wikidata to wikipedia links available

2017-12-12 Thread Federico Leva (Nemo)

Rune Stilling, 11/12/2017 17:55:
Another question. What happens if a Wikipedia title/url changes. How do 
you handle this? Do you update Wikidata manually (the link from Wikidata 
to Wikipedia)?


If the page is renamed, the same action also updates the linked Wikidata 
entity, so there is no additional work for editors most of the time.


I don't know what you're supposed to do as a data consumer: if you 
produce links and want to ensure they keep working, then I guess you 
should convert to page IDs and use a link of the form

https://en.wikipedia.org/?curid=8091

Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] An answer to Lydia Pintscher regarding its considerations on Wikidata and CC-0

2017-12-02 Thread Federico Leva (Nemo)

Leila Zia, 02/12/2017 22:48:
​(​Side-note. We should take this part offline but for the record: I 
couldn't find a place where transparency was listed as an agreed upon 
and shared value of our movement as a whole. There are subgroups that 
consider it a core value or one of the guiding principles, and it's of 
course built in in many of the things we do in Wikimedia, but I'm 
hesitant to call it /a core value of our movement/ given that it's not 
listed somewhere as such. btw, for the record, it's high on my personal 
and professional list of values.)


Transparency it's one of the 6 main Wikimedia values as listed in the 
"canonical" values document:

https://meta.wikimedia.org/w/index.php?title=Values=15348985

I know that since 2013 things have become increasingly confusing, with 
other texts and qualifiers popping up, but I consider that to be just 
background noise.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] An answer to Lydia Pintscher regarding its considerations on Wikidata and CC-0

2017-12-01 Thread Federico Leva (Nemo)

mathieu stumpf guntz, 01/12/2017 03:00:
Actually, as far as I know, CC-by-sa-3.0-undeed states nothing about 
/suis generis/ rights


I don't know what's -undeed, but 3.0-it and 4.0 do, which is for 
instance why ISTAT data can be imported in Wikidata despite the less 
than ideal license (CC-BY-3.0-it).


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikidata is becoming a proper citizen of the linked open data web

2017-10-26 Thread Federico Leva (Nemo)

Lydia Pintscher, 26/10/2017 20:21:

I’m looking
forward to seeing what new things are going to be built with this and
how we will show up on http://lod-cloud.net.


Me too! Things seem to be changing rapidly over there, unless it's just 
an optical effect. For instance my main take-away used to be that 
GeoNames is very central (see e.g. 
) but in 
the latest versions I can't even locate it.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikidata prefix search is now Elastic

2017-10-26 Thread Federico Leva (Nemo)

Thanks!

Stas Malyshev, 25/10/2017 23:22:

Wikidata and Search Platform teams are happy to announce that Wikidata
prefix search (aka wbsearchentities API aka the thing you use when you
type into that box on the top right or any time you edit an item or
property and use the selector widget) 


Is the "selector widget" some gadget or non-default preference, or do 
you just mean the dropdown suggestions in the field for the value of a 
property? When I select the property I still see a wbsgetsuggestions 
request (which is good because I get suggestions of common properties); 
only when I switch to the next field I see some 
wbsearchentities/wbgetentities/other requests.




- better language support (matches along fallback chain and also can
match in any language, with lower score)


Useful! I tested with https://www.wikidata.org/wiki/Q20241614 , 
Q12756715 and Q997741 (random items without any label or description in 
my languages or fallbacks thereof) and I can still match them when I try 
to add them as values on another item.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Kickstartet: Adding 2.2 million German organisations to Wikidata

2017-10-15 Thread Federico Leva (Nemo)
This is an area where I would very much like to see some important 
properties created and populated, to the benefit e.g. of various 
infoboxes on Wikipedias which contain data in need of frequent updates 
(especially income, revenue, market capitalization, number of employees, 
links to most recent financial statements and other corporate information).

https://www.wikidata.org/w/index.php?title=Wikidata:Property_proposal/Organization=307430401
https://www.wikidata.org/wiki/Wikidata:List_of_properties/Organization

Even data for companies listed in stock exchanges is terribly outdated 
most of the times.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Writing a Bot for data Import

2017-06-28 Thread Federico Leva (Nemo)

Thanks for asking those questions.

Marisa Nest, 28/06/2017 20:01:
- The first section is for all bots thus also for our bot. One of these 
requirements is to be able to set a limit for maximum edits per minute. 
But what does “edit" exactly mean in that case? Is to create an item and 
to add a label each a separate edit?


Yes, an edit is a revision. Bots (and even non-bots) go quite fast on 
Wikidata, so it doesn't matter so much if your counting method varies by 
a factor of 2; what matters is that if your bot would run at 1000 edits 
per minute you're able to limit it to 100 or 10 as needed.


- In the second section are requirements for “Langlink import bots”. In 
which case are these requirements related to our bot? In addition, there 
is a link in this section for a full list of requirements for "import 
bots". Which of these entries are requirements, and which are merely 
recommended?


Almost nothing is relevant for your case, especially for the newly 
created entities. You'll mostly need to check for duplicate statements 
when you expand existing items.


- In the third section “Statement adding bot” is one requirement 
"Monitor constraint violation reports for possible errors generated or 
propagated by your bot”. Should that be implemented as well or is that 
rather a task for the bot operator?


That's something the operator should do. If you're not validating all 
your statements beforehand, make sure to watchlist the contraint 
violation report pages and check whether there are significant increases 
after your first run. Try to clean up if there are unexpected errors.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Coursework involving Wikidata?

2017-06-06 Thread Federico Leva (Nemo)

Daniel Mietchen, 06/06/2017 05:24:

The process of building these puzzles is actually more instructive
than solving them, but it's not yet simple to leverage Wikidata for
the building process. Would be great to have some Wikidata games/
tools similar to
https://en.wikipedia.org/wiki/Wikipedia:Six_degrees_of_Wikipedia#External_links


I agree it's important to have something ready and engaging (or at least 
not frustrating) for course attendees to make some first edits quickly 
on Wikidata.


I have tried the primary sources tool once, but mix'n'match/Wikidata 
game worked quite well with librarians. It's important to start with a 
task which is self-contained and understandable for the audience 
(nothing that requires checking dozen pages) and which doesn't require 
wrestling with the interface.


For instance, I prefer to work on small Wikidata items because in the 
bigger ones it's so hard to miss the "add statement" button that I'm 
forced to babysit every attendee individually until their scrolling 
abilities reach biathlon level (https://phabricator.wikimedia.org/T142082 ).


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] wikipedia site link to a subtitel wrong

2017-06-05 Thread Federico Leva (Nemo)

The usual
https://phabricator.wikimedia.org/T54564
https://phabricator.wikimedia.org/T74347

If Wikidata cannot be fixed to support the needs for interwiki links, I 
guess you must use the tradional interwiki syntax for those cases. A 
blacklist on the corresponding Wikidata item for the sites handled 
locally may be in order.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Coursework involving Wikidata?

2017-06-04 Thread Federico Leva (Nemo)

For a six hours course on Wikidata or more, I see
https://meta.wikimedia.org/wiki/Workshop_Wikidata_SUPSI

There was also recently something in eastern Europe but I forgot the 
details.


For something a bit smaller, most Wikidata teaching WMIT does is for 
librarians, often as part of courses which mostly focus on Wikipedia and 
other sister projects. There were some specifically on Wikidata though, 
like:

http://www.spaghettiopendata.org/content/wikidata-la-banca-di-conoscenza-libera-casa-wikimedia
http://www.aib.it/struttura/sezioni/toscana/2015/47168-wikidata-riuso-dati-aperti-tra-wikipedia-e-biblioteche/

Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Extending public wikidata with a stand alone version

2017-01-08 Thread Federico Leva (Nemo)

Satya Gadepalli, 08/01/2017 09:23:

I have done a local  wikidata stand-alone service. Now i want to make
this as an intermediary on top of public wikidata and extend few
concepts related to enter specific Item / labels.


Cf.:
* 
https://www.wikidata.org/wiki/Wikidata:Project_chat/Archive/2016/01#Federated_Wikibase.2FWikidata
* "Federation prototype" 
https://www.mediawiki.org/wiki/Wikimedia_Engineering/2016-17_Q3_Goals#Wikidata


Reminder: http://www.eagle-network.eu/wiki/ is one actively used 
Wikibase repository. :)


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikidata ontology

2017-01-07 Thread Federico Leva (Nemo)

Markus Kroetzsch, 08/01/2017 00:12:

The subclass of and instance of statements are actually used in very
many WDQS queries, often with * expressions to navigate the hierarchy.


I think that's what Gerard meant: you don't have to know what's under 
the hood, as long as it works. When you get some unexpected result, you 
go check what went wrong in the chain of subclasses etc. This is at 
least what I do, although I also work with some more traditional people 
who want to know the full ontology before even entering their first 
statement (of course they get lost for a few months).


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Master thesis

2016-11-05 Thread Federico Leva (Nemo)
For those who have ideas, there's also 
http://wikipapers.referata.com/wiki/List_of_open_questions to expand.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Tool to identify the articles versions across Wikipedia language editions?

2016-10-31 Thread Federico Leva (Nemo)

Reem Al-Kashif, 31/10/2016 20:24:

I'm wondering if there is a way/tool to identify the articles that exist
in the one edition of Wikipedia and have counterparts in another.


Reading your request literally, Special:MostInterwikis suffices. 
https://en.wikipedia.org/wiki/Special:MostInterwikis
Usually people do the opposite: 
https://tools.wmflabs.org/not-in-the-other-language/



I'm
also wondering if there is a way to generate a list of these articles'
titles for certain categories.


Again, usually the objective is the opposite: 
https://tools.wmflabs.org/missingtopics/


Your objective is unclear, so it's hard to tell whether you just need a 
titles dump, a SPARQL query or something simpler.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Using wikibase for creating structured vocabularies collaboratively

2016-10-12 Thread Federico Leva (Nemo)

Claudia Müller-Birn, 12/10/2016 21:11:

wonder if there are other projects than Wikidata that are using Wikibase for 
structuring their data


http://www.eagle-network.eu/wiki/index.php/Main_Page , as documented on 
http://wikiba.se/projects/ . You can ask on this list if you have questions.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] signing license declarations

2016-10-05 Thread Federico Leva (Nemo)

Benjamin Good, 05/10/2016 23:33:

somewhere the world can see


Like http://www.uniprot.org/help/license

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] signing license declarations

2016-10-05 Thread Federico Leva (Nemo)

Benjamin Good, 05/10/2016 19:44:

As a specific example, we have informal (e.g. an email to us) permission
to import data from the Disease Ontology [2] and UniProt [3] but would
like to make those informal agreements 'official' and public.


Just make them add such a note to http://www.uniprot.org/help/license or 
equivalent? E.g. http://www.beic.it/it/articoli/copyright releases some 
parts in CC-0.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Request for Property help (Schema.org mapping taskforce)

2016-09-14 Thread Federico Leva (Nemo)

Thad Guidry, 13/09/2016 19:38:

There is no need, therefore, to replicate those same definitions in
Wikidata by creating a new Wikidata item (topic / entity) for each
external vocabulary class or subclass or property.

Instead, the best practice is to simply POINT to those external
definitions, such as those in Schema.org, DBPedia.org, MusicBrainz,
etc., etc.


Agreed. If the external definition is compatible, just link it. A 
specific item needs to be created only where the two definitions are 
incompatible. I would define two items to be equivalent iff swapping 
them doesn't change the truthness of any statement or inference on 
Wikidata 
(https://en.wikipedia.org/wiki/Equivalence_relation#Well-definedness_under_an_equivalence_relation 
). Makes sense?


I don't know if Q474191 and Q26869695 are incompatible definitions; I 
understood you claimed they were. If the two definitions are compatible, 
we can merge the two items and just consider Q474191 equivalent to 
SchemaOrg's definition of "Diet".


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Request for Property help (Schema.org mapping taskforce)

2016-09-12 Thread Federico Leva (Nemo)

Thad Guidry, 12/09/2016 21:29:

The reason is that our definition in Schema.org is slightly different
than
  which is the biological
definition (a mixture of food sources that sustains a living thing) and
from that derives the nutritional definition, the one Schema.org has,
like a specific diet


Better now?
https://www.wikidata.org/w/index.php?title=Q26869695=375889582=375889390
https://www.wikidata.org/w/index.php?title=Q26869695=375889582=375889390

Do we need a help page on how hypernymy/hyponymy is handled on Wikidata? 
I know it can be confusing.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Request for Property help (Schema.org mapping taskforce)

2016-09-12 Thread Federico Leva (Nemo)

Thad Guidry, 12/09/2016 15:46:

Not sure I understand what you mean by 'item' ?  Do you mean a Wikidata
topic ?


I don't know what a "Wikidata topic" is, no such term is found on 
https://www.wikidata.org/wiki/Wikidata:Glossary .


Your example Q474191 already has an "equivalent class" link to 
SchemaOrg, can you make an example of an item (Q-number) and an external 
thingy which you are unable to link? Thanks.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Request for Property help (Schema.org mapping taskforce)

2016-09-12 Thread Federico Leva (Nemo)

Thad Guidry, 10/09/2016 17:03:

Does anyone know if a useful property like 'external subclass' is
available to use to help us with our mapping ?


Why can't you create an item for the subclass and link from there?

Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] List of WP-languages and language short code

2016-09-07 Thread Federico Leva (Nemo)

Markus Bärlocher, 07/09/2016 16:16:

I need the relation between ISO 639-3 and WP-shortcuts


I still have no idea what you mean by WP-shortcuts, but if you mean the 
MediaWiki/Wikimedia language codes then 
https://www.mediawiki.org/wiki/Language_code has all the pointers you 
need and https://en.wikipedia.org/wiki/List_of_ISO_639-1_codes has most 
of the equivalencies between ISO 639-1 and ISO 639-3.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] List of WP-languages and language short code

2016-09-07 Thread Federico Leva (Nemo)

Markus Bärlocher, 07/09/2016 13:27:

But I'm not a programmer - I need a format like CSV, or TXT as table,
with this columns:
- ISO-639-3 short code
- name of language in English
- name of language local in local writing system


https://meta.wikimedia.org/wiki/Special:SiteMatrix , copy and paste in 
LibreOffice calc.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] A property to exemplify SPARQL queries associated with a property

2016-08-23 Thread Federico Leva (Nemo)

Dario Taraborelli, 23/08/2016 20:35:

would it stretch too much the scope of properties of properties?


One can always use the talk page, like with all the templates 
documenting usage and monitoring.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Info box proposal

2016-08-03 Thread Federico Leva (Nemo)

Brill Lyle, 03/08/2016 19:20:

I am not saying editing Wiki Markup on Wikidata. Is that what you are
describing?


No.

Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Info box proposal

2016-08-03 Thread Federico Leva (Nemo)

Brill Lyle, 03/08/2016 18:53:

Speaking of: Where's the user documentation for Authority Control? Have
you tried to update and/or add Authority Control on Wikidata manually?


Sure. I've also taught dozens of persons and none of them preferred 
entering said data via wikitext (despite being taught that option too).


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Info box proposal

2016-08-03 Thread Federico Leva (Nemo)

Brill Lyle, 03/08/2016 13:30:

Huge barrier for Wikipedia end-users.


What makes you think so? Did you interview or observe users editing? In 
my experience, Wikidata is much easier for newbies to grasp than 
wikitext or even VisualEditor: like VisualEditor's template editor, 
Wikidata resembles a standard form, which people are used to.


We only need to make sure there are direct deep links from each piece of 
displayed (or missing) information to the statement on Wikidata where 
they are (or should be); and later add dialogs for direct editing from 
the client wikis, as was done long ago with the interlanguage links.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Communicating corrections back to data source

2016-06-11 Thread Federico Leva (Nemo)

Sandra Fauconnier, 10/06/2016 21:12:

Recently his team has developed an initiative called ResourceSync
, that seems to be addressing
exactly this - keeping distributed databases on the web mutually up to date.
It’s the closest thing I’ve ever seen that seems to address what we (and
the entire interlinked web) would need in this area. I might have missed
other initiatives, but this one gave me a big AHA moment!


That's more like a backup/mirroring system 
https://www.openarchives.org/rs/1.0/resourcesync#MotivExamples (similar 
to LOCKSS etc.), while here we're talking of data reconciliation.


Avoiding data loss, linkrot and single points of failure for the web is 
indeed something that interests Wikimedia; I believe the most advanced 
proposal in this area is 
http://brewster.kahle.org/2015/08/11/locking-the-web-open-a-call-for-a-distributed-web-2/


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Possibility of data lock

2016-06-09 Thread Federico Leva (Nemo)

john cummings, 09/06/2016 19:19:

I don't know what the solution is but the current situation doesn't seem
to work, I spent at least 40 hours (+ Nav's time) to import the 670
items and the data I imported has already started to become less correct
through user edits and bot edits.


If you start from the assumption that the starting data is perfect, of 
course it can only get worse. I'm not sure what was your purpose in 
importing this dataset (is there a page describing the effort?) but you 
might want to look into another method.


All the datasets I've seen imported on Wikidata have been improved 
significantly on the wiki. Of course, one has to live with the fact that 
the dataset will diverge. That's hardly a novel challenge: free software 
flourishes with forks, and each language edition of Wikipedia has its 
own version of each article.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] WDQS URL shortener

2016-06-01 Thread Federico Leva (Nemo)

Dario Taraborelli, 01/06/2016 10:33:

I don't, it probably depends on what shorteners are most used for spam
purposes across Wikimedia projects. Maybe someone familiar with URL
blacklisting from major wikis can comment?


Nearly all URL shorteners get blacklisted, eventually.

Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Is there a dump of wiki media items?

2016-05-29 Thread Federico Leva (Nemo)

Melvin Carvalho, 29/05/2016 19:31:

Is there a way I can get a dump of media items in wikidata or wikimedia
commons


https://meta.wikimedia.org/wiki/Mirroring_Wikimedia_project_XML_dumps#Media_tarballs

Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Ontology

2016-05-14 Thread Federico Leva (Nemo)

Gerard Meijssen, 14/05/2016 15:39:

When an external ontology says that something is a disease and the DSM-5
says it is not. There is a huge problem.


Until recently DSM called homosexuality a disease, we must live with 
conflicts.

https://en.wikipedia.org/wiki/Diagnostic_and_Statistical_Manual_of_Mental_Disorders#DSM-III-R_.281987.29

Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] [Wikimedia-l] ArticlePlaceholder now live on first 4 Wikipedias

2016-05-13 Thread Federico Leva (Nemo)

Lydia Pintscher, 13/05/2016 18:28:

Ok search integration should be fixed now.


I see a link at  the very end of 
https://eo.wikipedia.org/?search=Paolo+Monti (not so convenient; using 
right side, as for the interwiki search results, may have been better) 
but nothing with phrase search, after the 3 full text matches. 
https://eo.wikipedia.org/?search="Paolo+Monti;


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] ArticlePlaceholder now live on first 4 Wikipedias

2016-05-11 Thread Federico Leva (Nemo)

Thanks!

Lydia Pintscher, 11/05/2016 21:48:

* Odia:
https://or.wikipedia.org/wiki/%E0%AC%AC%E0%AC%BF%E0%AC%B6%E0%AD%87%E0%AC%B7:AboutTopic?entityid=Q131074
* Napolitan:
https://nap.wikipedia.org/wiki/Speci%C3%A0le:AboutTopic?entityid=Q2613697
* Esperanto:
https://eo.wikipedia.org/wiki/Speciala%C4%B5o:AboutTopic?entityid=Q12345
* Haitian Creole:
https://ht.wikipedia.org/wiki/Espesyal:AboutTopic?entityid=Q12345



What URLs are linked by what other places of the Wikipedia interface, so 
far? Is an URL 
https://nap.wikipedia.org/wiki/Speci%C3%A0le:AboutTopic/Q2613697 
recorded in pageviews, so that users can see how many visits they receive?


Would also be nice to have a page somewhere (on Meta, probably) to 
describe the current situation, e.g. if 
https://phabricator.wikimedia.org/T113954#1777562 is still the criterion 
for search. I don't see anything in 
https://eo.wikipedia.org/?search=Paolo+Monti for instance, nor any way 
to reach the special page (pending 
https://phabricator.wikimedia.org/T109437 ).


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Aren't pages too long?

2016-05-03 Thread Federico Leva (Nemo)

My main issues are
* there isn't a TOC, which is sorely needed to be able to jump to the 
various sections (e.g. identifiers, sitelinks);
* there is only one "add statement" button and it's in an unpredictable 
position.
The two issues are very easy to fix IMHO: just add a TOC and a second 
button at the top.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Photographers' Identities Catalog (& WikiData)

2016-04-03 Thread Federico Leva (Nemo)

David Lowe, 03/04/2016 18:42:

Actually, Nemo, I haven't yet added them to mix-n-match (though we've
discussed previously).


Ah, sorry. Then follow James' suggestion. I recommended QuickStatements 
to the national central library of Italy (BNCF) too, AFAIK it worked 
well for them. One advantage is that all edits are attributed to you in 
the history.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Photographers' Identities Catalog (& WikiData)

2016-04-03 Thread Federico Leva (Nemo)

David Lowe, 03/04/2016 17:10:

Also, I should have pointed out that I have 11,000 Wikidata

entities matched in PIC already. Is there a quick & easy way to get
those in?


IIRC you said you already added those matches to mix-n-match. When the 
property exists, Magnus is usually very fast at importing the 
associations into Wikidata with his bot. :)


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Photographers' Identities Catalog (& WikiData)

2016-04-03 Thread Federico Leva (Nemo)

David Lowe, 15/12/2015 00:00:

Once PIC is actually launched and at a permanent url (January sometime),
I'd love to get PIC ID #s into the WD records.


Have you proposed the property yet? If not please do: 
https://www.wikidata.org/wiki/Wikidata:Property_proposal/Authority_control


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] from Freebase to Wikidata: the great migration

2016-02-18 Thread Federico Leva (Nemo)

Lydia Pintscher, 18/02/2016 15:59:

Thomas, Denny, Sebastian, Thomas, and I have published a paper which was
accepted for the industry track at WWW 2016. It covers the migration
from Freebase to Wikidata. You can now read it here:
http://research.google.com/pubs/archive/44818.pdf



Nice!

> Concluding, in a fairly short amount of time, we have been
> able to provide the Wikidata community with more than
> 14 million new Wikidata statements using a customizable

I must admit that, despite knowing the context, I wasn't able to 
understand whether this is the number of "mapped"/"translated" 
statements or the number of statements actually added via the primary 
sources tool. I assume the latter given paragraph 5.3:


> after removing dupli
> cates and facts already contained in Wikidata, we obtain
> 14 million new statements. If all these statements were
> added to Wikidata, we would see a 21% increase of the num-
> ber of statements in Wikidata.

Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Strict mode for TeX/Math extension (was: upcoming deployments/features)

2016-02-05 Thread Federico Leva (Nemo)

Daniel Kinzler, 04/02/2016 19:02:

I agree that it would be nice to have a "strict TeX mode" for the math
extension. I'm not sure whether we would enable that on wikidata. While it would
make the life of consumers easier, it would make the life of people importing
from wikipedia harder. But perhaps it would be worth it, especially if the use
of "extra stuff" is actually rare on wikipedia.

Moritz, how hard would it be to add a "strict mode" that would disallow any
non-standard syntax?


I second the question, which I think warrants a new thread. Remember to 
file a task in Phabricator when the dust settles down. :)


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Other sites

2016-02-03 Thread Federico Leva (Nemo)

Lydia Pintscher, 03/02/2016 14:47:


What sites are allowed in the item "other sites" section? I haven't
found documentation about it.

Currently MediaWiki, Wikispecies, Meta, Wikidata and Commons are in this
section.


What section are we talking about? Is this a part of the entries 
presentation on Wikidata, or the sidebar on client wikis, or something else?


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Maintenance scripts for clients

2016-01-31 Thread Federico Leva (Nemo)

John Erling Blad, 06/08/2015 17:13:

A couple of guys at nowiki has started using this tool, and if they
continue at present speed the list will be emptied in two weeks time.

Can you please add nnwiki too and I will inform the community there
that there is a tool available.


4941 items in list, looks like that didn't happen. How comes bots are 
slacking? I thought they cared a lot about creating new Wikidata items 
for new articles!


Nemo



On Thu, Aug 6, 2015 at 10:55 AM, Magnus Manske
 wrote:

John: List mode!

https://tools.wmflabs.org/wikidata-todo/duplicity.php?wiki=nowiki=list

On Thu, Aug 6, 2015 at 8:16 AM Zolo ...  wrote:


About missing labels: in frwiki, most Wikidata data are added using
Module:Wikidata. The module adds a generic "to be translated" category when
there is no French label. With Wikidata usage picking up speed, the
community is finally coming to grip with it, as can be seen from that stats
at Catégorie:Page utilisant des données de Wikidata à traduire.

On Tue, Aug 4, 2015 at 7:06 PM, John Erling Blad  wrote:


Nice solution, I'll post a link at Wikipedia:Torget.
It is a bit like making a traffic statistic by using a road cam, so it
wasn't really what I was looking for..

On Tue, Aug 4, 2015 at 5:18 PM, Magnus Manske
 wrote:

I set up one of my tools for you (nowiki) for [1] :
https://tools.wmflabs.org/wikidata-todo/duplicity.php

It doesn't give you a list (though I could add that), rather presents
you
with a random one and tries to find a matching item. Basically, what
you
need to do anyway for due diligence.


Not quite sure what else you need, too much "somehow" in your
description...


On Tue, Aug 4, 2015 at 4:01 PM John Erling Blad 
wrote:


We lack several maintenance scripts for the clients, that is human
readable special pages with reports on which pages lacks special
treatment. In no particular order we need some way to identify
unconnected pages in general (the present one does not work [1]), we
need some way to identify pages that are unconnected but has some
language links, we need to identify items that are used in some
language and lacks labels (almost like [2],but on the client and for
items that are somehow connected to pages on the client), and we need
to identify items that lacks specific claims and the client pages use
a specific template.

There are probably more such maintenance pages, these are those that
are most urgent. Now users start to create categories to hack around
the missing maintenance pages, which create a bunch of categories.[3]
At Norwegian Bokmål there are just a few scripts that utilize data
from Wikidata, still the number of categories starts to grow large.

For us at the "receiving end" this is a show stopper. We can't
convince the users that this is a positive addition to the pages
without the maintenance scripts, because them we more or less are in
the blind when we try to fix errors. We can't use random pages to try
to prod the pages to find something that is wrong, we must be able to
search for the errors and fix them.

This summer we (nowiki) have added about ten (10) properties to the
infobokses, some with scripts and some with the property parser
function. Most of my time I have not been coding, and I have not been
fixing errors. I have been trying to explain to the community why
Wikidata is a good idea. At one point the changes was even reverted
because someone disagree with what we had done. The whole thing
basically revolves around "my article got an Q-id in the infobox and I
don't know how to fix it". We know how to fix it, and I have explained
that to the editors at nowiki several times. They still don't get it,
so we need some way to fix it, and we don't have maintenance scripts
to do it.

Right now we don't need more wild ideas that will swamp the
development for months and years to come, we need maintenance scripts,
and we need them now!

[1] https://no.wikipedia.org/wiki/Spesial:UnconnectedPages
[2] https://www.wikidata.org/wiki/Special:EntitiesWithoutLabel
[3]

https://no.wikipedia.org/wiki/Spesial:Prefiksindeks/Kategori:Artikler_hvor

John Erling Blad
/jeblad

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata



___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata



___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata



___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata



___
Wikidata mailing list
Wikidata@lists.wikimedia.org

[Wikidata] Fwd: [Wikimania-l] WMF Scholarship Deadline for Wikimania is Today

2016-01-09 Thread Federico Leva (Nemo)




 Messaggio inoltrato 
Oggetto:[Wikimania-l] WMF Scholarship Deadline for Wikimania is Today
Data:   Sat, 9 Jan 2016 14:55:30 -0600
Mittente:   Ellie Young



Reminder:
Scholarship applications for Wikimania 2016 which is being held in Esino
Lario, Italy on June 22–27, 2016 are now being accepted. Applications
are open until Saturday, January 09 2016 23:59 UTC. Applicants will be
able to apply for a partial or full scholarship. A full scholarship will
cover the cost of an individual's round-trip travel, shared
accommodation, and conference registration fees as arranged by the
Wikimedia Foundation. A partial scholarship will cover conference
registration fees and shared accommodation. Applicants will be rated
using a pre-determined selection process and selection criteria
established by the Scholarship Committee and the Wikimedia Foundation,
who will determine which applications are successful. To learn more
about Wikimania 2016 scholarships, please visit:
https://wikimania2016.wikimedia.org/wiki/ScholarshipsTo apply for a
scholarship, fill out the multi-language application form on:
https://scholarships.wikimedia.org/applyIt is highly recommended that
applicants review all the material on the Scholarships page and the
associated FAQ (
https://wikimania2016.wikimedia.org/wiki/Scholarships/FAQ) before
submitting an application. If you have any questions, please contact:
wikimania-scholarships at wikimedia.org
or leave a message at:
https://wikimania2016.wikimedia.org/wiki/Talk:Scholarships


--
Ellie Young
Events Manager
Wikimedia Foundation
eyo...@wikimedia.org 




___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Duplicates in Wikidata

2015-12-27 Thread Federico Leva (Nemo)

Is this something for a Wikidata game? :)

Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikidata Analyst, a tool to comprehensively analyze quality of Wikidata

2015-12-09 Thread Federico Leva (Nemo)
Useful and very pretty, I can't wait for the analysis by import source. 
I'll try to dig the data to find interesting evidence/examples of data 
to use more.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] data access for Wikispecies, MediaWiki, Meta and Wikinews is coming

2015-12-01 Thread Federico Leva (Nemo)

Lydia Pintscher, 01/12/2015 16:28:

We chatted with fundraising and agreed to postpone this a bit on Meta
to not interfere with fundraising on the most successful days of the
year.


Funny how WMF always gets in the way of sister projects.


We'll move it to the 15th.


Let's cross fingers!

Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Preferred rank -- choices for infoboxes, versus SPARQL

2015-11-28 Thread Federico Leva (Nemo)

Gerard Meijssen, 28/11/2015 07:05:

A big city is what? A city with more than a given number of inhabitants?
If so it is redundant because it can be inferred.


Criteria might be defined by local law and/or require some 
administrative act. That's how it works in Italy, for instance.


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Mix'n'match: how to preserve manually audited items for posterity?

2015-11-22 Thread Federico Leva (Nemo)

Dario Taraborelli, 21/11/2015 18:34:

I spent most of my time manually auditing automatically matched entries
from the Dizionario Biografico degli Italiani [2].


Thank you! That's very useful. I did some thousands too. :)


My favorite example? Mix’n’ match suggested a match between /Giulio
Baldigara /(Q1010811 ) and
/Giulio Baldigara/ (DBI
)
which looked totally legitimate: these two individuals are both Italian
architects from the 16th century with the same name, they were both born
around the same years in the same city, they were both active in Hungary
at the same time: strong indication that they are the same person,
right? It turns out they are brothers and the full name of the person
referenced in Wikidata is /Giulio Cesare Baldigara/ (the least known in
a family of architects). I unmatched the suggestion and flagged the DBI
entry as non existing in Wikidata.


Yes, this happens every now and then with Europeans that time, also with 
father and son having very same name and very same field of activity or 
even publications.
	Creating an item is good, as long as you have at least one piece of 
distinguishing information.
	The standard practice (at least on it.wiki) in such very ambiguous 
cases is to add a disambiguation page or note, even if the target 
article doesn't exist yet. In 
https://it.wikipedia.org/wiki/Antonio_Montanari I went the extra mile 
and also added more information, including a source: you can go into any 
level of detail, unlike on Wikidata.
	I encourage you to create a disambiguation page at 
https://it.wikipedia.org/wiki/Giulio_Baldigara with all the information 
you told us here (policy reference: 
https://it.wikipedia.org/wiki/Aiuto:Disambiguazione#Link_rossi ).


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Freebase to Wikidata: Results from Tpt internship

2015-10-02 Thread Federico Leva (Nemo)

Thad Guidry, 02/10/2015 21:44:

​To my eyes, it shows that the Asia continent is still generally void of
any useful machine-readable Knowledge, in either Freebase or Wikidata.
(or anywhere else)​  But this is already a known state of affairs and
probably will not improve until 1 Million USA students learn Mandarin. :)


It also shows that Wikidata and Freebase have different opinions on 
what's the centre of Europe (or maybe one of the two has tons of 
statements on Cape Town! too lazy to manually calculate labels on the axes).


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Importing Freebase (Was: next Wikidata office hour)

2015-09-29 Thread Federico Leva (Nemo)

Denny Vrandečić, 28/09/2015 23:27:

Actually, my suggestion would be to switch on Primary Sources as a
default tool for everyone.


Yes, it's a desirable aim to have one-click suggested actions (à la 
Wikidata game) embedded into items for everyone. As for this tool, 
unrelatedly from the data used, at least slowness and misleading 
messaging need to be fixed first: 
https://www.wikidata.org/wiki/Wikidata_talk:Primary_sources_tool


(Compare: we already have very easy "remove" buttons on all statements 
on all items. So the interface for large-scale easy correction of 
mistakes is already there, while for *insertion* it's still missing. 
Which is also the gist of Gerard's argument, I believe. I agree with 
Lydia we can eventually do both, of course.)


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Importing Freebase (Was: next Wikidata office hour)

2015-09-29 Thread Federico Leva (Nemo)

Thomas Steiner, 28/09/2015 23:32:

Note: as far as I can tell, the stats available at
https://tools.wmflabs.org/wikidata-primary-sources/status.html  so far
do not differentiate between "fact wrong" (as in "Barack Obama is
president of Croatia" [fact wrong]) and "source wrong" ("Barack Obama
is president of the United States", "according to
http://www.theonion.com/; [fact correct, source wrong]).


Indeed. I only briefly tested "primary sources" because it's 
frustratingly slow, but the statements I rejected were not wrong, just 
ugly: for instance redundant references where we already had some. I'd 
dare calling them formatting issues, which a bot can certainly filter. 
But maybe I was lucky!


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-29 Thread Federico Leva (Nemo)

Peter F. Patel-Schneider, 28/09/2015 22:27:

>I'm aguing against making such inference part of wikibase/wikidata core
>functionality, and hiding it's working ("magic").
>
>However, I very much hope for a whole ecosystem of tools that apply and use 
such
>inference, and make the results obvious to users, both integrated with
>wikidata.org and outside.


Has anyone argued for performing inference and then hiding that it happened?


Some did, I think. :) Anything that doesn't create a recentchanges entry 
is "hiding that it happened".


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Federico Leva (Nemo)

Thad Guidry, 27/09/2015 18:51:

OK, it seems that Wikipedia does have a few nice features. :)

I was able to quickly search History


Congratulations for using action=history! Those interested in learnign 
more features of a wiki can check https://www.mediawiki.org/wiki/Principles


> then you would have to fully describe what the property Sister City 
really means...


Good idea. Please add your point of view to the discussion on the 
matter: https://www.wikidata.org/wiki/Property_talk:P190


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


  1   2   >